EP2834050A1 - Procédé pour commander un robot industriel - Google Patents

Procédé pour commander un robot industriel

Info

Publication number
EP2834050A1
EP2834050A1 EP13714909.2A EP13714909A EP2834050A1 EP 2834050 A1 EP2834050 A1 EP 2834050A1 EP 13714909 A EP13714909 A EP 13714909A EP 2834050 A1 EP2834050 A1 EP 2834050A1
Authority
EP
European Patent Office
Prior art keywords
operating
touch display
display
virtual
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP13714909.2A
Other languages
German (de)
English (en)
Other versions
EP2834050B1 (fr
Inventor
Franz Som
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Reis Group Holding GmbH and Co KG
Original Assignee
Reis Group Holding GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Reis Group Holding GmbH and Co KG filed Critical Reis Group Holding GmbH and Co KG
Publication of EP2834050A1 publication Critical patent/EP2834050A1/fr
Application granted granted Critical
Publication of EP2834050B1 publication Critical patent/EP2834050B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35488Graphical user interface, labview
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36159Detachable or portable programming unit, display, pc, pda
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36168Touchscreen

Definitions

  • the invention relates to a method for operating an industrial robot by means of a graphical user interface with touch display having control unit.
  • the handheld terminal may be coupled to a robot controller to program or control the industrial robot.
  • the handheld device includes electronics that includes a microprocessor to communicate with the robot controller. Furthermore, the handheld terminal comprises a display designed as a touch screen, an emergency stop button and a switch designed as a lock.
  • the handheld terminal comprises various input means or traversing means which can be operated independently of one another independently of one another and which are embodied, for example, as a 6D mouse or as jog keys.
  • traversing means By means of the touch screen it is possible to assign each of the traversing means its own reference coordinate system.
  • the control of the industrial robot takes place exclusively via the manually operable input means, so that the hand-held operating device is complex in its manufacture and prone to operation.
  • the handset in the form of a mobile phone has a touch screen, which serves as an output means for outputting information from the robot controller, in particular for displaying a user interface and on the other hand as a command input means for inputting control commands by means of keys.
  • the handset is by means of a clamping device, as it is known in principle, for example, from brackets for mobile phones in motor vehicles, releasably attached to a portable security device and connected to this via a USB interface.
  • the safety input device has an emergency stop button, a consent button and a mode selector switch.
  • a disadvantage of this embodiment is that an operator for safe operation of the virtual buttons is always forced to look at the touch screen to avoid incorrect inputs. The same applies to adverse environmental conditions such as strong light or darkness, which would complicate the operation of the touch screen.
  • a method for operating an industrial robot by means of a graphical user interface with touch display having control unit comprising the method steps: displaying at least one representing a function or mode of operation of the industrial robot virtual control on the touch Display, selection of a desired function or mode by operating the at least one virtual control element by an operator, detecting the confirmation of the at least one virtual control element and sending a control function corresponding to the selected function or mode to a safety means of the robot control, evaluation of the control command, executing the selected Function or operating mode, if an acknowledgment button is correctly actuated or identification matches.
  • WO 2010/009488 A2 and WO 2011/150440 A2 methods for operating an industrial robot by means of a graphical user interface with touch display having control unit are known in which the inputs are monitored by a safety controller.
  • the present invention has the object, a method of the type mentioned in such a way that the safety in the operation of an industrial robot is increased.
  • a preferred method is characterized in that the display position in which the graphic information is displayed within the image is determined at random.
  • the transmission of the image with integrated graphical information takes place as an image file such as bitmap from the safety controller to the HMI device and is then displayed on the touch display in a predetermined by the safety controller or the safety controller known position.
  • touch display is preferably a commercially available touchscreen with a smooth surface, which is preferably designed as a capacitive touchscreen, even if a resistive touchscreen is also suitable.
  • a virtual operating element corresponding to the function or operating mode is displayed as graphical information or a numerical code is pictorially represented.
  • the graphic information is stored in a memory of the safety controller in a secure technique preferably two-channel.
  • the evaluation of the control signal in the safety device also takes place in a secure technique, preferably two-channel.
  • the touch of the virtual operating element on the surface of the touch display is detected by determining a first coordinate of a touch point and triggering the function of the virtual operating element if the first coordinate of the touch point permanent contact with the surface of the touch screen predetermined coordinate range leaves by a manual action of the operator.
  • the manual action can be triggered by a gesture of the operator.
  • the gesture can be performed by dragging a finger of the operator on the touch display in or out of the predetermined coordinate range.
  • the gesture is carried out in a defined direction, wherein the sensitivity to the finger movement, the intensity of the gesture can be adjusted continuously to trigger an action.
  • the invention relates to a self-inventive method for operating an industrial robot with a hand-held device. It is envisaged that the triggering of a touch function requires a manual action of the operator on the touch display. To prevent inadvertent triggering of virtual controls by unintentional contact, a function is only triggered if a special "small gesture" is carried out after touching the touch display, eg distortion of the finger in a defined direction touch ".
  • the required to trigger a function gesture can be adjusted continuously: It ranges from a simple finger touch, general operation of the touch display, to a defined gesture.
  • haptic brands such as the special expression of the finger hollows in the display edge, the finger can slide in continuation of the finger hollows on the touch screen and thereby trigger a function. If the operator notices that he initiated an unwanted function triggering, he can suppress the triggering of the function by pulling his finger back into the original position.
  • the device according to the invention is distinguished in particular from the prior art in that the number of hardware components is reduced to an absolute minimum. All operating functions are consistently implemented in touch software, with the exception of the safety-related switches “Emergency stop” and “Approval”. It No other electrical components such as membrane keys, switches or signal lamps are required. As a result, the system is low maintenance.
  • the achieved space gain goes in favor of a large comfortable touch display.
  • the virtual controls and indicators displayed on the touch screen are designed for industrial use and are rich in contrast and large in size, allowing for reliable operation.
  • Fig. 2 shows a portion of an adjacent to the touch display display frame of
  • Fig. 3 shows a second portion of an adjacent to the touch display
  • Fig. 6 shows a schematic method for operating an industrial robot with a
  • FIG. 7 shows a detail of a user interface of the operating device with virtual
  • FIG. 8 shows a section of a user interface of the operating device with an image with graphic information
  • the handset 10 includes a graphical user interface 18 with a touch-sensitive display 20 - hereinafter called touch display.
  • the touch display 20 is used to display at least one virtual control element 22.1 ... 22.n, 24.1 ... 24.n, which represents a function for controlling, programming or operation of the industrial robot 12, wherein upon touching the virtual control element 22.1. ..22.n, 24.1 ... 24.n with a finger of an operator or a pen the assigned function is triggered.
  • the handset 10 further includes a control unit 30 for controlling the graphical user interface 18 and for communicating with the robot controller 16 and a position sensor for determining the position and inclination of the operating device.
  • the graphical user interface 18 with the touch display 20 is arranged together with the control unit 30 in a housing 32.
  • the housing 32 forms a display frame 34 which surrounds the touch display 20 at the edge.
  • a safety-oriented "emergency stop" switch 26 is furthermore arranged.
  • the virtual operating elements 22.1 .... 22.n and 24.1... 24.n are arranged along respective frame sections 36, 38 of the display frame adjoining the touch display 20.
  • haptic marks 40.1 to 40.n respectively are provided in the frame section 36, 38 42.1 ... 42.n.
  • Each haptic mark 40.1 ... 40.n, 42.1 ... 42.n is assigned a virtual operating element 22.1 ... 22.n, 24.1 ... 24.n.
  • the virtual operating element 22.1 ... 22.n, 24.1 ... 24.n adjoins directly at the haptic mark 40.1 ... 40.n or 42.1 ... 42. n, so that an immediate transition from the haptic mark 40.1 ... 40. n or 42.1 ... 42.n to the virtual control element 22.1 ... 22. n, 24.1 ... 24. n takes place.
  • a finger guided along a haptic mark 40.1... 40. n or 42.1... 42.n is guided virtually to the virtual operating element 22.1... 22.n, 24.1... 24.n.
  • the position of the virtual operating element is sensed with the aid of the haptic marks, and then the function is triggered by touching the virtual operating element. Furthermore, it is not necessary that the touch screen, so the display 20 must be specially designed. In particular, and in deviation from the prior art, it is not necessary that special superimposed materials are attached to the display, otherwise transparency losses occur.
  • the haptic marks 40.1 ... 40.n or 42.1 ... 42.n form a guide through which a finger of an operator led to the associated virtual control element 22.1 ... 22.n, 24.1 ... 24.n becomes.
  • FIG. 2 shows an enlarged representation of the operating elements 22.1... 22n and the haptic marks 40.1.
  • the haptic marks 40.1... 40.n are formed as finger depressions, which are shaped such that they can be reliably sensed with the fingers and guide the finger from the frame sections 36, 38 in the direction to ensure the assigned virtual control element 22.1 ... 22.n or 24.1 ... 24.n.
  • haptic marks 43.1... 43. n are provided, which are formed as nubs and arranged on a surface of the display frame 34.
  • FIG. 3 shows an embodiment of a haptic mark 44 as a frame corner 46 of the display frame 34 adjoining the touch display 20.
  • a clear, exact position on the touch display 20 is defined by the frame corner 46 of the display frame 34.
  • a virtual operating element 48 is provided on the touch display 20, which is moved, for example, in a linear direction along a display-side frame section 50 or the other frame section 52 of the frame corner 44.
  • FIG. 4 shows a further embodiment of a haptic mark 54, which is designed as a display-side frame section 56 of the display frame 34.
  • the finger of an operator can perform a sliding movement over which a virtual sliding element 60 extending along the frame section 56 can be adjusted.
  • the haptic marks 40.1 ... 40.n, 42.1 ... 42.n shown in FIGS. 1 and 2 and designed as finger depressions form a haptic orientation on the display edge with high resolution, eg for the sensing of positions of the virtual Operating elements 22.1 ... 22.n, 24.1 ... 24. n, since these are arranged directly next to the finger hollows.
  • Each finger recess can be uniquely assigned to a virtual control element.
  • the triggering of a function assigned to the virtual operating element 22.1... 22n, 24.1... 24n requires a manual action of the operator on the touch display 20.
  • a function is triggered only when, after touching the touch display 20, a predefined gesture such as, for example, warping the Finger is executed in a defined direction.
  • a predefined gesture such as, for example, warping the Finger is executed in a defined direction.
  • the sensitivity of the reaction to the finger movement can be infinitely adjusted via a controller.
  • the intensity of the necessary gestures to trigger functions can be adjusted continuously. It ranges from a simple finger touch, general operation of the touch display 20, to a special little gesture.
  • the finger can slide in continuation of the finger hollows on the touch display and thereby trigger a function. If the operator notices that he initiated an unwanted function triggering, he can suppress the triggering of the function by pulling his finger back into the original position.
  • the corresponding coordinates of the contact point on the touch display are detected by the control unit 30.
  • the corresponding function becomes only triggered when the finger of the operator leaves a predetermined coordinate range or reaches a predefined coordinate range. If the virtual control element is deflected and thus ready to trigger (triggered when the finger is released), this is characterized by an optical identifier, for example by a colored border, on the control. If an accidental deflection is reversed by the control element is retracted back to the origin, this is indicated by a disappearance of this optical identifier.
  • virtual controls 48 e.g. are placed in the frame corner 46 of the display frame 34, associated with a special gesture control. These may e.g. along the frame portions 50, 52 are displaced in two directions 62, 64, as shown in Fig. 3. Each direction of movement 62, 64 is assigned a selectable function. So it is e.g. it is possible to activate a function "A" when moving along the frame section 52 and to activate a function "B" when moving along the frame section 50. The degree of deflection is evaluated, with two evaluation options are provided.
  • the degree of deflection is immediately transferred to the function as an analog parameter, such as speed specification. If the finger is released in the deflected position, the analog value jumps immediately to zero. If the finger is slidingly guided back into the starting position 66, the parameter returns to zero analogously to the deflection.
  • This function can be used to e.g. to start a movement program in positive or negative directions while continuously varying the speed.
  • a switching function is triggered when a definable threshold value is exceeded.
  • the activation of the function takes place only when the finger leaves the touch display 20 in the deflected position. However, the finger on the without releasing Frame sections 50, 52 returned to the zero position, the function is prevented.
  • Another inventive idea of the invention relates to the realization of a so-called override function (speed controller), which is implemented via the sliding control element 60 shown in FIG. 4.
  • the sliding control element 60 is placed along the frame section 56 and in the middle of the haptic mark 43.
  • the blind adjustment is additionally supported by the haptic marks 22.1 ... 22.n 43, 54 of the frame sections 38, 56.
  • the so-called override may, for a shift between two haptic marks, be increased by a defined amount, e.g. 20, be adjusted.
  • a defined amount e.g. 20
  • Another inventive feature relates to the symmetrical arrangement of the haptic marks 22.1 ... 22.n, 24.1 ... 24.n based on the L Lucass sec. Transverse center axis of the touch display 20.
  • the longitudinal center axis is the straight line which runs centrally and parallel to the longitudinal frame legs of the display frame 34. Perpendicular to this, the transverse center axis runs, ie centrally between and parallel to the shorter transverse limbs of the display frame 34. This ensures that the handset 10 is suitable both for a right-handed operation and a left-handed operation. This is achieved in particular by the consistent buttonless design of the graphical user interface and by the symmetrical arrangement of the haptic marks.
  • Fig. 5 shows a rear side 66 of the housing 32.
  • retaining strips 70, 72 are arranged, on which the handset 10 can be securely held with one or both hands.
  • the retaining strips 70, 72 may have an outer geometry corresponding to cylinder sections, wherein the retaining strips 70, 72 should emanate from the outer edge, ie from the longitudinal edges of the display frame 34.
  • each retaining strip 70, 72 each have a consent switch or consent button 74, 76 is integrated, of which either one must be operated for the Verfahrschulgabe the industrial robot.
  • FIG. 1 035 953 Another inventive embodiment of the invention is characterized in that a previously customary key-operated switch for selecting the robot operating modes “set-up”, “automatic”, “automatic test” is replaced by a software function.
  • the touch display 20 is in principle a single-channel and thus insecure device, with the aid of a safety controller 78, hereinafter referred to as SafetyController 78, which is integrated into the robot controller 16 as shown in FIG 1 035 953, the disclosure of which is fully incorporated into the present application, but the teaching according to the invention is not limited to a safety control according to European Patent Application 1 035 953.
  • the operating surface 18 offers different operating mode options in the form of virtual user interfaces 80, 82, 84 such as softkeys for selection, as shown in FIG. 7.
  • the operator selects a new operating mode "X.”
  • the newly selected operating mode is sent to the SafetyController 78 as a command "Request new operating mode-X.”
  • the SafetyController 78 takes from its memory 86 graphic information corresponding to this operating mode, such as Icon 88, and places it in a randomly determined display position in a larger image 90.
  • the position of the icon 88 in the image 90 is known only to the SafetyController 78.
  • This image 90 is sent to the user interface 18 as an image file such as a bitmap and displayed there in a defined position, as shown in FIG is shown.
  • SafetyController 78 By a fingertip on the illustrated icon 88, the operator must confirm the operating mode detected by SafetyController 78.
  • a touch position on the touch display is recorded in the form of touch coordinates and sent back to SafetyController 78. This compares the touch position with the random display position of the icon 88 in the image 90 known only to the SafetyController 78. The comparison takes place taking into account the known position of the image 90 on the touch display 20. If the touch position (within a defined Tolerance) equal to the display position, the introduced mode change is executed. Otherwise, the mode change is discarded and the previous mode is retained.
  • SafetyController 78 displays the detected operating mode on the HMI device 10,
  • SafetyController 78 sets the new operating mode.
  • the SafetyController 78 can display an iconized numerical code which has to be recognized by the operator and entered as a number via a displayed keyboard. The touch position of the displayed numbers of the keyboard is sent to the SafetyController, which checks the correctness of the input.
  • the icons 80, 82, 84 are stored in the SafetyController 78 in secure technology.
  • a request for mode change can also be made via a hardware key switch.
  • the insertion / removal of the key in the operation selector switch is simulated by a login / logout procedure using a PIN.
  • the industrial robot 12 can be mounted in 6 degrees of freedom, for example, in FIG B. X, Y, Z and orientations A, B, C of a tool 91 are sensitively controlled.
  • the industrial robot 12 can be moved simultaneously in two coordinate directions, e.g. X and Y proceed.
  • a deflection of the finger generates a speed command for the industrial robot 12: the more the finger is deflected, the faster the robot moves.
  • the user After touching a selected traversing surface 100 shown in FIG. 9, the user activates a virtual operating element 92 which is positioned in the region of the point of contact. Subsequently, the virtual operating element 92 can also be pulled over a limit 94 beyond the entire touch display 20 by means of the finger and thus generate travel specifications. After releasing the industrial robot 12 stops immediately. After that, the desired area 100 must be hit again for a new travel specification.
  • the sensitivity of the reaction to a finger movement can be adjusted continuously via a virtual control element 96 such as slide control (override) both for the position specification and for the speed default.
  • the sensitive area 100 for the 2D method is placed in the vicinity of the display edge 36 so that it can still be reached with a clear distance from the edge of the display with the finger (for example with the spread thumb).
  • an approximately fingerlessnesss field 101 is placed with a virtual control element 98 directly to the side of the display edge 36, so that this field 101 can be "felt" with the finger in that the finger, in particular the thumb, is guided along the display edge 36.
  • This field produces a one-dimensional travel specification, for example in the Z direction.
  • the operator can clearly differentiate and blindly reach the two travel fields 100, 101:
  • the field 100 which is placed about a finger or thumb width next to the display edge 36, activates the simultaneous travel presetting in two dimensions (XY dimension).
  • the industrial robot 12 has 6 degrees of freedom.
  • the display is divided into two zones.
  • the upper zone are the traversing fields 100, 101 for the dimensions 1-3 (eg X, Y, Z).
  • the lower zone are the Verfahrfelder 102, 103 for the dimensions 4-6 eg A, B, C.
  • the user activates a virtual operating element 104 which is positioned in the region of the point of contact. Subsequently, the virtual control element 104 can be moved on the touch display in order to generate a driving instruction.
  • the industrial robot 12 can be moved simultaneously in all 6 degrees of freedom.
  • the traversing functions can only be used sequentially.
  • the touch display is ideally aligned collinear with the coordinate system of the industrial robot.
  • the robot movement optimally matches the finger movement on the touch display.
  • the coordinate system of the touch display must be recalibrated to the coordinate system of the robot.
  • a special virtual operating element 114 with a pointer 116 is provided on the touch display 20.
  • This control element 114 must first be touched with a finger and then the finger must be in the selected direction of the robot coordinate system, for. B. X-direction, are pulled.
  • the X-direction in the working area of the robot z. B. be marked by a mark on the bottom surface.
  • the movement of the finger on the operating element 114 and thus the alignment of the pointer 116 take place parallel to the marking present in the working area of the robot. Such is exemplified by "200" in Fig. 6.
  • the vector direction between the first touch point and the release point is calculated
  • Vector the selected robot coordinate axis and a common Z vector
  • a rotation matrix is calculated over which all finger movements are transformed before being given to the robot as a travel instruction.
  • the recalibration is done as quickly as possible with a single gesture. After recalibration, both coordinate systems are again collinear with each other, virtually sorted. For better verifiability, the direction of the calibrated display coordinate system is shown graphically on the touch display.
  • the coordinate system for the robot is shown in FIG. It can be seen that the marking 200 is parallel to the X-axis.
  • the Y-axis extends in the plane of the footprint of the robot 12. Perpendicular to this extends the Z-axis about which the robot 12 is rotatable (arrow AI).
  • FIGS. 1 and 6 by the arrows 1, 2, 3, 4, 5, 6 or AI, A2, A3, A4, A5, A6, the pivoting or rotational movements of the robot 12 and of the tool 91 holding arms.
  • the robot 12 can move the tool 91 with 6 degrees of freedom.
  • This calibration method according to the invention which works without sensors, can also for any other coordinate systems, such. As freely definable frames, are used.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un procédé pour commander un robot industriel (12) au moyen d'un appareil de commande (10) présentant une interface utilisateur graphique (18) pourvue d'un écran tactile (20). Pour améliorer la sécurité de commande du robot industriel, au moins un élément de commande virtuel représentant une fonction du robot industriel (12) est affiché sur l'écran tactile, un signal de commande affecté à celui-ci est envoyé à une commande de sécurité et une image est produite au moyen de la commande de sécurité, l'image étant alors affichée sur l'écran tactile (20). En cas de contact avec l'image sur l'écran tactile, un envoi en retour vers la commande de sécurité est réalisé afin d'exécuter une fonction du robot industriel lorsqu'une correspondance est détectée entre l'image représentée et le contact avec l'image sur l'écran tactile.
EP13714909.2A 2012-04-05 2013-04-05 Procédé pour commander un robot industriel Active EP2834050B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012103030A DE102012103030B3 (de) 2012-04-05 2012-04-05 Verfahren zur Bedienung eines Industrieroboters
PCT/EP2013/057179 WO2013150130A1 (fr) 2012-04-05 2013-04-05 Procédé pour commander un robot industriel

Publications (2)

Publication Number Publication Date
EP2834050A1 true EP2834050A1 (fr) 2015-02-11
EP2834050B1 EP2834050B1 (fr) 2016-08-24

Family

ID=48050703

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13714909.2A Active EP2834050B1 (fr) 2012-04-05 2013-04-05 Procédé pour commander un robot industriel

Country Status (8)

Country Link
US (1) US9333647B2 (fr)
EP (1) EP2834050B1 (fr)
KR (1) KR101536107B1 (fr)
CN (1) CN104379307B (fr)
DE (1) DE102012103030B3 (fr)
DK (1) DK2834050T3 (fr)
ES (1) ES2598242T3 (fr)
WO (1) WO2013150130A1 (fr)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012102749A1 (de) 2012-03-29 2013-10-02 Reis Group Holding Gmbh & Co. Kg Vorrichtung und Verfahren zur Bedienung eines Industrieroboters
DE102013216740A1 (de) * 2013-08-23 2015-02-26 Robert Bosch Gmbh Bedienvorrichtung, Steuervorrichtung und Anlage der Automationstechnik
DE102014200066A1 (de) * 2014-01-07 2015-07-09 Spinner Werkzeugmaschinenfabrik Gmbh Bediengerät für eine Werkzeugmaschine mit lageabhängiger Zuordnung von Bedienbefehlen zu einzelnen Bewegungsachsen der Werkzeugmaschine
CN104238418A (zh) * 2014-07-02 2014-12-24 北京理工大学 一种交互现实系统和方法
US10617234B2 (en) * 2014-09-03 2020-04-14 Key Infuser Device for interaction of an object exhibited with the aid of a robotic arm
DE102014014498A1 (de) * 2014-09-25 2016-03-31 Wavelight Gmbh Mit einem Touchscreen ausgestattetes Gerät sowie Verfahren zur Steuerung eines derartigen Geräts
US9597807B2 (en) * 2014-10-24 2017-03-21 Hiwin Technologies Corp. Robot teaching device
WO2016203364A1 (fr) * 2015-06-15 2016-12-22 Comau S.P.A. Dispositif de commande de sécurité portable pour machines industrielles, en particulier pour des robots
EP3371684B1 (fr) 2015-11-02 2023-10-11 The Johns Hopkins University Procédé, dispositif et support lisible par ordinateur pour la gestion, par un dispositif mobile, d'un robot industriel collaboratif
DE102016202881B4 (de) * 2016-02-24 2018-01-18 Kuka Roboter Gmbh Bediengerät für Manipulator
DE102016204137A1 (de) * 2016-03-14 2017-09-14 Kuka Roboter Gmbh Programmierbares Manipulatorsystem mit einer Funktionsschaltervorrichtung
DE102016208811B3 (de) 2016-05-20 2017-10-05 Kuka Roboter Gmbh Mobile Sicherheits-Grundsteuervorrichtung mit einer Kodiervorrichtung für ein mobiles Endgerät mit Multi-Touchscreen
DE102016211244B4 (de) * 2016-06-23 2018-01-18 Kuka Roboter Gmbh Roboter-Bedienhandgerätverbund mit einem Grundsteuerungs-Lagesensor
JP2018176320A (ja) * 2017-04-07 2018-11-15 株式会社ディスコ 加工装置
JP1606242S (fr) * 2017-11-22 2018-06-11
USD938960S1 (en) * 2019-03-27 2021-12-21 Teradyne, Inc. Display screen or portion thereof with graphical user interface
CN111709922B (zh) * 2020-06-10 2023-07-04 北京百度网讯科技有限公司 图像质量比较方法、装置、设备以及存储介质
CN113733087B (zh) * 2021-09-06 2023-03-07 深圳太力生物技术有限责任公司 细胞操作机器人的控制信息配置方法、装置、设备和介质

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5465215A (en) * 1994-07-07 1995-11-07 Cincinnati Milacron Inc. Numerical control method and apparatus
JP4014662B2 (ja) * 1995-09-18 2007-11-28 ファナック株式会社 ロボット教示操作盤
DE59809233D1 (de) * 1997-12-06 2003-09-11 Elan Schaltelemente Gmbh & Co Überwachungs- und steuergerät sowie verfahren zur überwachung einer technischen anlage mit erhöhten sicherheitsanforderungen, insbesondere eines handhabungsgerätes
DE10007308A1 (de) * 2000-02-17 2001-08-23 Bosch Gmbh Robert Verfahren und Vorrichtung zur Ermittlung der verbleibenden Betriebsdauer eines Produktes
US8388530B2 (en) * 2000-05-30 2013-03-05 Vladimir Shusterman Personalized monitoring and healthcare information management using physiological basis functions
AT412176B (de) * 2001-06-26 2004-10-25 Keba Ag Tragbare vorrichtung zumindest zur visualisierung von prozessdaten einer maschine, eines roboters oder eines technischen prozesses
ITTO20020862A1 (it) * 2002-10-04 2004-04-05 Comau Spa Sistema di programmazione per robot o simili apparati
ITTO20020863A1 (it) * 2002-10-04 2004-04-05 Comau Spa Terminale portatile di comando, programmazione e/ o
US20040090428A1 (en) 2002-11-08 2004-05-13 Xerox Corporation Overlays with raised portions for touch-sensitive screens
JP2005161498A (ja) * 2003-12-05 2005-06-23 National Institute Of Advanced Industrial & Technology ロボット遠隔操作制御装置
EP1786601A1 (fr) 2004-06-24 2007-05-23 Abb Ab Systeme robotique industriel pourvu d'un dispositif de commande operateur portable
EP1716982B1 (fr) * 2005-04-19 2008-05-07 COMAU S.p.A. Procédé de commande de robots industriels et robots, systèmes de robots et programmes d'ordinateur associés
EP1719588A1 (fr) * 2005-05-02 2006-11-08 Abb Research Ltd. Système de contrôle d'un robot comprenant un pendant de commande portable muni d'un dispositif de sécurité
DE102007018607A1 (de) * 2007-04-18 2008-10-30 Abb Research Ltd. Portable Bedienvorrichtung
US20110131002A1 (en) * 2008-05-15 2011-06-02 Simeon Falk Sheye Method for automatic testing of software
WO2009149740A1 (fr) * 2008-06-09 2009-12-17 Abb Technology Ab Procédé et système facilitant la calibration d'une cellule robotisée programmée hors ligne
AT10676U1 (de) * 2008-07-21 2009-08-15 Keba Ag Verfahren zum betreiben eines mobilen handbediengerätes für die abgabe oder freischaltung von potentiell gefahrbringenden steuerkommandos sowie entsprechendes handbediengerät
AT509932A3 (de) * 2010-05-31 2015-04-15 Keba Ag Verfahren und steuerungssystem zum programmieren oder vorgeben von bewegungen oder abläufen eines industrieroboters
DE102010025781B4 (de) * 2010-07-01 2022-09-22 Kuka Roboter Gmbh Tragbare Sicherheitseingabeeinrichtung für eine Robotersteuerung
DE102010039540C5 (de) 2010-08-19 2020-01-02 Kuka Deutschland Gmbh Handbediengerät zum manuellen Bewegen eines Roboterarms

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2013150130A1 *

Also Published As

Publication number Publication date
KR101536107B1 (ko) 2015-07-13
EP2834050B1 (fr) 2016-08-24
WO2013150130A1 (fr) 2013-10-10
ES2598242T3 (es) 2017-01-26
CN104379307B (zh) 2016-05-18
KR20140148474A (ko) 2014-12-31
US9333647B2 (en) 2016-05-10
DE102012103030B3 (de) 2013-05-23
DK2834050T3 (en) 2016-12-12
US20150066209A1 (en) 2015-03-05
CN104379307A (zh) 2015-02-25

Similar Documents

Publication Publication Date Title
EP2834050B1 (fr) Procédé pour commander un robot industriel
EP2834051B1 (fr) Procédé pour commander un robot industriel
EP2874789B1 (fr) Procédé de commande d'un robot industriel et dispositif permettant de mettre en oeuvre le procédé
EP2834715B1 (fr) Procédé pour commander un robot industriel
EP2977841B1 (fr) Procede de fonctionnement d'un robot industriel
DE112014004307B4 (de) Roboterbedienungsvorrichtung, Robotersystem, und Roboterbedienungsprogramm
EP2920656B1 (fr) Procédé pour activer de manière fiable et intentionnelle des fonctions et/ou des mouvements d'une installation technique commandable
EP3458232A1 (fr) Dispositif de commande de base de sécurité mobile comprenant un dispositif de codage pour un terminal mobile à écran tactile multipoint et procédé d'établissement d'une liaison de commande associée de façon univoque
EP3366434A1 (fr) Procédé de vérification d'une fonction d'un véhicule et/ou d'au moins un dispositif de commande
EP2953793A1 (fr) Système de commande d'une presse à imprimer
WO2016142378A1 (fr) Procédé de sélection ciblée d'éléments affichés sur un écran tactile

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141103

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160330

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 822628

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160915

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502013004167

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: NV

Representative=s name: FIAMMENGHI-FIAMMENGHI, CH

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: DK

Ref legal event code: T3

Effective date: 20161206

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20160824

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2598242

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20170126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161124

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161226

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161125

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502013004167

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161124

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20170526

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170405

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 6

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20170430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170405

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170430

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 822628

Country of ref document: AT

Kind code of ref document: T

Effective date: 20180405

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20130405

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20180405

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160824

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161224

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 502013004167

Country of ref document: DE

Representative=s name: KILBURN & STRODE LLP, NL

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230528

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20230512

Year of fee payment: 11

Ref country code: DK

Payment date: 20230414

Year of fee payment: 11

Ref country code: DE

Payment date: 20230307

Year of fee payment: 11

Ref country code: CH

Payment date: 20230502

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240229

Year of fee payment: 12

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20240312

Year of fee payment: 12

Ref country code: IT

Payment date: 20240313

Year of fee payment: 12

Ref country code: FR

Payment date: 20240308

Year of fee payment: 12