WO2023138784A1 - Système et procédé de configuration d'un robot - Google Patents

Système et procédé de configuration d'un robot Download PDF

Info

Publication number
WO2023138784A1
WO2023138784A1 PCT/EP2022/051389 EP2022051389W WO2023138784A1 WO 2023138784 A1 WO2023138784 A1 WO 2023138784A1 EP 2022051389 W EP2022051389 W EP 2022051389W WO 2023138784 A1 WO2023138784 A1 WO 2023138784A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
measurement data
operator
operating mode
manipulator
Prior art date
Application number
PCT/EP2022/051389
Other languages
German (de)
English (en)
Inventor
Simon Mayer
Felix Wohlgemuth
Iori MIZUTANI
Original Assignee
Universität St. Gallen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universität St. Gallen filed Critical Universität St. Gallen
Priority to PCT/EP2022/051389 priority Critical patent/WO2023138784A1/fr
Publication of WO2023138784A1 publication Critical patent/WO2023138784A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/427Teaching successive positions by tracking the position of a joystick or handle to control the positioning servo of the tool head, master-slave control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35444Gesture interface, controlled machine observes operator, executes commands
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35472Mode selection
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36435Electromyographical, myoelectric control signal

Definitions

  • the invention relates to a system and method for configuring a robot.
  • robots for example industrial robots
  • robots enable fundamental improvements in efficiency and automation.
  • the robots are usually associated with high acquisition costs, which is why small and medium-sized companies in particular have a great interest in being able to use the robots as flexibly as possible. Therefore, these must be suitable for being able to train them in a large number of different activities and processes.
  • Classic systems rely on code-based programming, as they enable offline configuration without having to interrupt the production process.
  • the approach requires relevant IT knowledge, which requires specialist staff or external service providers. As a result, the approach is cost-intensive and not very economical.
  • US Pat. No. 9,821,457 B1 discloses a method for configuring a robot using touch-sensitive screens or simple programming tools. A target trajectory is specified. However, programming knowledge is also required and the movement to be carried out is not defined directly, but only via manual user input (i.e. indirectly).
  • US 2020/0 097 081 A1 discloses a method for setting virtual systems into specific states based on electromyographic measurement data. Here, too, the configuration is done indirectly. A manipulator of the robot is not moved directly.
  • US 2012/0 221 177 A1 discloses a combination of electromyographic sensors and inertial sensors (acceleration and yaw rate sensors) to control movements of a robot. Measurement data from the sensors must be determined, which must exceed predefined threshold values in order to activate the configuration mode.
  • CN 1 07 553 499 A also discloses a combination of electromyographic sensors and inertial sensors (acceleration and yaw rate sensors) in order to demonstrate movements and to configure the robot accordingly.
  • the manipulator cannot be moved or positioned with sufficient precision or at all. This often means that to activate the operating mode using the software or hardware switch, the user has to use both hands during the configuration tasks to set the correct mode. Alternatively, several users are required to configure the robot together. As a result, the configuration of robots has so far been complex.
  • a system for configuring a robot comprises at least one sensor device carried by an operator, a classification device, a control device and a robot with at least one manipulator.
  • the classification device receives at least measurement data from the sensor device.
  • the control device is coupled to at least the classification device and the robot.
  • the control device puts the robot into at least a first operating mode in which the at least one manipulator can be moved directly (immediately) by the operator if the classification device assigns the measurement data received as a result of a gesture by the operator to a predefined first reference gesture.
  • a gravity compensation mechanism of the robot could be activated.
  • the operator performs a specific gesture.
  • Measurement data are determined by the sensor device based on this gesture by the operator.
  • This measurement data is transmitted to the classification device, which determines whether the measurement data received can be assigned to a predefined reference gesture.
  • the classification device can have corresponding reference measurement data, which the classification device received during a training phase based on the reference gestures.
  • the measurement data actually received during operation are then related by the classification device to the reference measurement data of the corresponding reference gesture and the degree of correspondence is evaluated.
  • the measurement data received by the sensor device itself is based on a measurement variable, which the operator's hands can advantageously remain free to record.
  • the present system advantageously enables the robot to be switched between different operating modes without the manual (manual) actuation of a hardware or software switch being necessary for this purpose.
  • Switching between operating modes is simplified and linked to the interactive manipulation of the robot in an intuitive way.
  • the manipulator of the robot can be moved freely. This means that locking mechanisms that would otherwise have a free di- direct (immediate) movement of at least the manipulator by the operator are essentially disabled.
  • the robot itself or its manipulators are not changed when the robot is switched to the first operating mode. So robots, especially collaborative robots, can be configured with less work.
  • the operator's hands remain free, so that the operator can advantageously use them to carry out other activities, for example to move the manipulator of the robot.
  • the complexity of configuring the robot can advantageously be reduced.
  • the configuration procedure enables a more stringent and efficient handling of the robot. The advantages are particularly effective when the robot has to be configured to carry out activities that were not previously planned.
  • a robot is understood to mean a technical apparatus which has at least one manipulator.
  • the manipulator provides at least one articulated connection, so that a kinematic chain is generated, which enables a movement along one or more directions by means of at least one actuator.
  • the robot can have an end effector, which forms the last element of the kinematic chain of the manipulator.
  • the end effector is used to perform an activity, such as making a weld.
  • the end effector can be moved by an operator during the first operating mode of the robot in order to show the robot an activity to be carried out, for example to produce a welded connection along a specific trajectory.
  • the robot can also have a number of manipulators and/or a number of end effectors, which are designed in a corresponding manner and can be influenced by the operator.
  • a classification device can be understood to mean a hardware and/or software-based component which has a classification algorithm or is set up to execute one.
  • the classification algorithm can be understood as a set of assignment rules, based on which the measurement data received from the sensor device are Relation to reference gestures can be set.
  • corresponding reference measurement data are determined, which the classification device received from the sensor device in a training phase of the classification algorithm.
  • the measurement data received during operation are then compared with the reference measurement data by the classification device using the classification algorithm in order to determine the degree of agreement. If the degree of agreement is above a predetermined threshold value, the measurement data received during operation are determined as representing the corresponding reference gesture.
  • the corresponding reference gesture is then assigned to the measurement data received during operation.
  • the classification engine uses non-parametric methods to estimate probability density functions (k-nearest-neighbor algorithms) for the assignment.
  • the classification device uses the measurement data received from the sensor device as input. As an output, the classification device provides an assignment variable that indicates whether an assignment to a predefined reference gesture has taken place based on the received measurement data.
  • control device can be a hardware- or software-based component, which in particular comprises a data processing device.
  • the control device can receive the allocation variable from the classification device. Based on the assignment variable, the control device can issue corresponding commands to the robot in order to put it into a specific operating mode.
  • the control device can preferably include the classification device.
  • the control device and the classification device can be combined in one device.
  • the classification device can be in the form of a computer program and can be executed in the control device.
  • the control device receives the measurement data from the sensor device and feeds them to the classification device.
  • a gesture by the operator can be understood to mean a specific posture of at least one part of the operator's body.
  • the sensor device can be set up to record a measured variable depending on the posture of the at least one part of the operator's body. Based on the detected measured variable, the sensor device can determine corresponding measurement signals from which measurement data are then derived.
  • the measurement data received by the classification device can thus essentially be determined by corresponding electromyographic action currents of the operator when performing a gesture, which will be explained in more detail below.
  • a gravitational compensation mechanism can be understood to mean that the at least one manipulator of the robot can be moved on the joint plane based on exclusive torque control.
  • the gravitational terms are compensated during the movement.
  • the gravitational terms can be compensated for by at least one actuator of the robot's manipulator.
  • the gravity compensation mechanism thus provides compliance of the robot's manipulator. It allows the user to intuitively position the manipulator in the entire workspace. As a result, movements to be carried out and positions or orientations of the manipulator to be assumed can be demonstrated directly in a particularly simple manner while compensating for the gravitational force.
  • the direct movement of the manipulator advantageously makes it possible to avoid code-based programming of the robot to train certain activities. This means that the robot can also be configured by non-specialist personnel. In particular, no IT knowledge is required.
  • the classification device is preferably set up to assign the measurement data received from the sensor device to a number of different predefined reference gestures.
  • the control device can then put the robot into a number of different operating modes depending on the reference gesture to which the respective measurement data are assigned.
  • This means that the system is advantageously set up to reliably identify several different gestures of the operator. This is because the measurement data output by the sensor device depend on the gesture made by the operator. The operator can thus carry out certain gestures which entail different measurement data and which are subsequently recognized by the classification device. different predetermined reference gestures or their specific reference measurement data can be assigned.
  • the operating mode can advantageously be switched over intuitively by the operator, while the operator's hands can remain free. For example, different gestures of the operator can be recognized based on different joint positions of a body joint of the operator. In particular, the wrist or finger joints or a combination of both can be used for this purpose.
  • a first reference gesture can be grasping an object, for example.
  • a second reference gesture can be, for example, a loosely outstretched hand in which essentially only a few muscles are contracted.
  • a third reference gesture can be, for example, a hand position stretched essentially sideways in the direction of the elbow joint.
  • a fourth reference gesture can be a press of fingers on a flat surface.
  • the reference gestures can in particular be user-independent. This means that the system can be used by different users without having to redefine or retrain the reference gestures.
  • the robot can be put into four different operating modes.
  • the first mode of operation is the training mode, in which the gravity compensation mechanism is activated.
  • a second operating mode can be the conventional operating mode of the robot, in which the robot carries out trained movements of the manipulator and/or the end effector.
  • the end effector can be selected in a third operating mode, for example in order to set a specific position and/or orientation of the end effector and to confirm it with the fourth operating mode.
  • the fourth mode of operation is intended to confirm a position and/or orientation of the manipulator or the end effector or both.
  • start and end points of trajectories to be carried out can be defined. This advantageously avoids the operator having to actuate a separate hardware or software switch in order to switch between the different operating modes. Switching between the operating modes is intuitive. Thus, the operator has his hands free. The changeover thus takes place within a shorter time tighten. The time required to configure the robot is reduced compared to known approaches.
  • the assignment variable preferably has at least four different values.
  • the first reference gesture is assigned to the first operating mode, the second reference gesture to the second operating mode, the third reference gesture to the third operating mode and the fourth reference gesture to the fourth operating mode.
  • the classification device is optionally set up to output different values of the assignment variable.
  • the value of the assignment variable can depend in particular on which predefined reference gesture the received measurement data is assigned to.
  • the value of the mapping variable can also determine which operating mode to put the robot in.
  • the sensor device has at least one measuring device that is set up to determine electromyographic-based measuring signals.
  • the configuration of the robot is then based on electromyography.
  • the measuring device detects a measured variable in relation to at least one part of the operator's body, in particular a part of the body surface or an internal part of the body, for example a neuroanatomical part of the body (nerve pathway of the nervous system).
  • the measurement signals can be determined based on the recorded values of the measurement variable.
  • the sensor device is set up to derive corresponding measurement data from the measurement signals, which the sensor device provides for the classification device or the control device.
  • electromyographic-based measurement signals can be understood to mean that surface-based voltages or currents of at least one part of the operator's body are evaluated.
  • the measured variable can also be recorded based on sensory and/or nerve stimuli. These can also be detected from the body surface, for example, and are therefore also considered to be surface-based voltages or currents in the present case.
  • the measuring device preferably comprises at least one surface electrode which is arranged in contact with the body surface of the operator. A muscle contraction by the operator results in a change in surface-based tensions.
  • the surface electrode detects potential fluctuations due to muscle contraction movements and/or due to sensory and/or nerve stimuli that are intended to trigger the corresponding movement, in the form of action stream.
  • the surface electrode can be set up to provide test voltages that are varied by muscle contraction or stimuli.
  • the sensor device includes a number of measuring devices.
  • the multiple measuring devices can be arranged circumferentially around a body part of the operator.
  • the sensor device can comprise a carrier component on which a number of surface sensors are arranged in such a way that they are each in contact with at least part of the body surface of the operator.
  • the sensor device includes a bracelet that the operator can wear on a lower or upper arm, in particular a forearm.
  • Several measuring devices can be arranged on the inside of the bracelet.
  • the measuring devices can detect action currents caused by movements of the wrist and/or finger joints of the operator's arm.
  • the bracelet preferably comprises at least four measuring devices, in particular eight, more particularly twelve measuring devices, or up to 24 measuring devices.
  • the sensor device preferably has at least one analog/digital converter, by means of which the specific measurement signals are converted into digital measurement data and can be provided in this way.
  • the robot itself or the manipulator is not instrumented or changed in any other way when the robot is put into the first operating mode.
  • the robot or the manipulator is not changed in any other way than is described here.
  • the system thus provides a hands-free configuration of the robot in terms of its mode of operation.
  • the specific measurement data are not intended to train the robot for a specific activity. This is essentially made possible independently of the configuration mechanism described here as soon as the Ro- boter is switched to the first operating mode.
  • the effort involved in data processing with regard to the measurement data is advantageously reduced. These are only evaluated in order to evaluate whether they can be assigned to predetermined reference gestures. The demands on the data processing units can thus be reduced.
  • a movement and/or position and orientation of the at least one manipulator during the first operating mode of the robot can preferably be reproduced by the robot in an operating mode that differs from the first operating mode.
  • actions to be performed by the robot can be trained directly by the operator while the robot is in the first operating mode and the gravitational compensation mechanism is activated.
  • an indirect configuration of the robot with regard to activities to be carried out is dispensed with.
  • this can increase the precision of the training process, since it is carried out directly. Since the gravitational forces are advantageously compensated, a movement and/or position and orientation of the manipulator can be carried out smoothly and precisely.
  • the training can also be carried out by non-specialist personnel, since only the desired activity to be carried out by the robot has to be demonstrated. For example, programming effort can consequently be avoided.
  • the robot is then set up to reproduce the activity to be performed that is presented in a different operating mode than the first operating mode.
  • the robot can include sensors that detect the activity being performed.
  • the sensors can be assigned to actuators of the manipulator and in this respect can detect movements, positions and orientations of the manipulator that are initiated by the operator in the first operating mode.
  • the robot is switched to the other operating mode using an analogous configuration mechanism. This means that corresponding measurement data are received by the classification device based on a gesture by the operator. If the measurement data are assigned to a corresponding predefined reference gesture, the robot is switched to the other operating mode in an analogous manner.
  • the robot has a control device that provides the movement and/or position and orientation of the at least one manipulator during the first operating mode as program code at an interface.
  • the control device can be set up to generate program code based on the movement and/or position and orientation of the manipulator during the first operating mode, which program code reflects the movement and/or position and orientation.
  • This program code can be provided at the interface.
  • the program code is provided in such a way that it can be reused at the interface.
  • an interface is understood to mean an electrical or electronic interface, for example a man-machine interface.
  • the program code can also be output at the man-machine interface, for example a display.
  • the program code can also be further processed by downstream data processing units. This allows the automation to be optimized.
  • the interface can optionally be such that the program code can be read out and manipulated at the interface and a manipulated program code can in turn be provided to the control device.
  • a movement and/or position and orientation to be carried out by the manipulator of the robot can be modified and adapted if this should be necessary.
  • the classification device and/or the control device is also coupled to an interface, for example the same interface to which the robot's control device is coupled. Then the interface can allow the reference gestures to be redefined, changed or to include additional reference gestures.
  • a reference gesture can also be provided in order to delete a waypoint of a trained movement and/or position and orientation of the manipulator or the end effector.
  • a user can adapt the reference gestures as required, for example because certain gestures are unsuitable for a particular application.
  • the extent to which the robot is to be put into a specific operating mode can be influenced if the measurement data is assigned to a reference gesture.
  • the processing of the allocation variable by the control device can be influenced.
  • training of the reference gestures can be based on the fact that a specific gesture is repeated and the measurement data then determined are used as a reference value (reference measurement data) for defining the reference gesture.
  • the currently received measurement data can then be compared with the reference measurement data that correspond to the respective reference gestures.
  • the classification device can then determine that reference gesture that reflects the current measurement data in operation (best possible). The assignment can depend on a specific level of trust being reached, that is to say on the degree of agreement exceeding a threshold value.
  • the control device of the robot is preferably set up to communicate with the control device of the system.
  • the control device of the robot can receive the assignment variable or corresponding commands from the control device. After receiving the assignment variable or the commands, the control device can put the robot in a corresponding operating mode.
  • the control device can be coupled to locking mechanisms and/or actuators of the robot, for example.
  • the robot can be put into the first operating mode by the control device and the gravitational compensation mechanism can be activated.
  • the actuators can be so flexible that gravitational forces are essentially compensated.
  • the manipulator of the robot can be moved particularly easily.
  • the reporter's control device can also be set up to control the manipulator in such a way that it performs certain movements.
  • actuators of the robot can be controlled accordingly by the control device, for example in order to reproduce trained movements, positions or orientations.
  • the controller of the robot can be separate from the controller of the system.
  • the robot's control device can also comprise the system's control device.
  • the control device can also include the classification device. The control device is then also set up to receive measurement data from the sensor device.
  • the sensor device includes a communication interface.
  • the sensor device can communicate with the classification device via the communication interface.
  • the classification device can therefore also have a communication interface.
  • the communication device of the sensor device and/or the classification device can also be set up to communicate wirelessly. This then advantageously avoids a wired connection from the sensor device on the body surface of the operator to the classification device. Thus, the comfort for the operator is significantly increased.
  • control device and/or the monitoring device can also have communication interfaces that optionally enable wireless communication.
  • the devices mentioned can thus advantageously communicate wirelessly with one another. Overall, the convenience of the system is therefore high.
  • a Bluetooth protocol an NFC (near field communication) protocol, a Wifi protocol or a ZigBee protocol can be used as the communication protocol for wireless data communication
  • the classification device is optionally set up to assign the received measurement data to a predefined reference gesture based on a regression algorithm.
  • a linear regression method for example, a linear regression method, a non-linear regression method, an interpolation and/or extrapolation method or the like can be used to compare the received measurement data with a predefined reference gesture or its corresponding reference measurement data and to enable an association.
  • Regression analysis procedures are tools for evaluating the degree of agreement between different data sets when there are more equations than known variables. In this case the whole system cannot be solved analytically. In other words, the regression algorithm provides a way to determine the degree of agreement without having to have all the variables.
  • a nearest-neighbor classification in the used as part of the regression algorithm i.e. an estimate of the probability density function.
  • the received measurement data can also be normalized in order to increase the accuracy of the mapping mechanism.
  • the classification device includes an artificial neural network.
  • the artificial neural network can be set up to execute a classification algorithm.
  • the classification algorithm can enable nearest-neighbor classification.
  • an artificial neural network is understood to mean a structure with a large number of connected neurons, which are organized in layers. Neural networks make it possible to automatically learn features from training examples. The state of each node is computed from the weighted inputs from multiple nodes in the previous layer. In other words, the underlying algorithm follows a randomly initialized strategy during the learning process that describes the mechanism of the neural network.
  • the weights of all neurons can be viewed as a representation of a specific mapping rule set from an input variable, here the received measurement data, to an output variable, here the reference gesture (alternatively the corresponding reference measurement data), to which the received measurement data is assigned, of the neural network.
  • the mapping rule set can be modified to adjust the weights of the neurons relative to each other.
  • the neural network can be trained by providing it with measurement data as input variables, which are known to be associated with a specific predefined reference gesture of the operator. This information can be compared with the reference gesture determined by the neural network, to which the neural network has assigned the measurement data received.
  • the feedback variable is then used to indicate to the neural network whether the assignment has been made correctly.
  • the neural network can modify the mapping rule set to improve the confidence level of the mapping mechanism.
  • the neural network can optionally be set up to also vary the mapping rule set during operation in order to evaluate whether the level of trust in the assignment mechanism can be improved.
  • the neural network is preferably multi-layered. This means that the neural network comprises at least one additional layer of neurons between the input node of the first layer and the output node of a last layer of neurons.
  • the mapping rule set can be made more filigree, which improves the level of confidence in the results that can be achieved.
  • the training mechanism of the neural network allows the classification device to learn the reference gestures independently of the user and still achieve a reliable assignment of the respective measurement data to a specific reference gesture with a high level of confidence. This advantageously increases the precision of the determination of the reference gestures.
  • the classification device, the control device and the control device can each have typical data processing components, for example a data processing unit which is coupled to a storage device.
  • a method for configuring a robot using a system has at least one control device which is coupled to at least one classification device and the robot.
  • the procedure includes at least the following steps:
  • Measurement data is received by the classification device from a sensor device carried by an operator.
  • a gesture of the operator is then assigned to a predefined first reference gesture by the classification device based on the received measurement data.
  • the robot is then put into at least a first operating mode by the control device, in which at least one manipulator of the robot can be moved directly by the operator if the classification device assigns the measurement data received as a result of a gesture by the operator to a predefined first reference gesture.
  • a gravitational compensation mechanism of the robot can be activated when the robot is put into the first operating mode.
  • At least the last three steps described above, beginning with the assignment of the measurement data to a reference gesture, or alternatively all of the steps, can be carried out in a computer-implemented manner.
  • a computer program can be provided, which includes instructions whose execution causes a data processing device to carry out the steps mentioned.
  • a computer-readable storage device can also be provided, which comprises instructions, the execution of which causes a data processing device to carry out the steps mentioned.
  • the method advantageously enables the robot to be put into a specific operating mode without the operator being blocked as a result. Manual operation of hardware or software switches is not necessary.
  • the method is preferably further developed in which the classification device is set up to assign the measurement data received from the sensor device to a number of different predefined reference gestures.
  • the control device then puts the robot into a number of different operating modes depending on the reference gesture to which the respective measurement data are assigned.
  • FIG. 1 shows a simplified schematic representation of a system for configuring a robot
  • the phrase "at least one of A, B and C” means, for example, (A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C), including all other possible combinations, when more than three elements are listed.
  • the term “at least one of A and B” generally means “A and/or B", namely "A” alone, “B” alone or "A and B”.
  • FIG. 1 shows a simplified schematic representation of a system 10 for configuring a robot 12.
  • the system 10 is operated and used by an operator 14.
  • a sensor device 16 is coupled to the operator 14 in such a way that the sensor device 16 can record measured values of a measured variable by means of a sensor depending on a movement and/or a position and/or an orientation and/or an irritation state of at least one part of the body of the operator 14.
  • the movement and/or position and/or orientation of the body part of the operator 14 can be stimulated by a sensory and/or neural stimulus that can be detected by the sensor. It is also conceivable that the stimulus does not trigger any real movement of an extremity due to an impairment.
  • the sensory and/or neural stimulus itself can be detected by the sensor, so that even in this case it can be used by an operator 14, although the operator 14 does not carry out any real movement or position or orientation change that should actually be triggered by the stimulus.
  • the measuring technology of the sensor device 16 is therefore based on the acquisition of electromyographic measured values. From this, the sensor device 16 determines electromyography-based measurement signals, which are converted into measurement data, for example by means of an analog-to-digital converter.
  • the sensor device 16 is set up to be in contact with at least part of the body surface of the operator 14 .
  • the sensor device Device 16 can be integrated into a bracelet that the operator 14 wears in the area of the forearm.
  • a movement of the operator 14 causes movement- and gesture-dependent action streams, which the pickup of the sensor device 16 can detect.
  • the currents of action can be based on the actual movement or the actual state of contraction of the muscle and/or on a sensory and/or neural stimulus that defines the state of contraction of the muscle.
  • the sensor device 16 can apply test voltages.
  • the sensor device 16 comprises a plurality of sensors which are arranged in the circumferential direction around a part of the body, in particular the forearm of the operator 14.
  • the system 10 also includes a classification device 18 which receives the measurement data from the sensor device 16 .
  • the classification device 18 is therefore coupled to the sensor device 16 .
  • the classification device 18 is set up to compare the received measurement data with reference gestures.
  • the reference gestures can be included in a memory device of the classification device 18, for example. Essentially, the reference gestures are associated with reference measurement data that was acquired in a training phase in order to provide a basis for the comparison.
  • the classification device 18 then evaluates which reference measurement data correspond to the currently received measurement data. In this respect, the degree of agreement is checked.
  • the currently received measurement data which is based on a movement and/or position and/or orientation of at least one part of the operator 14, is assigned to that reference gesture whose reference measurement data shows the greatest degree of agreement, optionally with the highest level of confidence.
  • the information of the reference gesture to which the classification device 18 has assigned the current measurement signals is transmitted to the control device 20 in the form of an assignment variable.
  • the control device 20 is set up to determine an operating mode into which the robot 12 is to be placed based on the reference gesture.
  • the control device then issues corresponding commands to the robot 12, which cause a switchover to the reference-gesture-dependent operating mode.
  • the robot 12 is put into a first operating mode. In the first operating This activates a gravitational compensation mechanism of the robot 12 .
  • the robot 12 comprises at least one manipulator 22, which is then freely movable when the gravitational compensation mechanism is activated.
  • the actuators on which the manipulator 22 is based are then flexible. The locking mechanisms of the actuators are essentially deactivated.
  • At least the manipulator 22 can then be moved freely by the operator 14 in the first operating mode in order to teach the robot 12 movement patterns and/or positions and/or orientations of the manipulator 22, which the latter can later reproduce in another operating mode.
  • the operator 14 advantageously has both hands free to move the manipulator 22.
  • the presentation of specific trajectories or positions of the manipulator 22 can take place much more intuitively.
  • a hand of the operator 14 is not occupied by having to operate a dead man's switch with his hand to activate the gravitational compensation mechanism, as is required in known approaches.
  • the configuration of the robot 12 and the teaching of trajectories of the manipulator 22 is presently possible in a shorter period of time, in a more intuitive manner and with increased precision.
  • no complex sensor devices with inertial sensors are necessary, which ensures a reduction in costs.
  • the robot 12 comprises an end effector 24 which represents the last link in the kinematic chain of the manipulator 22 .
  • the end effector 24 can be used to perform an activity on a work table 26 .
  • the end effector 24 can also be set depending on the operating mode. So it can also be compliant when the gravitational compensation mechanism is activated.
  • the end effector 24 can also be moved by the operator 14 (in the first operating mode) in order to learn specific trajectories and/or positions and/or orientations.
  • the classification device 18 can be set up to execute a classification algorithm.
  • the classification device 18 can be set up to execute a regression algorithm 27, in particular a nearest-neighbor classification algorithm.
  • the classification device 18 can also include a neural network 28 whose neurons are used to determine the assignment. This can increase the confidence level of the assignment.
  • the neural network 28 can be trained in a training phase in which the operator 14 makes defined reference gestures. Their corresponding measurement data then form reference measurement data in the classification device 18 for the later evaluation of actual measurement data during normal operation of the system 10.
  • the data communication between the sensor device 16, the classification device 18, the control device 20 and the robot 12 does not have to be wired. Communication can also be wireless.
  • the devices have communication devices 30, 32, 34, 38 that are set up for a wireless communication protocol, for example Bluetooth or Wifi.
  • the operating mode of the robot 12 is switched on the basis of the commands which the control device 20 outputs to the robot 12 as a function of the specific reference gesture.
  • the robot 12 has a control device 36 in the present case.
  • the control device 36 is also set up to control the actuators of the manipulator 22 and optionally of the end effector 24 in such a way that the gravitational compensation mechanism is activated for them, so that they are flexible and can be moved by the operator 14 particularly easily.
  • the control device 36 is set up to record movements and/or positions and/or orientations of at least the manipulator 22 during the first operating mode and to reproduce them in another operating mode of the robot 12 .
  • the classification device 18 and/or the control device 20 and/or the control device 36 can be implemented at least partially in a single device.
  • the classification device 18 and/or the control device 20 in the control device 36 of the robot 12 can be realized. Then the system 10 again has a reduced complexity.
  • the classification device 18 and/or the control device 20 and/or the control device 36 are optionally coupled to an interface 40 .
  • the interface 40 allows interaction with other components or with a human.
  • the reference gestures or the reference measurement data on which they are based, which are present in the classification device 18 can be varied via the interface 40 .
  • the control device 20 can be influenced in order to assign a specific reference gesture to a modified or additional operating mode of the robot 12 into which the robot is to be placed.
  • the control device 36 can be set up to convert movements and/or positions and/or orientations of at least the manipulator 22 during the first operating mode into program code, which can be provided at the interface 40 .
  • the interface 40 can be set up to transmit modified program code to the control device 36 of the robot 12, as a result of which an operator 14 can influence the learned trajectories in a targeted manner.
  • FIG. 2 shows a simplified schematic illustration of a method 42 for configuring a robot 12. Optional steps are shown in dashed lines.
  • step 44 measurement data is received by the classification device 18 from a sensor device 16 carried by an operator 14 .
  • step 46A a gesture of the operator 14 is associated with a predefined first reference gesture by the classification device 18 based on the received measurement data.
  • step 48A the robot 12 is put into at least a first operating mode by the control device 20, in which at least one manipulator 22 of the robot 12 can be moved directly by the operator 14 if the classification device 18 assigns the measurement data received as a result of a gesture by the operator 14 to a predefined first reference gesture.
  • step 50 a gravity compensation mechanism of the robot 12 is activated when the robot 12 is placed in the first mode of operation.
  • the method 42 provides an intuitive robot 12 configuration mechanism. This can thus advantageously also be configured by non-specialist personnel.
  • the gesture of the operator 14 can be assigned to different reference gestures in steps 46B to 46D based on the received measurement data.
  • the robot 12 can optionally be put into mutually different operating modes depending on the respective associated reference gesture.
  • Method 42 can also be developed in that, in step 52, measurement signals are determined by sensor device 16, in particular measurement signals based on electromyographic measurement values of a measurement variable that are detected by a sensor.
  • Amounts and numbers may be referred to in the present application. Unless expressly stated, such amounts and numbers are not to be considered as limiting, but as examples of the possible amounts or numbers within the context of the present application.
  • the term “plurality” can also be used in the present application to refer to a quantity or number. In this context, the term “plurality” means any number greater than one, e.g. e.g., two, three, four, five, etc.
  • the terms "about”, “about”, “near”, etc. mean plus or minus 5% of the stated value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système et un procédé de configuration d'un robot. Le système comprend au moins un dispositif capteur porté par un opérateur, un dispositif de classification, un dispositif de commande et un robot avec au moins un manipulateur. Le dispositif de classification reçoit au moins des données de mesure provenant du dispositif capteur. Le dispositif de commande est couplé au moins au dispositif de classification et au robot. Le dispositif de commande fait passer le robot dans au moins un premier mode de fonctionnement dans lequel le ou les manipulateurs peuvent être mis en mouvement directement par l'opérateur, si le dispositif de classification associe les données de mesure reçues suite à un geste de l'opérateur à un premier geste de référence prédéfini. Dans le premier mode de fonctionnement, un mécanisme de compensation de la gravité du robot est activé.
PCT/EP2022/051389 2022-01-21 2022-01-21 Système et procédé de configuration d'un robot WO2023138784A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/051389 WO2023138784A1 (fr) 2022-01-21 2022-01-21 Système et procédé de configuration d'un robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/051389 WO2023138784A1 (fr) 2022-01-21 2022-01-21 Système et procédé de configuration d'un robot

Publications (1)

Publication Number Publication Date
WO2023138784A1 true WO2023138784A1 (fr) 2023-07-27

Family

ID=80736099

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/051389 WO2023138784A1 (fr) 2022-01-21 2022-01-21 Système et procédé de configuration d'un robot

Country Status (1)

Country Link
WO (1) WO2023138784A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120221177A1 (en) 2010-12-10 2012-08-30 Foundation Of Soongsil University-Industry Cooperation Method of controlling navigation of robot using electromyography sensor and acceleration sensor and apparatus therefor
US9821457B1 (en) 2013-05-31 2017-11-21 Brain Corporation Adaptive robotic interface apparatus and methods
CN107553499A (zh) 2017-10-23 2018-01-09 上海交通大学 一种多轴机械臂的自然手势运动控制系统和方法
CN109778932A (zh) * 2019-03-22 2019-05-21 江苏徐工工程机械研究院有限公司 一种基于手臂姿态的非接触式挖机臂操控系统及操控方法
US20200097081A1 (en) 2018-09-20 2020-03-26 Jasmine Stone Neuromuscular control of an augmented reality system
CN111399640A (zh) * 2020-03-05 2020-07-10 南开大学 一种面向柔性臂的多模态人机交互控制方法
DE102017010678B4 (de) * 2017-11-17 2021-07-01 Kuka Deutschland Gmbh Verfahren und System zum Vorgeben eines Beaufschlagungsmuster-Befehls-Lexikons zur Eingabe wenigstens eines Roboterbefehls

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120221177A1 (en) 2010-12-10 2012-08-30 Foundation Of Soongsil University-Industry Cooperation Method of controlling navigation of robot using electromyography sensor and acceleration sensor and apparatus therefor
US9821457B1 (en) 2013-05-31 2017-11-21 Brain Corporation Adaptive robotic interface apparatus and methods
CN107553499A (zh) 2017-10-23 2018-01-09 上海交通大学 一种多轴机械臂的自然手势运动控制系统和方法
DE102017010678B4 (de) * 2017-11-17 2021-07-01 Kuka Deutschland Gmbh Verfahren und System zum Vorgeben eines Beaufschlagungsmuster-Befehls-Lexikons zur Eingabe wenigstens eines Roboterbefehls
US20200097081A1 (en) 2018-09-20 2020-03-26 Jasmine Stone Neuromuscular control of an augmented reality system
CN109778932A (zh) * 2019-03-22 2019-05-21 江苏徐工工程机械研究院有限公司 一种基于手臂姿态的非接触式挖机臂操控系统及操控方法
CN111399640A (zh) * 2020-03-05 2020-07-10 南开大学 一种面向柔性臂的多模态人机交互控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DU ET AL.: "Robot manipulator using a vision-based human-manipulator interface", J THEOR APPL INF TECHNOL, vol. 50, no. 1, 10 April 2013 (2013-04-10)

Similar Documents

Publication Publication Date Title
DE102014222809B3 (de) Event-basierte Redundanzwinkelkonfiguartion für Gelenkarmroboter
DE102010045529B4 (de) Interaktives Robotersteuerungssystem und Verwendungsverfahren
DE102014108287B4 (de) Schnelles Erlernen durch Nachahmung von Kraftdrehmoment-Aufgaben durch Roboter
EP2546711B1 (fr) Procédé de programmation d'un robot
DE102010045528A1 (de) Gerüst und Verfahren zum Steuern eines Robotersystems unter Verwendung eines verteilten Rechnernetzwerks
DE102017202717A1 (de) Roboteranlernung per menschlicher demonstration von aufgaben mit kraft- und positionszielen
DE102011117094B4 (de) Robuster betrieb von sehnengetriebenen roboterfingern unter verwendung von kraft- und positionsbasierten steuergesetzen
DE102013017425A1 (de) Verfahren für die Erkennung von Gesten eines menschlichen Körpers
DE10215885A1 (de) Automatische Prozesskontrolle
EP3615275B1 (fr) Dispositif de vissage
EP3990231B1 (fr) Système de saisie sur un manipulateur robotisé
WO2023138784A1 (fr) Système et procédé de configuration d'un robot
EP3710900B1 (fr) Procédé et système pour la prescription ou l'apprentissage d'une instruction pour robot
DE102015117306A1 (de) Mehrachs-Maus für einen Mehrachsroboter
WO2023025658A2 (fr) Main robotique d'un robot et procédé d'apprentissage pour robot, et élément portable à capteurs et retour d'effort associé
CH717682B1 (de) Vorrichtung zur Konfiguration von Robotersystemen mittels elektromyographischen Signalen.
DE102020006839A1 (de) System und Verfahren zum manuellen Anlernen elnes Robotermanipulators
WO2020260556A1 (fr) Procédé pour spécifier une valeur d'entrée sur un manipulateur robotisé
EP3448321B1 (fr) Prothèse
EP4132432A1 (fr) Dispositif de prothèse actionnable, dispositif de traitement de données électroniques et procédé d'actionnement d'un dispositif de prothèse
DE102022206273A1 (de) Verfahren zum Trainieren eines maschinellen Lernmodells zum Implementieren einer Steuerungsvorschrift
WO2018185140A1 (fr) Gant de données pour la commande à distance d'une main robotisée et procédé correspondant
DE112021004529T5 (de) Steuervorrichtung und Steuerverfahren
DE102020111535A1 (de) Verfahren zur Steuerung zumindest eines Aktuators einer orthopädietechnischen Einrichtung und orthopädietechnische Einrichtung
DE102020110687A1 (de) Verfahren zum Übertragen von Daten

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22709581

Country of ref document: EP

Kind code of ref document: A1