EP3934858A1 - Verfahren, system sowie nichtflüchtiges speichermedium - Google Patents

Verfahren, system sowie nichtflüchtiges speichermedium

Info

Publication number
EP3934858A1
EP3934858A1 EP20710474.6A EP20710474A EP3934858A1 EP 3934858 A1 EP3934858 A1 EP 3934858A1 EP 20710474 A EP20710474 A EP 20710474A EP 3934858 A1 EP3934858 A1 EP 3934858A1
Authority
EP
European Patent Office
Prior art keywords
machine
model
sub
sensor
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20710474.6A
Other languages
German (de)
English (en)
French (fr)
Inventor
Christian PIECHNICK
Klaus Wagner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wandelbots GmbH
Original Assignee
Wandelbots GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wandelbots GmbH filed Critical Wandelbots GmbH
Publication of EP3934858A1 publication Critical patent/EP3934858A1/de
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/423Teaching successive positions by walk-through, i.e. the tool head or end effector being grasped and guided directly, with or without servo-assistance, to follow a path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/427Teaching successive positions by tracking the position of a joystick or handle to control the positioning servo of the tool head, master-slave control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36436Arm follows movement of handheld device, camera detects, analyses motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36479Record position on trigger of touch probe
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40116Learn by operator observation, symbiosis, show, watch
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40391Human to robot skill transfer

Definitions

  • Various exemplary embodiments relate to a method, a system and a non-volatile storage medium.
  • Programming is usually done in the form of program code by one or more specially trained staff
  • the cost makes automation by means of a
  • Glue gun a sensor system (e.g. a camera) and a control system (e.g. a programmable logic device
  • Motion planning and motion control in the form of a
  • a programmer manually writes the program code that allows the robot to carry out the application autonomously at execution time.
  • the programming of an industrial robot can alternatively or additionally be carried out by the expert using CAD-based code generation.
  • a virtual representation of an industrial robot can alternatively or additionally be carried out by the expert using CAD-based code generation.
  • a learning process (also referred to as teach-in) is conventionally used.
  • the robot can be controlled manually, for example.
  • a sensitive robot also referred to as a co-bot
  • the trajectory ie the path on which the robot should move
  • Robots are supposed to perform, however, remain complex and are therefore conventionally not taken into account by the learning process.
  • the complexity consists, for example, in the integration of the various components of the robot, such as the end effector, the sensors and the control system, into the process to be carried out, which is therefore done manually
  • the learning process can alternatively or additionally take place via an interactive input device. This will be described
  • the learning process can alternatively or additionally take place by means of sensor data processing.
  • Sensor data processing is based on sensors that are attached directly to the robot.
  • the field of vision is often restricted by the end effector and robot.
  • changing light conditions or air particles affect the sensors on the robot.
  • a method can include: determining a machine-independent
  • Process model based on data, the data indicating a handling of an implement when performing a
  • Represent process flow wherein the process flow has a plurality of sub-processes, wherein the process model for each sub-process of the plurality of sub-processes a process activity with spatial information of the
  • Sub-process linked Mapping of the machine-independent process model to a machine-specific control model of a machine using a model of the machine, the machine-specific control model defining an operating point of the machine for each sub-process of the plurality of sub-processes that corresponds to the process activity and the spatial information of the sub-process.
  • the data can be used during the handling of the implement
  • a machine-unspecific process model is clearly generated which describes the overall human process of the process task to be performed.
  • the process model can be created, for example by tracking how the process task is done by a person.
  • the machine-unspecific process model is then transferred to a machine-specific control model, which is tailored to the hardware platform (generally also referred to as a machine) that is to perform the process task automatically.
  • a machine-specific control model tailored to the hardware platform (generally also referred to as a machine) that is to perform the process task automatically.
  • control program can be formed, which the hardware platform, e.g. whose PLC (programmable logic controller) can execute.
  • PLC programmable logic controller
  • Figures 1 and 3 each show a method according to different
  • Figures 2 and 4A each show a method according to different
  • FIG. 4B shows a system for performing a method according to various embodiments in one
  • FIG. 5 shows the method according to various embodiments in a schematic model diagram
  • FIG. 6 shows the method according to various embodiments in a schematic flowchart
  • FIG. 7 shows a machine in accordance with various embodiments in a schematic structural diagram.
  • connection means both a direct and an indirect connection, a direct or indirect connection and a direct or indirect coupling, e.g. one
  • a clutch can be set up, a mechanical interaction (e.g.
  • processor can be used as any type of entity
  • the data or signals can be any type of signals. Signals allowed.
  • the data or signals can be any type of signals.
  • a processor can be an analog circuit, a digital circuit, a
  • Microprocessor a central processing unit (CPU), a graphics processing unit (GPU), a digital one
  • DSP Signal processor
  • FPGA programmable gate arrangement
  • system can be used as a phrase interacting
  • entities can include at least one mechanical component, at least one electromechanical transducer (or other types of actuators), at least one electrical component, at least one instruction (e.g. in a
  • Storage medium and / or at least one
  • actuator also referred to as an actuator or actuator
  • the term “actuator” can be understood as a component that is used to influence a mechanism or a process in Response to a control is set up.
  • the actuator can convert instructions issued by the control device (the so-called activation) into mechanical movements or
  • the actuator e.g. an electromechanical
  • Converter for example, can be set up to convert electrical energy into mechanical energy (e.g. by movement) in response to an activation.
  • control device can be understood as any type of logic implementing entity that
  • control device can have an interconnection and / or a processor which can execute software that is stored in a storage medium, in firmware or in a combination thereof, and can issue instructions based thereon.
  • the control device can for example be configured by means of code segments (e.g. software) to enable the operation of a system (e.g. its operating point), e.g. a machine or a plant, e.g. at least its kinematic chain to control.
  • Controlling can be understood as an intended influencing of a system.
  • Control have a forward control path and thus clearly implement a sequence control that converts an input variable into an output variable.
  • the control path can, however, also be part of a control loop, so that regulation is implemented.
  • the control has a continuous influence of the output variable on the input variable, which is effected by the control loop (feedback).
  • Process flow can be understood as the sum of all processes (eg a chronological sequence of controlled events) that fulfill a predefined process task.
  • the sub-processes of the process flow can each fulfill a sub-task (ie part of the process task).
  • the individual sub-processes can, depending on the type of process flow,
  • Sub-process can, for example, be carried out, started or ended precisely when a process situation assigned to it is present, e.g. a threshold value for a measured variable is not reached or a pattern recognition recognizes the workpiece to be processed.
  • a process activity and at least one vector of the process activity can be assigned to each sub-process.
  • the vector can have at least one position, its change, a spatial distribution and / or at least one direction of the
  • Process activity can also be more complex or detailed (more general herein as spatial information
  • the spatial information can be assigned a time indication about the process activity, which e.g. Define the duration, the beginning, the end and / or a cycle of the process activity.
  • the process activity can add the total
  • the corresponding spatial information can describe where and / or with what distribution the effect is to be provided and / or in what spatial position (ie Position and / or orientation) the implement is located for this purpose.
  • the handling of the work device can describe the way in which the work device is guided and / or actuated when a process sequence is carried out, for example how it is held, how strongly pressed, and / or how long the
  • a data-based (e.g. digital and / or virtual) representation of an original can be understood as a model, e.g. a physical object (e.g. a machine) or a process (e.g. a control process or a
  • Modeling i.e. the image of the original on the
  • the model can, for example, contain physical information (e.g. length, distance, weight, volume, composition, etc.), movement-related
  • Information e.g. position, alignment, direction of movement, acceleration, speed of movement, etc.
  • logical information links, sequence, couplings,
  • Information e.g. time, total duration, frequency,
  • Period etc.
  • functional information e.g. current intensity, effect, map or characteristic curve
  • the control model can accordingly be a formal one
  • control model can contain a variety of instructions for
  • Control e.g. to bring the machine to a working point
  • further criteria the fulfillment of which triggers the instruction assigned to them, ends or
  • control model can be a
  • control logic which logically links several criteria and / or several instructions with one another, and / or which implements a process (eg a process plan) according to which the control takes place.
  • the process model can be a formal
  • Process model can have a large number of links between a process activity and the corresponding spatial information and optionally assign corresponding process situations to the process activities
  • the process model can have a process logic that logically links several process situations and / or several process activities with one another, and / or which has a sequence (e.g. a
  • the process situation can cause a process activity assigned to it in accordance with the spatial information (which represents the conditioned sub-process).
  • a flow chart can have at least branches, jumps and / or loops.
  • the presence or absence of a process situation can generally be represented by means of at least one criterion, which is fulfilled, for example, in the presence or absence of the process situation.
  • the mapping can be the transfer of elements of a
  • the mapping can assign at least one element of the mapping to each element of the original image.
  • the mapping can have, for example, operators, transformations and / or on the elements of the initial set
  • the elements can in general have: logical relationships, links, information, properties, coordinates or the associated coordinate system, mathematical objects (such as formulas or numbers), processes, activities, etc.
  • a code generator can be understood as a computer program which is set up to produce a model, e.g. is available in a modeling language, in a
  • UML unified modeling language
  • the model can also be in a markup language, a structogram, a decision table or another formal one
  • the code generator generates code segments (also referred to as code generation) which can be combined with other optional program parts to form a program.
  • the spatial information of the orientation and / or position of an object can be understood here as spatial position (also referred to as position information or, in simplified form, as position).
  • the position can clearly describe the location (e.g. a point) in space and the orientation the respective orientation (e.g. a direction) of an object relative to the space.
  • a series of spatial positional information that is taken up one after the other by an object can be understood as a trajectory.
  • the position information can optionally be time-dependent (i.e. movement-related, then also referred to as movement), according to a clocking or
  • FIG. 1 illustrates a method 100 according to various embodiments in a schematic side view.
  • the machine 114 to be programmed may be a robot, e.g. an industrial robot for handling, assembling or processing a workpiece.
  • the method 100 enables, for example, the end-user programming of the complete automation application (including
  • the machine 114 to be programmed can generally have a manipulator 114p, 114e and a frame 114u on which the manipulator 114p, 114e is supported.
  • manipulator 114p, 114e summarizes the entirety of the movable members 114v, 114g, 114e of the machine 114, the control of which enables a physical interaction with the environment, e.g. carry out a process flow.
  • the machine 114 can have a control device 712 which is set up to implement the interaction with the environment in accordance with a control program.
  • the last link 114e of the manipulator 114p, 114e (also referred to as the end effector 114e) may have one or more than one tool, such as one
  • Welding torch a gripping instrument, a painting device or the like.
  • the manipulator 114p, 114e can at least one
  • Positioning device 114p for example a
  • Robotic arm 114p (also referred to more generally as an articulated arm), to which the end effector 114e is attached.
  • the robotic arm 114p illustratively provides a mechanical arm which can provide functions similar to those of a human arm.
  • the links of the positioning device 114p can be, for example, connecting links 114v and joint links 114g, the connecting links 114v being connected by means of the
  • Hinge members 114g are interconnected.
  • joint member 114g may have one or more joints, each joint of which is the one connected thereto
  • Links 114v can provide rotational movement (i.e., rotational movement) and / or translational movement (i.e., displacement) relative to one another.
  • the movement of the joint members 114g can be set in motion by means of actuators which are controlled by the control device 702.
  • a sensor arrangement 102 (having at least one tracker) can be or will be mounted on a work device 104.
  • a person 106 carries out an activity for completing the process task by means of the working device 104 to which the sensor arrangement 102 is attached (e.g. painting,
  • the work device 104 can be, for example, any hand-held work device 104 that a human worker 106 can use, relocate, hold, lift and / or manipulate (e.g., a handheld screwdriver, a spray gun, a) in the course of his job
  • the sensor arrangement 102 transmits data to an external receiver, which data are detected by an integrated sensor system of the sensor arrangement 102.
  • the data can for example be a position and / or a movement (e.g.
  • actuation sensor for example a button or switch, more generally also referred to as a trigger
  • the external receiver thus receives the time-dependent position of the sensor arrangement 102 in the room 701, 703, 705. Based on this, the time-dependent position of the work device 104 can be determined.
  • data from an additional external sensor system 112 also referred to as an additional sensor arrangement 112 and / or PLC data can be obtained from the receiver.
  • the additional sensor arrangement 112 can in
  • time-based data for example at a high frequency, are acquired and recorded by means of the sensor arrangements 102, 112, which cover the entire process sequence
  • This data can optionally be job-specific
  • Activity-specific process parameters can include the parameters of the respective function and / or of the operating point of the work device 104 (e.g. a volume flow of the paint spray gun).
  • Activity-specific process parameters can be determined in 103 a platform-independent model 104m (also referred to as process model 104m) of the process task.
  • This process model 104m clearly describes the overall human process of the process task.
  • the process model 104m can optionally be examined and adapted by a person 106.
  • the incoming data are time-based movement data of the human-held work device 104, data of the trigger (s) and data of further external ones
  • An instance of a platform-independent process model 104m e.g. in the form of a metamodel.
  • the metamodel describes the data types of the model instance and their possible ones
  • a model is, for example, a directed graph with typified nodes. Nodes have a data type (node of the metamodel) that describes the parameters of the model and their value ranges.
  • the generation of the model instance on the basis of the input data takes place with the help of, for example, artificial neural networks.
  • kNN Artificial neural networks
  • the training vectors are set according to the input parameters required in each case (for example, spatial coordinates of a sub-object of an implement, associated time, working points / control points of the implement,
  • a specific hardware platform 114 (also referred to more generally as a machine 114) can be selected (e.g. a specific robot type or end effector, etc.).
  • the machine specifics (e.g. structure) of the machine 114 can be taken into account by means of a model 116m of the machine 114.
  • Process model 104m can be software in 105 Generate platform-specific model 116m (also referred to as control model 116m) for a robot controller 702.
  • Generate platform-specific model 116m also referred to as control model 116m
  • control model 116m for a robot controller 702.
  • Process parameters e.g. volume flow at the paint end effector and / or movement sequences
  • Process parameters can be determined which correspond to the activity-specific process parameters.
  • the incoming data is an instance of the platform-independent metamodel (more generally
  • platform dependent model 116m is also available via a
  • the platform-dependent model 116m describes the data types and relationships of the platform-independent models 104m.
  • the model transformation describes a
  • Mapping function such as nodes or groups of nodes from the platform-independent model 104m on nodes or
  • Node groups of the platform-dependent model 116m are mapped. It also describes how these generated nodes are related to each other. The mapping takes place, for example, taking into account the respective
  • a model of the machine can contain, for example, and thus take into account:
  • Machine e.g. maximum gripping force of the end effector and / or degrees of freedom or freedom of movement of the
  • Control variables or controlled variables input variables of the individual actuators or output variables of the individual sensors;
  • Characteristics of a specifically used platform e.g. machine
  • a specifically used platform e.g. machine
  • a code generator 412 can be used to generate a program code 116 (e.g.
  • the program code 116 can denote the respective code in which the control program 116 is written. Depending on the process task,
  • the program code 116 can be generated for a communicating overall system (for example the robot controller and the PLC controller).
  • the program code 116 can optionally have predefined parts to which the program code 116 can be adapted by a developer.
  • the code is generated in the form of templates that exist for each target language. This
  • Templates have instances of the platform-dependent model as input and describe how from them at the metamodel level
  • Text fragments are generated.
  • these templates also have a pure text output
  • Control structures e.g. branches.
  • a template engine in turn has a template and an instance of the
  • platform-independent model as input and generates one or more text files from it.
  • Fig. 2 illustrates the method 100 according to various embodiments in a schematic flow diagram 200.
  • the method 100 may include in 101: attaching 201 a mobile sensor arrangement 102 to the work device 104 of the manual process flow generated by the machine 114 (e.g.
  • Process flow is, for example, rail-based (e.g. for gluing, welding, painting or milling).
  • the sensor arrangement 102 can be fastened to the work device 104 magnetically, with clamping screws, with a clip or Velcro tape, and optionally sequentially on several
  • the sensor assembly 102 can be one or more than one
  • the sensor arrangement 102 can optionally have one or more than one mobile unit, each mobile unit having at least one sensor of the sensor arrangement 102, eg a mobile unit with a trajectory sensor (also known as a tracker).
  • the or each mobile unit can be self-sufficient, for example to its own
  • Energy supply and / or for wireless communication can be accommodated together in a housing of a mobile unit (also referred to as internal sensors).
  • a transducer As a sensor (also referred to as a detector), a transducer can be understood that is set up to provide a
  • the Sensor type to record the corresponding property of its environment qualitatively or quantitatively as a measured variable, e.g. a physical or chemical property and / or a material nature.
  • the measured variable is the physical variable to which the measurement using the sensor applies.
  • a sensor can be, for example, a certain type of sensor
  • the operating point sensor can, for example, detect the operating point of the implement 104.
  • the trajectory sensor can, for example, a movement and / or the position (i.e. orientation and / or
  • the sensor arrangement 102 can have at least one optoelectronic sensor (e.g. a camera), at least one traction sensor and / or at least one
  • the traction sensor can, for example, be a motion sensor (e.g. an acceleration sensor and / or
  • Having speed sensor and / or a position sensor (e.g. having an orientation sensor and / or a position sensor).
  • the sensor arrangement 102 and / or the additional sensor arrangement 112 can have at least one sensor have, for detecting an electrodynamic
  • Property e.g. current, voltage, magnetic field or power
  • a location-related property e.g. alignment and / or position
  • Property e.g. speed and / or acceleration
  • a thermal property e.g. temperature or temperature difference
  • a geometric property e.g. distance, solid angle, volume
  • a photometric property e.g. light intensity, brightness, color, energy, or power
  • a mechanical property e.g. force, pressure, Mass, energy, power, torque, actuation, etc.
  • Sensor arrangement 102 can optionally be separate from the
  • the at least one sensor can, for example, have an actuation sensor and / or an operating point sensor.
  • the at least one sensor can be attached to the manual implement 104, e.g.
  • a supply device of the manual work device 104 e.g. for measuring a volume flow, a temperature or a current strength
  • the manual implement 104 can be a handheld device that is powered by a stationary supply device, e.g. with a fluid and / or with energy.
  • the operating point can describe the point in the characteristics map or on the characteristic curve of a technical device that is assumed due to the system properties and external influences and parameters of the device.
  • the working point can clearly describe the operating status of the device.
  • the method 100 can optionally further comprise in 101:
  • the calibration can include calibrating the position of the sensor arrangement 102 in relation to the coordinate system of the implement 104, the additional sensor arrangement 112 and / or a global coordinate system.
  • the global coordinate system can be
  • the calibration can include, for example: detecting the position of the sensor arrangement 102 with respect to the working device 104.
  • the calibration can alternatively or additionally include: detecting the position of the sensor arrangement 102 in space, e.g. in terms of the global coordinate system. Using the two position information items that describe the sensor arrangement 102 relatively in space and relative to the work device 104, the trajectory 111 of the work device 104 can be determined on the basis of the trajectory of the sensor arrangement 102 (see FIG. 1).
  • the method 100 may include at 101: manual
  • Work device 104 is carried and / or operated manually (i.e. by a person).
  • the work device 104 can be a work device carried or at least moved by muscle power.
  • Receiver send and / or the external receiver can determine the position and movement of the sensor arrangement 102 in space
  • a signal transmission to the external receiver via radio e.g. Bluetooth
  • the sensor arrangement 102 can have at least one trigger, which with a
  • User interface of the implement 104 is coupled.
  • the trigger can, for example, record the manual control of the work device 104 at the user interface.
  • the trajectory 111 (e.g. position and / or movement) of the sensor arrangement 102 can be detected by means of at least one camera 112 or another type of sensor 112, e.g. by means of a laser scanner 112, one
  • Distance sensor 112 a sonar sensor 112 and / or a radar sensor 112.
  • the method 100 can further optionally include in 103:
  • the movement data or the trajectory 111 can also be smoothed (e.g. so that one is as straight as possible along the
  • the stored parameters of the process sequence can have boundary conditions, such as represent optimal and / or limit values.
  • boundary conditions such as represent optimal and / or limit values.
  • the stored parameters can use the
  • process flow such as the working speed, the holding time, the pressure force, etc.
  • the determined process model 104m can be set up platform-independent, i.e. do not focus on a specific one
  • the method 100 can have in 105: transferring 209 the process model 104m into the control model 116m.
  • the process model 104m can be processed with stored machine-specific information 114m from one or more different machines 114 to form the
  • Control model 116m (also as process and
  • Machine 114 The sum of the machine-specific
  • Information 114m may also be referred to as a model 114m of the or each machine 114.
  • the machine-specific information 114m can
  • the machine-specific information 114m can include at least one tool 114e of the machine 114 (also referred to as machine tool 114e)
  • Positioning device 114p of the machine 114 are attached.
  • the method 100 in 103 can include:
  • Post-processing 207 of the process model 104m can optionally have: post-processing 211 of the control model 116m that has been formed.
  • the post-processing 207 can for example by a user with the help of a
  • User interface 406 also referred to as a user interface
  • application software e.g. executed on a PC, tablet or smartphone
  • the method 100 can optionally include in 207 or 211: visualization and simulation of the
  • Control model 116m or process model 104m in a virtual environment and / or editing the trajectory 111, 113, the process logic, the process parameters and the
  • a spatial sub-model 502 of the process model 104m can, for example, with the representation of the corresponding
  • the method 100 can have in 107:
  • Code generation for one or more than one machine 114 optionally of different types, optionally with the
  • a control program 116 which can be executed by the corresponding machine 114, can be formed by means of the code generation 107.
  • the code generation 107 takes place, for example, in the form of templates that exist for each target language. This
  • Templates have instances of the platform-dependent model as input and describe how from them at the metamodel level
  • Text fragments are generated.
  • these templates also have a pure text output
  • Control structures e.g. branches.
  • a template engine in turn has a template and an instance of the
  • platform-independent model as input and generates one or more text files from it.
  • the code generation 107 can, for example, for a
  • the code generation 107 can, for example
  • the source code can be generated for different target languages, e.g. depending on which target language is suitable for the corresponding machine.
  • the source code e.g. can be subsequently adapted and edited by a developer, for example by means of the
  • Fig. 3 illustrates the method 100 according to various embodiments in a schematic side view 300.
  • the sensor arrangement 102 can, for example, be a software-based method 100 for teaching a
  • Enable industrial robot which is also accessible to a technical layman 106.
  • a non-programmer 106 may be enabled to do a
  • At least one task expert 106 e.g. a mechanic or a welder
  • Control software of the robot 114 can be generated completely automatically.
  • the method 100
  • control model 116m is independent of specific machine types and can therefore also be used for any machine type, e.g. from another manufacturer, can be reused. Conversion of the machine 114 can thus be facilitated.
  • the method 100 can include in 101: acquiring the data by means of the sensor arrangement 102 (clearly a physical sensor component) and / or an additional (clearly external) sensor arrangement 112.
  • the or each mobile unit can be attached to a work device 104 and by means of at least one integrated therein Sensor detect the position and / or the acceleration of the implement 104.
  • Attaching the or each mobile unit of the sensor arrangement 102 to the work device 104 can be carried out in 101 by means of a
  • Fixing devices e.g. magnetic, clip, or Velcro.
  • the sensor arrangement 102 can have at least one trigger 102t (for example on a trigger of a glue gun) which detects manual control of the implement 104.
  • the additional sensor arrangement 112 can determine the position of the
  • Detect sensor arrangement 102 in space 301 Both the data from the sensor arrangement 102 (e.g. its mobile unit) and the data from the external sensor system 112 can be transmitted in a time-synchronized manner to a computing unit 302 (e.g. PC, laptop, etc.) which has the external receiver or is communicatively connected to it.
  • a computing unit 302 e.g. PC, laptop, etc.
  • the sensor arrangement 102 can be calibrated 203 in its relative position (i.e. position and / or orientation) to the work device 104.
  • Coordinate system of the implement 104 can be determined. Furthermore, a calibration can be carried out in a global
  • Fig. 4A illustrates the method 100 according to various embodiments in a schematic flow diagram 400a.
  • a person can exemplarily run the process once or several times with the actual work device 104
  • the process sequence can include screwing a screw using an electric screwdriver 104.
  • the sensor arrangement 102 and / or the additional sensor arrangement 112 can detect at least one measured variable which defines a spatial position (and / or its
  • the trajectory 111 of the work device 104 can be recorded.
  • the trajectory 111 of the implement 104 can, for example, provide information about a speed, a Have the position, an acceleration and / or an orientation of the implement 104.
  • provided function (and / or its change) are recorded in a parameterized manner, e.g. a rotational speed of a shaft of the implement, a temperature of a heating device of the implement, a current through a welding tip of the
  • the trigger 102t can be configured to detect manual control of the implement 104, e.g. at a user interface of the work device 104.
  • the data 402 collected in 101 can be sent to a processing unit 302 (or another processing unit) and there with activity-specific process parameters 404 (e.g.
  • Volume flow of a spray gun 104) can be enriched 403. Enriching with activity-specific
  • Process parameters 404 can for example be done automatically or manually (e.g. via user input).
  • the user can choose from a range of tools (e.g. spray guns) in the graphical user interface of the computing unit.
  • the process-specific parameters e.g. volume flow
  • the process-specific parameters are then queried via an input form.
  • the computing unit 302 can use the acquired data 402 to create 103 the process model 104m (also referred to as a process-specific model), which describes the process task without reference to a specific automation.
  • the processing unit 302 can optionally be set up to change the data 402, for example to optimize and / or abstract it. For example, changing can include to thin out and / or smooth the trajectory 111 (for example the movement data), to identify and / or optimize sub-processes and to close logic connections.
  • the process model 104m can optionally by means of a
  • User interface 406 (e.g. on a PC, tablet or
  • components e.g. sections or individual points
  • the trajectory 111 of the implement 104 can be adapted and / or finely adjusted.
  • an actuation with the at least one trigger 102t can be tracked, changed and / or linked to the trajectory 111.
  • the process model 104m can further be provided with machine-specific information 114m (e.g. the concrete robot platform 114 or at least the end effector 114e), e.g.
  • the processing unit 302 can automatically determine the control model 116m from the process model 104m and the provided machine-specific information 114m.
  • the control model 116m represents a
  • the incoming data is an instance of the platform-independent metamodel (more generally
  • platform dependent model 116m is also available via a
  • the platform-dependent model 116m describes the data types and relationships of the platform-independent models 104m.
  • the model transformation describes a
  • Mapping function such as nodes or groups of nodes from the platform-independent model 104m on nodes or
  • Node groups of the platform-dependent model 116m are mapped. It also describes how these generated nodes are related to each other. The mapping takes place, for example, taking into account the respective
  • a model of the machine can contain, for example, and thus take into account:
  • Machine e.g. maximum gripping force of the end effector and / or degrees of freedom or freedom of movement of the
  • Control variables or controlled variables input variables of the individual actuators or output variables of the individual sensors;
  • End positions of the partial coordinate systems are; permissible operating parameters of the machine, such as the maximum operating temperature.
  • Characteristics of a specifically used platform e.g. machine
  • a specifically used platform e.g. machine
  • the machine-specific control model 116m can optionally be post-processed 211 using a user interface 406 (e.g. provided on a PC, tablet or smartphone).
  • a user interface 406 e.g. provided on a PC, tablet or smartphone.
  • the trajectory 113 of the machine 114 e.g. of the end effector 114e
  • Hold times or the like changed e.g. can be specified.
  • error compensation and / or a communication path with the system control can be defined.
  • a program code 116 can be generated 107 from the control model 116m, which program code is stored on the control device 702 (also referred to as controller 702) of the machine 114
  • the code is generated in the form of templates that exist for each target language. This
  • Templates have instances of the platform-dependent model as input and describe how from them at the metamodel level
  • Text fragments are generated.
  • these templates also have a pure text output
  • Control structures e.g. branches.
  • a template engine in turn has a template and an instance of the
  • platform-independent model as input and generates one or more text files from it.
  • An external system for example a PC, a PLC or the like, can optionally be used to control the end effector 114e.
  • program code 116 for an internal control device 116a of the machine also referred to as robot code
  • Program code 116 for the external system 116b and / or for the communication interface 116b to this can be generated.
  • the machine 114 can have an internal 116a and / or external 116b control device 702, which is set up to control the end effector 114e and / or the positioning device 114p of the machine 114.
  • the program code 116 can from the control device 116a,
  • the program code 116 can denote the respective code in which the control program 116 is written. Depending on the process task, information technology infrastructure and the specific requirements, various target platforms on which the program code 116 is to be listed can be served.
  • the program code 116 can be generated for a communicating overall system (e.g. the
  • Program code 116 can optionally have predefined parts to which the program code 116 can be adapted by a developer.
  • the program code 116 can be changed subsequently, e.g. be adjusted, for example by means of a
  • Fig. 4B illustrates a system 400b for performing the method 100 according to various embodiments in a schematic system diagram.
  • the system 400b can have at least one sensor arrangement 452, for example that which can be attached to the work device 104 Sensor arrangement 102 or the stationary sensor arrangement 112.
  • the system 400b can furthermore have at least one processing unit 302, which has one or more than one processor
  • the system 400b can optionally have a
  • Have code generator 412 which is set up to convert the machine-specific control model 116m into the
  • the system 400b can optionally have a machine 114 which has a control device 702 programmed with the control program 116 and a programming interface 702i, by means of which the control program 116 can be provided to the control device 702.
  • Fig. 5 illustrates the method 100 according to various embodiments in a schematic model diagram 500.
  • the process model 104m (process-specific model) clearly describes a process task without reference to an underlying automation solution.
  • the process model 104m can have several sub-models 502, 504, 506 (e.g. per sub-process).
  • a first partial model 502 (also referred to as spatial model 502) can describe physical, geometric, position-related and / or movement-related properties, e.g. Objects whose shape or position in Cartesian space 701, 703, 705, and / or at least one trajectory 111 in the
  • the spatial model 502 can describe process activities annotated in Cartesian (e.g. gripping an object) and / or the activity-related process parameters.
  • the Cartesian space can be spanned by directions 701, 703, 705 which, for example, are stationary.
  • the first partial model 502 can describe alternative trajectories 111 or sections thereof, which be referenced by means of a third model 506 (also referred to as logic model 506) logic model.
  • the spatial model 502 clearly describes how the process task is carried out in the physical space 701, 703, 705 and which task-specific process activities (for example activating the spray function of a spray gun) are linked to it.
  • the process activities can be carried out by at least one
  • activity-related process parameters e.g. of at least one position-related and / or movement-related
  • Process parameters e.g. a speed
  • Process parameters e.g. a speed
  • At least one functional process parameter (e.g. a volume flow of a spray gun) can be represented.
  • the spatial model 502 describes, for example, the
  • a second partial model 504 (also referred to as a machine-unspecific adaptation model) can describe which
  • different process situations 514 can occur (e.g. painting of component A or component B) and / or how these process situations 514 can be distinguished (e.g. based on a shape of component A and / or shape of component B), e.g. based on criteria.
  • the second partial model 504 defines a criterion for
  • Recognition of a process situation 514 e.g. by means of a
  • the second partial model 504 describes reference data for the criteria that were recorded, for example, during the teaching 101 (e.g. example images of the components).
  • environment variables are defined for this (e.g. shape of the component, temperature, etc.). Based on the environment variables, situations for certain
  • a third partial model 506 (also referred to as a machine-unspecific logic model) describes the underlying process logic 516.
  • the process logic 516 can include at least one control structure 516s (e.g. loop, branch, jump,
  • Link adaptation model 504. In this way a complex schedule can be represented, e.g. "If component A has been recognized, a jump is made to path 111 with the process parameters annotated there".
  • control model 116m can be a fully integrated
  • Represent automation script for a specific hardware selection positioning device 114p, control device 702 and / or end effector 114e).
  • Control model 116m can have multiple sub-models 552, 554, 556, 558.
  • An additional first partial model 552 (also referred to as a physical model 552) clearly describes the spatial information analogously to the physical model 502, however, in relation to the machine 114. For example
  • Process-specific information can be transferred to the coordinate system 705, 713 of the machine 114.
  • the attachment and alignment of the end effector 114e of the machine 114 can be taken into account in order to determine the coordinate system 701, 703, 705 when the implement 104 is detected during the
  • Process parameters of the first partial model 502 are mapped 105a to the specific tool of the end effector 114e (ie machine-specific).
  • An additional second partial model 554 (also referred to as a machine-specific adaptation model 554) can describe a large number of parameters and situations.
  • the parameters can have a name and a data type
  • one or more than one criterion can be used to define how a process situation can be recognized using specific parameter values (e.g. using a sensor of the machine 114).
  • specific parameter values e.g. using a sensor of the machine 114.
  • the dependency for determining the parameter values can be specified on the basis of available sensors 114s of the machine 114.
  • machine-specific adaptation model 554 can be implemented using
  • Adaptation model 504 can be derived and optionally further refined 211 by the user.
  • An additional third part model 556 (also as
  • machine-specific logic model can be a
  • control structures 516s e.g. loops, conditions, etc.
  • the machine-specific logic model 556 can clearly show the other sub-models, e.g. the physical model 552
  • the machine-specific logic model 556 can be generated from the machine-specific logic model 506 by means of mapping 105c
  • a fourth partial model 558 can describe the integration of the automation script with possible third-party systems 518.
  • the external system 518 can have at least one sensor (for example at least one camera), a control device or a business resource planning system (ERP system).
  • ERP system business resource planning system
  • at least one integration-specific protocol can be defined, which the machine 114 communicatively embeds 520 in the system landscape.
  • the interaction model 558 can for example be determined on the basis of the machine-unspecific logic model 504 and can optionally be from
  • the physical model 552 can clearly define movement processes and movement sequences in combination with an activity of the end effector 114e.
  • the interaction model
  • the machine 114 communicates in the system landscape (e.g., transmits data to and / or receives data from the external system 518).
  • Adaptation model 556 describes under which conditions which procedure is required.
  • Fig. 6 illustrates the method 100 according to various embodiments in a schematic flow diagram 600.
  • a control program 116 can be formed which can be executed by the machine 114.
  • the control program 116 can have source code files which are set up in the respective target format of the machine 114.
  • Each machine 114 e.g. robot platform
  • Execute programming language e.g. KRL, Rapid, Visual Basic, etc.
  • Control models 116m in text can be done using templates.
  • the templates can be instantiated with process- and machine-specific models 114m, 104m and generate the respective program code.
  • Fig. 7 illustrates a machine 114 in accordance with various embodiments in a schematic layout diagram 700.
  • the machine 114 may herein be a means of a
  • Control program 116 be programmable machine. Once programmed, the machine 114 can be configured to carry out a process flow autonomously, and optionally the
  • Process sequence i.e. the execution of the task to vary within limits depending on sensor information.
  • the machine 114 can have a control device 702 which is set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in accordance with the control program 116.
  • the control device 702 can be set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in accordance with the control program 116.
  • the control device 702 can be set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in accordance with the control program 116.
  • the control device 702 can be set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in accordance with the control program 116.
  • the control device 702 can be set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in accordance with the control program 116.
  • the control device 702 can be set up to control at least one actuator 704 (also referred to as an actuator) of the machine 114 in
  • Machine 114 may have a kinematic chain 706 along which an action of the at least one actuator 704 is transmitted, e.g. along the coupling of the links of the kinematic chain 706 with one another.
  • Controller 702 may, for example, be a
  • PLC programmable logic system
  • the kinematic chain 706 can be a
  • Positioning device 114p have positionable end effector 114e.
  • the last link of the kinematic chain 706 of the machine 114, which is set up directly on a can be understood as the end effector 114e
  • Affect the workpiece e.g. to process this (i.e. to process it). Activities such as acting on the workpiece, a preparatory step for it or a
  • the process activity can, for example, be an archetype, joining (e.g. welding, coating, screwing, plugging, contacting, gluing or otherwise assembling) or assembling), separating (e.g. grinding, milling, sawing or otherwise machining, punching or dismantling), forming, heating, relocating (e.g. gripping, fitting, rotating or moving), or the like.
  • the process flow can be web-based, ie by moving the
  • End effector 114e are mapped along a trajectory 113.
  • the positioning device 114p may include at least one
  • end effector 114e can relocate to a position (also referred to as positioning).
  • the end effector 114e can
  • At least one actuator 704 which is arranged to carry out the process activity, e.g. by means of a tool of the end effector 114e.
  • the tool can generally provide a function appropriate to the process activity, by means of which the workpiece is acted upon.
  • Tool can, for example, be an archetype tool
  • Joining tool e.g. screwdriver, glue gun or
  • the joining tool can, for example, be a coating tool (e.g. a
  • Paint spray gun a powder coating gun or be formed from it.
  • the machine 114 can have at least one internal
  • Have sensor 114i which is configured to have a
  • the machine 114 can have at least one external sensor 114s (e.g. a camera) which is set up to detect a property of the environment of the machine 114.
  • at least one external sensor 114s e.g. a camera
  • the external sensor 114s can be used, for example, to recognize whether a predefined process situation is present (ie a situation-related criterion is met). Using the property detected by means of the at least one sensor 114s, 114i, it can be determined, for example, whether a criterion is met. If the criterion is parameterized according to the sensor type (ie mapped onto a property that can be detected by the sensor type), the property detected by the sensor can be compared with the parameterized criterion in order to determine whether the criterion is met.
  • the working point can, for example, define which
  • actuators 704 of machine 114 Describe actuators 704 of machine 114.
  • the storage medium can be provided as part of the control device 702 and / or separately therefrom.
  • the storage medium can for example be an electronic one
  • Semiconductor storage medium e.g. a
  • ROM Read-only memory
  • RAM random access memory
  • USB stick USB stick
  • SSD Solid state drive
  • HDD hard disk drive
  • MD storage disk
  • Storage medium an optical storage medium, a compact disc, a digital versatile disc (DCV), or a magneto-optical disc.
  • control model has been described with reference to a machine.
  • what has been described can apply to a plurality of separate, e.g. among themselves
  • communicating machines e.g. a process line
  • communicate machines apply, as well as for a machine, several
  • Example 1 is a method comprising: determining a machine-independent process model on the basis of data, the data representing handling of a work device when carrying out a process flow, the process flow having a plurality of sub-processes, the process model having one for each sub-process of the plurality of sub-processes Process activity linked with spatial information of the sub-process; Mapping the
  • Example 2 is the method according to Example 1, wherein the mapping comprises the spatial information on one
  • mapping the operating point of a positioning device of the machine; and / or wherein the mapping comprises the
  • Example 3 is the method according to Example 1 or 2, wherein the machine has at least one actuator on which the Operating point is related, for example, the actuator is a motor, for example, the actuator part of the
  • Positioning device and / or the end effector is.
  • Example 4 is the method according to one of Examples 1 to 3, the spatial information being exactly three
  • Coordinate system and / or has exactly three directional coordinates) and / or wherein the spatial information is related to a rectilinear coordinate system.
  • Example 5 is the method according to one of Examples 1 to 4, wherein one or more than one sub-process of the plurality of sub-processes is conditional according to a criterion (also referred to as a conditional sub-process), the model of
  • Machine represents a sensor type of the machine, the mapping having to parameterize the criterion according to the sensor type (e.g. on a sensor detection area
  • Process model has to determine (e.g. on the basis of the data) the criterion and / or the conditional sub-process.
  • Example 6 is the method according to one of Examples 1 to 5, wherein the mapping comprises mapping the spatial information onto a coordinate system of the machine (which e.g. is related to the at least one actuator of the machine).
  • Example 7 is the method according to one of Examples 1 to 6, wherein the determination of the process model comprises assigning one or more than one process logic (e.g. a rule)
  • the determination of the process model comprises assigning one or more than one process logic (e.g. a rule)
  • the process logic optionally being set up to determine whether the criterion is met, for example the
  • Example 8 is the method according to one of Examples 1 to 7, the machine-independent process model (eg its process logic) for at least one sub-process of the
  • Multiple sub-processes at least two links between process activity and spatial information of the
  • Sub-process whereby the two links are alternatives, between which a decision is made based on a criterion (e.g. using the process logic).
  • Example 9 is the method according to one of Examples 1 to 8, wherein the spatial information represents a position and / or orientation of the implement; and / or wherein the process activity is an actuation and / or a
  • Example 10 is a method according to any of Examples 1 to 9, further comprising: presenting the
  • User interface which is set up to change the machine-independent process model or the machine-specific control model on the basis of a user input.
  • Example 11 is a method according to one of Examples 1 to 10, further comprising: Acquiring the data at least in part by means of a sensor arrangement which is connected to the
  • Working device is attached, optionally the working device (e.g. directly and / or muscle-powered) being handled and / or carried by a person (e.g. their hand), with, for example, the sensor arrangement being removable (i.e.
  • Example 12 is a method comprising: detecting
  • Data for example the data according to one of Examples 1 to 11, at least in part by means of a sensor arrangement which can be removed from a (for example manually movable) working device is attached, the data being a handling of the
  • Work device a process sequence is carried out, wherein the work device is moved by a person; Determination of a machine-independent process model based on the data, which represents the process flow, furthermore
  • Example 13 is the method according to Example 11 or 12, the sensor arrangement having a fastening device by means of which the sensor arrangement is detachably fastened to the working device.
  • Example 14 is the method according to any one of Examples 11 to 13, wherein the sensor arrangement (e.g. their
  • Fastening device has one or more than one magnet; and / or a clamping device (e.g. comprising a clip and / or clamping screws); and / or has a Velcro strip.
  • a clamping device e.g. comprising a clip and / or clamping screws
  • Example 15 is the method according to any one of Examples 11 to 14, wherein the sensor arrangement is separate from that
  • Working device is supplied with energy; and / or wherein the sensor arrangement is galvanically separated from the working device.
  • Example 16 is the method according to one of Examples 11 to 15, wherein the sensor arrangement has one or more than one trajectory sensor (e.g. position sensor and / or movement sensor) by means of which the data are recorded.
  • the sensor arrangement has one or more than one trajectory sensor (e.g. position sensor and / or movement sensor) by means of which the data are recorded.
  • Example 17 is the method according to one of Examples 11 to 16, the sensor arrangement having an operating point sensor, by means of which an operating point of the working device is detected, with the operating point sensor, for example has a flow sensor, a temperature sensor and / or a current sensor (eg power sensor).
  • the operating point sensor for example has a flow sensor, a temperature sensor and / or a current sensor (eg power sensor).
  • Example 18 is the method according to one of Examples 11 to 17, further comprising: wireless transmission of the data from the sensor arrangement to a base station, the
  • Base station has, for example, a computing unit, wherein the computing unit is used to determine the
  • Example 19 is the method according to any one of Examples 1 to 18, wherein the data e.g. by means of the
  • Sensor arrangement are detected, represent a trajectory (e.g. a position-related and / or movement-related property) of the implement, and / or a property according to the work point of the implement (for example
  • Example 20 is the method according to Example 19, wherein the determination of the machine-independent process model comprises a trajectory of the process sequence, e.g. along which the implement is guided and / or the sub-processes
  • Example 21 is the method according to one of Examples 1 to 20, wherein the determination of the machine-independent
  • Process model also has to take into account at least one boundary condition for the process activity and / or the spatial information, wherein, for example, the determination of the machine-independent process model further comprises a model of at least one sub-process of the plurality of
  • the boundary condition for example
  • Example 22 is the method according to Example 21, wherein the at least one boundary condition is related to temporal information and / or mechanical information, wherein, for example, the at least one boundary condition is a
  • Example 23 is the method according to one of Examples 1 to 22, the data being recorded in a time-resolved manner.
  • Example 24 is the method according to any one of Examples 1 to 23, wherein the model of the machine, a sensor type of the machine, an end effector of the machine, a
  • Example 25 is the method according to one of Examples 1 to 24, the machine-independent process model also representing a result of the process sequence.
  • Example 26 is the method according to one of Examples 1 to 25, further comprising: mapping the machine-specific control model to a control program which can be executed by the machine, e.g. whose code segments are set up according to a programming interface of the machine.
  • Example 27 is the method according to Example 26, wherein the mapping of the machine-specific control model to the control program comprises using one or more than one template.
  • Example 28 is the method according to Example 27, the template using the model of the machine, the process model and / or a model of at least one sub-process
  • Example 29 is the method according to one of Examples 1 to 28, further comprising: Acquiring the data at least in part by means of an additional sensor arrangement which is arranged in a stationary manner, for example the acquisition of the additional sensor arrangement and the sensor arrangement and / or the corresponding parts of the data be synchronized.
  • Example 30 is the method according to Example 29, wherein the additional sensor arrangement is an optoelectronic sensor, a distance sensor, a sonar sensor and / or a
  • a trajectory e.g. a position-related and / or
  • Example 31 is a system comprising: one or more than one processor which is set up to carry out the method according to one of Examples 1 to 30, optionally further comprising a wireless communication device for wireless communication with the sensor arrangement and / or the additional sensor arrangement.
  • Example 32 is the system according to Example 31, furthermore
  • Memory in which the machine-independent process model, the machine-specific control model and / or the model of the machine are or will be stored; and / or one or more than one sensor arrangement for acquiring the data.
  • Example 33 is a non-volatile memory comprising code segments that are established when executed by a processor, the method according to any one of Examples 1 to 30.
  • Example 34 is a machine-independent process model, e.g. process model from the method according to one of Examples 1 to 30, the process model for each sub-process of a large number of sub-processes linking a process activity with spatial information of the sub-process, the several sub-processes being part of a process flow, which is carried out by means of a working device, for example the spatial information exactly three
  • Coordinate system and / or has exactly three directional coordinates) and / or wherein the spatial information is related to a rectilinear coordinate system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)
EP20710474.6A 2019-03-07 2020-03-06 Verfahren, system sowie nichtflüchtiges speichermedium Pending EP3934858A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019105820.1A DE102019105820A1 (de) 2019-03-07 2019-03-07 Verfahren, System sowie nichtflüchtiges Speichermedium
PCT/EP2020/056052 WO2020178435A1 (de) 2019-03-07 2020-03-06 Verfahren, system sowie nichtflüchtiges speichermedium

Publications (1)

Publication Number Publication Date
EP3934858A1 true EP3934858A1 (de) 2022-01-12

Family

ID=69784434

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20710474.6A Pending EP3934858A1 (de) 2019-03-07 2020-03-06 Verfahren, system sowie nichtflüchtiges speichermedium

Country Status (7)

Country Link
US (1) US20220143830A1 (zh)
EP (1) EP3934858A1 (zh)
JP (1) JP2022524385A (zh)
KR (1) KR20220002279A (zh)
CN (1) CN113710430A (zh)
DE (1) DE102019105820A1 (zh)
WO (1) WO2020178435A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11654566B2 (en) * 2020-08-12 2023-05-23 General Electric Company Robotic activity decomposition

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2776477B2 (ja) * 1996-02-13 1998-07-16 川崎重工業株式会社 ロボット3次元位置姿勢教示システム
SE531104C2 (sv) * 2002-12-30 2008-12-16 Abb Research Ltd Metod och system för programmering av en industrirobot
EP1842631B1 (en) * 2006-04-03 2008-11-19 ABB Research Ltd Apparatus and method for automatic path generation for an industrial robot
US7865285B2 (en) * 2006-12-27 2011-01-04 Caterpillar Inc Machine control system and method
US20090132088A1 (en) * 2007-04-24 2009-05-21 Tairob Ltd. Transfer of knowledge from a human skilled worker to an expert machine - the learning process
US9731419B2 (en) * 2010-08-03 2017-08-15 Praxair S.T. Technology, Inc. System and method for programming robots
US8578346B2 (en) * 2010-09-10 2013-11-05 International Business Machines Corporation System and method to validate and repair process flow drawings
AT12208U3 (de) * 2011-09-06 2013-07-15 Keba Ag Verfahren, steuerungssystem und bewegungsvorgabemittel zum programmieren oder vorgeben von bewegungen oder abläufen eines industrieroboters
JP5549749B1 (ja) * 2013-01-16 2014-07-16 株式会社安川電機 ロボット教示システム、ロボット教示プログラムの生成方法および教示ツール
US9186795B1 (en) * 2013-06-24 2015-11-17 Redwood Robotics, Inc. Programming and execution of force-based tasks with torque-controlled robot arms
JP2015111338A (ja) * 2013-12-06 2015-06-18 株式会社ツガミ 加工プログラム生成装置、加工システム、及び、加工プログラム生成用のプログラム
ES2759082T3 (es) * 2014-04-04 2020-05-07 Abb Schweiz Ag Aparato portátil para controlar un robot y método del mismo
US20150024345A1 (en) * 2014-10-09 2015-01-22 Reza Eftekhar Ashtiani A milling blank and a method for fabricating dental bridgework using milling blank
TWI805545B (zh) * 2016-04-12 2023-06-21 丹麥商環球機器人公司 用於藉由示範來程式化機器人之方法和電腦程式產品
WO2018022718A1 (en) * 2016-07-26 2018-02-01 University Of Connecticut Skill transfer from a person to a robot
JP6469069B2 (ja) * 2016-12-13 2019-02-13 ファナック株式会社 学習を容易化する機能を備えたロボット制御装置、及びロボット制御方法
JP6392905B2 (ja) * 2017-01-10 2018-09-19 ファナック株式会社 教示装置への衝撃を学習する機械学習装置、教示装置の衝撃抑制システムおよび機械学習方法
US11273553B2 (en) * 2017-06-05 2022-03-15 Autodesk, Inc. Adapting simulation data to real-world conditions encountered by physical processes
JP6514278B2 (ja) * 2017-07-04 2019-05-15 ファナック株式会社 レーザ加工ロボットシステム
CN107738256A (zh) * 2017-10-17 2018-02-27 佛山市南方数据科学研究院 一种手把手仿人示教机器人编程系统
CN108161904B (zh) * 2018-01-09 2019-12-03 青岛理工大学 基于增强现实的机器人在线示教装置、系统、方法、设备
US11281936B2 (en) * 2018-12-31 2022-03-22 Kofax, Inc. Systems and methods for identifying processes for robotic automation and building models therefor

Also Published As

Publication number Publication date
WO2020178435A1 (de) 2020-09-10
KR20220002279A (ko) 2022-01-06
US20220143830A1 (en) 2022-05-12
DE102019105820A1 (de) 2020-09-10
CN113710430A (zh) 2021-11-26
JP2022524385A (ja) 2022-05-02

Similar Documents

Publication Publication Date Title
EP3013537B1 (de) Verfahren und system zur programmierung eines roboters
EP2539116B1 (de) Prozessmodulbibliothek und programmierumgebung zur programmierung eines manipulatorprozesses
DE102013113370B4 (de) Roboteraufgabensteuerungskomponente mit erweiterbarer programmierumgebung
EP2285537B1 (de) Vorrichtung und verfahren zur rechnergestützten generierung einer manipulatorbahn
DE102010045529B4 (de) Interaktives Robotersteuerungssystem und Verwendungsverfahren
DE102011079117B4 (de) Verfahren zum Programmieren eines Roboters
DE102012218297B4 (de) Verfahren zur dynamischen Optimierung einer Robotersteuerschnittstelle
DE112018002565B4 (de) System und Verfahren zum direkten Anlernen eines Roboters
Nagata et al. Development of CAM system based on industrial robotic servo controller without using robot language
Andersen et al. Definition and initial case-based evaluation of hardware-independent robot skills for industrial robotic co-workers
DE102017120221B4 (de) Steuereinheit
DE102019134794B4 (de) Handgerät zum Trainieren mindestens einer Bewegung und mindestens einer Tätigkeit einer Maschine, System und Verfahren.
EP4078311A1 (de) Handgerät zum trainieren mindestens einer bewegung und mindestens einer tätigkeit einer maschine, system und verfahren
WO2020178435A1 (de) Verfahren, system sowie nichtflüchtiges speichermedium
DE102020200165B4 (de) Robotersteuereinrichtung und Verfahren zum Steuern eines Roboters
Froschauer et al. Workflow-based programming of human-robot interaction for collaborative assembly stations
DE202019005591U1 (de) System aufweisend ein Handgerät zum Trainieren mindestens einer Bewegung und mindestens einer Tätigkeit einer Maschine
Syrjänen Task level robot programming: Background, methods and current state
DE112022000487T5 (de) Befehlserzeugungsvorrichtung und Computerprogramm
Karlsson et al. Remote Programming and Configuration of a Robotic System: A Workplace Oriented Case Study
DE102016204137A1 (de) Programmierbares Manipulatorsystem mit einer Funktionsschaltervorrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210713

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
TPAC Observations filed by third parties

Free format text: ORIGINAL CODE: EPIDOSNTIPA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240327