EP4149727A1 - Verfahren zur steuerung des betriebs eines industrieroboters - Google Patents
Verfahren zur steuerung des betriebs eines industrierobotersInfo
- Publication number
- EP4149727A1 EP4149727A1 EP21725132.1A EP21725132A EP4149727A1 EP 4149727 A1 EP4149727 A1 EP 4149727A1 EP 21725132 A EP21725132 A EP 21725132A EP 4149727 A1 EP4149727 A1 EP 4149727A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- orientation
- objects
- industrial robot
- information
- handling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 122
- 238000001514 detection method Methods 0.000 claims description 77
- 230000003287 optical effect Effects 0.000 claims description 41
- 238000013528 artificial neural network Methods 0.000 claims description 28
- 238000013459 approach Methods 0.000 claims description 21
- 230000009471 action Effects 0.000 claims description 9
- 238000004040 coloring Methods 0.000 claims description 7
- 230000003993 interaction Effects 0.000 claims description 5
- 238000000926 separation method Methods 0.000 claims description 4
- 238000002955 isolation Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 16
- 239000012636 effector Substances 0.000 description 12
- 238000010801 machine learning Methods 0.000 description 12
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000004806 packaging method and process Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 235000013305 food Nutrition 0.000 description 3
- 235000009508 confectionery Nutrition 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000007599 discharging Methods 0.000 description 2
- 239000002537 cosmetic Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G47/00—Article or material-handling devices associated with conveyors; Methods employing such devices
- B65G47/74—Feeding, transfer, or discharging devices of particular kinds or types
- B65G47/90—Devices for picking-up and depositing articles or materials
- B65G47/905—Control arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40053—Pick 3-D object from pile of objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40532—Ann for vision processing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40607—Fixed camera to observe workspace, object, workpiece, global
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45063—Pick and place manipulator
Definitions
- the invention relates to a method for controlling the operation of an industrial robot configured in particular to carry out pick-and-place or isolation tasks, the industrial robot being a handling device which has at least one handling element for handling one of a first orientation and / or position in comprises a second orientation and / or position to be relocated object.
- control of the operation of corresponding industrial robots has hitherto typically been carried out via control devices that can be assigned or assigned to the respective industrial robots. These control devices are set up to feed these data - this can be, for. B. to act around different acquisition or sensor data -to process the generation of control information.
- the control information generated in this way is used as a basis for controlling the operation of the respective industrial robot or robots.
- the present invention is based on the object, in particular with regard to the data processing resources required for generating the corresponding control information, ie. H. in particular the required memory and computing resources to specify an improved method for controlling the operation of an industrial robot.
- the object is achieved by a method for controlling the operation of an industrial robot according to claim 1.
- the dependent claims relate to possible embodiments of the method.
- a first aspect of the invention relates to a method for controlling the operation of at least one industrial robot.
- a corresponding industrial robot typically comprises a handling device which has at least one handling element, ie, for example, a Gripping element, suction element, etc., for handling an object to be transferred from a first spatial orientation and / or position into a second spatial orientation and / or position.
- a corresponding industrial robot is therefore in particular an industrial robot configured to carry out pick-and-place or transfer or separation tasks.
- the method is, in particular, a method for controlling the operation of at least one industrial robot configured to carry out pick-and-place or transfer or separation tasks.
- a corresponding handling device of a corresponding industrial robot can optionally also be referred to or regarded as an end effector device.
- a corresponding handling element can therefore optionally also be referred to or regarded as an end effector element.
- a corresponding industrial robot can be a collaborative industrial robot (“cobot”).
- the method for controlling a collaborative industrial robot (“cobot”) can therefore be or will be implemented.
- a plurality of objects located in a first orientation and / or position is detected and detection data describing the detected plurality of objects located in the first orientation and / or position is generated.
- acquisition data are generated which contain or describe a plurality of acquired objects located in a first orientation and / or position.
- the objects in the first alignment and / or position are typically detected via one or more detection devices; consequently, one or more detection devices is or are typically used to carry out the first step of the method.
- the objects located in the first alignment and / or position are typically located in a detection area of the respective detection device (s).
- a corresponding detection area can e.g. B. a defined section or area of a, z. B.
- a corresponding detection area can be a defined section or area of a, z.
- the objects detected in the first step of the method can in principle be aligned and / or positioned at least partially, possibly completely, in an ordered or disordered manner.
- a first step is used in a second step of the method Data processing measure for processing the acquisition data, the application of the first data processing measure supplying a selection of exactly one object from the plurality of objects in the first orientation and / or position described by the acquisition data, and generating exactly one of the objects described by the acquisition data Selection information describing a plurality of objects that are selected objects in the first orientation and / or position.
- a first data processing measure is applied in order to process the acquisition data generated in the first step of the method.
- the acquisition data generated in the first step of the method are accordingly processed in the second step of the method on the basis of a first data processing measure.
- the application of the first carried out in particular by a data processing device implemented in hardware and / or software
- the data processing measure provides a selection of precisely one object from the plurality of objects in the first alignment and / or position described by the detection data.
- the result of the application of the first data processing measure is therefore generated selection information which contains or describes exactly one object selected from the plurality of objects in the first orientation and / or position described by the detection data.
- the selection information generated therefore contains or describes (precisely) the object which was selected from the plurality of objects in the first orientation and / or position described by the detection data by using the first data processing measure.
- one or more selection criteria can be taken into account when applying the first data processing measure; consequently, exactly one object can be selected from the plurality of objects located in the first alignment and / or position described by the detection data on the basis of or taking into account at least one selection criterion.
- the application of the first data processing measure can involve the application of at least one selection algorithm, possibly forming a component of a selection software.
- B. involve an image processing algorithm.
- a corresponding selection or image processing algorithm can be set up to select precisely one object from the plurality of objects in the first alignment and / or position described by the detection data.
- one or more selection criteria can be taken into account when applying the first data processing measure.
- a corresponding selection or image processing algorithm can be set up accordingly to take into account one or more corresponding selection criteria when selecting precisely one object from the plurality of objects in the first orientation and / or position described by the detection data.
- the selection algorithm can be used as part of a machine learning process, ie via a Method of machine learning.
- a machine learning process ie via a Method of machine learning.
- one or more artificial neural networks with one or more intermediate layers implemented between an input and an output layer can be or have been used.
- a second data processing measure is used to process the selection information, the application of the second data processing measure providing at least one coordinate for handling the precisely one object described by the selection information by means of at least one handling element of the handling device of the industrial robot, and generating one the coordinate information describing at least one coordinate.
- a second data processing measure is applied in order to process the selection information generated in the second step of the method.
- the selection information generated in the second step of the method is accordingly processed in the third step of the method on the basis of a second data processing measure.
- the application of the second data processing measure carried out in particular by the or another data processing device implemented in hardware and / or software provides as a result at least one coordinate for handling the precisely one object described by the selection information by means of at least one handling element of the handling device of the industrial robot.
- the result of the application of the second data processing measure is therefore generated coordinate information which contains or describes at least one coordinate for handling the precisely one object described by the selection information by means of at least one handling element of the handling device of the industrial robot.
- the generated coordinate information therefore contains or describes one or more coordinates - these are typically coordinates related to the respective object - for handling the precisely one object described by the selection information by means of at least one handling element of the handling device of the industrial robot.
- the coordinates described by the coordinate information can be, for. B. act around world coordinates.
- one or more determination criteria can be taken into account when using the second data processing measure; consequently, one or more coordinates for handling the precisely one object described by the selection information can be determined by means of at least one handling element of the handling device of the industrial robot based on or taking into account at least one determination criterion.
- the application of the second data processing measure can include the application of at least one determination algorithm, possibly forming a component of a determination software.
- B. involve an image processing algorithm.
- a corresponding determination or image processing algorithm can be set up to determine suitable coordinates on the object described by the selection information, at which the object can be handled, ie, gripped, by means of one or more handling elements of the handling device of the industrial robot.
- one or more determination criteria can be taken into account when applying the second data processing measure.
- a corresponding determination or image processing algorithm can be set up accordingly to take into account one or more corresponding determination criteria when determining corresponding coordinates.
- the determination algorithm can be used as part of a machine learning process, i. H. via a method of machine learning.
- a machine learning process i. H. via a method of machine learning.
- one or more artificial neural networks with one or more intermediate layers implemented between an input and an output layer can be or have been used.
- a fourth step of the method the operation of the industrial robot for performing a task, in particular a pick-and-place or isolation task, is controlled on the basis of the coordinate information.
- the coordinate information is therefore included in corresponding control information for controlling the operation of the industrial robot or corresponding control information is generated on the basis of or taking into account the coordinate information.
- the coordinate information generated in the third step of the method is used as a basis for controlling the operation of the robot, so that the industrial robot attaches the respective handling element or elements of the handling device to the coordinate or coordinates described by the coordinate information around the respective object to convert from the first orientation and / or position into a second orientation and / or position.
- the second alignment and / or position can be, for. B. be specified by a user or programmer of the industrial robot or will be.
- the control of the operation of the industrial robot typically includes handling precisely the one object that is described by the selection information at the coordinate or coordinates that are described by the coordinate information, or results in one.
- the main advantage of the method described here is that in the second step of the method a corresponding selection of exactly one object is selected from the plurality of detected objects described by the detection data.
- the amount of data to be processed further to determine the respective coordinates can be significantly reduced or limited to a single object, namely the respectively selected object, which also significantly reduces the data processing resources required to generate the corresponding control information, ie in particular the memory and computing resources reduced.
- the method described here makes it possible already in the second step of the method and thus comparatively early to perform the to significantly reduce the amount of data to be processed to generate corresponding coordinate information by using the first data processing measure to select exactly one object from the plurality of objects in the first orientation and / or position described by the detection data.
- the amount of data to be processed further is reduced to precisely the selected object; the data to be processed further defined by the selection information are thus restricted or concentrated to the respective precisely one selected object, so that further processing of the data is limited or concentrated only to the respective precisely one selected object.
- the method can be carried out significantly faster than the required data processing resources, i.e. H. in particular the required memory and computing resources are significantly reduced.
- less powerful data processing devices can be used to carry out the method, in particular with regard to storage and computing power.
- Controlling the operation of the industrial robot on the basis of the coordinate information can in all embodiments include at least one interaction of the at least one handling element of the handling device of the industrial robot with the object described by the respective selection information and thus selected at or in the area of the at least one coordinate described by the coordinate information.
- a corresponding interaction can be a, z. B. realized by gripping (“picking”) the object described by the respective selection information and thus selected at the coordinate (s) described by the respective coordinate information.
- controlling the operation of the industrial robot on the basis of the coordinate information can include converting the object described and thus selected by the respective selection information from a first orientation and / or position to a second orientation and / or position by means of the at least one handling element of the handling device of the industrial robot .
- the detection of the plurality of objects located in the first orientation and / or position in the first step of the method can be carried out by means of at least one optical Detection device are carried out.
- Corresponding detection devices can accordingly be designed as optical detection devices, ie, for example, as (digital) camera devices, or comprise such devices.
- Corresponding detection devices can have one or more detection elements, ie, for example, optical sensor elements, such as e.g. B. CCD and / or CMOS sensors include.
- the objects located in the first alignment and / or position can therefore be detected optically in all embodiments.
- At least one optical detection device can be used, which is connected to the industrial robot that can be controlled or is to be controlled according to the method, ie. H. is arranged or formed in particular on an immovable or immovable section of the industrial robot, in particular on a section of a housing device of the industrial robot which is set up to accommodate functional and / or supply components of the industrial robot.
- at least one optical detection device arranged or embodied in a fixed position or stationary on the industrial robot can be used, which is advantageous because the fixed position or fixed arrangement enables a defined optical detection area to be implemented in a simple manner.
- a corresponding positional or stationary arrangement of a corresponding optical detection device is expediently selected in an area that is elevated relative to the objects to be detected, so that the detection area resulting from the arrangement of the optical detection device relative to the objects to be detected provides a kind of overview of at least a part of the objects to be detected, possibly across all objects to be detected.
- an arrangement of the optical detection device on a housing device of the industrial robot of the industrial robot has proven to be expedient.
- a corresponding housing device can, for. B. arranged or formed on a vertically extending arranged or formed base support of the industrial robot.
- Functional and / or supply components of the industrial robot possibly implemented in hardware and / or software, can be arranged or formed on or in a housing device, which is sometimes also to be referred to or to be regarded as a “head”.
- At least one optical detection device can be used which has a defined optical detection area within which objects can be or are detected by means of the detection device.
- a corresponding optical detection area can, for. B. a defined section or area of a, z. B. belt-like or chain-like conveying element of a conveying device conveying the respective objects located in the first orientation and / or position.
- a corresponding optical detection area can be a defined section or area of a, z. B. band-like or chain-like feed element of a feed device feeding the respective objects in the first orientation and / or position into an action area of a handling element of the handling device of the industrial robot.
- an optical detection area can be selected in such a way that the amount of data generated by the detection data is limited to a specific limit value. It is of course conceivable that an optical detection device with a variable optical detection area is used; consequently, an optical detection device can be used which has a first optical detection area and at least one of these, in any case, with regard to the size of the detection area in comparison to the first optical detection area, different further optical detection areas.
- the objects to be recorded can be recorded dynamically or statically; the objects can therefore be detected when they are in motion or when they are not in motion.
- the first data processing measure can be implemented by at least one single or multi-layered first artificial neural network.
- a corresponding first artificial neural network has at least one intermediate layer located between an input and an output layer.
- the degree of complexity and the associated performance of the respective first artificial neural network (s) can in particular be defined by the number of respective intermediate layers.
- artificial neural networks configured in a (comparatively) simple or (comparatively) complex manner can be used to implement the first data processing measure.
- the use of corresponding artificial neural networks showed special advantages in connection with the processing of the respective acquisition data to generate the respective selection information.
- the second data processing measure can be implemented by at least one single or multi-layered second artificial neural network.
- a corresponding second artificial neural network has at least one intermediate layer located between an input and an output layer.
- the degree of complexity and the associated performance of the respective second artificial neural network (s) can in particular be defined by the number of respective intermediate layers.
- artificial neural networks configured in a (comparatively) simple or (comparatively) complex manner can be used to implement the second data processing measure.
- the use of corresponding artificial neural networks showed special advantages in connection with the processing of the respective selection information to generate the respective coordinate information.
- At least one for applying the first and second data processing measures can be used to apply the first and second data processing measures set up hardware and / or software implemented data processing device can be used.
- a respective data processing device can be set up to implement at least one first artificial neural network and / or at least one second artificial neural network or be or will be implemented by corresponding first and / or second artificial neural networks.
- a respective data processing device can form part of a control device for controlling the operation of the industrial robot.
- one or more selection criteria can be taken into account when applying the first data processing measure.
- the selection made by means of the first data processing measure can therefore be carried out on the basis of at least one selection criterion from the plurality of objects in the first orientation and / or position described by the detection data.
- a corresponding selection criterion can represent a boundary condition to be taken into account when selecting a respective precisely one object.
- absolute alignment information and / or absolute position information describing an absolute alignment and / or position of at least one object of the objects located in the first alignment and / or position can be used.
- Absolute alignment information and / or absolute position information can e.g. B. be specified in angles and / or in world coordinates or contain such.
- an absolute alignment and / or position of at least one object of the objects located in the first alignment and / or position can be recorded and used as a selection criterion and taken into account or taken as a basis for the selection of precisely one object.
- a selection criterion such as B. relative orientation information and / or relative position information describing a relative orientation and / or position of at least one object of the objects in the first orientation and / or position to at least one further object of the objects in the first orientation and / or position
- a relative orientation information and / or relative position information can, for. B. be specified in angles and / or in world coordinates or contain such.
- a relative alignment and / or position of at least one object of the objects in the first alignment and / or position to at least one further object of the objects in the first alignment and / or position can be detected and used as a selection criterion and accordingly when selecting the exactly one object can be taken into account or used as a basis.
- a corresponding selection criterion can be an approach movement or approach vector of a handling element of the handling device of the industrial robot for approaching at least one object in the first orientation and / or position Approach information describing the objects located can be used.
- a starting movement or starting vector of a handling element of the handling device of the industrial robot, in particular required by an ACTUAL position and / or ACTUAL orientation, can therefore be recorded and used as a selection criterion and taken into account or taken as a basis for the selection of exactly one object will.
- dimensional information describing at least one geometrical-constructive dimension of at least one object of the objects located in the first orientation and / or position can be used as a corresponding selection criterion. Consequently, at least one dimension of at least one object of the objects located in the first orientation and / or position can be detected and used as a selection criterion and accordingly taken into account or taken as a basis for the selection of precisely one object.
- form information describing at least one geometrical-constructive shape (spatial shape) of at least one object of the objects located in the first orientation and / or position can be used as a corresponding selection criterion. Consequently, at least one shape of at least one object of the objects located in the first orientation and / or position can be detected and used as a selection criterion and accordingly taken into account or taken as a basis for the selection of precisely one object.
- color information describing the coloring of at least one object of the objects located in the first alignment and / or position can be used as a corresponding selection criterion.
- the coloring of at least one object of the objects located in the first alignment and / or position can therefore be recorded and used as a selection criterion and accordingly taken into account or taken as a basis for the selection of precisely one object.
- the corresponding selection criterion can include the surface, in particular the surface texture, i.e. H. in particular the mechanical and / or optical surface properties of at least one object of the objects located in the first orientation and / or position are used.
- the surface, in particular the surface quality, of at least one object of the objects located in the first orientation and / or position can therefore be recorded and used as a selection criterion and taken into account accordingly or taken as a basis for the selection of precisely one object.
- mass information describing the mass, in particular a center of gravity, of at least one object of the objects located in the first orientation and / or position can be used as a corresponding selection criterion. Consequently, the mass, in particular a center of mass, of at least one object in the Objects located in the first alignment and / or position are detected and used as a selection criterion and accordingly taken into account or taken as a basis for the selection of precisely one object.
- a corresponding selection criterion can be used as a corresponding selection criterion, which is sometimes also referred to as “format” in pick-and-place applications.
- the type of at least one object of the objects located in the first orientation and / or position can therefore be recorded and used as a selection criterion and accordingly taken into account or taken as a basis for the selection of precisely one object.
- one or more determination criteria can be taken into account when applying the second data processing measure.
- the determination of the at least one coordinate for handling the precisely one object described by the selection information using the second data processing measure can therefore be carried out on the basis of at least one determination criterion.
- a corresponding determination criterion can represent a boundary condition to be taken into account when determining the respective coordinate (s).
- absolute orientation information and / or absolute position information describing an absolute orientation and / or position of the respectively selected object can be used.
- Absolute alignment information and / or absolute position information can e.g. B. be specified in angles and / or in world coordinates or contain such.
- An absolute alignment and / or position of the respectively selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or taken as a basis.
- z. B relative orientation information and / or relative position information describing a relative orientation and / or position of the respectively selected object to at least one further object of the objects located in the first orientation and / or position
- a relative orientation information and / or relative position information can, for. B. be specified in angles and / or in world coordinates or contain such.
- a relative orientation and / or position of the respectively selected object to at least one further object of the objects located in the first orientation and / or position can therefore be recorded and used as a determination criterion and taken into account or taken as a basis when determining the coordinate (s) will.
- a corresponding determination criterion can be an approach movement, in particular required from an ACTUAL position and / or an ACTUAL orientation or approach information describing the approach vector of a handling element of the handling device of the industrial robot can be used for approaching the respectively selected object.
- An approach movement or approach vector of a handling element of the handling device of the industrial robot for approaching the respective selected object can therefore be recorded and used as a selection criterion and accordingly when determining the coordinate (n ) taken into account or taken as a basis.
- dimensional information describing at least one geometrical-constructive dimension of the respectively selected object can be used as a corresponding determination criterion. Consequently, at least one dimension of the respectively selected object can be recorded and used as a determination criterion and accordingly taken into account when determining the coordinate (s) or used as a basis for this.
- shape information describing at least one geometric-constructive shape (spatial shape) of the respective selected object can be used as a corresponding determination criterion.
- a shape of the respectively selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or used as a basis for this.
- color information describing the coloring of the respective selected object can be used as a corresponding determination criterion.
- the coloring of the respective selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or used as a basis for this.
- the corresponding determination criterion can include the surface, in particular the surface quality, i.e. H. in particular the mechanical and / or optical surface properties of the respective selected object descriptive surface information can be used.
- the surface, in particular the surface quality, of the respectively selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or used as a basis.
- mass information describing the mass, in particular a center of gravity, of the respective selected object can be used as a corresponding determination criterion.
- the mass, in particular a center of mass, of the respectively selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or taken as a basis.
- the type of Handling information describing handling of the respectively selected object can be used by at least one handling element of the handling device of the industrial robot.
- the type of handling of the respectively selected object can therefore be detected by at least one handling element of the handling device of the industrial robot and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or taken as a basis.
- handling information describing the handling surface of the respectively selected object by at least one handling element of the handling device of the industrial robot can be used as a corresponding determination criterion.
- the handling surface of the respectively selected object can therefore be detected by at least one handling element of the handling device of the industrial robot and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or used as a basis.
- a genre describing the particular selected object can be used as a corresponding determination criterion.
- the genus of the respectively selected object can therefore be recorded and used as a determination criterion and taken into account accordingly when determining the coordinate (s) or used as a basis for this.
- the method typically does not end after performing the steps described for a first selected object which, within the framework of or after performing the steps of the method described above, for example.
- B selected from the plurality of objects located in the first orientation and / or position and removed as a selected object in the course of controlling the industrial robot from the plurality of objects located in the first orientation and / or position and transferred to a second orientation and / or Position has been implemented.
- corresponding selection information and coordinate information for a first object selected from the plurality of objects located in the first orientation and / or position has been generated, corresponding selection information and coordinate information is typically generated for at least one further object from the plurality of objects in the first orientation and / or position. or position remaining objects.
- the method is typically carried out until appropriate selection information and coordinate information has been successively generated for each object from the large number of objects located in the first orientation and / or position.
- the process can be interrupted or stopped when a termination condition is fulfilled or present.
- a corresponding termination condition can e.g. B. be fulfilled or present when there is only a certain number of objects in the first orientation and / or position or when there is no longer any object in the first orientation and / or position.
- the method may prior to the step of detecting the plurality of in a first orientation and / or position objects and the generation of acquisition data describing the acquired plurality of objects in the first orientation and / or position, a step of providing a plurality of objects in a first orientation and / or position on a stationary or moving surface, in particular a base of a conveyor device, preferably a feed conveyor device for feeding objects.
- the provision which can include feeding the respective objects into an action area of a handling element of the handling device of the industrial robot, can therefore optionally also represent a step of the method.
- the corresponding objects can be objects to be packaged in a package.
- the type of packaging results from the type of objects to be packed.
- food that can be separated such as. B. candy, referenced as possible objects.
- the method can in principle be carried out with all objects to be separated from a plurality of objects located in a first orientation and / or position.
- a second aspect of the invention described herein relates to an industrial robot, in particular a collaborative industrial robot (“cobot”).
- the industrial robot comprises a handling device which has at least one handling element that can be moved in at least one degree of freedom of movement for handling an object to be converted from a first orientation and / or position to a second orientation and / or position and a control device implemented in particular in terms of hardware and / or software Control of the operation of the industrial robot.
- the control device is set up to carry out the method according to the first aspect of the invention.
- the control device therefore typically has machine-readable instructions for performing the steps of the method according to the first aspect of the invention. All statements in connection with the method according to the first aspect of the invention apply analogously to the industrial robot according to the second aspect of the invention and vice versa.
- a third aspect of the invention described herein relates to an arrangement for converting objects from a first orientation and / or position into a second orientation and / or position.
- the arrangement which may also be designated or considered as a machine, comprises at least one industrial robot according to the second aspect of the invention.
- the arrangement comprises at least one peripheral device.
- a corresponding peripheral device can be or comprise a feed device for feeding objects, in particular objects located in a first orientation and / or position, into an action area of at least one handling element of the handling device of the industrial robot.
- a corresponding peripheral device can be or comprise a discharge device for discharging objects, in particular objects that have been converted into the second orientation and / or position by means of the industrial robot. All statements in connection with the Methods according to the first aspect of the invention apply analogously to the arrangement according to the third aspect of the invention and vice versa.
- a corresponding arrangement can be a packaging machine for packaging objects or form part of such a machine.
- a corresponding packaging machine can, for. B. be set up to objects such. B. food, cosmetic articles, pharmaceutical articles, technical articles, from a first orientation and / or position to a second orientation and / or position, d. H. z. B. in a carrier-like, receiving device to implement.
- FIG. 1 shows a basic illustration of an arrangement for converting objects from a first orientation and / or position into a second orientation and / or position according to an exemplary embodiment
- FIG. 2 shows a basic illustration of an industrial robot according to an exemplary embodiment
- FIG. 3 shows a basic illustration of a plurality of objects located in a first orientation and / or position, detected by a detection device, according to a
- FIG. 4 shows an object selected from the plurality shown in FIG. 3; FIG. and
- FIG. 5 shows a block diagram to illustrate a method for controlling the operation of an industrial robot according to an exemplary embodiment.
- FIG. 1 shows a schematic diagram of an arrangement 1 for converting objects 2 from a first orientation and / or position into a second orientation and / or position according to an exemplary embodiment in a top view.
- the arrangement 1 can also be referred to or viewed as a machine.
- the arrangement 1 comprises one, e.g. B. as collaborative industrial robots ("cobots"), trained industrial robots and several peripheral devices.
- the exemplary embodiment is a z. B. designed as a feed belt feed device 4 for feeding objects 2, in particular objects 2 located in a first orientation and / or position, in an action area 5 of a, z. B. designed as a gripping or suction element, end effector or handling element 6 of an end effector or handling device 7 of the industrial robot 3 and a z. B. a discharge device 9 designed as a discharge belt for discharging objects 2, in particular objects 2 which have been moved into a second orientation and / or position by means of the industrial robot.
- the end effector or handling element 6 of the end effector or handling device 7 is, as indicated purely schematically by the double arrow, mounted movably in one or more degrees of freedom of movement.
- the dashed illustration indicates that the arrangement 1 can also include several corresponding peripheral devices and several corresponding end effector or handling devices 7 together with the associated end effector or handling element 6.
- FIG. 2 shows a basic illustration of an industrial robot 3 according to an exemplary embodiment in a side view. Based on Fig. 2 is again the z. B. designed as a robot arm or such a comprehensive end effector or handling device 7 together with the associated end effector or handling element 6 shown.
- the industrial robot 3 shown in FIG. 2 can correspond to the industrial robot s shown in FIG. 1.
- a housing device 3.1 arranged or constructed on a vertically extending arranged or constructed base support 3.2 of the industrial robot 3 can be seen from FIG. 2.
- Functional and / or supply components (not shown) of the industrial robot 3, possibly implemented in hardware and / or software, can be arranged or formed on or in the housing device 3.1, which is also to be referred to or to be regarded as a “head”.
- Fig. 2 is also a z. B. designed as a camera device or such a comprehensive optical detection device 10 can be seen, which is arranged or formed on the housing device 3.1.
- the optical detection device 10 is accordingly arranged or formed on a section of the housing device 3.1 and thus on an immovable or immovable section of the industrial robot 3.
- a corresponding positional or stationary arrangement of the optical detection device 10 is therefore selected in the exemplary embodiment on an area that is elevated relative to the objects 2 to be detected, so that the optical detection area resulting from the arrangement of the optical detection device 10 relative to the objects 2 to be detected 12 enables a kind of overview of at least some of the objects 2 to be detected, possibly of all objects 2 to be detected.
- the dashed illustration again indicates that the industrial robot 3 can also include several corresponding end effector or handling devices 7 together with associated end effector or handling elements 6 as well as several corresponding optical detection devices 10.
- a control device 11 implemented in hardware and / or software can be seen, which is set up to control the operation of the industrial robot 3.
- the control device 11 shown purely by way of example in FIG. 1 as a structural component of the industrial robot 3 is therefore set up to generate control information Basis of which the operation of the industrial robot 3 is controlled to carry out certain tasks. With corresponding tasks it can be, for. B. to pick-and-place or separation tasks of objects 2 act.
- the control device 11 is set up accordingly to carry out a method for controlling an industrial robot 3, which is explained in more detail below using an exemplary embodiment, also with reference to FIGS. 3-5.
- a first step of method S1 of the method detection data are generated which contain or describe a plurality of detected objects 2 located in a first orientation and / or position.
- the z. For this purpose, objects 2 located, for example, on a surface of the feed device 4 in the first orientation and / or position are detected via one or more optical detection devices 10; consequently, one or more optical detection devices 10 are used to carry out the first step of the method.
- the objects 2 located in the first orientation and / or position are located in a detection area 12 shown in dashed lines in FIG. 1 and shown separately in FIG. 3 at least one optical detection device 10.
- the objects 2 detected in the first step of the method can in principle be aligned and / or positioned at least partially, possibly completely, in an ordered or disordered manner.
- a first data processing measure is used in order to process the detection data generated in the first step of method S1.
- the acquisition data generated in the first step of the method S1 are therefore processed in the second step of the method S2 on the basis of a first data processing measure.
- the application of the first data processing measure carried out in particular by a data processing device implemented in hardware and / or software, provides as a result a selection of exactly one object 2.1 from the plurality of objects 2 in the first orientation and / or position described by the detection data (cf. 3).
- the result of the application of the first data processing measure is therefore generated selection information which contains or describes exactly one object 2.1 selected from the plurality of objects 2 in the first orientation and / or position described by the detection data.
- the selection information generated therefore contains or describes (precisely) the object 2.1, which by applying the first data processing measure from the A plurality of objects 2 located in the first orientation and / or position, described by the detection data, has been selected.
- the application of the first data processing measure can involve the application of at least one selection algorithm, possibly forming a component of a selection software.
- B. involve an image processing algorithm.
- a corresponding selection or image processing algorithm can be set up to select precisely one object 2.1 from the plurality of objects 2 in the first alignment and / or position described by the detection data.
- the selection algorithm can be used as part of a machine learning process, i. H. via a method of machine learning.
- a machine learning process i. H. via a method of machine learning.
- one or more artificial neural networks with one or more intermediate layers implemented between an input and an output layer can be or have been used.
- a second data processing measure is used in order to process the selection information generated in the second step of method S2.
- the selection information generated in the second step of the method S2 is accordingly processed in the third step of the method S3 on the basis of a second data processing measure.
- the application of the second data processing measure carried out in particular by the or another data processing device implemented in hardware and / or software provides at least one coordinate K for handling the precisely one object 2.1 described by the selection information by means of a handling element 6 of the
- Handling device 7 of the industrial robot 3 (see. Fig. 4).
- the result of the application of the second data processing measure is therefore generated coordinate information which contains at least one coordinate K for handling the precisely one object 2.1 described by the selection information by means of a handling element 6 of
- Handling device 7 of the industrial robot 3 contains or describes.
- the coordinate information generated therefore contains or describes one or more coordinates - these are, for example, world coordinates related to the respective object 2.1 - for handling the precisely one object 2.1 described by the selection information by means of a handling element 6 of the handling device 7 of the industrial robot 3.
- the application of the second data processing measure can include the application of at least one, possibly a component of a determination software,
- Determination algorithm this can be, for. B. involve an image processing algorithm.
- a corresponding determination or image processing algorithm can be set up to determine suitable coordinates on the object 2.1 described by the selection information, at which the object 2.1 can be handled by means of a handling element 6 of the handling device 7 of the industrial robot 3, i. h e.g. grab, lets.
- the determination algorithm can be used as part of a machine learning process, i. H. via a method of machine learning.
- a machine learning process i. H. via a method of machine learning.
- one or more artificial neural networks with one or more intermediate layers implemented between an input and an output layer can be or have been used.
- a fourth step of method S4 of the method the coordinate information generated in the third step of method S3 is used as a basis for controlling the operation of the industrial robot, so that the industrial robot 3 detects the respective handling element 6 of the handling device 7 at the coordinate or coordinates K described by the coordinate information starts to convert the object 2.1 from the first orientation and / or position into a second orientation and / or position.
- the second alignment and / or position can be, for. B. be specified by a user or programmer of the industrial robot or will be.
- the control of the operation of the industrial robot 3 accordingly typically includes handling precisely the one object 2.1, which is described by the selection information, at the coordinate or coordinates K which are described by the coordinate information, or results in such.
- the control of the operation of the industrial robot 3 on the basis of the coordinate information can include at least one interaction of a handling element 6 of the handling device 7 of the industrial robot 3 with the object 2.1 described and thus selected by the respective selection information at or in the area of the coordinate (s) K described by the coordinate information include.
- a corresponding interaction can be a, z. B. realized by gripping (“picking”) the object 2.1 described by the respective selection information and thus selected at the coordinate (s) K described by the respective coordinate information.
- controlling the operation of the industrial robot 3 on the basis of the coordinate information can convert the object 2.1 described and thus selected by the respective selection information from the first orientation and / or position to a second orientation and / or position by means of a handling element 6 of the handling device 7 of the Include industrial robots 3.
- the main advantage of the method is that already in the second step of the method S2 a corresponding selection of exactly one object 2.1 is selected from the plurality of detected objects 2 described by the detection data.
- the amount of data to be processed further to determine the respective coordinates K can be significantly reduced or limited to a single object, namely the respectively selected object 2.1, which also reduces the data processing resources required to generate the corresponding control information, ie in particular the memory and Computing resources, significantly reduced.
- the method therefore makes it possible in the second step of method S2, and thus comparatively early, to considerably reduce the amount of data to be processed to generate the corresponding coordinate information by applying the first data processing measure from the plurality described by the acquisition data to the first alignment and / or position located objects 2 exactly one object 2.1 is selected.
- the amount of data to be processed subsequently is based on precisely the selected object 2 -1 reduced; the data to be processed further defined by the selection information are thus limited or concentrated to the respective precisely one selected object 2.1, so that further processing of the data is limited or concentrated only to the respective precisely one selected object 2.1.
- the method can be carried out significantly faster than the required data processing resources, i.e. H. in particular the required memory and computing resources are significantly reduced.
- less powerful data processing devices can be used to carry out the method, in particular with regard to storage and computing power.
- the first data processing measure can be implemented by a single or multi-layered first artificial neural network.
- a corresponding first artificial neural network has at least one intermediate layer located between an input and an output layer.
- artificial neural networks configured in a (comparatively) simple or (comparatively) complex manner can be used to implement the first data processing measure.
- the second data processing measure can be implemented by a single or multi-layered second artificial neural network.
- a corresponding second artificial neural network has at least one intermediate layer located between an input and an output layer.
- artificial neural networks configured in a (comparatively) simple or (comparatively) complex manner can be used to implement the second data processing measure.
- Respective data processing devices for applying the first and second data processing measures can therefore be set up to implement at least one first artificial neural network and / or at least one second artificial neural network or be or will be implemented by corresponding first and / or second artificial neural networks.
- the respective data processing device can form part of the control device 11 for controlling the operation of the industrial robot 3.
- at least one selection criterion can be taken into account. The selection of exactly one object 2.1 from the plurality of objects 2 in the first orientation and / or position described by the detection data using the first data processing measure can therefore be carried out on the basis of at least one selection criterion.
- absolute alignment information and / or absolute position information describing an absolute alignment and / or position of at least one object 2 of the objects 2 located in the first alignment and / or position can be used.
- a selection criterion such as B. a relative orientation and / or position of at least one object 2 of the objects 2 in the first orientation and / or position to at least one further object 2 of the objects 2 in the first orientation and / or position describing relative orientation information and / or relative position information be used.
- an approach movement or approach vector of a handling element 6 of the handling device 7 of the industrial robot 3 for approaching at least one object 2 of the type shown in the first Approach information describing alignment and / or position of objects 2 can be used.
- dimensional information describing at least one geometrical-constructive dimension of at least one object 2 of the objects 2 located in the first orientation and / or position can be used as a corresponding selection criterion.
- shape information describing at least one geometric-constructive shape (spatial shape) of at least one object 2 of the objects 2 located in the first orientation and / or position can be used as a corresponding selection criterion.
- color information describing the coloring of at least one object 2 of the objects 2 located in the first alignment and / or position can be used as a corresponding selection criterion.
- a corresponding selection criterion can use surface information describing the surface, in particular the surface properties, ie in particular the mechanical and / or optical surface properties, of at least one object 1 of the objects 2 located in the first orientation and / or position will.
- mass information describing the mass, in particular a center of gravity, of at least one object 2 of the objects 2 located in the first orientation and / or position can be used as a corresponding selection criterion.
- a corresponding selection criterion can be used as a corresponding selection criterion describing the genre of at least one object 2 of the objects 2 located in the first orientation and / or position, sometimes also referred to as “format” in pick-and-place applications.
- At least one determination criterion can be taken into account.
- the determination of the at least one coordinate for handling the precisely one object 2.1 described by the selection information using the second data processing measure can therefore be carried out on the basis of at least one determination criterion.
- absolute alignment information and / or absolute position information describing an absolute alignment and / or position of the respectively selected object 2.1 can be used.
- z. B relative orientation information and / or relative position information describing a relative orientation and / or position of the respectively selected object 2.1 to at least one further object 2 of the objects 2 located in the first orientation and / or position can be used.
- an approach movement or approach vector of a handling element 6 of the handling device 7 of the industrial robot 3 describing approach information for approaching the respectively selected object 2.1 can be used as a corresponding determination criterion, in particular from an ACTUAL position and / or ACTUAL orientation .
- dimensional information describing at least one geometrical-constructive dimension of the respectively selected object 2.1 can be used as a corresponding determination criterion.
- form information describing at least one geometrical-constructive shape (spatial shape) of the respectively selected object 2.1 can be used as a corresponding determination criterion.
- a corresponding determination criterion can be the color of the selected object 2.1 descriptive color information can be used.
- the corresponding determination criterion can include the surface, in particular the surface quality, i.e. H. in particular the mechanical and / or optical surface properties of the respectively selected object 2.1 descriptive surface information can be used.
- mass information describing the mass, in particular a center of gravity, of the respective selected object 2.1 can be used as a corresponding determination criterion.
- handling information describing the type of handling of the respectively selected object 2.1 by a handling element 6 of the handling device 7 of the industrial robot 3 can be used as a corresponding determination criterion.
- handling information describing the handling surface of the respectively selected object 2.1 by a handling element 6 of the handling device 7 of the industrial robot 3 can be used as a corresponding determination criterion.
- a genus of the respective selected object 2.1 describing genre information can be used as a corresponding determination criterion.
- the method typically does not end after the described steps S1-S4 have been carried out for a first selected object 2.1 which, within the framework of or after carrying out the above-described steps of the method, e.g. B. selected from the plurality of objects 2 located in the first orientation and / or position and removed as the selected object in the course of controlling the industrial robot from the plurality of objects 2 located in the first orientation and / or position and transferred to a second orientation and / or position has been implemented.
- corresponding selection information and coordinate information for a first selected object 2.1 has been generated
- corresponding selection information and coordinate information for at least one further object 2 is typically generated from the plurality of objects 2 remaining in the first orientation and / or position.
- the method is typically carried out until appropriate selection information and coordinate information has been successively generated for each object 2 from the large number of objects 2 located in the first orientation and / or position.
- the process can be interrupted or stopped when a termination condition is fulfilled or present.
- a corresponding termination condition can e.g. B. be fulfilled or present when there is only a certain number of objects 2 in the first orientation and / or position or when there is no longer any object 2 in the first orientation and / or position.
- the method can include a step of providing a plurality on objects 2 in a first orientation and / or position on a stationary or moving base, in particular a feed conveyor device 4.
- the provision which can include feeding the respective objects 4 into the action area 5 of a handling element 6 of the handling device 7 of the industrial robot, can therefore also optionally represent a step of the method.
- the corresponding objects 2 can be objects to be packaged in a package.
- the type of packaging results from the type of objects to be packed.
- food that can be separated such as. B. candy, referenced as possible objects 2.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102020113278.6A DE102020113278B4 (de) | 2020-05-15 | 2020-05-15 | Verfahren zur Steuerung des Betriebs eines Industrieroboters, Industrieroboter und Anordnung zur Umsetzung von Objekten |
PCT/EP2021/062346 WO2021228773A1 (de) | 2020-05-15 | 2021-05-10 | Verfahren zur steuerung des betriebs eines industrieroboters |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4149727A1 true EP4149727A1 (de) | 2023-03-22 |
Family
ID=75904933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21725132.1A Withdrawn EP4149727A1 (de) | 2020-05-15 | 2021-05-10 | Verfahren zur steuerung des betriebs eines industrieroboters |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230192421A1 (de) |
EP (1) | EP4149727A1 (de) |
DE (1) | DE102020113278B4 (de) |
WO (1) | WO2021228773A1 (de) |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4087874B2 (ja) * | 2006-02-01 | 2008-05-21 | ファナック株式会社 | ワーク取り出し装置 |
US8805585B2 (en) | 2008-06-05 | 2014-08-12 | Toshiba Kikai Kabushiki Kaisha | Handling apparatus, control device, control method, and program |
JP4565023B2 (ja) * | 2008-07-04 | 2010-10-20 | ファナック株式会社 | 物品取り出し装置 |
FI20105732A0 (fi) * | 2010-06-24 | 2010-06-24 | Zenrobotics Oy | Menetelmä fyysisten kappaleiden valitsemiseksi robottijärjestelmässä |
JP5911299B2 (ja) | 2011-12-27 | 2016-04-27 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法およびプログラム |
JP5975685B2 (ja) * | 2012-03-09 | 2016-08-23 | キヤノン株式会社 | 情報処理装置、情報処理方法 |
US9875427B2 (en) * | 2015-07-28 | 2018-01-23 | GM Global Technology Operations LLC | Method for object localization and pose estimation for an object of interest |
DE102016009030B4 (de) | 2015-07-31 | 2019-05-09 | Fanuc Corporation | Vorrichtung für maschinelles Lernen, Robotersystem und maschinelles Lernsystem zum Lernen eines Werkstückaufnahmevorgangs |
ITUA20163608A1 (it) * | 2016-05-19 | 2017-11-19 | Milano Politecnico | Procedimento e dispositivo per il controllo della movimentazione di uno o più robot collaborativi |
DE202017106506U1 (de) | 2016-11-15 | 2018-04-03 | Google Llc | Einrichtung für tiefes Maschinenlernen zum Robotergreifen |
JP6587761B2 (ja) | 2017-02-09 | 2019-10-09 | 三菱電機株式会社 | 位置制御装置及び位置制御方法 |
WO2018236753A1 (en) | 2017-06-19 | 2018-12-27 | Google Llc | PREDICTION OF ROBOTIC SEIZURE USING NEURAL NETWORKS AND A GEOMETRY-SENSITIVE REPRESENTATION OF OBJECT |
US11312012B2 (en) * | 2019-01-01 | 2022-04-26 | Giant Ai, Inc. | Software compensated robotics |
US10870204B2 (en) * | 2019-01-25 | 2020-12-22 | Mujin, Inc. | Robotic system control method and controller |
-
2020
- 2020-05-15 DE DE102020113278.6A patent/DE102020113278B4/de active Active
-
2021
- 2021-05-10 EP EP21725132.1A patent/EP4149727A1/de not_active Withdrawn
- 2021-05-10 US US17/998,757 patent/US20230192421A1/en active Pending
- 2021-05-10 WO PCT/EP2021/062346 patent/WO2021228773A1/de unknown
Also Published As
Publication number | Publication date |
---|---|
DE102020113278B4 (de) | 2024-07-25 |
DE102020113278A1 (de) | 2021-11-18 |
WO2021228773A1 (de) | 2021-11-18 |
US20230192421A1 (en) | 2023-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102011113590B4 (de) | Planen simultaner Pfade mit einem oder mehreren humanoiden Robotern | |
DE102015102740B4 (de) | Vorrichtung und Verfahren zum Anordnen von Gegenständen mittels Roboter und Gegenstandübertragungssystem | |
DE102010018759B4 (de) | Spannungsverteilung in einem sehnengetriebenen Roboterfinger | |
DE112012002677T9 (de) | Zuführvorrichtung für Bauelemente | |
DE69032185T2 (de) | Verfahren und Vorrichtung zur Kontrolle der Bearbeitungsspur eines Industrieroboters | |
DE102015005213B4 (de) | Steuervorrichtung für eine flexible Robotersteuerung | |
DE102018212531B4 (de) | Artikeltransfervorrichtung | |
DE102019121889B3 (de) | Automatisierungssystem und Verfahren zur Handhabung von Produkten | |
WO2010034044A2 (de) | Verfahren und anlage zum aufnehmen und/oder bearbeiten von objekten | |
DE102014102943A1 (de) | Robotersystem mit Funktionalität zur Ortsbestimmung einer 3D- Kiste | |
WO2019224004A1 (de) | Verfahren zum hantieren eines werkstücks mit hilfe eines entnahmewerkzeugs und maschine zur durchführung des verfahrens | |
DE112020001886T5 (de) | Mehrfachproduktpalettenvorrichtung, Steuersystem für Mehrfachproduktpalettenvorrichtung, Verschiebungsbegrenzungsmechanismus und Angleichungsmechanismus | |
DE102016117855A1 (de) | Verfahren zur Werkstückbearbeitung durch Zusammenwirken von Werkzeugmaschine und Roboter | |
DE102018112370B4 (de) | Richtungsabhängige Kollisionsdetektion für einen Robotermanipulator | |
EP3993958B1 (de) | Verfahren zur positionierung eines biegeschlaffen flächenwerkstücks sowie positionierungsvorrichtung | |
EP4149727A1 (de) | Verfahren zur steuerung des betriebs eines industrieroboters | |
EP3702108A1 (de) | Verfahren zum ermitteln einer greifposition zum ergreifen eines werkstücks | |
EP3615275A1 (de) | Schraubvorrichtung | |
DE69913268T2 (de) | Vorrichtung und Verfahren zum automatischen Zuführen von Wulstkernen mit Wulstfüllern | |
DE102012023916A1 (de) | Metall- und/oder Keramikpulver-Presskörper-Pressenanordnung mit zumindest zwei Pressen und einer Transferanordnung und Steuerverfahren dafür | |
WO1997015494A1 (de) | Befüllung von pharmazeutischen mehrkammerverpackungen | |
DE102018122081A1 (de) | Verfahren und Vorrichtung zur Handhabung von Stückgütern, Artikeln und/oder Gebinden | |
DE102020113277B4 (de) | Verfahren zum Erzeugen eines Trainingsdatensatzes zum Trainieren eines Industrieroboters, Verfahren zur Steuerung des Betriebs eines Industrieroboters und Industrieroboter | |
EP3486182B1 (de) | Rotationskopf mit längsbeweglichem faltkanal | |
EP3548975B1 (de) | Steuerung eines technischen prozesses auf einer mehr-rechenkern-anlage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20221213 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20230714 |