EP4164841A1 - Sensoreinrichtung für ein greifsystem, verfahren zur erzeugung von optimalen greifposen zur ansteuerung einer greifvorrichtung und zugehöriges greifsystem - Google Patents
Sensoreinrichtung für ein greifsystem, verfahren zur erzeugung von optimalen greifposen zur ansteuerung einer greifvorrichtung und zugehöriges greifsystemInfo
- Publication number
- EP4164841A1 EP4164841A1 EP21733750.0A EP21733750A EP4164841A1 EP 4164841 A1 EP4164841 A1 EP 4164841A1 EP 21733750 A EP21733750 A EP 21733750A EP 4164841 A1 EP4164841 A1 EP 4164841A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- gripping
- sensor device
- robot
- segmentation
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000011218 segmentation Effects 0.000 claims description 42
- 238000003384 imaging method Methods 0.000 claims description 10
- 238000004891 communication Methods 0.000 claims description 9
- 230000004438 eyesight Effects 0.000 claims description 8
- 230000000243 photosynthetic effect Effects 0.000 claims description 4
- 238000013135 deep learning Methods 0.000 claims 1
- 230000006870 function Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39484—Locate, reach and grasp, visual guided grasping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39543—Recognize object and plan hand shapes in grasping movements
Definitions
- the invention relates to a sensor device for a gripping system, the gripping system comprising a robot, i.e. a manipulator with at least one degree of freedom such as an industrial robot, with a gripping device for handling objects and a robot or machine controller for controlling the robot and the gripping device.
- the invention also relates to a method for generating gripping poses for a machine or robot controller for control purposes the robot and the gripping device for gripping objects and an associated gripping system.
- US Pat. No. 9,002,098 B1 describes a robot-assisted visual perception system for determining a position and pose of a three-dimensional object. The system receives an external input for selecting an object to be gripped.
- the system also receives visual input from a sensor on a robot controller that is scanning the object of interest.
- Rotationally invariant shape features and appearance are extracted from the captured object and a set of object templates.
- a match between the scanned object and an object template is identified on the basis of shape features.
- the correspondence between the scanned object and the object template is confirmed on the basis of appearance features.
- the scanned object is then identified and a three-dimensional pose of the scanned object of interest is determined.
- the robot controller is used to grasp and manipulate the scanned object.
- the system works on the basis of templates or rotation-invariant features to compare the sensor data with the model.
- the object of the present invention is to provide the generation of optimal gripping poses. From these gripping poses, instruction sets for controlling the gripping device for gripping objects can then be generated in an advantageous manner on the robot or machine side. Both the gripping of known objects and the gripping of unknown objects should be possible.
- This object is achieved by a sensor device with the features of claim 1.
- Such a sensor device in particular enables handling tasks such as pick & place to be started up quickly without intervening in the robot or machine control and without expert knowledge in the field of image processing and robotics.
- the sensor device represents a largely self-sufficient unit with which suitable gripping poses can be generated. From these gripping poses, application-independent command sets for the robot or machine control can be generated on the robot or machine side.
- Segmentation is a branch of digital image processing and machine vision.
- the creation of content-related regions by combining neighboring pixels or voxels in accordance with a certain criterion of homogeneity is referred to as segmentation.
- the control interface is provided by the robot
- a system with such a sensor device consequently allows both the gripping of known objects and the gripping of unknown objects on the basis of the generalized segmentation and gripping planning algorithm.
- the gripping planning parameters for the gripping planning module and / or the control parameters for the control interface can be specified, such as the parameterization of
- Figure 2 Prototype structure for the sensor device.
- Figure 3 Sensor device hardware architecture. Ever smaller batch sizes and rising wage costs pose major challenges for production technology in high-wage countries. In order for these to be addressed, today's automation system must be able to be quickly adapted to the new environmental conditions. In the following, a sensor device is presented that allows handling tasks such as pick & place to be started up quickly without programming.
- the sensor device represents, in particular, a computing unit which allows a suitable gripping pose for an object to be determined based on gray value data, color data or 3D point cloud data (for example using mono or stereo camera systems). Suitable here means that the resulting grip both meets certain quality criteria and does not lead to any collisions between the gripper, robot and other objects.
- the camera system can be used externally or directly in the
- Sensor device be structurally integrated, which is clear from the hardware architecture according to Figure 3.
- the gripping pose is passed on to a control system with gripping device and manipulator (e.g. robot) connected to the sensor device, which then performs the grip.
- manipulator e.g. robot
- Any number of imaging sensors or camera systems and manipulator systems can be connected via, in particular, a physical Ethernet interface.
- the software-related peculiarities of the respective subsystems are abstracted via a meta description and integrated function drivers.
- the software architecture is called a pipeline because the result of process i represents the input variable for process i + 1.
- the individual objects are detected using an instance segmentation method from the image information made available by the sensor system. If other / additional image processing functions are required, these can be made available to the overall system via the Vision Runtime. You can develop your own functions and integrate finished runtime systems.
- the segmented objects represent the input variable for gripping planning.
- the gripping planner then detects a suitable / sought-after grip, which is the
- Control interface is made available for execution.
- the virtual environment engine On the basis of CAD and real scene data, the virtual environment engine generates training data. These are synthetic image data (2D or 3D) of the objects to be grasped in the overall scene and their annotation (ground truth of the objects) in a given data format. A segmentation model is trained on the basis of the synthetic training data, which can then be downloaded via the user interface (web interface) of the sensor device and made available to the segmentation method. The training takes place in particular outside the sensor device on a powerful server, but it can also be carried out on the sensor device. Due to the autonomy of the individual steps, the time-consuming programming of image processing and robot programs is no longer necessary. In particular, only the following processes need to be parameterized / executed manually by end users: - Upload of CAD models (and / or real image data with
- the sensor device maps the entire engineering process for automating a pick & place application. Both 2D and 3D imaging sensors are considered, so that a suitable hardware solution results depending on the application. Also, no known system combines the various possibilities of gripping planning
- the pipeline with the sequence from sensor data acquisition to communication with the control by the sensor device 14 is shown in Figure 3.
- the sensor data are obtained via, in particular, different imaging sensors 1.1 (2d) and 1.2 (3D) (see Figure 2).
- any sensors can be integrated.
- the data is processed by the vision runtime module 2. This is used by the instance segmentation module 3 in normal operation (gripping planning).
- the output is the object envelopes including the class affiliation of the objects contained in the sensor data.
- a segmentation model must first be trained using a data-driven method, see Figure 1 (b), which is integrated via the user interface 9.
- Figure 1 (b) which is integrated via the user interface 9.
- additional functions 4 module e.g. quality inspection, barcode reading, etc.
- the relevant gripping features are determined from the object segmentation. These then represent the basis for the gripping planning in the gripping planning module 6.
- various methods for gripping planning can be freely selected by the user.
- Model-based methods a handle or several handles are specified by the user and the system searches for them in the scene object
- model-free methods the system determines the best possible handle in terms of handle stability and quality
- Different gripping systems number of fingers, operating principle (clamping gripping as well as vacuum gripping)
- This is configured via the user interface 9 via the gripping planning parameters.
- the planner provides a gripping pose and the gripping finger configuration in the SE (3) via the
- Control interface available.
- a list of all recognized objects including their class and object envelopes can be made available.
- the control interface 7 is used for communication with the robot or machine controller 8. This is designed as a client-server interface, the sensor device representing the server and the controller 8 representing the client.
- the interface 7 is based on a generally applicable protocol so that it can be used for various proprietary controls and their specific command sets. Communication takes place via TCP / IP or a fieldbus protocol.
- a specific function block is integrated which generates control-specific command sets.
- the entire parameterization and configuration of the sensor device takes place via the user interface 9. This is shown via a web server that runs locally on the sensor device 14.
- the teaching-in of segmentation models is possibly carried out on an external
- Training server carried out, the upload of training data and download of the finished model is carried out via the user interface 9.
- a training server 11 is available for teaching in the segmentation model. This service can be performed outside of the sensor device 14.
- the user 10 can make the objects to be gripped available as CAD data and as real scene data. On the basis of this data, various object scenes are generated in the virtual environment module 12 and made available to the training module 13 as photosynthetic data. The time required for annotation of the training data can thus be largely minimized.
- the data-driven segmentation algorithm is trained in module 13.
- the output is a segmentation model which the user 10 integrates on the sensor device 14 via the user interface 9.
- the hardware architecture and the embedding of the sensor device 14 in the overall automation system is shown in Figure 3.
- the sensor device 14 is supplied with electrical energy via the energy supply module 18.
- the sensor device which functions as a server in relation to the controller 8, represents the slave in the communication topology of the overall automation system. via the field bus system provided.
- the gripping device 22 can also be integrated via a system control 21 if this is required by the architecture of the overall system.
- the sensor device 14 is connected to a terminal device (for example PC) by the user 10 via the physical user interface 15 (for example Ethernet).
- the software configuration is then carried out via interface 9 (web server).
- the communication between the sensor device 14 and the controller 8 also takes place via an optionally physically separate or shared interface 15 (for example Ethernet, fieldbus). Communication takes place as already shown.
- an optionally physically separate or shared interface 15 for example Ethernet, fieldbus.
- Ethernet interface 16 Sensor instead of GigE, for example, can be used here.
- An additional lighting module 19 can also be activated via the sensor device interface 17 (digital output).
- the system limit of 14 can also be extended by integrating 1 and 19, whereby the interfaces remain the same.
- An image processing sensor can be connected to the sensor device via a uniform interface (Ethernet), which uses the sensor data.
- Ethernet uniform interface
- a manipulator system can be connected to the sensor device via a uniform interface (e.g. Ethernet).
- a uniform interface e.g. Ethernet
- the sensor device then provides the control system as a client with the various services such as gripping poses, object positions, etc.
- the user can connect to the sensor device via the user interface (Ethernet) and set all necessary configurations and parameterizations via a web server.
- the user interface Ethernet
- the sensor device represents a computer system which is either designed as a separate calculation box or can be integrated into a subcomponent (e.g. gripping system, flange). This can also be ported as a software solution to appropriate external hardware.
- the sensor device can be seamlessly integrated into today's automation architectures thanks to the open interfaces to the control system and the imaging sensor.
- the sensor device uses the visual sensor data and carries out instance segmentation of the previously defined gripping objects. - In the Vision Runtime, additional
- Image processing functions are integrated so that, for example, special tasks such as quality checks or the like can be carried out.
- the gripping planner can automatically determine a predefined or a suitable grip for the objects on the basis of the results of the segmentation.
- the gripping pose is transformed directly into the selected robot coordinate system and transferred to the controller.
- the image processing sensor can be calibrated and simply registered geometrically with the manipulator system.
- the individual gripping tasks are specified by the user in a task-oriented manner (pick (Object_X)) and are mapped in the respective control.
- the individual software modules are provided for this.
- the sensor device can also simply return the detected objects (without a planned grip), since the interface to the control is structured flexibly.
- the following services can be offered getGraspPose (), getObjects (), getBestObject (), hasObject (x), etc. -
- the sensor device can be based on CAD data
- Imaging sensor Used to record 2D or 3D data.
- Instance segmentation Software algorithm that performs instance segmentation (segmentation between individual different and identical object classes, i.e. separation between all objects).
- Gripping planning module software algorithm that uses features to calculate a suitable grip.
- Control interface Interface of the sensor device for communication with 8.
- Robot / machine control control of the robot.
- Training server Generation of photosynthetic data or augmentation of the data and training of 3.
- Virtual Environment Engine Virtual rendering and physics engine for generating photosynthetic data and for augmenting real image data.
- Training segmentation training algorithm for 3.
- Sensor setup over-system of 1, 2, 3, 4, 5, 6, 7, 8 and 9.
- Interface control / user physical Ethernet interface.
- Energy supply module External electrical energy supply.
- System control interface Hardware and software interface for the system.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102020115628.6A DE102020115628A1 (de) | 2020-06-12 | 2020-06-12 | Sensoreinrichtung für ein Greifsystem, Verfahren zur Erzeugung von optimalen Greifposen zur Ansteuerung einer Greifvorrichtung und zugehöriges Greifsystem |
PCT/EP2021/065736 WO2021250227A1 (de) | 2020-06-12 | 2021-06-11 | Sensoreinrichtung für ein greifsystem, verfahren zur erzeugung von optimalen greifposen zur ansteuerung einer greifvorrichtung und zugehöriges greifsystem |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4164841A1 true EP4164841A1 (de) | 2023-04-19 |
Family
ID=76532183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21733750.0A Pending EP4164841A1 (de) | 2020-06-12 | 2021-06-11 | Sensoreinrichtung für ein greifsystem, verfahren zur erzeugung von optimalen greifposen zur ansteuerung einer greifvorrichtung und zugehöriges greifsystem |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230278199A1 (de) |
EP (1) | EP4164841A1 (de) |
DE (1) | DE102020115628A1 (de) |
WO (1) | WO2021250227A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7204513B2 (ja) * | 2019-02-13 | 2023-01-16 | 株式会社東芝 | 制御装置及びプログラム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB8720331D0 (en) * | 1987-08-28 | 1987-10-07 | Caplin Cybernetics Corp | Control system |
AT11337U1 (de) * | 2009-02-26 | 2010-08-15 | Ih Tech Sondermaschb U Instand | Verfahren und vorrichtung zum robotergesteuerten greifen und bewegen von objekten |
US9089966B2 (en) | 2010-11-17 | 2015-07-28 | Mitsubishi Electric Corporation | Workpiece pick-up apparatus |
US9002098B1 (en) | 2012-01-25 | 2015-04-07 | Hrl Laboratories, Llc | Robotic visual perception system |
JP6695843B2 (ja) | 2017-09-25 | 2020-05-20 | ファナック株式会社 | 装置、及びロボットシステム |
US10535155B2 (en) | 2017-10-24 | 2020-01-14 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for articulated pose estimation |
DE102018209220A1 (de) * | 2018-06-11 | 2019-12-12 | Kuka Deutschland Gmbh | Verfahren und System zum Handhaben von Objekten mithilfe eines Roboters |
DE102019122790B4 (de) | 2018-08-24 | 2021-03-25 | Nvidia Corp. | Robotersteuerungssystem |
DE102018126310B3 (de) | 2018-10-23 | 2019-11-07 | Roboception Gmbh | Verfahren zum Erstellen eines Objektmodells zum Greifen eines Objekts, computerlesbares Speichermedium und Robotersystem |
EP3767521A1 (de) | 2019-07-15 | 2021-01-20 | Promaton Holding B.V. | Objektdetektion und instanzsegmentierung von 3d-punktwolken auf basis von tiefenlernen |
-
2020
- 2020-06-12 DE DE102020115628.6A patent/DE102020115628A1/de active Pending
-
2021
- 2021-06-11 EP EP21733750.0A patent/EP4164841A1/de active Pending
- 2021-06-11 US US18/009,929 patent/US20230278199A1/en active Pending
- 2021-06-11 WO PCT/EP2021/065736 patent/WO2021250227A1/de unknown
Also Published As
Publication number | Publication date |
---|---|
US20230278199A1 (en) | 2023-09-07 |
DE102020115628A1 (de) | 2021-12-16 |
WO2021250227A1 (de) | 2021-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102018215057B4 (de) | Maschinelles-Lernen-Vorrichtung, Robotersystem und maschinelles-Lernen-Verfahren | |
DE102019109624B4 (de) | Roboterbewegungseinlernvorrichtung, Robotersystem und Robotersteuerung | |
DE102019121889B3 (de) | Automatisierungssystem und Verfahren zur Handhabung von Produkten | |
EP3882856A1 (de) | Verfahren und vorrichtung zum bestimmen einer pose | |
DE102022107249A1 (de) | Modularisierung von Netzen zum Lernen von hochdimensionalen Roboteraufgaben | |
DE102020214231A1 (de) | Verfahren zum steuern einer robotervorrichtung und robotersteuereinrichtung | |
EP4164841A1 (de) | Sensoreinrichtung für ein greifsystem, verfahren zur erzeugung von optimalen greifposen zur ansteuerung einer greifvorrichtung und zugehöriges greifsystem | |
WO2023078884A1 (de) | Ansteuerung eines industrieroboters für eine greifaufgabe | |
CN115648202A (zh) | 基于时序逻辑的机器人视觉感知和反应规划联合方法及系统 | |
WO2022022784A1 (de) | Verfahren und system zur bestimmung von optimierten programmparametern für ein roboterprogramm | |
DE102018124671B4 (de) | Verfahren und Vorrichtung zur Erstellung eines Robotersteuerprogramms | |
DE102020210823A1 (de) | Maschinen-Steueranordnung | |
DE102021209646B4 (de) | Robotervorrichtung, Verfahren zum computerimplementierten Trainieren eines Roboter-Steuerungsmodells und Verfahren zum Steuern einer Robotervorrichtung | |
EP4121897B1 (de) | Verfahren und systeme zum bereitstellen von synthetischen gelabelten trainingsdatensätzen und ihre anwendungen | |
DE102022201719A1 (de) | Vorrichtung und Verfahren zum Trainieren eines maschinellen Lernmodells zum Erzeugen von Deskriptorbildern zu Bildern von Objekten | |
DE102022202144A1 (de) | Vorrichtung und Verfahren zur Steuerung eines Roboters zur Durchführung einer Aufgabe | |
EP4063081A1 (de) | Verfahren zum ermitteln von steuerungsdaten für eine greifeinrichtung zum greifen eines gegenstands | |
EP4063080A1 (de) | Verfahren zum ermitteln von steuerungsdaten für eine grei-feinrichtung zum greifen eines gegenstands | |
EP4064106A1 (de) | Verfahren zum erzeugen von trainingsdaten für ein ml-modell | |
EP3582140A1 (de) | System zur automatischen erkennung von laborarbeitsgegenständen sowie verfahren zum betrieb eines systems zur automatischen erkennung von laborgegenständen | |
DE102023203021A1 (de) | Verfahren zum Ermitteln eines Deskriptorbilds für ein Bild eines Objekts | |
DE102022202143B4 (de) | Vorrichtung und Verfahren zur Steuerung eines Roboters zur Durchführung einer Aufgabe | |
DE102021212494B4 (de) | Vorrichtung und Verfahren zur Steuerung einer Robotervorrichtung | |
DE102022106765B3 (de) | Verfahren zum Bestimmen einer Lage eines Objekts relativ zu einer Erfassungseinrichtung, Computerprogramm und Datenträger | |
DE102016121788A1 (de) | Konfiguration einer Automatisierungsanlage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20221216 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: SCHUNK SE & CO. KG SPANNTECHNIK GREIFTECHNIK AUTOMATISIERUNGSTECHNIK |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240828 |