WO2023165746A1 - Système d'automatisation et procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation - Google Patents

Système d'automatisation et procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation Download PDF

Info

Publication number
WO2023165746A1
WO2023165746A1 PCT/EP2023/050431 EP2023050431W WO2023165746A1 WO 2023165746 A1 WO2023165746 A1 WO 2023165746A1 EP 2023050431 W EP2023050431 W EP 2023050431W WO 2023165746 A1 WO2023165746 A1 WO 2023165746A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
pose
sensor device
poses
automation system
Prior art date
Application number
PCT/EP2023/050431
Other languages
German (de)
English (en)
Inventor
Christian Hanel
Simon Jessen
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2023165746A1 publication Critical patent/WO2023165746A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39008Fixed camera detects reference pattern held by end effector

Definitions

  • the present invention relates to a method for determining a pose of a sensor device of an automation system and an automation system.
  • An automation system is known from DE 10 2016 204 174 A1, which can automate parts of a production process for workpieces.
  • the automation system includes a functional module with a sensor unit for recording environmental data and an evaluation unit for determining an absolute position of the functional module in the system area on the basis of the environmental data.
  • Constant innovations mean that the workflows in production processes often have to be changed. Furthermore, the sequences of production processes have dynamic factors, which are recognized by sensors and which must be reacted to by appropriate adaptation.
  • robots with complex kinematics are often used, for example to transport or assemble parts.
  • the robots are typically a Cartesian robot, a SCARA robot or an articulated robot.
  • the movements or tasks that the robot is to process are typically programmed by an application engineer, ie a technician or engineer. All positions that the robot should move to in the application must be specified by the programmer.
  • a distinction is made between static points (ie three-dimensional locations), which do not change during the application's runtime, and dynamic points, which can change during the application's runtime.
  • Static points can be approached precisely with the robot by the applicator using the manual control device and accepted in the program.
  • Dynamic points can be determined, for example, using 3D sensors, in particular by means of optical sensors. Dynamic points are, for example, gripping points on moving objects.
  • the 3D sensor records the environment, for example as a point cloud, a software component evaluates this data and then localizes the object in three-dimensional space with its position and orientation.
  • 3D sensors are also conceivable to grab parts from boxes or pallets or to lift them from belt sections.
  • This technology greatly simplifies the system mechanically, since there is no need for precise mechanical locking and pre-separation of objects. This fact makes the system cheaper, since standardized 3D sensors and software components are used. Adjustments to new products or the reuse of components is possible with little effort.
  • the systems with the 3D sensors are calibrated manually or semi-automatically when the system is commissioned, for example.
  • the relationship between the robot and the 3D sensor is established using hand-eye calibration. This reference is required in order to convert positions detected by the 3D sensor into the robot's coordinate system.
  • a detection device is known from DE 10 2015 226 741 A1, which can have a calibration module that calibrates a component pick-up with a locating device.
  • the calibration can take place by comparing a resulting position of the component pick-up with a marker position of the marker determined by a measuring system.
  • the hand-eye calibration is based on algorithms that process a vector of data pairs as input parameters.
  • a pair of data consists of sensor data, such as a camera image, an image of the surroundings, a topology of the surroundings, a point cloud or the like, which have been determined using the data from the 3D sensor and which relate to a detection area in which a calibration mark mounted on a robot flange or Calibration plate is positioned, as well as the associated Cartesian position with translation and orientation of the robot flange.
  • sensor data such as a camera image, an image of the surroundings, a topology of the surroundings, a point cloud or the like, which have been determined using the data from the 3D sensor and which relate to a detection area in which a calibration mark mounted on a robot flange or Calibration plate is positioned, as well as the associated Cartesian position with translation and orientation of the robot flange.
  • the 3D sensor must also be configured correctly, for example, the exposure time and the aperture must be set for cameras.
  • the calibration engineer must ensure at all times that the kinematics travel without collision. If necessary, the calibration must be aborted and the calibration points must be redetermined to prevent a collision.
  • the calibration therefore requires well-founded specialist knowledge and can typically only be carried out successfully by an expert.
  • the invention described below addresses these issues.
  • the invention provides a method for determining a pose of a sensor device of an automation system and an automation system with the features of the independent patent claims.
  • the invention accordingly relates to a method for determining a pose of a sensor device of an automation system, the automation system also having a processing device with at least one movable component for processing workpieces.
  • a large number of calibration poses are specified, into which a calibration object is to be moved.
  • unsuitable calibration poses are recognized and discarded.
  • the detection of unsuitable calibration poses includes determining whether the calibration object in the calibration pose is at least partially outside a field of view of the sensor device and/or is at least partially covered by components of the automation system and/or is at least partially unrecognizable by the sensor device.
  • a collision-free path is calculated in order to move the calibration object into the calibration poses using the at least one movable component of the processing device.
  • the calibration object is moved along the calculated path into the calibration poses by means of the at least one movable component of the processing device, with the sensor device generating sensor data for each calibration pose.
  • the pose of the sensor device is determined based on the generated sensor data.
  • the pose of the sensor device represents the position and orientation of the sensor device in three-dimensional space.
  • the pose of the sensor device is described either in absolute coordinates, for example in world coordinates, and/or in relative coordinates, in particular in relation to the processing device.
  • the invention relates to an automation system with a sensor device, a processing device and a control device.
  • the sensor device is designed to generate sensor data.
  • the processing device has at least one movable Component, which is designed for processing workpieces.
  • the control device is designed to specify a large number of calibration poses into which a calibration object is to be moved, and is also designed to recognize and reject unsuitable calibration poses.
  • the detection of unsuitable calibration poses includes determining whether the calibration object in the calibration pose is at least partially outside a field of view of the sensor device and/or is at least partially covered by components of the automation system and/or is at least partially unrecognizable by the sensor device.
  • the control device is also designed to calculate a collision-free path in order to move the calibration object into the calibration poses using the at least one movable component of the processing device.
  • the control device is further designed to control the processing device in order to move the calibration object along the calculated path into the calibration poses using the at least one movable component of the processing device, and to control the sensor device to generate sensor data for each calibration pose.
  • the control device is designed to determine a pose of the sensor device based on the generated sensor data.
  • the processing device and/or optionally installed parts on the processing device for example a valve terminal, or other parts of the automation system at least partially cover the calibration object in the direct field of view of the sensor device.
  • the calibration object cannot be fully recognized.
  • the calibration poses are validated by a check, which can be carried out geometrically in particular.
  • This geometric field of view coverage check includes determining whether the calibration object in the calibration pose is at least partially outside a field of view of the sensor device and/or is at least partially covered by components of the automation system and/or is at least partially unrecognizable by the sensor device.
  • an algorithm thus checks whether the field of view between the sensor device and the calibration object is free of overlaps. Unlike manual hand-eye calibrations this has the advantage that the user himself does not have to check a visual field overlap between the sensor visual field and the calibration object in the calibration poses approached or to be approached by the processing device. Especially in the case of the automatically generated calibration poses, this field of view overlap check contributes in a particularly advantageous manner to the automation of the entire calibration process. This has the particular advantage that invalid calibration poses can be recognized and sorted out quickly. It is particularly advantageous that the geometric field of view overlap check is carried out using a 3D simulation with collision detection.
  • the geometry of the calibration pose in particular each calibration pose, is determined dynamically in relation to the calibration object and the sensor device and compared to the processing device, in particular the kinematics, and/or the automation system and/or the optional interference contours, such as valve terminals, is checked for collision. If there is no collision, the field of view of the sensor device is free and the calibration pose can be used to determine the pose of the sensor device.
  • the invention enables the pose of the sensor device (calibration pose or hand-eye calibration pose) to be determined semi-automatically or fully automatically, without special expert knowledge being required for the operation.
  • the sensor device and processing device can be calibrated.
  • the calibration process can be largely automated.
  • the necessary calibration poses are preferably generated and adjusted automatically. Unsuitable calibration poses are automatically discarded.
  • the invention thus provides an integrated path planner that makes it possible to determine collision-free movements between the calibration poses.
  • the invention enables a preferably fully automatic calibration and optimization of the calibration with minimal user interaction.
  • the sensor device is a 3D sensor, which thus generates three-dimensional sensor data.
  • the simple automatic calibration can ensure that 3D sensors can be better established in plant construction.
  • the at least one processing device is a robot.
  • the robot can be a Cartesian robot, a SCARA robot and/or an articulated robot.
  • a user can control the control device via a user interface.
  • the user interface is preferably kept simple and can manage with just a few easily understandable parameter specifications.
  • the processing device is a machine tool, special machine, an industrial robot or a universal, programmable machine for handling, assembling or processing workpieces and/or for handling tasks in logistics and intralogistics.
  • the moving component of the processing device may be a manipulator (such as a robotic arm) or an effector (such as a tool or gripper).
  • the initially specified calibration poses are at least partially specified by experts and determined sensor-specifically.
  • the initially predetermined calibration poses generated at least partially automatically from the sensor-specific field of view of the sensor.
  • the field of view coverage check i.e. determining whether the calibration object in the calibration pose is at least partially covered by components of the automation system
  • the field of view coverage check is carried out using a three-dimensional geometric model of at least part of the processing device.
  • the method for determining the pose of the sensor device of the automation system it is determined whether the calibration object in the calibration pose is at least partially not recognizable by the sensor device, taking into account an alignment of the calibration object relative to the sensor device and/or taking into account lighting conditions.
  • the movable component arranges the calibration object within the field of view of the sensor device. It is then determined whether the calibration object can be recognized.
  • the collision-free path is calculated using a three-dimensional geometric model of at least part of the processing device. This means that collisions can be safely avoided without complex tests. In particular, possible inherent collisions of the at least one moving component and collisions of the at least one moving component with other components of the automation system can be taken into account.
  • the sensor device generates at least one sensor data record for generating the sensor data for each calibration pose in a multiplicity of at least partially different configurations. This allows the determination of the pose of the sensor device to be further improved.
  • the pose of the sensor device is determined for each configuration, and a quality of determination of the pose of the sensor device (i.e. a quality of the calibration corresponding to an error) is determined for each configuration, where that pose of the sensor device is selected in which the quality of the calibration is highest.
  • the quality of the sensor data and/or the quality of the environmental conditions and/or statistical fluctuations can be calculated and taken into account.
  • an initial approximate pose of the sensor device is specified, unsuitable calibration poses being recognized and rejected and the collision-free path being calculated as a function of the initial approximate pose of the sensor device.
  • the pose of the sensor device is determined iteratively, with each iteration step detecting and rejecting unsuitable calibration poses and calculating the collision-free path depending on the pose of the sensor device determined in the previous iteration step he follows.
  • the sensor device is preferably controlled multiple times with different sets of parameters in order to achieve the best possible calibration.
  • the pose of the sensor device is determined using a hand-eye calibration method.
  • the automation system is a versatile manufacturing system.
  • FIG. 1 shows a schematic block diagram of an automation system according to an embodiment of the invention
  • FIG. 2 shows a schematic oblique view of an automation system according to an embodiment of the invention
  • FIG. 3 shows a schematic plan view of the automation system illustrated in FIG. 2 for explaining the detection of an obstruction of the field of vision
  • FIG. 4 shows a schematic oblique view of the automation system illustrated in FIG. 3 for explaining the detection of an obstruction of the field of view;
  • FIG. 5 shows a schematic oblique view of the automation system illustrated in FIG. 2 with a large number of calibration poses
  • FIG. 6 shows a schematic view of the sensor device of the automation system illustrated in FIG. 2 for explaining the automatic determination of the calibration poses
  • FIG. 7 schematic views of automatically generated calibration poses
  • FIG. 8 shows a schematic view of the sensor device of the automation system illustrated in FIG. 2 for explaining the determination of whether the calibration object is in a calibration pose at least partially outside the field of view of the sensor device;
  • FIG. 9 shows a schematic oblique view of the automation system illustrated in FIG. 2 with a calculated collision-free path
  • FIG. 10 shows a flow chart of a method for determining a pose of a sensor device of an automation system according to an embodiment of the invention.
  • FIG. 1 shows a schematic block diagram of an automation system 1 with a sensor device 2, a processing device 3 and a control device 4.
  • the sensor device 2 is, for example, a 3D camera which generates sensor data.
  • the processing device 3 can be a robot, for example, and has at least one movable component that is designed to process workpieces.
  • the movable component can be a gripper arm of the robot, for example.
  • the control device 4 is designed to specify a large number of calibration poses into which a calibration object is to be moved.
  • the calibration object can be, for example, a calibration plate with a pattern (such as a grid) on it.
  • a calibration pose is understood to mean the combination of position and orientation of the calibration object.
  • the orientation can be specified by specifying angles, such as gimbal angles, Euler angles or roll-pitch-yaw angles.
  • the initially specified calibration poses can be specified relative to the coordinate system of the sensor device 2 and thus independently of the processing device 3 .
  • the calibration poses are converted into a coordinate system of the processing device 3 in order to approach it with the movable component.
  • an initial approximate pose of the sensor device 2 can be specified, for example by the user or derived from a CAD construction of the automation system 1.
  • the specified calibration poses can be arranged in a uniformly distributed manner within the field of view (scanning volume) of the sensor device 2 .
  • the control device 4 can also identify and discard unsuitable calibration poses.
  • the detection of unsuitable calibration poses includes three separate checks, which are carried out in a predetermined order or at least partially in parallel.
  • the control device 4 determines whether the calibration object is at least partially outside a field of view of the sensor device in the calibration pose. Accordingly, it is determined whether the calibration object is at least partially outside a field of view of the sensor device in the calibration pose.
  • the control device 4 thus discards all calibration poses that are not located geometrically completely within the field of view of the sensor device.
  • the components can be parts of the processing device 3, for example.
  • a computer-aided design (CAD) modeled image of at least part of the processing device 3 and/or other components of the automation system 1 can be used here.
  • This detection can be based on environmental conditions, such as lighting conditions. Additionally or alternatively, this detection can be carried out using the calibration pose, the orientation of the calibration object relative to the sensor device being taken into account. For example, it can be determined whether a pattern located on the calibration object is in the calibration pose Sensor device can be seen. If the calibration object is a calibration plate, for example, calibration poses are discarded in which the orientation of the plate runs essentially parallel to the optical axis of the sensor device 2, so that a pattern on the calibration plate cannot be recognized by the sensor device 2.
  • control device 4 is designed to calculate a collision-free path in order to move the calibration object into the calibration poses using the at least one movable component of the processing device 3 .
  • control device 4 can use a CAD-modeled image of at least part of the processing device 3 and/or other components of the automation system 1 .
  • the control device 4 is further designed to control the processing device 3 in order to move the calibration object by means of the at least one movable component of the processing device 3 along the calculated path into the calibration poses.
  • the control device 4 also controls the sensor device 2 so that it generates sensor data for each calibration pose.
  • the sensor data can be present as a data pair list, with the calibration poses being assigned the associated 3D sensor data.
  • Several data sets with different parameters, i.e. settings of the sensor device 2 can be recorded.
  • control device 4 is designed to determine a pose of the sensor device 2 based on the generated sensor data. Based on the determined pose of the sensor device 2, the control device 4 calibrates the sensor device 2 of the automation system 1.
  • the settings of the sensor device (such as the exposure time) can optionally differ at least in part.
  • the sensor data are thus generated for at least partially different configurations of the sensor device 2 .
  • the controller can perform a calibration and calculate an error (e.g. a statistical error or an error calculated on the basis of environmental conditions), the sensor data set with the smallest error being selected for determining the pose of the sensor device 2 .
  • the method can be iterated, with unsuitable calibration poses being recognized and discarded in each iteration step and the collision-free path 10 being calculated as a function of the pose of the sensor device 2 determined in the previous iteration step.
  • FIG. 2 shows a schematic oblique view of an automation system 1 .
  • the sensor device 2 has a field of view 5 in which the calibration object 6 can be moved by the movable component 31 of the processing device 3 .
  • FIG. 3 shows a schematic plan view of the automation system 1 illustrated in FIG. 2, and FIG. 4 shows a schematic oblique view of the automation system 1 to explain the detection of an obstruction of the field of vision.
  • the field of view 5 of the sensor device 2 and a calibration object 6 which is located within the field of view 5 are shown.
  • a three-dimensional partial area 61 of the field of view 5 is assigned to the calibration object 6, which corresponds to the spatial area which must not be covered so that the calibration object 6 can be recognized completely.
  • a region 7 of the calibration object 6 is covered by a component of the processing device 3 in the illustrated calibration pose. The component of the processing device 3 is thus located in the partial area 61 of the field of view 5, so that the calibration object 6 cannot be fully recognized.
  • FIG. 5 shows a schematic oblique view of the automation system 1 with a large number of calibration poses p1 to p4. While four calibration poses p1 to p4 are illustrated in FIG. 5, the invention is not limited to a specific number of calibration poses p1 to p4.
  • Several poses are illustrated, including a pose u, with an associated vector running from a predetermined reference point of the processing device 3 to a reference point of the sensor device 2 . This is the determined pose of the sensor device 2.
  • Poses Wi are also illustrated, with the associated vectors in each case from the reference point of the processing device 3 to the respective calibration poses p1 to p4.
  • Further poses Vi are drawn in, with the associated vectors pointing from the reference point of the sensor device 2 to the respective calibration poses p1 to p4.
  • the poses Wi can be calculated from the poses Vi and u.
  • the poses can be described by 4x4 matrices describing translation and orientation.
  • the poses are given with respect to a coordinate system of the sensor device 2 with axes x, y and z. By determining the pose of the sensor device, the poses can also be specified with respect to a coordinate system of the processing device 3 .
  • FIG. 6 shows a schematic view of the sensor device 2 of the automation system 1 illustrated in FIG. 2 to explain the automatic determination of the calibration poses.
  • raster levels 8 can be provided relative to the coordinate system of the sensor device 2 , which have a first raster level 81 , a second raster level 82 and a third raster level 83 in the illustrated embodiment.
  • the invention is not limited to a specific number of grid levels 8 .
  • the grid planes 8 run parallel to one another at a distance h.
  • the raster planes consist of raster elements with width w and length I.
  • the invention is not limited to the use of grids. Other sampling methods can also be used to generate calibration poses.
  • FIG. 7 shows schematic views of automatically generated calibration poses, which are characterized by different positions and orientations x'-y'-z' relative to the coordinate system of the sensor device 2.
  • FIG. 8 shows a schematic view of the sensor device of the automation system 1 illustrated in FIG.
  • the calibration object 6 is located in the edge area outside the field of view 5 of the sensor device 2.
  • the calibration pose shown is therefore discarded.
  • FIG. 9 shows a schematic oblique view of the automation system 1 illustrated in FIG. 2 with a calculated collision-free path 10 on which the calibration poses p1 to p4 lie.
  • FIG. 10 shows a flow chart of a method for determining a pose of a sensor device 2 of an automation system 1.
  • the method can be carried out with an automation system 1 described above.
  • the automation system 1 includes a processing device 3 with at least one movable component 31 for processing workpieces.
  • a first method step S1 an initial approximate pose of the sensor device 2 is specified.
  • a large number of calibration poses p1 to p4 are specified, into which a calibration object 6 is to be moved.
  • unsuitable calibration poses p1 to p4 are identified and discarded.
  • the detection of unsuitable calibration poses includes determining whether the calibration object 6 in the calibration pose p1 to p4 is at least partially outside a field of view of the sensor device 2 and/or is at least partially covered by components of the automation system 1 and/or is at least partially not covered by the sensor device 2 is recognizable.
  • a three-dimensional geometric model of at least part of the processing device can be used to determine whether the calibration object 6 is at least partially covered by components of the automation system 1 in the calibration pose p1 to p4.
  • the orientation of the calibration object 6 and/or lighting conditions can be taken into account when determining whether the calibration object 6 cannot be recognized at least partially by the sensor device in the calibration pose p1 to p4.
  • a collision-free path 10 is calculated in order to move the calibration object 6 into the calibration poses p1 to p4 by means of the at least one movable component 31 of the processing device 3.
  • the collision-free trajectory 10 can be calculated using a three-dimensional geometric model of at least a part of the processing device 3 .
  • step S5 the calibration object 6 is moved along the calculated path into the calibration poses p1 to p4 using the at least one movable component 31 of the processing device 3, with the sensor device 2 generating sensor data for each calibration pose p1 to p4
  • the pose of the sensor device 2 is determined in a method step S6 based on the generated sensor data in order to thereby calibrate the sensor device 2 .
  • the pose of the sensor device 2 can be determined using a known hand-eye calibration method.
  • step S2 to S6 The determination of the pose of the sensor device 2 (steps S2 to S6) can be carried out iteratively.
  • the detection and rejection of unsuitable calibration poses and the calculation of the collision-free path initially take place as a function of the initial approximate pose of the sensor device.
  • unsuitable calibration poses p1 to p4 are recognized and discarded and the collision-free path 10 is calculated as a function of the pose of the sensor device 2 determined in the previous iteration step.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Manipulator (AREA)

Abstract

L'invention se rapporte à un procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation, le système d'automatisation comprenant en outre un dispositif de traitement ayant au moins un composant mobile servant à traiter des pièces à usiner. Une pluralité de poses d'étalonnage dans lesquelles un objet d'étalonnage doit être déplacé est prédéfinie. En outre, les poses d'étalonnage inappropriées sont détectées et rejetées. La détection de poses d'étalonnage inappropriées consiste à déterminer si l'objet d'étalonnage dans la pose d'étalonnage est au moins partiellement situé à l'extérieur d'un champ de vision du dispositif de détection et/ou est au moins partiellement caché par des composants du système d'automatisation et/ou est au moins partiellement indétectable par le dispositif de détection. Un trajet sans collision est calculé afin de déplacer l'objet d'étalonnage dans les poses d'étalonnage au moyen dudit composant mobile du dispositif de traitement. L'objet d'étalonnage est déplacé le long du trajet d'étalonnage calculé dans les poses d'étalonnage au moyen dudit composant mobile du dispositif de traitement, le dispositif de détection générant des données de capteur pour chaque pose d'étalonnage. La pose du dispositif de détection est déterminée à l'aide des données de capteur générées.
PCT/EP2023/050431 2022-03-03 2023-01-10 Système d'automatisation et procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation WO2023165746A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022202152.5 2022-03-03
DE102022202152.5A DE102022202152A1 (de) 2022-03-03 2022-03-03 Automatisierungssystem und Verfahren zum Ermitteln einer Pose einer Sensorvorrichtung eines Automatisierungssystems

Publications (1)

Publication Number Publication Date
WO2023165746A1 true WO2023165746A1 (fr) 2023-09-07

Family

ID=84981013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/050431 WO2023165746A1 (fr) 2022-03-03 2023-01-10 Système d'automatisation et procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation

Country Status (2)

Country Link
DE (1) DE102022202152A1 (fr)
WO (1) WO2023165746A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015226741A1 (de) 2015-12-28 2017-06-29 Robert Bosch Gmbh Erkennungsvorrichtung und Zuführvorrichtung
DE102016204174A1 (de) 2016-03-14 2017-09-14 Robert Bosch Gmbh Automatisierungsanlage mit mindestens einem Funktionsmodul
WO2021211420A1 (fr) * 2020-04-14 2021-10-21 Realtime Robotics, Inc. Configuration d'un environnement fonctionnel de robot comprenant l'agencement de capteurs

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015226741A1 (de) 2015-12-28 2017-06-29 Robert Bosch Gmbh Erkennungsvorrichtung und Zuführvorrichtung
DE102016204174A1 (de) 2016-03-14 2017-09-14 Robert Bosch Gmbh Automatisierungsanlage mit mindestens einem Funktionsmodul
WO2021211420A1 (fr) * 2020-04-14 2021-10-21 Realtime Robotics, Inc. Configuration d'un environnement fonctionnel de robot comprenant l'agencement de capteurs

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JUSTINAS MISEIKIS ET AL: "Automatic Calibration of a Robot Manipulator and Multi 3D Camera System", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 7 January 2016 (2016-01-07), XP081152121, DOI: 10.1109/SII.2016.7844087 *

Also Published As

Publication number Publication date
DE102022202152A1 (de) 2023-09-07

Similar Documents

Publication Publication Date Title
DE102017128652B4 (de) Robotersystem mit einer mehrzahl von robotern, robotersteuerung und robotersteuerverfahren
EP2216144B1 (fr) Système et procédé pour vérifier des composants et/ou des unités fonctionnelles avec un dispositif de test
DE102011082800B4 (de) System und Verfahren zur automatisierten Erstellung von Roboterprogrammen
DE102015107436B4 (de) Lernfähige Bahnsteuerung
DE102018202322B4 (de) Robotersystem, Robotersteuervorrichtung und Robotersteuerverfahren
DE102010032840A1 (de) Vorrichtung und Verfahren zum Messen der Position eines Werkzeugmittelpunktes eines Roboters
DE112010000794T5 (de) Verfahren zum Steuern eines Roboterwerkzeugs
DE102018001026A1 (de) Robotersystem mit einer lernenden steuerungsfunktion und lernendes steuerungsverfahren
EP3374134B1 (fr) Étalonnage d'un système comprenant un moyen de transport et au moins un robot
DE112019000097B4 (de) Steuervorrichtung, Arbeitsroboter, Programm und Steuerverfahren
EP2500148B1 (fr) Procédé et dispositif de commande d'un robot avec un modèle virtuel du robot
DE102017128543A1 (de) Störbereich-einstellvorrichtung für einen mobilen roboter
DE102020130520A1 (de) Verfahren zum steuern eines roboters in gegenwart menschlicher bediener
DE112019005484T5 (de) Automatische Kalibrierung für ein Kamera-Robotersystem mit Werkzeugoffsets
WO2019096479A1 (fr) Procédé et moyen pour faire fonctionner un système robotique
DE102018124595B4 (de) Vorrichtung zur Erfassung einer Position und Lage eines Endeffektors eines Roboters
EP3907754B1 (fr) Procédé de traitement des substrats, en particulier des plaquettes, des masques ou des panneaux d'affichage plats, au moyen d'une machine de l'industrie des semi-conducteurs
DE112021001173T5 (de) Entgratungsvorrichtung und Steuerungssystem
DE102018103474A1 (de) Ein system und verfahren zur objektabstandserkennung und positionierung
WO2023165746A1 (fr) Système d'automatisation et procédé de détermination d'une pose d'un dispositif de détection d'un système d'automatisation
DE102015117306A1 (de) Mehrachs-Maus für einen Mehrachsroboter
DE102019131401B3 (de) Kalibrierung einer Impedanzregelung eines Robotermanipulators
EP0977101A1 (fr) Système de fabrication flexible et procédé de commande
DE102014100538B4 (de) Verfahren zum Kalibrieren eines Roboters und einer Kamera und System zum Durchführen des Verfahrens
DE102011084353B4 (de) Verfahren und Computerprogramm zum Einrichten eines Greifroboters, sowie Greifroboter

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23700192

Country of ref document: EP

Kind code of ref document: A1