EP3426444A1 - Industrial robot having at least two image acquisition devices - Google Patents
Industrial robot having at least two image acquisition devicesInfo
- Publication number
- EP3426444A1 EP3426444A1 EP17707515.7A EP17707515A EP3426444A1 EP 3426444 A1 EP3426444 A1 EP 3426444A1 EP 17707515 A EP17707515 A EP 17707515A EP 3426444 A1 EP3426444 A1 EP 3426444A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- robot
- image
- arm
- robot controller
- industrial robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37555—Camera detects orientation, position workpiece, points of workpiece
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39057—Hand eye calibration, eye, camera on hand, end effector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39082—Collision, real time collision avoidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40201—Detect contact, collision with human
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40557—Tracking a tool, compute 3-D position relative to camera
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40591—At least three cameras, for tracking, general overview and underview
Definitions
- the invention relates to an industrial robot comprising a robot arm having a plurality of links which are connected by joints, and having a robot controller, which is configured to adjust the members of the robot arm in accordance with a Ro ⁇ boterprogramms automatically or in a manual drive mode ge ⁇ against each other.
- a robot which has a device attached to the robot arm or integrated in the robot arm for taking an image of an image projected on the surface of the object, wherein a data processing device is adapted to analyze an image data set, associated with an image taken with the apparatus for capturing the image from the image projected on the surface of the object.
- the recording takes place by means of a camera attached to the robot arm or integrated in the robot arm, wherein the projected image represent virtual input means.
- the object of the invention is to provide an industrial robot which can be reliably operated by means of cost-effective and / or simple sensor devices in a secure manner.
- an industrial ⁇ robot comprising a robot arm having a plurality of links which are connected by joints, and including a Robo ⁇ ters control, which is designed to adjust the members of the robot arm in accordance with a robot program automatically or in a manual drive mode against each other wherein Wenig ⁇ least one of the plurality of members includes at least two Bilderfas ⁇ sungs wornen, each of which image capture device with respect to their spatial position and orientation with respect to the member to which the image sensing ⁇ device is attached, is calibrated.
- Industrial robots are working machines that can be equipped with tools for the automatic handling and / or machining of objects, such as workpieces, and that are programmable by means of their joints in a plurality of axes of movement, for example with regard to orientation, position and workflow, in order to handle the tool.
- the industrial robot comprises the robot arm and a programmable robot controller (control device) which controls or regulates the movements of the industrial robot during operation, by one or more automatically or manually adjustable joints (robot axes) are moved by particular electric drives or motors, and the Robot controller controls the drives according to a robot program automatically or in a manual drive mode or controls.
- the robot control thus serves to move the Glie ⁇ the robot arm by driving drives of the robot arm.
- the drives in turn move the axes, ie the joints of the robot arm.
- two adjacent links of the robot arm can be connected to each other adjustable by a single joint, which represents the respective axis, which is moved by an associated ⁇ drive.
- Robotic arms may include, among other things, a frame and a relative to the frame by means of a joint rotatably mounted Karus ⁇ sell include, on which a rocker is pivotally mounted by means of another joint.
- an arm jib can be pivotally mounted on the rocker by means of a further joint.
- the arm boom carries one Robot hand, wherein the extent of the cantilever arm and / or the Ro ⁇ boterhand may have more additional joints.
- One, several or all joints of the robot arm can be designed as Drehge ⁇ joints.
- the connected via a plurality of joints having links Ro ⁇ boterarm may be configured as an articulated robot having a plurality of serially arranged one after the limbs and joints, in particular the robot arm may be designed as a six-axis articulated robot.
- One, several or all joints of the articulated robot can be designed as a rotary joints is ⁇ .
- the robot controller may be configured to the members of the robot arm in accordance with the robot program automatically or in a manual drive mode against one another in a predetermined positioning of the adjustable joints to verstel ⁇ len, wherein the at least two image acquisition devices in terms of their spatial position and orientation with respect to the member on which the image capturing means are fixed, at least in one of the positioning accuracy of the joints corresponding accuracy are measured.
- a positioning accuracy of the adjustable joints in this context is understood in particular to be a measure of exactly how a desired, in particular programmed setpoint position and / or desired orientation of a particular reference point of the robot arm, for example ⁇ a tool reference point, which also as TCP is called ⁇ net, can actually be achieved or achieved.
- a desired, in particular programmed setpoint position and / or desired orientation of a particular reference point of the robot arm for example ⁇ a tool reference point, which also as TCP is called ⁇ net
- TCP is called ⁇ net
- the joints comes to stand in an actual pose and there has an actual position and a Istorientie ⁇ tion in space, which may differ from the desired target position and target orientation.
- the maximum position or position deviation of this actual position and actual orientation of the desired position and desired orientation provides the degree of positioning accuracy.
- the image capture device is generally formed an image from the surroundings, for example from the working space of the industrial robot or a shelter of Industriero ⁇ boters to detect. By means of the image capture device, the image is captured, forwarded and / or stored in the form of image information.
- the image capture device can for this purpose have at least one image sensor.
- the image capture device may in particular be a digital camera.
- the image capture device may include a camera chip with an associated optical lens. Such camera chips as such are generally known and are used, for example, in mobile telephones, in particular in smartphones.
- a calibration of the image capture device means that the image capture device is rigidly attached to the associated member of the robot arm and then determines the exact position and orientation of the image capture device relative to the associated member to which the image capture device is attached, in particular measured and stored. Such a measurement takes place at least in a positioning accuracy of the joints corresponding accuracy. This means that the relative position and position of the image capture device on the associated member is measured and stored at least as accurately as the industrial robot position its robot arm, ie can take his pose or the pose of a tool reference point.
- At least one of the plurality of members includes at least two image acquisition devices, each of which paintings- detecting means with regard to their spatial position and orientation with respect to the member on which the Bilderfas ⁇ sungs stimulate is fixed, at least in one of the Po ⁇ positioning accuracy of the joints corresponding accuracy is calibrated,
- the actual spatial position and orientation of the relevant image acquisition device can be determined solely from the known joint positions of the robot arm. If the position and orientation of the image acquisition device with respect to the associated link is known, the position and orientation of the image acquisition device in space can be determined in the space due to the position and orientation of the respective link in space known by the joint positions of the robotic arm, namely in one of the positioning accuracy of the joints ent ⁇ speaking accuracy.
- Each member is provided with image capturing means equipped kitchens ⁇ tet may have two or more image capturing means. These multiple image sensing devices of a single member may be attached to the member so that they can each detect different spatial sectors.
- Image capture devices can not only be attached to a single member of the robot arm, but it can have two or more members of the robot arm, in particular all members of the robot arm each have two or more image ⁇ superviseds raiseden.
- the plurality of image acquisition devices such as cameras, should be integrated directly into the robot structure at strategically important locations and thus one User or application developer or robot programmer. The cameras are already measured or adjusted on the robot, so that the user can use them directly, ie without further measurement.
- at least two camera chips of different points from the same object such as a workpiece or the tool of the industrial robot to detect, are then realized in particularstagefunktio ⁇ NEN means of the image acquisition devices.
- the placement of the image capture devices may preferably be close to the last, ie distal, axis in order to benefit from the degrees of freedom of the axes located in the kinematic chain of the camera with respect to the possible movements of the cameras.
- the image capture devices should be located far enough away from a tool or the flange of the robotic arm so as not to be obscured by the tool on the flange.
- An accommodation on the robot hand for example, a six-axis kick robot may be appropriate.
- the FEH ⁇ lumbar degrees of freedom for example the last one or two axes can be compensated for.
- an application program can also allow the development of own applications using the image information obtained by the image acquisition devices. Possible applications for application programs can be at ⁇ play as:
- the member may include a member housing surface defining the outer shape of the member and an inwardly extending receiving space extending from the member housing surface within which the image sensing device is fully received.
- the link housing surface may be the outer jacket wall of a particularly hollow structural component, which forms the respective member of the robot arm.
- the structural component can be designed, in particular, for transmitting all forces and moments necessary for carrying the robot arm itself and for carrying and / or moving it from the robot arm tool to be handled. Also, reaction forces and reaction moments, which occur through a combination of tool and workpiece are derived here via the respective structural component.
- the receiving space can be formed by a recess or an opening in the particular hollow structural component.
- a separate receiving space may be provided. In the receiving space, the respective image capture device is attached.
- the image capture device can be screwed, ange ⁇ clamped or glued to the member within the receiving space.
- the image capture device may include an entrance window through which light rays enter the image capture device from outside to capture an image of the environment of the industrial robot within the image capture device, the entrance window having a window surface flush with the link shell surface of the link on which the Image capture device is attached, terminates or back ⁇ offset relative to the link housing surface.
- the entrance window can be formed for example by an optical lens of the image capture device.
- the entrance window may also be a light-permeable protective cover which covers the image acquisition device, in particular its optical lens.
- An outer surface of the entrance window can bün ⁇ dig flush with the housing member surface, or back with respect to the member ⁇ housing surface.
- the limb may comprise at least three image acquisition devices arranged distributed over a circumference of the limb.
- the environment can be detected optically over an angle of at least 360 degrees completely over the circumference of the limb, without the limb having the image acquisition device only having to be adjusted.
- the robot arm may be designed as an articulated robot, in which the plurality of links in a kinematic
- a proximal end member of the links forms a base frame, which is designed for fastening the articulated robot to a foundation
- a distal end member of the links forms a hand flange which is configured for attachment of a tool to be handled by the articulated robot
- the Bilder chargedseinrichtun ⁇ gene are located on a member which is immediately upstream in the kinematic chain to the distal end member.
- the image capturing means are arranged on a member which is immediately upstream in the kinematic chain to the distal end member, all the image capturing means proximal upstream joints can be used in the ki ⁇ nematic chain in order, by adjusting the joints, the positions and the orientations of the image capturing means in To align space. So can the Image capture devices occupy the largest number of different poses in the room.
- the image capture device may have an image sensor connected to the robot controller, such that image information captured by the image sensor is transmitted to the robot controller and the robot controller has a processing device that is configured to determine the joint position values currently acquired by the image sensor at a particular time of acquisition of image information To detect members of the robot arm and assign these joint position values of the members of the robot arm to the image information. Accordingly, the processing device can store the image information acquired by the image acquisition devices and assign each of them the currently assumed joint position values of the limbs of the robot arm. For this purpose, the processing device can use the momentarily assumed joint position values at the time of creation of the
- the stored joint position values allow the image information to be assigned to an individual view.
- image information and related joint position values can be miteinan ⁇ was compared and evaluated.
- the image capture device may have an image sensor connected to the robot controller such that image information captured by the image sensor is transmitted to the robot controller and the robot controller has processing means configured to pick up the image currently occupied by the image sensor at a particular time
- To determine joint position values of the limbs of the robot arm to determine from the joint position values of the limbs of the robot arm the position and orientation of the image acquisition device with respect to the working space of the industrial robot and to associate this position and orientation of the image acquisition device in the working space with the associated image information.
- the robot controller or the processing device may be configured to execute a safety-related function of the industrial robot on the basis of image information obtained from the image capture devices and based on spatial locations of the image capture devices associated with the image information.
- the processing device may optionally be part of the robot controller or be designed as a separate processing device separate from the robot controller.
- the processing device may be partially or completely implemented as hardware and / or software.
- the image acquisition devices in particular cameras, integrated into the robot structure for security functions, basically at least two cameras are to be used which have at least one overlapping region.
- safety-related functions are possible in principle, since the Object, for example, a reference marker, is detected by two independent image capture devices or cameras that record the object under different viewing angles.
- markers can be any clearly identifiable pattern or object. However, to ensure high availability, the pattern should include redundancy that allows troubleshooting.
- 2D codes such as QR codes, are available for this purpose. Also colored codes are possible.
- the robot controller or the processing device can be configured in a first embodiment to move the robot arm into an arm position, in which at least one of the image capture devices detects a separate from the robot arm in a previously known, stored position and orientation in space optical marker and from this optical marker Generates image data, and based on an evaluation of the generated image data and the previously known, stored position and orientation of the optical marker, a current joint position eins ⁇ least one of the joints of the robot arm is calculated.
- a is from from within the Ar ⁇ beits Schemes mounted reference marker in a two-channel time detection range of the cameras.
- the position of the marker is of two cameras he failsafe ⁇ known. Since the position of the camera on the robot is known, it can be determined by using the securely configured position of the marker the position of the robot up to the axis of the attached cameras are verified. Should additionally also the Po ⁇ sition of the remaining axis can be verified, so can the tool reference point (TCP) is also a marker, in particular be attached with a different identifier, which is moved into the field of view of the cameras.
- TCP tool reference point
- the robot controller or the processing means may be formed in an alternative or supplementary second embodiment, to move the robot arm in a Armstel- lung in which optically detects at least one of the Bilderfas ⁇ sungs foundeden a gehandhabtes by the robot tool and generated by this tool image data and the robot controller or the processing ⁇ device sent for further evaluation. If, for example, a robot is to be used to safely monitor the work spaces for a robot, for example with a tool changer, then the problem arises that it must also be prevented that the tool received violates the work space. However, this is only possible with a tool changer, if you have a secure detection of the tool.
- the tool analogous to a marker on the flange of the robot arm or it may be so placed ⁇ introduced more markers that it is attached to the flange tool with the expected tool is compared and thus the geometry of the tool can be taken into account. Several markers can ensure that the tool is detected regardless of the position of the flange of the robot arm. If a wrong or missing tool is detected, the robot can be stopped, in particular safety-switched off.
- the robot controller or the processing means may be formed in an alternative or supplementary third embodiment, to move the robot arm in a Armstel ⁇ lung, in which at least sungs wornen one of Bilderfas- an optically detected by the robot arm workpiece to be processed and generated from this workpiece image data and transmitted to the robot controller or the processing ⁇ device for further evaluation.
- the workpiece can also be safely detected, provided that a marker is in the
- Field of view of the camera attach or the workpiece sel ⁇ ber can be identified by the camera. If geometrically different workpieces can be processed by the robot, this can also be used to monitor a violation of the work space by the workpiece.
- the robot controller or the processing device can be configured in an alternative or supplementary fourth embodiment to move the robot arm into at least one arm position, in which at least one of the image acquisition devices at least partially or completely optically detects the working space of the industrial robot and generates image data from this work space transmitted to the robot controller or the processing device for further evaluation. If you want to program working areas in which the robot is allowed to stand on a robot system, or protective areas into which the robot is not allowed to enter, this can be done by an optical teach-in. With an object on which one or more markers are applied, it shows the first robot having a corner of a Härau ⁇ mes or a shelter. Subsequently, one shows the Ro ⁇ boter a second, opposite corner of the working space or the shelter.
- the robot system can detect the position and the angle of the markers both times by means of the image capture devices and thus adjust the work space or the shelter.
- convex or concave structures may be suitable for teaching.
- the robot system recognizes the area defined by the touch-up virtual objects, and therefore at least knows their funda ⁇ constricting geometric shape.
- the robot controller or the processing device can be configured in an alternative or supplementary fifth embodiment to move the robot arm into at least one arm position, in which at least one of the image capture devices at least partially or completely optically captures a shelter separate from the workspace of the industrial robot and from this shelter Image ⁇ data generated and transmitted to the robot controller or the processing device for further evaluation.
- the image capture devices can also recognize a human hand gesture, for example. As long as the hand is held in this position, the consent is considered granted. Disappears hand out of the field of view of the image acquisition devices, particularly cameras, or changed the hand signal, it will invalidate the Zustim ⁇ determination.
- a special glove with applied markers can be used.
- other hand gestures or gestures, including with the second hand may of course be used to serve other safety or non-safety functions of the robot, such as start button, jog mode, or direct whilsan ⁇ gifts for movements.
- start button jog mode
- travelers may instruct the robot to slowly drive to the left.
- the travel speed can also be adjusted depending on the position of the hand.
- the robot controller or the processing device can be configured in an alternative or supplementary sixth embodiment to move the robot arm into at least one arm position, in which at least one of the image capture devices optically detects a person or an identification device associated with a particular person and generates image data therefrom and the image data from the robot controller or processor is compared with stored identification features.
- FIG. 1 is a perspective view of an industrial robot having a robot arm, a robot controller, and a plurality of image capturing devices,
- FIG. 2 shows a schematic sectional view in the axial direction on an arrangement of eight image acquisition devices of an exemplary member of the robot arm
- FIG. 3 is a schematic sectional view in the axial direction of an arrangement of four image sensing devices of an exemplary member of the robot arm,
- FIG. 4 is a schematic sectional view in the axial direction of an arrangement of two image sensing devices of an exemplary member of the robot arm
- FIG. 5 is a schematic sectional view in the axial direction of an arrangement of an image sensing device of an exemplary member of the robot arm
- FIG. 6 is a schematic sectional view in the axial direction of an arrangement of two image sensing devices of an exemplary member of the robot arm with an overlap region
- Fig. 9 is a schematic representation of a robot arm with
- Image capturing devices that capture a reference mark
- Image capture devices that detect a body having three orthogonally arranged reference marks to detect a first boundary of a shelter
- Fig. 11 is a schematic representation of the robot arm with
- Image detection devices which detect the body with three orthogonally arranged reference marks in order to detect a second boundary of the protection space.
- FIG. 1 shows a robot 1 having a robot arm 2 and a robot controller 13.
- the robot arm 2 comprises a plurality of links 12 arranged one after the other and connected by joints 11.
- the links 12 are, in particular, a frame 3 and a carousel rotatably mounted relative to the frame 3 about a vertically extending axis AI 4.
- Further members of the robot arm 2 are in the case of vorlie ⁇ ing embodiment, a rocker arm 5, a boom 6 and a preferably multi-axis robot hand 7 with a designed as a flange 8 fastening device 15 for attaching an end effector, not shown, ie tool.
- the rocker 5 is pivotally mounted at the lower end, for example on a swing bearing head not shown on the carousel 4 about a preferably horizontal axis of rotation A2.
- the arm extension 6 is pivotably mounted about a likewise preferably horizontal axis A3.
- this carries the robot hand 7 with its preferably three axes of rotation A4, A5, A6.
- the arm extension 6 has a first housing component 9 mounted pivotably on the rocker 5. On the first housing part 9, a second housing part 10 of the arm extension 6 is rotatably mounted about the axis A4.
- the robot 1 has a robot controller 13 and a robot arm 2 with a plurality of joints 11 connected by links 12, which are automated by drive motors of the robot 1 which are coupled to the joints 11 in accordance with a robot program executed by the robot controller 13 or in one Manual drive operation of the robot 1 are drive-controlled ⁇ ert to change the configuration of the robot arm 2, wherein, for example, of the members 12 of an intermediate member 12.3, the articulated via a first pivot 11.1 with a in the kinematic chain of the robot arm 2 the intermediate member 12.3 upstream first member 12.1 is rotatably connected and which is rotatably connected via a second pivot 11.2 with a in the kinematic chain of the robot arm 2 the intermediate member 12.3 downstream second member 12.2, wherein the axis of rotation of the first pivot joint 11.1 (HAI) orthogonal to the axis of rotation of the second Swivel joint 11.2 (HA2) is aligned.
- HAI first pivot joint 11.1
- Fig. 1 accordingly shows an industrial robot 1 comprising a robot arm 2 with a plurality of members 12, the over Joints 11 are connected, and having a robot ⁇ control 13, which is designed to adjust the members 12 of Robo ⁇ terarms 2 according to a robot program automatically or in a hand-held operation against each other and in particular in a predetermined positioning accuracy of the adjustable joints 11, wherein at least one of the plurality of links 12 has at least two image capture devices K1, K2 and / or K3, K4, each of which image capture device K1, K2 and / or K3, K4 with respect to their spatial position and orientation with respect to the link 12 at which the image capture device K1, K2 and / or K3, K4, is calibrated, in particular at least in one of the positioning accuracy of the joints 11 corresponding accuracy is measured.
- the exemplary member 12.1 has a member housing surface 20 defining the outer shape of the member 12.1 and an inwardly extending receiving space extending from the member housing surface 20 within which the image sensing devices K1, K2 are fully received
- the image capturing means Kl, K2 have, as shown in FIG. 7 and FIG. 8, each of an entrance window 19, may occur, for example, in the image detection means Kl via the light beams from the outside, an image of the environment of the industrial robot 1 within the Bilderfas ⁇ sungs owned Kl, wherein the entrance window 19 has a window surface 19a, which, as shown in Fig. 7, flush with the member housing surface 20 of the member 12.1, on which the image detecting device Kl is attached, closes.
- the element 12. 1 has eight sections distributed over a circumference of the element 12. 1. ordered image capture devices Kl to K8.
- FIGS. 3 and 4 show that, outside a certain minimum distance (outer circle 21), complete coverage already exists with four image capture devices K1, K2, K3 and K4.
- the additional four image capture devices K2, K4, K6 and K8 according to FIG. 2 is a Doppelabde ⁇ ckung each point is achieved in an overlap region 23, so that security requirements secure technology can be met. If one only requires a certain angle range with double coverage, for example in order to detect an adjustment reference marker, then the four image acquisition devices K1, K2, K3 and K4 are sufficient, since they also already have sufficient overlap.
- the member 12.1 only four over a circumference of the member
- the member 12.1 has two image detection devices K1 and K2 distributed over a circumference of the member 12.1.
- a further image acquisition device K2 can be arranged at least, but not shown in detail are.
- FIGS. 4 and 5 a cover at the top is dispensed with.
- the number of at least necessary image capture devices K1 and K2 can be reduced to two.
- FIG. 6 it is indicated as already two Bilderfas ⁇ sungs adopteden Kl and form a common overlapping area 23 K2, in the optical because of the redundant detection means of the two Bilder chargedsein- directions Kl and K2, a monitoring secure technology can be performed.
- FIGS. 7 and Fig. 8 of the representative image capture device class associated with an entrance window 19 may enter from the outside into the image detection means Kl via the light rays to capture an image of the area of the industrial robot 1 within the images ⁇ capture device Kl.
- the entrance window 19 has, in the variant according to FIG. 7 is a window surface 19a on which 12 or the hand member 7 on which the image capture device ⁇ Kl is flush with a housing member 20 of the surface member attached completes.
- the entrance window 19 has, in the variant according to FIG. 8 to a window surface 19a, which is to-back offset from the Gliedge ⁇ koruseober Structure 20 of the member 12 and the hand member 7.
- the entrance window 19 can be formed, for example, by an optical lens of the image capture device K1.
- the entrance window 19 may also be a translucent protective cover which covers the image capturing device K1, in particular its optical lens.
- the robot arm 2 according to Fig. 1 and Fig. 9 through Fig. 11 as an articulated robot 2a is formed, wherein the plurality of members 12 arranged successively in a kinematic chain and in each case two adjacent links 12 by each ⁇ wells one of the joints 11 against each other adjustable connectedness are, wherein a proximal end member of the limbs Gl 12 forms a base frame, which is designed to fasten the Knickarmro ⁇ boters 2a to a foundation, and a dista ⁇ les end member G7 of the members 12 forms a hand flange, for attaching one of the articulated robot 2a to be manipulated tool, and thessen chargedseinrichtun ⁇ gen Kl, K2 are arranged on a member G6, which is in the ki ⁇ nematic chain the distal end member G7 immediately upstream.
- the image capture device K1, K2 may have an image sensor connected to the robot controller 13, such that image information acquired by the image sensor is transmitted to the robot controller 13 and the robot controller 13 has a processing device 16 (FIG capture time of ⁇ He abstract image information by the image sensor currently occupied joint position values of the links 12 of the Robo ⁇ terarms 2 and NEN zuzuord- the image information this joint position values of the links 12 of the robot arm. 2
- the image capture device K1, K2 may have an image sensor connected to the robot controller 13, such that image information captured by the image sensor is transmitted to the robot controller 13 and the robot controller 13 has a processing device 16 which is configured to detect at a particular time from the joint position values of the links 12 of the robot arm 2 to determine the position and orientation of the image capture device K1, K2 with respect to the working space of the industrial robot, and to determine this position and orientation. tion of the image acquisition device K1, K2 in the work space to the associated image information.
- the robot controller 13 or the processing device 16 can be designed to perform a safety-related function of the industrial robot on the basis of image information obtained from the image capture devices K1, K2 and based on spatial positions of the image capture devices K1, K2 associated with the image information , As shown in FIG. 9, the robot controller 13 or the processing device 16 may be configured to move the robot arm 2 to an arm position, in which at least one of the image capture devices K1, K2 is located in a previously known stored position and orientation in the robot arm 2 Room arranged optical
- Marker 24 detected and generated by this optical marker 24 image data, and based on an evaluation of the generated ⁇ image data and the previously known, stored position and orientation of the optical marker 24 a momentary hinge position at least one of the joints of the robot arm 2 is calculated.
- the robot controller 13 or the processing device 16 can be configured to move the robot arm 2 into an arm position in which at least one of the image acquisition devices K1, K2 optically detects a tool handled by the robot arm 2 and generates image data from this tool and to the robot controller 13 or transmits the proces ⁇ processing device 16 for further evaluation.
- the robot controller 13 or the processing device 16 can be configured to move the robot arm 2 into an arm position in which at least one of the image acquisition devices devices Kl, K2, a workpiece to be processed by the robot arm 2 optically detected and generated image data from this workpiece and transmitted to the robot controller 13 or the processing device 16 for further evaluation.
- the robot controller can be designed 13 or the processing means 16 to move the robot arm 2 in at least one arm position, in which at least one of the image acquisition devices Kl, K2 the working space 17 of the industrial robot 1 at least partially stabilized or fully optically recorded and generated from this Ar ⁇ beitsraum 17 image data and transmitted to the robot controller 13 or the processing device 16 for further evaluation.
- the Robotersteue ⁇ tion may be formed 13 or the processing means 16 to move the robot arm 2 in at least one arm position, in which at least one of the image acquisition devices Kl, K2 a 1 from the working chamber 17 of the industrial robot Separate shelter 18 at least partially or fully ⁇ constantly optically detected by means of a reference ⁇ mark cube 25 with three orthogonal marks, which define two diagonally opposite corners of the shelter 18 defi ⁇ , and generated by this shelter 18 image data and the robot controller 13 or the Processing device 16 transmitted for further evaluation.
- the robot controller 13 or the processing means 16 is formed, the robot arm 2 to move in at least one arm ⁇ tellung in which at least capturing means one of the paintings- Kl, K2 a person or an identification device which is associated with a particular person, detected optically and therefrom Image data are generated and the image data from the robot controller 13 or the processing device 16 are compared with stored identification features.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016203701.3A DE102016203701A1 (en) | 2016-03-07 | 2016-03-07 | Industrial robot with at least two image capture devices |
PCT/EP2017/054240 WO2017153180A1 (en) | 2016-03-07 | 2017-02-23 | Industrial robot having at least two image acquisition devices |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3426444A1 true EP3426444A1 (en) | 2019-01-16 |
Family
ID=58185512
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17707515.7A Withdrawn EP3426444A1 (en) | 2016-03-07 | 2017-02-23 | Industrial robot having at least two image acquisition devices |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3426444A1 (en) |
DE (1) | DE102016203701A1 (en) |
WO (1) | WO2017153180A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6496335B2 (en) * | 2017-03-03 | 2019-04-03 | ファナック株式会社 | Robot system |
DE102017111885B4 (en) * | 2017-05-31 | 2019-06-27 | Sick Ag | Method and system for monitoring a machine |
JP6680752B2 (en) | 2017-11-28 | 2020-04-15 | ファナック株式会社 | Control device that limits the speed of the robot |
TWI711910B (en) * | 2018-03-19 | 2020-12-01 | 達明機器人股份有限公司 | Method for calibrating eye-to-hand camera of robot arm |
DE102018109329B4 (en) * | 2018-04-19 | 2019-12-05 | Gottfried Wilhelm Leibniz Universität Hannover | Multi-unit actuated kinematics, preferably robots, particularly preferably articulated robots |
DE102018208028A1 (en) * | 2018-05-23 | 2019-11-28 | Bayerische Motoren Werke Aktiengesellschaft | Robot device, tool for a robotic device and hand tool |
DE102019106423B3 (en) * | 2019-03-13 | 2020-06-25 | Sick Ag | Safety device for a robot |
DE102019217957A1 (en) * | 2019-11-21 | 2021-05-27 | Kuka Deutschland Gmbh | Method for braking control of at least one servo motor, robot and computer program product |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9803364D0 (en) * | 1998-02-18 | 1998-04-15 | Armstrong Healthcare Ltd | Improvements in or relating to a method of an apparatus for registering a robot |
JP3782679B2 (en) * | 2001-05-09 | 2006-06-07 | ファナック株式会社 | Interference avoidance device |
DE102005046838A1 (en) * | 2005-09-29 | 2007-04-05 | Atec Pharmatechnik Gmbh | Device for lifting and turning containers in a clean room environment |
DE102007055204B4 (en) | 2007-11-19 | 2010-04-08 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Robot, medical workstation, and method of projecting an image onto the surface of an object |
DE102008042261B4 (en) * | 2008-09-22 | 2018-11-15 | Robert Bosch Gmbh | Method for the flexible handling of objects with a handling device and an arrangement for a handling device |
DE102008063081B4 (en) * | 2008-12-24 | 2014-10-23 | Gottfried Wilhelm Leibniz Universität Hannover | Securing device and method for operating a multi-unit machine |
JP2010152664A (en) * | 2008-12-25 | 2010-07-08 | Nissei Corp | Sensorless motor-driven robot using image |
US9393694B2 (en) * | 2010-05-14 | 2016-07-19 | Cognex Corporation | System and method for robust calibration between a machine vision system and a robot |
DE102010025601A1 (en) * | 2010-06-30 | 2012-01-05 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method and device for determining structural parameters of a robot |
JP2013078825A (en) * | 2011-10-04 | 2013-05-02 | Yaskawa Electric Corp | Robot apparatus, robot system, and method for manufacturing workpiece |
CN111240269B (en) * | 2014-08-08 | 2023-12-08 | 机器人视觉科技股份有限公司 | Method and system for implementing sensor-based safety features for robotic units |
-
2016
- 2016-03-07 DE DE102016203701.3A patent/DE102016203701A1/en not_active Ceased
-
2017
- 2017-02-23 EP EP17707515.7A patent/EP3426444A1/en not_active Withdrawn
- 2017-02-23 WO PCT/EP2017/054240 patent/WO2017153180A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2017153180A1 (en) | 2017-09-14 |
DE102016203701A1 (en) | 2017-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3426444A1 (en) | Industrial robot having at least two image acquisition devices | |
DE102019009313B4 (en) | Robot control, method and computer program using augmented reality and mixed reality | |
DE102010045752B4 (en) | Visual perception system and method for a humanoid robot | |
DE112011101730B4 (en) | System and method for robust calibration between an image processing system and a robot | |
EP2227356B1 (en) | Method and system for extremely precise positioning of at least one object in the end position of a space | |
EP1702727B1 (en) | Production assembly with a bending press, a manipulation device and a calibration device | |
DE102010023736B4 (en) | Robot system with problem detection function | |
DE102017128543B4 (en) | NOISE ZONE ADJUSTMENT DEVICE FOR A MOBILE ROBOT | |
DE102017209178B4 (en) | Method for determining the spatial position of a moving coordinate system, a measuring point of its sensor or an operating point of a tool in a robot | |
EP3324362B1 (en) | Method and device for commissioning a multi-axis system | |
EP2012208B1 (en) | Programmable hand tool | |
DE102013017007B4 (en) | Robot with a end manipulator arm with end effector and method for determining a force and torque input to an end effector of a robot | |
EP3650740B1 (en) | Safety system and method for monitoring a machine | |
EP2199036A2 (en) | Method and device for compensating a kinematic deviation | |
DE102010007025A1 (en) | Method for monitoring manipulator area, particularly for robot, involves presetting geometric limit of area to be monitored and monitoring object within predetermined limit by environment sensing unit | |
DE102019212452A1 (en) | Interference avoidance device and robot system | |
DE102018117829A1 (en) | Control unit for articulated robots | |
EP2566667A1 (en) | Handheld device and method for controlling and/or programming a manipulator | |
DE102015104582A1 (en) | Method for calibrating a robot at a work area and system for carrying out the method | |
DE102018124595B4 (en) | Device for detecting a position and attitude of an end effector of a robot | |
EP1468792A2 (en) | Method for robot calibration | |
EP3323565A1 (en) | Method and device for commissioning a multiple axis system | |
DE102018109329B4 (en) | Multi-unit actuated kinematics, preferably robots, particularly preferably articulated robots | |
DE10048952B4 (en) | Method and device for recording unknown spatial points in a work cell of a robot | |
DE102015104587A1 (en) | Method for calibrating a robot at a work area and system for carrying out the method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180921 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210607 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20211019 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230528 |