DE102009001894B4 - Robot system with 3D camera - Google Patents

Robot system with 3D camera

Info

Publication number
DE102009001894B4
DE102009001894B4 DE102009001894.8A DE102009001894A DE102009001894B4 DE 102009001894 B4 DE102009001894 B4 DE 102009001894B4 DE 102009001894 A DE102009001894 A DE 102009001894A DE 102009001894 B4 DE102009001894 B4 DE 102009001894B4
Authority
DE
Germany
Prior art keywords
3d camera
camera
3d
articulated arm
robot system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
DE102009001894.8A
Other languages
German (de)
Other versions
DE102009001894A1 (en
Inventor
Björn Biehler
Florian Forster
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pmdtechnologies AG
Original Assignee
Pmdtechnologies AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pmdtechnologies AG filed Critical Pmdtechnologies AG
Priority to DE102009001894.8A priority Critical patent/DE102009001894B4/en
Publication of DE102009001894A1 publication Critical patent/DE102009001894A1/en
Application granted granted Critical
Publication of DE102009001894B4 publication Critical patent/DE102009001894B4/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

Robot system with at least one articulated arm (32), a manipulator (34), and a 3D camera (40), wherein the 3D camera determines distances based on a light transit time, and the light transit time information from the phase shift of an emitted and received radiation are obtained the 3D camera is arranged on an articulated arm (32) of the robot system, wherein a second 3D camera is provided in addition to the 3D camera (40) arranged on an articulated arm (32), and wherein the second 3D camera comprises at least one part of the work area, characterized in that the second 3D camera is associated with a cooperating robot.

Description

  • The invention relates to a robot system or a method for operating a robot system, in which a working area of the robot system is monitored by means of a movably arranged 3D camera.
  • From the EP 0 263 952 B1 already a robot system with movable manipulator arms is known, which is associated with an image recording and processing system of 2D and 3D sensor systems. The 2D sensor system is designed as a CCD camera and the 3D sensor system is designed as a laser rangefinder with a deflection unit as a laser radar. The two sensors are arranged on a central platform and monitor the workspace of the robot. The objects to be manipulated are first measured in size using the laser radar. This information sets the distance unit and focal length of the CCD camera to obtain a meaningful 2D image. Object recognition and robot control are based on a common evaluation of the 2D and 3D information.
  • From US 2008/0 082 213 A1 a robot system for the removal of workpieces is known, in which the position of the workpieces is detected in an overview image and in close-up the three-dimensional position and orientation of the objects is determined in more detail. For the close-ups, it is intended to arrange a camera on a robot arm. For capturing overview shots, the robotic arm is either placed in an overview position or a second stationary camera is provided.
  • As 3D camera systems in particular also light time cameras are known according to the PMD principle, as described in detail in the DE 197 04 496 A1 are described.
  • The disadvantage of a central observation of the working space is that to improve the resolution, the CCD camera is to be equipped with a zoom lens, for example. The resolution of the laser radar is limited here essentially by the mechanical precision of the deflection unit and can be improved only with considerable effort. In addition, a central viewing position has the disadvantage that structures behind hidden areas can not be detected.
  • The object of the invention is to further develop an image capture with regard to a more reliable object detection and more precise robot control.
  • The object is achieved in an advantageous manner by the device according to the invention and the method according to the invention of the independent claims.
  • Advantageously, a robot system with at least one articulated arm, a manipulator and a 3D camera is provided in which the 3D camera determines distances on the basis of a light transit time and the light transit time information is obtained from the phase shift of an emitted and received radiation. The 3D camera is arranged on an articulated arm of the robot system. This procedure has the advantage that the 3D camera arranged on the articulated arm can assume different observation positions in space and thus it is possible to completely grasp the spatial environment or the working area of the robot system even in details.
  • The object is further achieved by the method according to the invention for the said robot system, in which a global observation position is approached in a first step. The articulated arms are controlled in such a way that the 3D camera can essentially completely capture a working area of the robot system from the global observation position. The data collected in this position are evaluated and an initial 3D model of the workspace is created. Based on the acquired and evaluated 3D data, it is checked whether the workspace has hidden areas - areas that can not be seen by the global observation position. In particular, the 3D data can be examined with regard to unclear areas. Unclear areas here are preferably areas in which a structure and / or an object is not recognized unambiguously or only with low probability. In the presence of hidden and / or unclear areas, local observation positions are determined and approached, from which an insight into the hidden areas seems possible and / or a better resolution of the unclear areas is to be expected. The first 3D model created from the global observation situation is then supplemented by the 3D data of the local observation positions. This procedure has the advantage that covert or unclear areas can be analyzed in detail by approaching local observation positions, and the local observation data can be used to determine more precise path and control strategies for the robot system, in contrast to a fixed overview position.
  • The measures listed in the dependent claims advantageous refinements and improvements of the method and apparatus given in the independent claims are possible.
  • Furthermore, it is advantageous to arrange the 3D camera such that the manipulator of the robot system lies in the field of view of the 3D camera. Such an arrangement not only allows detection of the surrounding space, but also allows the operation of the manipulator to be monitored. In particular, it is possible - for example, in a gripper - to monitor the access to an object and possibly also to control by the movement of the gripper in space detected virtually continuously and can be returned to a desired position in deviation of the actual position.
  • In addition, the 3D camera can be arranged to be movable on the articulated arm, so that further observation directions of the 3D camera can be set without changing the position of the articulated arm. This is particularly advantageous when the articulated arm is in an area with less movement possibilities. The movable arrangement of the 3D camera thus allows detection of the three-dimensional space situation, particularly in narrow and possibly difficult-to-see areas.
  • In a further embodiment, it is provided to arrange the 3D camera within the articulated arm or the manipulator. Typically, an articulated arm is not solid, but constructed, for example, as a hollow tube or pipe construction. The 3D camera can now be arranged within such a construction. Such an arrangement has the advantage that the 3D camera is protected against external influences by the construction of the articulated arm. Since the manipulator is usually arranged at the end of the last articulated arm, the manipulator can also be detected when the 3D camera is installed inside the articulated arm in addition to the working area. Depending on the configuration of the manipulator, the 3D camera can also be part of the manipulator.
  • Furthermore, it is advantageous to provide a second 3D camera in addition to the 3D camera arranged on the articulated arm. This second 3D camera is arranged such that it detects at least a part of the working area of the robot system. The second 3D camera is preferably arranged stationary, for example on a stand which is either connected to the robot system or can also be set up separately.
  • Furthermore, it is also possible to arrange the 3D camera in a movable manner in order to be able to detect further solid angle ranges even from the stationary position. The data partially redundantly acquired via the second 3D camera advantageously also permit checking of the two systems for data consistency, with suitable error responses being able to be initiated in the event of excessive deviations.
  • Furthermore, it is advantageous to provide the cooperating robot with a 3D camera in cooperating robot systems. For example, the cooperating robots can monitor each other's movements and, if necessary, also control or regulate.
  • The method can also be configured advantageously by determining a first path, in particular first, unloading strategy on the basis of the first 3D model obtained in the global observation position. Furthermore, the first 3D model is supplemented or re-determined on the basis of the local 3D data.
  • Show it:
    • 1 schematically a robot system according to the invention in global observation position,
    • 2 a robotic system in local observation position,
    • 3 a robot system with separate gripping and camera arm,
    • 4 a robot system with a 3D camera disposed within the articulated arm,
    • 5 a movably arranged on a hinged 3D-camera.
  • 1 shows a possible embodiment of a robot system according to the invention. The robot has several articulated arms 32 and at the last articulated arm 32 a manipulator 34, in particular a gripper 34 for the manipulation of objects 21 . 22 . 23 on. The articulated arms 32 and the gripper 34 are typically freely movable in the possible spatial axes. The objects 21 . 22 . 23 are in the present example in a transport container 10 For example, a wooden box or a box that is partially open on a front. The direct view of the objects 21 . 22 . 23 is through the unopened areas 15 adjusted or covered.
  • On the last articulated arm 32 Furthermore, a 3D camera 40 is arranged. The last articulated arm 32 thus assumes in the example shown both the function of a manipulator or. gripper arm 33 as well as the function of a camera arm 35 ,
  • The 3D camera is preferably designed as a time-of-flight (TOF) camera. With TOF camera in particular, all 3D cameras should be included, which gain a runtime information from the phase shift of an emitted and received radiation. According to the invention as a camera in particular a so-called Photomix detector (PMD), as described inter alia in the applications and, for example, from E 196 37 822 C1, EP 1 777 747 A1 . US Pat. No. 6,587,186 B2 and also DE 197 04 496 A1 Company, ifm electronic gmbh 'is available as frame grabber O3D101 / M01594. The PMD camera has the advantage that light source and detector are arranged in a housing and due to this compactness can be easily attached to an articulated arm.
  • In the in 1 the situation shown is the robot system 30 in a global observation position. The on the gripper arm 33 arranged 3D camera 40 is preferably positioned and aligned in this situation so that the working area of the robot system can be fully detected. On the basis of the acquired 3D data, for example with the aid of an image processing and / or evaluation unit, a 3D model of the work area can be determined and / or the objects arranged in the work area 21 . 22 . 23 be recognized. Based on these data, further actions of the robot system can then be initiated.
  • In the example shown, the robot system could be used, for example, for the sorting of luggage. In the global observation position, the system captures the scene three-dimensionally and classifies the individual pieces of luggage via object recognition. On the basis of the acquired data, for example, an unloading strategy and a corresponding path planning of the gripper are determined. The gripper is hereafter to suitable gripping points of the objects 21 . 22 . 23 or pieces of luggage out and sorted the luggage in a predetermined manner.
  • In the case shown, from the global observation position only one gripping point 20 of the object 22 be recorded. The possible gripping points 20 the remaining objects 21 . 23 can not be recognized from the global observation position, a corresponding path planning can not be made in this respect.
  • For an efficient unloading strategy and path planning it would be helpful to include the hidden objects in the planning as well. According to the invention, it is therefore provided that local observations are carried out after the work area visible for the 3D camera has been detected from the global observation position. According to the invention, it is provided to determine from the data of the global observation position suitable local observation position at which information with high probability can be obtained in the initially hidden areas and which can be achieved without collision.
  • In 2 is exemplified such a possible local observation position. This position allows a three-dimensional capture of the originally hidden space area 15 and the objects in it 23 , Since the robot system is typically equipped with sensors for detecting the spatial position of its manipulator, in particular gripper or tool, the robotic system is also aware of the actual position of the 3D camera in the local observation position, so that the locally observed 3D data can easily be converted into the already known 3D model can be taken over in addition. By approaching further local observation positions, the original 3D model can be successively supplemented and completed.
  • In particular, it may also be provided in a further step to approach further sublocal observation positions, for example in order to examine grip positions more accurately and / or to determine suitable grip points. For example, it may also be provided that a first unloading strategy and preliminary gripping points are determined on the basis of the global and local 3D data. Instead of directly approaching the provisional grip point, however, a sublocal observation position is approached shortly before the provisional grip point. At this position, the gripping point is again recorded in detail, set the gripping strategy or control and finally initiated the gripping process.
  • In addition, by approaching local or sublocal observation positions, in particular unclear areas can be examined more closely. On the basis of the global observation data, corresponding local or sublocal observation positions are also determined here, which promise an improved view or information on or about the unclear area.
  • 3 shows a further embodiment in which the 3D camera 40 is not on the manipulator arm 33 but on a previous articulated arm 32 is arranged. Camera and manipulator arm 35 . 33 Although they are within limits but essentially independent of each other movable. Such an arrangement has the advantage that the 3D camera 40, for example, not only the object to be gripped 21 . 22 . 23 , but also the manipulator or gripper 34 recorded and so the approach path of the gripper 34 to the gripping point 20 can be completely monitored and controlled.
  • On the one hand, the object to be gripped can be reached via the 3D camera 22 and a suitable grip point 20 However, there is also the possibility of the gripper 34 to control relative to the gripping point. The relative measurement has the advantage that absolute errors can be averaged out and thus the gripper can be controlled more precisely. Of course, the coordinates of the objects in the 3D model can first be used for a path planning or unloading strategy.
  • Also in the in 3 In the embodiment shown, it is advantageous to first bring the 3D camera 40 into a global viewing position in order to record the entire working area. After that, as already described, various local or sublocal positions can be approached in order to complete the 3D model.
  • The 4 corresponds essentially to the arrangement according to 2 , In the in 4 However, it is provided to position the 3D camera not on a camera or gripper arm, but the 3D camera in the manipulator arm 34 to arrange yourself. This arrangement has the advantage that the position of the 3D camera substantially coincides with the position of the manipulator and the path of the manipulator or gripper 34 and the gripping process itself can be observed directly.
  • Of course, in all embodiments of the manipulator can be configured as a gripper or as another tool. For example, the non-exhaustive list robot could be equipped with a drilling, welding, milling, suction or filling device. Here, too, the working area can be completely captured by global and local observation positions of the 3D camera. In the global observation position, for example, the objects to be processed and in the local observation positions the processing or attack points of the respective objects can be detected and determined.
  • Furthermore, the 3D camera can be movably arranged on the articulated arm or manipulator or camera arm so that further observation directions of the 3D camera can be set without changing the position of the articulated arm. This allows a detection of the three-dimensional space situation, especially in narrow and possibly difficult to see areas in which the joint or camera arm can not be moved freely in all spatial axes. Preferably, one or more axes are provided for the pivoting movements of the camera in order to allow an orientation of the camera in different spatial areas. In particular, it can also be provided to arrange the camera on a rotary element. The rotation element is designed so that the 3D camera can be moved about the longitudinal axis of the articulated arm.
  • Such an arrangement is schematically shown in FIG 5 shown. The robot system essentially corresponds to the one in 2 shown system, with the 3D camera on the last articulated arm 32 arranged rotation element 46 in more camera positions 40 ' . 40 " . 40 '' -which can be shown dashed-brought. The rotation element 46 allows a rotational movement of the 3D camera about the longitudinal axis of the articulated arm, so that the 3D camera, for example, a position 40 ' can take below the articulated arm. Furthermore, it is provided the 3D camera via a rotary and a pivot axis 44 . 42 to connect with the rotation element. About the rotation axis 44 and the pivot axis 42 Can the 3D camera in more spatial directions 40 " . 40 '' to be moved.
  • In a further embodiment, it is provided to equip two cooperating robots each with a 3D camera. The 3D model of the work area can be obtained as advantageously, partially redundantly, from the information of both 3D cameras. In addition, the approach of the local observation positions can advantageously be distributed to both robots. Since the 3D data are partially redundant, conclusions about the accuracy and reliability of the overall system can be drawn from the deviations. If the deviations exceed a predetermined limit value, suitable error responses can be initiated. For example, it may be provided that the robot systems first perform a self-calibration based on suitable calibration marks and / or objects. If such a calibration is unsuccessful, further error responses, such as signaling, are initiated.
  • Furthermore, it may be provided to supplement the robot arm 3D camera described by a stationary 3D camera. While the stationary 3D camera captures the work area globally, the robotic arm 3D camera probes the work area locally. In particular, it may be provided to first detect the work area of the robot arm 3D camera in a global observation position, and to check with the redundant data of both camera systems their error-free function or to shorten the cycle time by omitting the start of the global observation position. In the latter case, the data of the stationary 3D camera are supplemented by the local data of the robot arm camera.

Claims (4)

  1. Robot system with at least one articulated arm (32), a manipulator (34), and a 3D camera (40), wherein the 3D camera detects distances based on a light transit time and the light transit time information is obtained from the phase shift of an emitted and received radiation, wherein the 3D camera is arranged on an articulated arm (32) of the robot system, wherein in addition to that on an articulated arm (32 ) arranged 3D camera (40) is provided a second 3D camera, and wherein the second 3D camera detects at least a part of the work area, characterized in that the second 3D camera is associated with a cooperating robot.
  2. Robot system after Claim 1 in which the 3D camera (40) is arranged such that the manipulator (34) lies in the field of view of the 3D camera (40).
  3. A robotic system as claimed in any one of the preceding claims, wherein the 3D camera (40) is movably mounted on the articulated arm (32).
  4. Robot system according to one of the preceding claims, wherein the 3D camera (40) within the articulated arm (32) or manipulator (34) is arranged.
DE102009001894.8A 2009-03-26 2009-03-26 Robot system with 3D camera Active DE102009001894B4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102009001894.8A DE102009001894B4 (en) 2009-03-26 2009-03-26 Robot system with 3D camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102009001894.8A DE102009001894B4 (en) 2009-03-26 2009-03-26 Robot system with 3D camera

Publications (2)

Publication Number Publication Date
DE102009001894A1 DE102009001894A1 (en) 2010-09-30
DE102009001894B4 true DE102009001894B4 (en) 2018-06-28

Family

ID=42663844

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102009001894.8A Active DE102009001894B4 (en) 2009-03-26 2009-03-26 Robot system with 3D camera

Country Status (1)

Country Link
DE (1) DE102009001894B4 (en)

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006031580A1 (en) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Method and device for the three-dimensional detection of a spatial area
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US8630314B2 (en) 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
DE112011100290T5 (en) 2010-01-20 2013-02-28 Faro Technologies Inc. Coordinate measuring machine with an illuminated probe end and operating method
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
JP2013539541A (en) 2010-09-08 2013-10-24 ファロ テクノロジーズ インコーポレーテッド Laser scanner or laser tracking device having a projector
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
WO2013188025A1 (en) * 2012-06-15 2013-12-19 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
GB2489651B (en) 2010-01-20 2015-01-28 Faro Tech Inc Coordinate measurement machines with removable accessories
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
DE112011100302T5 (en) 2010-01-20 2012-10-25 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with multiple communication channels
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
US8607536B2 (en) 2011-01-14 2013-12-17 Faro Technologies, Inc. Case for a device
FR2983761A1 (en) * 2011-12-07 2013-06-14 Centre Nat Recherche Microtechnic device for e.g. handling objects, has mobile assembly comprising base with head that is moved in rotation with respect to base, where image sensor is arranged on head of mobile assembly facing actuator
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102014005351A1 (en) * 2014-04-11 2015-10-15 Thyssenkrupp Ag Getriebeprüfvorrichtung
CN104669243B (en) * 2014-08-29 2017-09-12 北京精密机电控制设备研究所 Arrest with six degree of freedom construction machine arm in a kind of space
CN105478363A (en) * 2015-11-20 2016-04-13 苏州易瑞得电子科技有限公司 Defective product detection and classification method and system based on three-dimensional figures
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US777747A (en) * 1904-07-23 1904-12-20 Heyl & Patterson Car-haul.
EP0263952B1 (en) 1986-10-15 1992-12-23 Mercedes-Benz Ag Robot unit with moving manipulators
US5608847A (en) * 1981-05-11 1997-03-04 Sensor Adaptive Machines, Inc. Vision target based assembly
DE19704496A1 (en) 1996-09-05 1998-03-12 Rudolf Prof Dr Ing Schwarte Method and apparatus for determining the phase and / or amplitude information of an electromagnetic wave
DE19637822C1 (en) * 1996-09-17 1998-03-26 Deutsch Zentr Luft & Raumfahrt The micromechanical tool
US6587186B2 (en) 2000-06-06 2003-07-01 Canesta, Inc. CMOS-compatible three-dimensional image sensing using reduced peak energy
EP1442848A2 (en) * 2003-01-30 2004-08-04 Fanuc Ltd Robot hand for taking out objects with means for changing the orientation of the hand
EP1777747A1 (en) 2005-10-19 2007-04-25 CSEM Centre Suisse d'Electronique et de Microtechnique SA Device and method for the demodulation of modulated electromagnetic wave fields
EP1905548A2 (en) * 2006-09-29 2008-04-02 Fanuc Ltd Workpiece picking apparatus
DE102007026956A1 (en) * 2007-06-12 2008-12-18 Kuka Innotec Gmbh Method and system for robot-guided depalletizing of tires

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US777747A (en) * 1904-07-23 1904-12-20 Heyl & Patterson Car-haul.
US5608847A (en) * 1981-05-11 1997-03-04 Sensor Adaptive Machines, Inc. Vision target based assembly
EP0263952B1 (en) 1986-10-15 1992-12-23 Mercedes-Benz Ag Robot unit with moving manipulators
DE19704496A1 (en) 1996-09-05 1998-03-12 Rudolf Prof Dr Ing Schwarte Method and apparatus for determining the phase and / or amplitude information of an electromagnetic wave
DE19637822C1 (en) * 1996-09-17 1998-03-26 Deutsch Zentr Luft & Raumfahrt The micromechanical tool
US6587186B2 (en) 2000-06-06 2003-07-01 Canesta, Inc. CMOS-compatible three-dimensional image sensing using reduced peak energy
EP1442848A2 (en) * 2003-01-30 2004-08-04 Fanuc Ltd Robot hand for taking out objects with means for changing the orientation of the hand
EP1777747A1 (en) 2005-10-19 2007-04-25 CSEM Centre Suisse d'Electronique et de Microtechnique SA Device and method for the demodulation of modulated electromagnetic wave fields
EP1905548A2 (en) * 2006-09-29 2008-04-02 Fanuc Ltd Workpiece picking apparatus
US20080082213A1 (en) * 2006-09-29 2008-04-03 Fanuc Ltd Workpiece picking apparatus
DE102007026956A1 (en) * 2007-06-12 2008-12-18 Kuka Innotec Gmbh Method and system for robot-guided depalletizing of tires

Also Published As

Publication number Publication date
DE102009001894A1 (en) 2010-09-30

Similar Documents

Publication Publication Date Title
EP1420264B1 (en) Method and device for calibrating a measurement system
DE112011100296T5 (en) Multifunctional coordinate measuring machines
DE602005003147T2 (en) Measuring system
DE112014001459B4 (en) Method for determining three-dimensional coordinates on a surface of an object
US8244402B2 (en) Visual perception system and method for a humanoid robot
JP3805302B2 (en) Work take-out device
CN101152720B (en) Workpiece picking apparatus
JP5290324B2 (en) Method and system for accurately positioning at least one object in a final pose in space
US8875409B2 (en) Coordinate measurement machines with removable accessories
US20130230235A1 (en) Information processing apparatus and information processing method
US7171041B2 (en) Position-orientation recognition device
EP1521211B1 (en) Method and apparatus for determining the position and orientation of an image receiving device
US20160327383A1 (en) Three-dimensional measuring device removably coupled to robotic arm on motorized mobile platform
US20110087360A1 (en) Robot parts assembly on a workpiece moving on an assembly line
US7312862B2 (en) Measurement system for determining six degrees of freedom of an object
EP1537961A2 (en) Method and device for improving the positioning accuracy of effectors on mechanisms and for measuring objects in a work space
JP2010536600A (en) Robot arm and control system
US8466797B2 (en) Handheld device for infrared temperature measurement with simultaneous image and temperature display
EP2636493A2 (en) Information processing apparatus and information processing method
EP2364823B1 (en) Robot system and transfer method
US8619265B2 (en) Automatic measurement of dimensional data with a laser tracker
US7200260B1 (en) Teaching model generating device
DE102013018222B4 (en) Object-taking device and method for picking up an object
US7084900B1 (en) Image processing apparatus
WO2011013301A1 (en) Position and orientation calibration method and apparatus

Legal Events

Date Code Title Description
OM8 Search report available as to paragraph 43 lit. 1 sentence 1 patent law
R012 Request for examination validly filed
R081 Change of applicant/patentee

Owner name: PMDTECHNOLOGIES AG, DE

Free format text: FORMER OWNER: IFM ELECTRONIC GMBH, 45128 ESSEN, DE

R016 Response to examination communication
R018 Grant decision by examination section/examining division
R020 Patent grant now final
R081 Change of applicant/patentee

Owner name: PMDTECHNOLOGIES AG, DE

Free format text: FORMER OWNER: PMDTECHNOLOGIES AG, 57076 SIEGEN, DE

R082 Change of representative

Representative=s name: SCHUHMANN, JOERG, DIPL.-PHYS. DR. RER. NAT., DE