WO2009128104A1 - Manipulator and related operation - Google Patents

Manipulator and related operation Download PDF

Info

Publication number
WO2009128104A1
WO2009128104A1 PCT/IT2008/000264 IT2008000264W WO2009128104A1 WO 2009128104 A1 WO2009128104 A1 WO 2009128104A1 IT 2008000264 W IT2008000264 W IT 2008000264W WO 2009128104 A1 WO2009128104 A1 WO 2009128104A1
Authority
WO
WIPO (PCT)
Prior art keywords
manipulator
laser light
emitter
plane
emission plane
Prior art date
Application number
PCT/IT2008/000264
Other languages
French (fr)
Inventor
Luigi Carrioli
Original Assignee
Sea Vision S.R.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sea Vision S.R.L. filed Critical Sea Vision S.R.L.
Priority to PCT/IT2008/000264 priority Critical patent/WO2009128104A1/en
Publication of WO2009128104A1 publication Critical patent/WO2009128104A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37571Camera detecting reflected light from laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40613Camera, laser scanner on end effector, hand eye manipulator, local

Definitions

  • the present invention regards a manipulator and its operation, in particular a manipulator comprising an articulated mechanical arm and the related spatial positioning device.
  • manipulator it is intended to identify a set comprising an articulated and robotised mechanical arm having the function of moving objects from a first spatial position to a second spatial position.
  • the mechanical arm is formed by a plurality of elements, each of which is articulated to the preceding element; the first element is coupled to a base and the final element comprises a gripping element which directly interacts with the object to be manipulated.
  • each element corresponds with an axis of the arm. For this reason, one normally refers to the arm elements as "axes" of the manipulator. In the most common solutions, the axes of the manipulator can be two, three, four etc.. In the most complex solutions, for applications which require many degrees of freedom, the manipulator axes can be six.
  • manipulators are driven by electric motors based on the programs stored in the corresponding control units .
  • Manipulators are widely used in various industrial fields, and mainly in the mechanical sector, for example for moving pieces in the assembly line, and in the packaging industry, for example for loading/unloading objects from belt distributors.
  • the gripping point and the releasing point of the objects are fixed over time.
  • a manipulator draws an object of known form and orientation in a pre-established point of space and moves it to a second pre-established point of space, with the desired orientation.
  • the manipulator carries, out repeated actuation cycles without the aid of a positioning device. If it is necessary to modify at least one operative parameter selected from the gripping point of the object, the releasing point or the initial and final orientation of the object, the manipulator is not capable of working with parameters different from those previously set in the related actuation program. In other words, the program running in the control unit must be modified in order to take into account the new operating parameters. It is evident that manipulators of this type are not very flexible and are not suitable for operating in conditions which differ from those pre-established.
  • the manipulators are guided by positioning devices which calculate the distance of the gripping element from the object which must be moved.
  • the positioning device sends a signal to the manipulator control unit indicative of the distance between the gripping element and the object to be manipulated.
  • the control unit feedback operates the manipulator in space and over time, i.e. it controls the movement of one or more axes, so to obtain an effective gripping of the object, at the same time avoiding accidental collisions against the object.
  • the positioning devices are prearranged on the penultimate axis, i.e. on the element of the arm which directly sustains the final element provided with the gripping device.
  • a conventional positioning device comprises a pointer which emits a , laser light beam, directed to hit the object to be manipulated, and a receiver intended to receive the light diffused by the object illuminated with laser light.
  • the laser light beam can be a spot beam, or it can spread fanwise in the emission plane, i.e. it can be a sheet of laser light.
  • the distance of the gripping element from the object is calculated based on trigonometric calculations, or on the flight time, i.e. based on the time intervening between the laser beam emission and the reception of the light diffused by the object.
  • the positioning device does not allow for verifying the correct orientation of the object in space.
  • the control unit feedback- regulates the movement of the articulated elements, i.e. the axes, based on the calculated value of the aforesaid distance.
  • the manipulator is not capable of adapting its functioning if the spatial orientation of the object to be manipulated is different from that pre-established. If the object is misaligned with respect to the pre-defined initial position, the manipulator may not function correctly, for example it may collide against the object, or it may not be able to grasp it, etc. In these cases, it is often necessary to stop the manipulator and intervene manually in order to restore the normal working conditions, with evident negative effects on the productivity.
  • a further drawback of the conventional solutions relates to the difficult management of the references which have been calculated and/stored by the control unit.
  • the control unit regulates the movements of the axes and each time stores the references related to the position taken by each axis in space. Rotations of the final axis greater - A -
  • the technical problem underlying the present invention is to provide a versatile manipulator that is capable of autonomously adapting its operation to the changing working conditions .
  • the present invention regards a manipulator, characterised according to claim 1.
  • the manipulator comprises a plurality of articulated and motorised elements, and a positioning device which in turn comprises a first emitter, intended to emit a laser light sheet in a first emission plane in order to illuminate an object, and a receiver, intended to receive the light diffused by the illuminated object, and is characterised in that it comprises a second emitter intended to emit a laser light sheet in a second emission plane, incident to said first emission plane.
  • the manipulator according to the present invention is provided with a second laser emitter which permits to illuminate, with a second laser light sheet, the object to be manipulated.
  • the two laser light sheets are emitted in different, but incident (non-parallel) planes.
  • the images of the light diffused by the object, recorded by the receiver, are processed in order to carry out a virtual, two- or three-dimensional modelling of the object.
  • the manipulator operates in feedback, on the basis of this processing, in order to orient the final axis, or a gripping element of the manipulator.
  • the correct positioning of the final axis of the manipulator, or a gripping element thereof, orthogonal to the gripping surface of the object permits an optimal gripping of the object.
  • the manipulator is autonomously oriented so to bring itself into the correct gripping position, independent of the object orientation.
  • the laser light sheets of the first emitter and the second emitter are focused on the same focusing plane.
  • the focusing plane is provided next to the gripping element of the manipulator.
  • the focusing plane of the laser light beam can intercept the final axis of the manipulator, or it can intercept the gripping element or it may be situated far from the latter.
  • the receiver comprises an image acquisition device, for example a video-camera or a photographic machine.
  • the receiver comprises a digital camera. More preferably, the digital camera is of CCD type with resolution of about 1 Megapixel.
  • the laser light sheet emitted by the first emitter intercepts the focusing plane common to the two emitters at a line oriented vertically with respect to the receiver, i.e. vertical with respect to a reference system integral with the receiver.
  • the laser light sheet emitted by the second emitter intercepts the focusing plane at a line that is horizontal with respect to the receiver.
  • the images of the laser light diffused by the object and acquired by the receiver permit to compute, in the reference system integral with the observer, the spatial coordinates x, y and z of the illuminated points of the object, i.e. the coordinates with respect to the reference system of the video-camera, which is in turn integral with the final axis of the manipulator.
  • the first emitter, the second emitter and the receiver are all pre-arranged integral with the final articulated element of the manipulator, i.e. the final axis, or to a gripping element thereof. If the final articulated element of the manipulator or gripping element rotates around a longitudinal axis, the reference system also rotates in an integral manner and the processing of the acquired images is not affected by the rotation.
  • the optical axis of the image acquisition device is parallel to the longitudinal axis of the final articulated element of the arm, or related gripping element.
  • the digital video-camera is oriented with optical axis parallel to the final axis of the manipulator, in order to frame the area around the gripping element and at the focusing plane of the laser emitters.
  • the optical axis of the video-camera is parallel to the longitudinal axis of the final articulated element of the manipulator, or the related gripping element, even when this rotates.
  • the angle between the optical axis and the emission plane of the first emitter is selected within the range of 15° - 60°
  • the angle between the same optical axis and the emission plane of the second emitter is selected within the range of 15° - 60°. More preferably, both of the above angles are equal to about 30° and the optical axis of the video-camera is orthogonal to both the lines intercepted by the emitters on the common focusing plane .
  • the emitters are stroboscopic laser light generators.
  • the laser light generators are not continuously emitted but are instead emitted in an intermittent manner.
  • the gripping element of the manipulator is of mechanical type, for example pliers, or pneumatic, for example a suction cup, or magnetic.
  • the focusing plane of the emitters is orthogonal to the longitudinal axis of the final arm of the manipulator and coincides with the focusing plane of the receiver.
  • the manipulator can also comprise one or more sources of white light. Such sources have the function of illuminating the object to be manipulated in order to permit the acquisition of object details in poor lighting conditions, for example to permit the reading of labels, barcodes, colour codes etc., which are present on the surface of the object.
  • the manipulator comprises a control unit having the function of processing the images acquired by the video-camera and, based on this processing, managing the movements of one or more manipulator axes.
  • the present invention also regards a method for determining the correct positioning of the manipulator with respect to the object to be manipulated, characterised according to claim 13.
  • the method comprises the steps of projecting, on said object, a first laser light sheet emitted in a first emission plane and acquiring at least one image of the laser light diffused by said object, and is characterised by the further step of projecting, on said object, a second laser light sheet emitted in a second emission plane, incident on said first emission plane.
  • the laser light images diffused by the object are acquired in conditions of predominance of the laser light over the light present in the environment where the object is found, or in the dark.
  • the laser light emitted in the first emission plane intersects the object at a first profile of its outer surface and the laser light emitted in the second emission plane intersects the object at a second profile of its outer surface.
  • the first and second profiles intersect each other.
  • the lines of light diffused by the object at the first and second profile are incident on each other with an angle included in the range of 45° - 90°.
  • the lines of light diffused by the object at the first profile and second profile are orthogonal to each other and both are orthogonal to the optical recording axis of the images acquired by the video-camera.
  • a first image is acquired of the line of light diffused at the first profile and a second image is acquired of the light diffused at the second profile.
  • the control unit of the manipulator processes the first and the second image in order to extrapolate the spatial coordinates of at least part of the object.
  • the control unit uses the digital or digitalised images of the lines diffused by the object - lines which are always vertical and horizontal with respect to the video-camera, even if the final arm of the manipulator rotates - in order to extrapolate the spatial coordinates of different points of the object.
  • the processing leads in practice to determining the spatial orientation of the object in a reference system integral with the video-camera, and thus integral with the final axis of the manipulator.
  • the processing is carried out with numeric algorithms and preferably comprises the virtual two- or three-dimensional modelling of at least part of the object.
  • the control unit feedback operates one or more articulated elements of the manipulator in order to orthogonally orient the gripping element with respect to a gripping surface of the object.
  • the manipulator according to the present invention is particularly flexible, since it is autonomously adapted to functioning correctly with the change of working conditions, i.e. with the change of at least one operating parameter selected from among the object filming point, the unloading point or the initial and final object orientation.
  • FIG. 1 is a schematic, plan view of a manipulator according to the present invention.
  • FIG. 2 is a schematic, side view of the manipulator shown in figure 1;
  • FIG. 3 is a perspective view of a detail of a manipulator according to the present invention.
  • FIG. 4 is a bottom view of the detail shown in figure 3, in a first operating step of the manipulator;
  • FIG. 5 is a perspective view of the detail shown in figure 4.
  • FIG. 6 is a bottom view of the detail shown in figure 3, in a second operating step of the manipulator;
  • FIG. 7 is a perspective view of the detail shown in figure 6.
  • a manipulator 1 is shown according to the present invention comprising a mechanical arm 2 with several axes and the related positioning device 3.
  • each element 22 - 26 is articulated to the preceding element; the first element 21 is linked to a base 4, inside of which a motor is housed which can be actuated for the rotation of the same element 21 on the plane, as shown in figure 1.
  • each element 21 - 26 of the arm 2 is also indicated.
  • the articulated element 21 is rotatable an angle of 320° in a horizontal plane around the axis JTl;
  • the articulated element 22 is rotatable an angle of 210° in a vertical plane around the axis JT2
  • the articulated element 23 is rotatable an angle of 270° in the already mentioned vertical plane around the axis JT3
  • the articulated element 24 is rotatable an angle of 720° around its own longitudinal axis JT4
  • the articulated element 25 is rotatable an angle of 270° around the vertical axis JT5
  • the final articulated element 26 is rotatable an angle equal to 720° around its own longitudinal axis JT6.
  • the articulated elements 21 - 26 of the arm 2 are motorised, or each drivable in rotation around the corresponding axis JTl - JT6.
  • the manipulator 1 has the function of drawing an object from a point in space and releasing it at another point in space.
  • the initial orientation of the object may be known, i.e. pre-established, or it may be random.
  • the grip of the object is carried out by a gripping element 5 mounted on the final axis JT6, i.e. on the final articulated element 26 of the arm 2 .
  • the gripping element 5 can be of various type according to the tasks and (geometric and weight) characteristics of the object to be manipulated.
  • the gripping element 5 can be a pneumatically-driven, extendable suction cap, or mechanical pliers, or an electromagnet. Since it is integral with the final articulated element 26, the gripping element 5 rotates around the final axis JT6.
  • the positioning device 3 of the manipulator is mounted on the final axis JT6, i.e. it is integral with the corresponding articulated element 26.
  • the positioning device 3 generates signals that can be processed by the control unit of the manipulator 1 (not shown) for determining the spatial coordinates of at least part of the object to be manipulated.
  • the processed coordinates refer to the reference system x, y, z of the positioning device 3 and, therefore, refer to a reference system integral with the final axis JT6. With respect to the conventional solutions, this feature permits considerably simplifying the mathematical management of the processed coordinates.
  • Figure 3 is a perspective view of a detail of the manipulator 1 schematised in figures 1 and 2.
  • the articulated elements 24 - 26 and the positioning device 3 are shown in detail.
  • the gripping element is a pneumatically-driven suction cap 5, extendable along the axis JT6 and rotatable around the same axis.
  • the positioning device 3 is integrally anchored to the final articulated element 26 by means of an anchoring bracket 261. Alternatively, the device 3 is integrally anchored to the gripping element 5.
  • the positioning element 3 comprises a first emitter 31, intended to emit a laser light sheet in a first emission plane in order to illuminate an object, and a receiver 33, intended to receive the light diffused by the illuminated object.
  • the positioning device 3 comprises a second emitter 32 intended to emit a laser light sheet in a second emission plane, incident on the first emission plane (i.e. the planes are. parallel to each other) .
  • the first and the second emitter 31, 32 are stroboscopic laser light sources, i.e. they emit intermittent laser light sheets.
  • a generator adapted for use as emitter 31 or 32 emits laser light with frequency equal to 675 nm and power equal to about 3 mW.
  • the lens of the laser generator is preferably of cylindrical type and permits broadening the laser light beam in an emission plane by an opening angle of about 105°.
  • the receiver 33 is any one device type adapted to record images of the light diffused by the object hit with the laser light sheets.
  • the receiver 33 is a camera or a video-camera, adapted to record single photograms or films of the area around the gripping element 5.
  • the receiver is a digital video-camera.
  • the resolution of the video-camera is selected based on the final applications of the manipulator. As an example, if the manipulator is used for moving medicine vials, the video-camera 33 has a resolution of about 1 Megapixel. Video-cameras can also be used with different resolutions, keeping in mind that, with the other conditions the same, the increase of digital video-camera 33 resolution corresponds with the increase of the time necessary for processing the images acquired therefrom.
  • the first emitter 31 is oriented so to emit a laser light sheet in an emission plane parallel to the axis y indicated in figure 3, that is a vertical axis with respect to the video-camera 33.
  • the second emitter 32 is oriented so to emit a laser light sheet in an emission plane parallel to the axis x, that is a horizontal axis with respect to the 33.
  • the reference system x, y, z of the video-camera 33 rotates and translates integrally with the final articulated element 26 of the arm 2 and the axis z is parallel to the axis JT6.
  • the laser light sheets emitted by the two emitters 31, 32 are focused on the same focusing plane.
  • the focusing plane can be provided for far from the final articulated element 26, or close to the video-camera 33, but preferably it is provided at the gripping element 5 and more preferably at about 40 mm from the suction cap 5.
  • the laser light sheet emitted by the first emitter 31 intercepts the focusing plane at a line 310 that is vertical with respect to the video-camera 33, i.e. parallel to the axis y, and the laser light sheet emitted by the second emitter 32 intercepts the focusing plane at a line 320 that is horizontal with respect to the video-camera 33, i.e. parallel to the axis x.
  • Figures 4 and 5 show a detail of the manipulator 1 in a first step during its operation.
  • the arm 2 is driven so to bring the gripping element 5 near the object V to be manipulated.
  • the object to be moved is a vial V of a medicine, but alternatively it could be different, for example a metal or plastic piece to feed to a work station, an enveloped letter, a package, etc.
  • the initial orientation of the vial V is random.
  • Figure 4 is a bottom view of the articulated elements 24 - 26 and figure 5 is a perspective view in the same functioning step of the manipulator 1.
  • the positioning device 3 is shown without cover elements.
  • the emitters 31, 32 and the optics of the video-camera 33 are visible.
  • the emitter 32 generates a laser light sheet 321 in the related emission plane.
  • the sheet 321 intercepts the vial V at a first profile 322 of its outer surface, i.e. it illuminates the surface of the vial V along a first line 322.
  • the emission of the laser light sheet 321 occurs with predominance of the laser light over the ambient light, preferably it occurs in the dark.
  • the video-camera 33 records at least one photogram of the vial V, acquiring one or more images of the light diffused by its surface, which has been hit with the light sheet 321 aligned with the axis x.
  • the acquired images are processed by a control unit (not shown) of the manipulator 1.
  • the processing of an acquired image permits determining the coordinates of the surface of the vial V hit by the light sheet 321, along the axis x.
  • Figures 6 and 7 show a detail of the manipulator 1 in a second step during its functioning.
  • the emitter 31 generates a laser light sheet 311 in the related emission plane.
  • the sheet 311 intercepts the vial V at a second profile 312 of its outer surface, i.e. it illuminates the surface of the vial V along a second line 312.
  • the emission of the laser light sheet 311 occurs with a predominance of the laser light over the ambient light, preferably in the dark.
  • the video-camera 33 records at least one photogram of the vial V, acquiring one or more images of the light diffused by its surface, which has been hit with the light sheet 311 aligned with the axis y.
  • the acquired images are processed by a control unit (not shown) of the manipulator 1.
  • the processing of an acquired image permits determining the coordinates of the surface of the vial V hit by the light sheet 311, along the axis y.
  • the profiles 312 and 322 illuminated by the two laser light sheets 311, 321 intersect each other with an angle in the range of 45° - 90°.
  • the profiles 312 and 322 are recorded and processed by a control unit in order to extrapolate a virtual and three-dimensional modelling of the vial V.
  • the control unit processes the images in order .to complete a virtual two-dimensional 2D but preferably three-dimensional 3D modelling of the vial V, i.e. a model of which the coordinates of its points in the reference system x, y, z are known or can be calculated.
  • the processing of the images acquired by the video-camera 33 is carried out by the control unit based on stored numeric algorithms .
  • the laser light sheets 311, 321 are simultaneously generated and the images acquired by the video-camera 33 are related to both profiles of the vial V.
  • the manipulator 1 comprises lighting means of the vial V with white light, for example LED 6 that can be supplied so to hit the vial V with white light.
  • the LED 6 are fixed to the support bracket of the video-camera 33 (figure 3) .
  • the control unit processes at least one image of the vial V acquired by the video-camera 33 when the LED 6 are activated.
  • the white light image permits determining the coordinates of the points of the surface of the same vial V along the axis z, and thus permits completing the 3D modelling of the vial V.
  • the manipulator 1 functions via feedback so to adapt the orientation of its gripping element 5 to the orientation of the vial V, as seen from the virtual modelling.
  • the control unit drives one or more articulated elements 21 - 26 of the arm 2 so to bring the gripping element 5, initially drawn near the vial V, into an operative position orthogonal to a grip surface of the vial V, for example orthogonal to the closure cap.
  • the white light images of the vial V can be used by the control unit for the recognition or reading of labels, barcodes, colour codes, etc., possibly present on the vial V, or for the quality control of the surface of the vial V.
  • the movements which one or more articulated elements 21 - 26 of the arm 2 must undergo in order to bring the gripping element 5 into the aforesaid operating position affect the orientation of the reference system x, y, z of the video- camera 33.
  • the functionality of the manipulator 1 is not affected by the movements of such reference system, since the video-camera 33 and the positioning device 3 are both integral with the final axis JT6.
  • the laser light sheets 311, 321 are emitted by the sources 31, 32 integral with the final articulated element 26 of the arm 2 of the manipulator 1 and the images of the light diffused by the vial V are acquired with the video-camera 33, it too integral with the final articulated element 26. Since it does not have to provide for rotations of the reference system x, y, z with respect to the laser light sheets 311, 321, the mathematical management of the point coordinates of the vial V determined by the control unit is simplified to the utmost degree.
  • the angle oc between the optical axis O of the video-camera 33 (figures 4 and 6) and the emission plane of the first emitter 31 is selected in the range of 15° - 60°; more preferably such angle is equal to 30°.
  • the angle ⁇ between the same optical axis O and the emission plane of the second emitter 32 is selected in the range of 15° - 60° and is preferably equal to 30°.
  • the optical axis of the video-camera 33 is orthogonal to both the lines 310, 320 on the common focusing plane.
  • the manipulator 1 Before entering into functioning mode, the manipulator 1 is preferably subjected to a calibration step. This procedure permits minimising or cancelling all of the positioning errors due to an imprecise assembly of the mechanical parts of the arm 2.
  • the calibration step provides for the acquisition of several images of a black dot matrix (of known geometry) printed on a sheet; two images are acquired with the activated emitters 31 and 32; a third image is acquired with the activated LED 6.
  • the images are sequentially acquired and at different distances of the positioning device 3 from the sheet, along the axis z.
  • the acquired images are processed by the control unit, which determines the correction factors to be applied to the movements of the articulated elements of the arm 2 for the manipulation (approaching and orientation with respect to the piece) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present invention regards a manipulator (1) comprising a plurality of articulated and motorised elements (21-26), and a positioning device (3) in turn comprising a first emitter (31), intended to emit a laser light sheet (311) in a first emission plane in order to illuminate an object (V), and a receiver (33), intended to receive the laser light diffused by the illuminated object (V). Advantageously, the manipulator (1) comprises a second emitter (32) intended to emit a laser light sheet (321) in a second emission plane, incident on said first emission plane. The profiles of the object (V) illuminated by the two laser light sheets (311, 321) are recorded and processed in order to extrapolate a virtual and three-dimensional modelling of the object (V). The manipulator (1) operates in feedback in order to adapt the orientation of its gripping element (5) to the orientation of the object (V), as seen in the 3D virtual modelling.

Description

Manipulator and related operation
DESCRIPTION
The present invention regards a manipulator and its operation, in particular a manipulator comprising an articulated mechanical arm and the related spatial positioning device.
For the purpose of the present description, with the term "manipulator" it is intended to identify a set comprising an articulated and robotised mechanical arm having the function of moving objects from a first spatial position to a second spatial position. The mechanical arm is formed by a plurality of elements, each of which is articulated to the preceding element; the first element is coupled to a base and the final element comprises a gripping element which directly interacts with the object to be manipulated.
Each element corresponds with an axis of the arm. For this reason, one normally refers to the arm elements as "axes" of the manipulator. In the most common solutions, the axes of the manipulator can be two, three, four etc.. In the most complex solutions, for applications which require many degrees of freedom, the manipulator axes can be six.
In line with the typical terminology of the technical sector of the invention, in the following present description, the articulated elements of the mechanical arm will also be identified with the expression "manipulator axes".
Traditionally, manipulators are driven by electric motors based on the programs stored in the corresponding control units .
Manipulators are widely used in various industrial fields, and mainly in the mechanical sector, for example for moving pieces in the assembly line, and in the packaging industry, for example for loading/unloading objects from belt distributors.
In the most common applications, the gripping point and the releasing point of the objects are fixed over time. In other words, a manipulator draws an object of known form and orientation in a pre-established point of space and moves it to a second pre-established point of space, with the desired orientation. In this type of applications, the manipulator carries, out repeated actuation cycles without the aid of a positioning device. If it is necessary to modify at least one operative parameter selected from the gripping point of the object, the releasing point or the initial and final orientation of the object, the manipulator is not capable of working with parameters different from those previously set in the related actuation program. In other words, the program running in the control unit must be modified in order to take into account the new operating parameters. It is evident that manipulators of this type are not very flexible and are not suitable for operating in conditions which differ from those pre-established.
In other applications, the manipulators are guided by positioning devices which calculate the distance of the gripping element from the object which must be moved. The positioning device sends a signal to the manipulator control unit indicative of the distance between the gripping element and the object to be manipulated. The control unit feedback operates the manipulator in space and over time, i.e. it controls the movement of one or more axes, so to obtain an effective gripping of the object, at the same time avoiding accidental collisions against the object.
Usually, the positioning devices are prearranged on the penultimate axis, i.e. on the element of the arm which directly sustains the final element provided with the gripping device. A conventional positioning device comprises a pointer which emits a , laser light beam, directed to hit the object to be manipulated, and a receiver intended to receive the light diffused by the object illuminated with laser light. The laser light beam can be a spot beam, or it can spread fanwise in the emission plane, i.e. it can be a sheet of laser light. The distance of the gripping element from the object is calculated based on trigonometric calculations, or on the flight time, i.e. based on the time intervening between the laser beam emission and the reception of the light diffused by the object. Disadvantageously, the positioning device does not allow for verifying the correct orientation of the object in space. In other words, the control unit feedback- regulates the movement of the articulated elements, i.e. the axes, based on the calculated value of the aforesaid distance. The manipulator is not capable of adapting its functioning if the spatial orientation of the object to be manipulated is different from that pre-established. If the object is misaligned with respect to the pre-defined initial position, the manipulator may not function correctly, for example it may collide against the object, or it may not be able to grasp it, etc. In these cases, it is often necessary to stop the manipulator and intervene manually in order to restore the normal working conditions, with evident negative effects on the productivity.
A further drawback of the conventional solutions relates to the difficult management of the references which have been calculated and/stored by the control unit. In addition to the distance of the gripping element from the object, the control unit regulates the movements of the axes and each time stores the references related to the position taken by each axis in space. Rotations of the final axis greater - A -
than 360° are difficult to manage by the control unit in mathematical terms and often cause positioning errors.
The technical problem underlying the present invention is to provide a versatile manipulator that is capable of autonomously adapting its operation to the changing working conditions .
In a first aspect thereof, the present invention regards a manipulator, characterised according to claim 1.
In particular, the manipulator comprises a plurality of articulated and motorised elements, and a positioning device which in turn comprises a first emitter, intended to emit a laser light sheet in a first emission plane in order to illuminate an object, and a receiver, intended to receive the light diffused by the illuminated object, and is characterised in that it comprises a second emitter intended to emit a laser light sheet in a second emission plane, incident to said first emission plane.
Advantageously, the manipulator according to the present invention is provided with a second laser emitter which permits to illuminate, with a second laser light sheet, the object to be manipulated. The two laser light sheets are emitted in different, but incident (non-parallel) planes. The images of the light diffused by the object, recorded by the receiver, are processed in order to carry out a virtual, two- or three-dimensional modelling of the object. The manipulator operates in feedback, on the basis of this processing, in order to orient the final axis, or a gripping element of the manipulator. The correct positioning of the final axis of the manipulator, or a gripping element thereof, orthogonal to the gripping surface of the object, permits an optimal gripping of the object. Advantageously, the manipulator is autonomously oriented so to bring itself into the correct gripping position, independent of the object orientation.
According to a preferred embodiment of the invention, the laser light sheets of the first emitter and the second emitter are focused on the same focusing plane. The focusing plane is provided next to the gripping element of the manipulator. As- an example, the focusing plane of the laser light beam can intercept the final axis of the manipulator, or it can intercept the gripping element or it may be situated far from the latter.
The receiver comprises an image acquisition device, for example a video-camera or a photographic machine. Preferably, the receiver comprises a digital camera. More preferably, the digital camera is of CCD type with resolution of about 1 Megapixel.
The laser light sheet emitted by the first emitter intercepts the focusing plane common to the two emitters at a line oriented vertically with respect to the receiver, i.e. vertical with respect to a reference system integral with the receiver. The laser light sheet emitted by the second emitter intercepts the focusing plane at a line that is horizontal with respect to the receiver. In this manner, the images of the laser light diffused by the object and acquired by the receiver permit to compute, in the reference system integral with the observer, the spatial coordinates x, y and z of the illuminated points of the object, i.e. the coordinates with respect to the reference system of the video-camera, which is in turn integral with the final axis of the manipulator.
Preferably, the first emitter, the second emitter and the receiver are all pre-arranged integral with the final articulated element of the manipulator, i.e. the final axis, or to a gripping element thereof. If the final articulated element of the manipulator or gripping element rotates around a longitudinal axis, the reference system also rotates in an integral manner and the processing of the acquired images is not affected by the rotation.
Preferably, the optical axis of the image acquisition device is parallel to the longitudinal axis of the final articulated element of the arm, or related gripping element. In practice, the digital video-camera is oriented with optical axis parallel to the final axis of the manipulator, in order to frame the area around the gripping element and at the focusing plane of the laser emitters. The optical axis of the video-camera is parallel to the longitudinal axis of the final articulated element of the manipulator, or the related gripping element, even when this rotates.
According to a preferred embodiment of the present invention, the angle between the optical axis and the emission plane of the first emitter is selected within the range of 15° - 60°, and the angle between the same optical axis and the emission plane of the second emitter is selected within the range of 15° - 60°. More preferably, both of the above angles are equal to about 30° and the optical axis of the video-camera is orthogonal to both the lines intercepted by the emitters on the common focusing plane .
Preferable, the emitters are stroboscopic laser light generators. In other words, the laser light generators are not continuously emitted but are instead emitted in an intermittent manner.
• According to a preferred embodiment of the present invention, the gripping element of the manipulator is of mechanical type, for example pliers, or pneumatic, for example a suction cup, or magnetic. The focusing plane of the emitters is orthogonal to the longitudinal axis of the final arm of the manipulator and coincides with the focusing plane of the receiver. In addition to the laser generators, the manipulator can also comprise one or more sources of white light. Such sources have the function of illuminating the object to be manipulated in order to permit the acquisition of object details in poor lighting conditions, for example to permit the reading of labels, barcodes, colour codes etc., which are present on the surface of the object.
As will be described in the following description, the manipulator comprises a control unit having the function of processing the images acquired by the video-camera and, based on this processing, managing the movements of one or more manipulator axes.
In a second aspect thereof, the present invention also regards a method for determining the correct positioning of the manipulator with respect to the object to be manipulated, characterised according to claim 13.
In particular, the method comprises the steps of projecting, on said object, a first laser light sheet emitted in a first emission plane and acquiring at least one image of the laser light diffused by said object, and is characterised by the further step of projecting, on said object, a second laser light sheet emitted in a second emission plane, incident on said first emission plane.
Preferably, the laser light images diffused by the object are acquired in conditions of predominance of the laser light over the light present in the environment where the object is found, or in the dark.
In a positioning cycle of the manipulator, the laser light emitted in the first emission plane intersects the object at a first profile of its outer surface and the laser light emitted in the second emission plane intersects the object at a second profile of its outer surface. The first and second profiles intersect each other.
Preferably, the lines of light diffused by the object at the first and second profile are incident on each other with an angle included in the range of 45° - 90°.
More preferably, the lines of light diffused by the object at the first profile and second profile are orthogonal to each other and both are orthogonal to the optical recording axis of the images acquired by the video-camera. In particular, a first image is acquired of the line of light diffused at the first profile and a second image is acquired of the light diffused at the second profile. The control unit of the manipulator processes the first and the second image in order to extrapolate the spatial coordinates of at least part of the object. In other words, the control unit uses the digital or digitalised images of the lines diffused by the object - lines which are always vertical and horizontal with respect to the video-camera, even if the final arm of the manipulator rotates - in order to extrapolate the spatial coordinates of different points of the object. The processing leads in practice to determining the spatial orientation of the object in a reference system integral with the video-camera, and thus integral with the final axis of the manipulator.
The processing is carried out with numeric algorithms and preferably comprises the virtual two- or three-dimensional modelling of at least part of the object. Once the spatial orientation of the object is determined, the control unit feedback operates one or more articulated elements of the manipulator in order to orthogonally orient the gripping element with respect to a gripping surface of the object. The manipulator according to the present invention is particularly flexible, since it is autonomously adapted to functioning correctly with the change of working conditions, i.e. with the change of at least one operating parameter selected from among the object filming point, the unloading point or the initial and final object orientation.
Further characteristics and advantages of the invention will now be illustrated with reference to the schematic drawings attached as a non-limiting example, wherein:
- Figure 1 is a schematic, plan view of a manipulator according to the present invention;
- Figure 2 is a schematic, side view of the manipulator shown in figure 1;
- Figure 3 is a perspective view of a detail of a manipulator according to the present invention;
- Figure 4 is a bottom view of the detail shown in figure 3, in a first operating step of the manipulator;
- Figure 5 is a perspective view of the detail shown in figure 4;
- Figure 6 is a bottom view of the detail shown in figure 3, in a second operating step of the manipulator;
- Figure 7 is a perspective view of the detail shown in figure 6.
With reference to figures 1 and 2, a manipulator 1 is shown according to the present invention comprising a mechanical arm 2 with several axes and the related positioning device 3.
There can be more than two axes of the mechanical arm 2, but preferably there are six axes, conventionally indicated with JTl, JT2, JT3, JT4, JT5 and JT6. The corresponding articulated elements of the mechanical arm 2 are indicated with the reference numbers 21 - 26. In general, the articulated elements of the arm 2 can have shapes different from that shown in figures 1 and 2 and can all be motorised or in any case separately drivable from the other articulated elements. Each element 22 - 26 is articulated to the preceding element; the first element 21 is linked to a base 4, inside of which a motor is housed which can be actuated for the rotation of the same element 21 on the plane, as shown in figure 1.
In figures 1 and 2, the degrees of rotation and dimensions for each element 21 - 26 of the arm 2 are also indicated. In particular, the articulated element 21 is rotatable an angle of 320° in a horizontal plane around the axis JTl; the articulated element 22 is rotatable an angle of 210° in a vertical plane around the axis JT2, the articulated element 23 is rotatable an angle of 270° in the already mentioned vertical plane around the axis JT3, the articulated element 24 is rotatable an angle of 720° around its own longitudinal axis JT4, the articulated element 25 is rotatable an angle of 270° around the vertical axis JT5 and the final articulated element 26 is rotatable an angle equal to 720° around its own longitudinal axis JT6.
The articulated elements 21 - 26 of the arm 2 are motorised, or each drivable in rotation around the corresponding axis JTl - JT6.
The manipulator 1 has the function of drawing an object from a point in space and releasing it at another point in space. The initial orientation of the object may be known, i.e. pre-established, or it may be random. The grip of the object is carried out by a gripping element 5 mounted on the final axis JT6, i.e. on the final articulated element 26 of the arm 2 .
The gripping element 5 can be of various type according to the tasks and (geometric and weight) characteristics of the object to be manipulated. For example, the gripping element 5 can be a pneumatically-driven, extendable suction cap, or mechanical pliers, or an electromagnet. Since it is integral with the final articulated element 26, the gripping element 5 rotates around the final axis JT6.
Advantageously, also the positioning device 3 of the manipulator is mounted on the final axis JT6, i.e. it is integral with the corresponding articulated element 26. The positioning device 3 generates signals that can be processed by the control unit of the manipulator 1 (not shown) for determining the spatial coordinates of at least part of the object to be manipulated. The processed coordinates refer to the reference system x, y, z of the positioning device 3 and, therefore, refer to a reference system integral with the final axis JT6. With respect to the conventional solutions, this feature permits considerably simplifying the mathematical management of the processed coordinates.
Figure 3 is a perspective view of a detail of the manipulator 1 schematised in figures 1 and 2. In particular, the articulated elements 24 - 26 and the positioning device 3 are shown in detail. In the shown embodiment, the gripping element is a pneumatically-driven suction cap 5, extendable along the axis JT6 and rotatable around the same axis.
The positioning device 3 is integrally anchored to the final articulated element 26 by means of an anchoring bracket 261. Alternatively, the device 3 is integrally anchored to the gripping element 5. The positioning element 3 comprises a first emitter 31, intended to emit a laser light sheet in a first emission plane in order to illuminate an object, and a receiver 33, intended to receive the light diffused by the illuminated object. Advantageously, the positioning device 3 comprises a second emitter 32 intended to emit a laser light sheet in a second emission plane, incident on the first emission plane (i.e. the planes are. parallel to each other) .
Preferably, the first and the second emitter 31, 32 are stroboscopic laser light sources, i.e. they emit intermittent laser light sheets. As an example, a generator adapted for use as emitter 31 or 32 emits laser light with frequency equal to 675 nm and power equal to about 3 mW. The lens of the laser generator is preferably of cylindrical type and permits broadening the laser light beam in an emission plane by an opening angle of about 105°.
The receiver 33 is any one device type adapted to record images of the light diffused by the object hit with the laser light sheets. Preferably, the receiver 33 is a camera or a video-camera, adapted to record single photograms or films of the area around the gripping element 5. According to a preferred embodiment of the present invention, the receiver is a digital video-camera. The resolution of the video-camera is selected based on the final applications of the manipulator. As an example, if the manipulator is used for moving medicine vials, the video-camera 33 has a resolution of about 1 Megapixel. Video-cameras can also be used with different resolutions, keeping in mind that, with the other conditions the same, the increase of digital video-camera 33 resolution corresponds with the increase of the time necessary for processing the images acquired therefrom.
According to a preferred embodiment of the present invention, the first emitter 31 is oriented so to emit a laser light sheet in an emission plane parallel to the axis y indicated in figure 3, that is a vertical axis with respect to the video-camera 33. The second emitter 32 is oriented so to emit a laser light sheet in an emission plane parallel to the axis x, that is a horizontal axis with respect to the 33. The reference system x, y, z of the video-camera 33 rotates and translates integrally with the final articulated element 26 of the arm 2 and the axis z is parallel to the axis JT6.
Preferably, the laser light sheets emitted by the two emitters 31, 32 are focused on the same focusing plane. The focusing plane can be provided for far from the final articulated element 26, or close to the video-camera 33, but preferably it is provided at the gripping element 5 and more preferably at about 40 mm from the suction cap 5.
The laser light sheet emitted by the first emitter 31 intercepts the focusing plane at a line 310 that is vertical with respect to the video-camera 33, i.e. parallel to the axis y, and the laser light sheet emitted by the second emitter 32 intercepts the focusing plane at a line 320 that is horizontal with respect to the video-camera 33, i.e. parallel to the axis x.
Figures 4 and 5 show a detail of the manipulator 1 in a first step during its operation. The arm 2 is driven so to bring the gripping element 5 near the object V to be manipulated. In the example shown in figures 4 and 5, the object to be moved is a vial V of a medicine, but alternatively it could be different, for example a metal or plastic piece to feed to a work station, an enveloped letter, a package, etc. The initial orientation of the vial V is random. Figure 4 is a bottom view of the articulated elements 24 - 26 and figure 5 is a perspective view in the same functioning step of the manipulator 1. The positioning device 3 is shown without cover elements. The emitters 31, 32 and the optics of the video-camera 33 are visible.
The emitter 32 generates a laser light sheet 321 in the related emission plane. The sheet 321 intercepts the vial V at a first profile 322 of its outer surface, i.e. it illuminates the surface of the vial V along a first line 322. The emission of the laser light sheet 321 occurs with predominance of the laser light over the ambient light, preferably it occurs in the dark. The video-camera 33 records at least one photogram of the vial V, acquiring one or more images of the light diffused by its surface, which has been hit with the light sheet 321 aligned with the axis x. The acquired images are processed by a control unit (not shown) of the manipulator 1. The processing of an acquired image permits determining the coordinates of the surface of the vial V hit by the light sheet 321, along the axis x.
Figures 6 and 7 show a detail of the manipulator 1 in a second step during its functioning. With respect to the first step described above, the arm 2 has not been moved. The emitter 31 generates a laser light sheet 311 in the related emission plane. The sheet 311 intercepts the vial V at a second profile 312 of its outer surface, i.e. it illuminates the surface of the vial V along a second line 312. The emission of the laser light sheet 311 occurs with a predominance of the laser light over the ambient light, preferably in the dark. The video-camera 33 records at least one photogram of the vial V, acquiring one or more images of the light diffused by its surface, which has been hit with the light sheet 311 aligned with the axis y. The acquired images are processed by a control unit (not shown) of the manipulator 1. The processing of an acquired image permits determining the coordinates of the surface of the vial V hit by the light sheet 311, along the axis y.
Preferably, the profiles 312 and 322 illuminated by the two laser light sheets 311, 321 intersect each other with an angle in the range of 45° - 90°. The profiles 312 and 322 are recorded and processed by a control unit in order to extrapolate a virtual and three-dimensional modelling of the vial V. In other words, the control unit processes the images in order .to complete a virtual two-dimensional 2D but preferably three-dimensional 3D modelling of the vial V, i.e. a model of which the coordinates of its points in the reference system x, y, z are known or can be calculated.
The processing of the images acquired by the video-camera 33 is carried out by the control unit based on stored numeric algorithms .
Alternatively, the laser light sheets 311, 321 are simultaneously generated and the images acquired by the video-camera 33 are related to both profiles of the vial V.
Preferably, the manipulator 1 comprises lighting means of the vial V with white light, for example LED 6 that can be supplied so to hit the vial V with white light. The LED 6 are fixed to the support bracket of the video-camera 33 (figure 3) . The control unit processes at least one image of the vial V acquired by the video-camera 33 when the LED 6 are activated. The white light image permits determining the coordinates of the points of the surface of the same vial V along the axis z, and thus permits completing the 3D modelling of the vial V.
The manipulator 1 functions via feedback so to adapt the orientation of its gripping element 5 to the orientation of the vial V, as seen from the virtual modelling. Once the modelling of the vial V is completed in the reference system x, y, z of the video-camera 33, the control unit drives one or more articulated elements 21 - 26 of the arm 2 so to bring the gripping element 5, initially drawn near the vial V, into an operative position orthogonal to a grip surface of the vial V, for example orthogonal to the closure cap. Moreover, the white light images of the vial V can be used by the control unit for the recognition or reading of labels, barcodes, colour codes, etc., possibly present on the vial V, or for the quality control of the surface of the vial V.
The movements which one or more articulated elements 21 - 26 of the arm 2 must undergo in order to bring the gripping element 5 into the aforesaid operating position affect the orientation of the reference system x, y, z of the video- camera 33. Advantageously, the functionality of the manipulator 1 is not affected by the movements of such reference system, since the video-camera 33 and the positioning device 3 are both integral with the final axis JT6. In other words, the laser light sheets 311, 321 are emitted by the sources 31, 32 integral with the final articulated element 26 of the arm 2 of the manipulator 1 and the images of the light diffused by the vial V are acquired with the video-camera 33, it too integral with the final articulated element 26. Since it does not have to provide for rotations of the reference system x, y, z with respect to the laser light sheets 311, 321, the mathematical management of the point coordinates of the vial V determined by the control unit is simplified to the utmost degree.
Preferably, the angle oc between the optical axis O of the video-camera 33 (figures 4 and 6) and the emission plane of the first emitter 31 is selected in the range of 15° - 60°; more preferably such angle is equal to 30°. The angle β between the same optical axis O and the emission plane of the second emitter 32 is selected in the range of 15° - 60° and is preferably equal to 30°.
According to a preferred embodiment of the present invention, the optical axis of the video-camera 33 is orthogonal to both the lines 310, 320 on the common focusing plane.
Before entering into functioning mode, the manipulator 1 is preferably subjected to a calibration step. This procedure permits minimising or cancelling all of the positioning errors due to an imprecise assembly of the mechanical parts of the arm 2. The calibration step provides for the acquisition of several images of a black dot matrix (of known geometry) printed on a sheet; two images are acquired with the activated emitters 31 and 32; a third image is acquired with the activated LED 6. The images are sequentially acquired and at different distances of the positioning device 3 from the sheet, along the axis z. The acquired images are processed by the control unit, which determines the correction factors to be applied to the movements of the articulated elements of the arm 2 for the manipulation (approaching and orientation with respect to the piece) .

Claims

1. Manipulator (1) comprising a plurality of articulated and motorised elements (21 - 26) , and a positioning device (3) which in turn comprises a first emitter (31) , intended to emit a laser light sheet (311) in a first emission plane in order to illuminate an object (V) , and a receiver (33) , intended to receive the light diffused by the illuminated object (V) , characterised in that it comprises a second emitter (32) intended to emit a laser light sheet (321) in a second emission plane, incident to said first emission plane.
2. Manipulator (1) according to claim 1, characterised in that the laser light sheets (311, 321) of said first emitter (31) and said second emitter (32) are focused on the same focusing plane.
3. Manipulator (1) according to claim 2, characterised in that the laser light sheet (311) emitted by said first emitter (31) intercepts said focusing plane at a line (310) that is vertical with respect to said receiver (33), and the laser light sheet (321) emitted by said second emitter
(32) intercepts said focusing plane at a line (320) that is horizontal with respect to said receiver (33) .
4. Manipulator (1) according to any one of the preceding claims 1-3, characterised in that said first emitter (31), said second emitter (32) and said receiver
(33) are arranged integral with the final articulated element (26) of the manipulator (1) or with a gripping element (5) thereof.
5. Manipulator (1) according to any one of the preceding claims 1-4, characterised in that said receiver
(33) comprises an image acquisition device.
6. Manipulator (1) according to claim 5, characterised in that the optical axis (0) of said image acquisition device is parallel to the longitudinal axis of the final element (26) of the manipulator (1) or to the longitudinal axis of a gripping element (5) thereof.
7. Manipulator (1) according to claim 6, characterised in that the angle (α) between said optical axis (0) and the emission plane of said first emitter is selected in the range of 15° - 60°, and the angle (β) between the same optical axis (0) and the emission plane of said second emitter (32) is selected in the range of 15° - 60°.
8. Manipulator (1) according to any one of the preceding claims 3-7, characterised in that said optical axis (0) is orthogonal to both the vertical and horizontal lines (310, 320) intercepted by said laser light sheets (311, 321) on said focusing plane.
9. Manipulator (1) according to any one of the preceding claims 1-8, characterised in that said first emitter (31) and said second emitter (32) are generators of stroboscopic laser light.
10. Manipulator (1) according to any one of the preceding claims 1-9, characterised in that said gripping element (5) comprises a pneumatic, mechanical or magnetic device, and in that said focusing plane is orthogonal to the longitudinal axis (JT6) of the final articulated element (26) and coincides with the focusing plane of the receiver (33) .
11. Manipulator (1) according to any one of the preceding claims 1-10, characterised in that it also comprises a white light source (6) for illuminating said object (V).
12. Manipulator (1) according to any one of the preceding claims 1-11, characterised in that it also comprises a control unit prearranged to receive and process the images acquired by said receiver (33) , and to feedback operate one or more of said articulated elements (21 - 26) when necessary on the basis of said processing.
13. Method for determining the correct positioning of the manipulator (1) according to claim 1 with respect to an object (V) to be manipulated, comprising the steps of projecting, on said object (V), a first laser light sheet
(311) emitted in a first emission plane and acquiring at least one image of the light diffused by said object (V), characterised by the further step of projecting, on said object (V), a second laser light sheet (321) emitted in a second emission plane, incident to said first emission plane.
14. Method according to claim 13, characterised in that the step of acquiring at least one image of the light diffused by said object (V) is implemented in conditions of predominance of the laser light over the light present in the environment where the object (V) is situated, or in the dark.
15. Method according to claim 13 or claim 14, characterised in that the laser light sheet (311) emitted in said first emission plane intersects said object (V) at a first profile (322) of its outer surface and the laser light sheet (321) emitted in said second emission plane intersects said object (V) at a second profile (312) of its outer surface, said first profile (322) and said second profile (312) intersecting each other.
16. Method according to claim 15, characterised in that the lines of light diffused by the object (V) at said first profile (322) and said second profile (312) are incident on each other with an angle included in the range 45° - 90°.
17. Method according to any one of the preceding claims 15-16, characterised by the further steps of acquiring a first image of the line of light diffused at said first profile (322) and a second image of the light diffused at said second profile (312) , and processing said first and second images in order to extrapolate the spatial coordinates (x., y, z) of at least part of the object (V) .
18. Method according to claim 17, characterised in that said processing is carried out with numeric algorithms and comprises the virtual and three-dimensional modelling of at least part of said object (V) .
19. Method according to claim 18, characterised by the further step of feedback operating one or more of said articulated elements (21 - 26) in order to orthogonally orient a gripping element (5) of the manipulator with respect to a gripping surface of said object (V) .
20. Method according to any one of the preceding claims 13-19, characterised in that said laser light sheets (311, 321) are emitted by laser sources (31, 32) integral with the final articulated element (26) of the manipulator (1) and the images of the light diffused by said object (V) are acquired with means (33) integral with the final articulated element (26) of the manipulator (1) .
PCT/IT2008/000264 2008-04-18 2008-04-18 Manipulator and related operation WO2009128104A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IT2008/000264 WO2009128104A1 (en) 2008-04-18 2008-04-18 Manipulator and related operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IT2008/000264 WO2009128104A1 (en) 2008-04-18 2008-04-18 Manipulator and related operation

Publications (1)

Publication Number Publication Date
WO2009128104A1 true WO2009128104A1 (en) 2009-10-22

Family

ID=40352201

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IT2008/000264 WO2009128104A1 (en) 2008-04-18 2008-04-18 Manipulator and related operation

Country Status (1)

Country Link
WO (1) WO2009128104A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114367987A (en) * 2014-05-28 2022-04-19 X开发有限责任公司 Robot device with environmental indication of joint status

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4731853A (en) * 1984-03-26 1988-03-15 Hitachi, Ltd. Three-dimensional vision system
FR2630957A1 (en) * 1988-05-05 1989-11-10 Peugeot Automatic control device for a manipulator robot
US5280436A (en) * 1990-04-18 1994-01-18 Matsushita Electric Industrial Co., Ltd. Method for measuring three-dimensional position of object to be captured and method for capturing the object
EP1277542A1 (en) * 2001-07-19 2003-01-22 Fanuc Ltd Workpiece unloading apparatus and method
US6628322B1 (en) * 1998-08-07 2003-09-30 Brown & Sharpe Dea, S.P.A. Device and method for positioning a measuring head on a noncontact three-dimensional measuring machine

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4731853A (en) * 1984-03-26 1988-03-15 Hitachi, Ltd. Three-dimensional vision system
FR2630957A1 (en) * 1988-05-05 1989-11-10 Peugeot Automatic control device for a manipulator robot
US5280436A (en) * 1990-04-18 1994-01-18 Matsushita Electric Industrial Co., Ltd. Method for measuring three-dimensional position of object to be captured and method for capturing the object
US6628322B1 (en) * 1998-08-07 2003-09-30 Brown & Sharpe Dea, S.P.A. Device and method for positioning a measuring head on a noncontact three-dimensional measuring machine
EP1277542A1 (en) * 2001-07-19 2003-01-22 Fanuc Ltd Workpiece unloading apparatus and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114367987A (en) * 2014-05-28 2022-04-19 X开发有限责任公司 Robot device with environmental indication of joint status

Similar Documents

Publication Publication Date Title
CN106994696B (en) Orientation system and coordinate system transformation method for end effector
US8346392B2 (en) Method and system for the high-precision positioning of at least one object in a final location in space
US8989897B2 (en) Robot-cell calibration
US10744645B2 (en) Measurement system
US20170016712A1 (en) Position measurement system
US11964396B2 (en) Device and method for acquiring deviation amount of working position of tool
EP2570241A2 (en) Robot system and imaging method
US10611032B2 (en) Measurement system
JP5729219B2 (en) Method for coupling camera coordinate system and robot coordinate system of robot control system, image processing apparatus, program, and storage medium
JP6900290B2 (en) Robot system
EP1841570A1 (en) Device and method for calibrating the center point of tool mounted on a robot by means of a camera
CN112334760A (en) Method and device for locating points on complex surfaces in space
US10935968B2 (en) Robot, robot system, and method for setting coordinate system of robot
US10864643B2 (en) Substrate conveying apparatus
CN211890843U (en) Visual guidance robot
WO2009128104A1 (en) Manipulator and related operation
JPH0545117A (en) Optical method for measuring three-dimensional position
WO2021256464A1 (en) Image capturing system and robot system
US20200376593A1 (en) Measurement device and recording medium encoding a program
KR100214675B1 (en) Calibration apparatus and the method of calibratoriginal position and orientation for industrial robot
EP4186806A1 (en) Improved apparatus for labelling food products
CN112008717A (en) Camera and robot system
JP2023007886A (en) Correction system and correction method for teaching data
CN116952160A (en) Line structured light rotary scanning measurement system and rotary error compensation method
KR19980084212A (en) Positioner of the mounter and its method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08763817

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08763817

Country of ref document: EP

Kind code of ref document: A1