WO2013002099A1 - 部品供給装置 - Google Patents
部品供給装置 Download PDFInfo
- Publication number
- WO2013002099A1 WO2013002099A1 PCT/JP2012/065766 JP2012065766W WO2013002099A1 WO 2013002099 A1 WO2013002099 A1 WO 2013002099A1 JP 2012065766 W JP2012065766 W JP 2012065766W WO 2013002099 A1 WO2013002099 A1 WO 2013002099A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- parts
- vision sensor
- dimensional vision
- orientation
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0009—Constructional details, e.g. manipulator supports, bases
- B25J9/0018—Bases fixed on ceiling, i.e. upside down manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40012—Pick and place by chain of three manipulators, handling part to each other
Definitions
- the present invention relates to a component supply device that aligns components to be supplied in bulk to an automatic assembly device, an automatic assembly robot, and the like, and in particular, using a plurality of vertical articulated robots (hereinafter simply referred to as “robots”).
- robots vertical articulated robots
- parts that are transported from the purchaser or the previous process in the product assembly line are packed in a so-called unpacked state in order to reduce the space volume occupied by the vehicle or parts box required for parts transportation and stocking. Often arrives at Therefore, in order to promote automation of product assembly, it is necessary to align the position and orientation of the parts supplied to the assembly apparatus by some means.
- a dedicated parts aligner called a parts feeder has been widely used as a means for automatically arranging parts in a stacked state.
- the parts feeder is designed for each part, so it is not versatile.Therefore, the design period is long, the price is high, vibration and noise are generated during operation, and it takes up a lot of floor space in the factory. There is a problem.
- the vision sensor is used to obtain a preferable posture.
- a technique for performing position and orientation correction by recognizing only the parts being taken is known (for example, see Patent Document 1).
- Patent Documents there is known a technique for recognizing the position and orientation of a part in a stacked state, changing the orientation if necessary and temporarily placing it, and further changing the orientation if necessary and aligning them (for example, Patent Documents). 2).
- Patent Document 1 Since the technique described in Patent Document 1 depends on a probabilistic event in which only a part in a preferable posture is selected and handled after sufficiently supplying the number of parts, the number of parts is reduced. Then, there is a problem that the probability that a part with a preferable posture does not exist increases, and the occurrence of an operator call increases even though the part remains. In addition, when parts that have a low probability of becoming a preferred posture are to be supplied, more parts must be put in, so it is necessary to increase the amount of inventory on hand, and the factory volume (floor area, floor area, There is a problem of wasting height.
- Patent Document 2 the technique described in Patent Document 2 is intended for a cube-shaped component that can be brought into contact correctly with an adsorption pad so that air does not leak and can perform the intended adsorption. Therefore, there are no parts where the suction pad can function, such as flat parts, complicated parts, parts with irregularities on the surface, parts with many small holes, parts with thin shapes, etc. There is a problem that it cannot respond to parts that are not attached. If the suction pad cannot be used, suction with a blower may be considered. However, in this case, there is a problem that noise is high and power consumption is large.
- Patent Document 1 the conventional parts supply device depends on a probabilistic event. Therefore, it is necessary to increase a stock on hand in order to put in a large number of parts, and the space volume of the factory is wasted. There was a problem. Further, the technique of Patent Document 2 has a problem that it cannot cope with a component that does not have a portion to which the suction pad is applied. Furthermore, since the conventional component supply apparatus uses the specialized hand designed for every component, there existed the subject that a huge design cost, hand switching time, and a hand temporary storage place were required.
- the present invention has been made to solve the above-described problems, and is dedicated to a wide variety of parts by handling parts using a vision sensor and a plurality of robots having parallel chuck hands. It is an object of the present invention to obtain a component supply device that can align components in a stacked state without designing a jig or a hand.
- a parts supply device includes: a bulk parts box for storing bulk parts; distance image measuring means for measuring a distance image of parts in the bulk parts box; and a bulk parts box based on the distance image.
- Isolating means for picking up parts from the position, position and orientation changing means for changing the position and orientation of the parts isolated by the isolating means to a position and orientation that is below a certain error with respect to the final position and orientation specified in advance, It is equipped with.
- the parts in a stacked state can be aligned at high speed by handling the parts by pipeline processing while delivering the parts with a plurality of robots. Further, even if the part has a complicated shape, the cycle time of the alignment process can be prevented from being extended. In addition, it is possible to quickly switch between production models only by changing the software, so there is no need for a dedicated hand for each part, reducing the cost of the hand, shortening the design time of the hand, and reducing the place for temporarily placing the hand. Become.
- FIG. 8 is a perspective view specifically showing another configuration example of the robot hand in FIG. 1.
- FIG. 1 shows the whole operation
- FIG. 2 shows the specific example of the parameter at the time of robot operation
- Embodiment 9 of this invention It is a perspective view which shows the three-dimensional shape of the components used as the holding object in Embodiment 9 of this invention. It is explanatory drawing which shows the solution subject by Embodiment 9 of this invention with the side view and front view of components. It is explanatory drawing which shows the solution subject by Embodiment 9 of this invention with the side view of a bulk stacking part. It is a side view which shows the some three-dimensional vision sensor by Embodiment 9 of this invention.
- FIG. 1 is a side view schematically showing an overall configuration of a component supply apparatus according to Embodiment 1 of the present invention, and a plurality of (here, four) robots 3, 6a, 6b, and 6c are used for loosely stacked components.
- the structure which supplies is arranged.
- the component supply apparatus includes a three-dimensional vision sensor 1, a bulk stacking component box 2 in which a large number of components (for example, L-shaped components) are stored, and a bulk stacking component box 2 disposed close to the bulk stacking component box 2.
- robot 3 hereinafter simply referred to as “robot 3”
- robot 3 a temporary placement table 4 for parts
- two-dimensional vision sensor 5 for imaging the temporary placement table 4
- robot group 6 second robot 6a to 6c
- Robot a control device 7 for controlling the robot 3 and the robot group 6 based on the detection results of the three-dimensional vision sensor 1 and the two-dimensional vision sensor 5, and a pallet 8 on which the aligned parts are placed. ing.
- the three-dimensional vision sensor 1 functions as a distance image measuring unit together with the control device 7, and generally picks up the components stacked in the bulk component box 2 from above, and each component from the three-dimensional vision sensor 1 is uneven (random). ) Has a function of measuring a large number of distance data to the top surface.
- a known method such as a stereo method, a light cutting method, a spatial code method, a random code method, or a time-off fly method can be applied.
- Two broken lines attached to the three-dimensional vision sensor 1 represent a state in which triangulation of distance is performed by each method.
- the distance data obtained by the three-dimensional vision sensor 1 is subjected to coordinate transformation calculation or the like in the three-dimensional vision sensor 1 or the control device 7 and contributes to the calculation of the distance image.
- a distance image is a “mapping of coordinate values in the direction of a specific coordinate axis” when viewed from a certain coordinate system for each pixel of an image of a captured scene. For example, the distance image is viewed from the base coordinate system of the robot 3. It is a map of the height when the parts are stacked.
- FIG. 2 is an explanatory diagram showing a specific example of a distance image.
- the height (distance) distribution of parts stacked in the upper part of the bulk parts box 2 is plotted on the XY plane of the robot coordinate system and is the highest.
- the state where the measured value of the Z coordinate position is mapped is shown.
- the magnitude of each value of the Z coordinate is indicated by the length of the bar graph, and the entire graph is plotted.
- the part extending in the middle of the protrusion is in a state where a part of the part is protruding.
- the protrusion part adjacent to the location where the dent has arisen in the near side has the protrusion part (gripable part) of components.
- the bulk parts box 2 is a simple box that has no special function and has an open top surface, and basically has a size that can store as many parts as the amount of hand-held or stock. is doing. Note that the bulk stacking component box 2 may be configured such that a plurality of types of components can be stored by partitioning the inside, and in this case, the size of the partition sections may not be uniform.
- shock resistance may be improved by providing a cushioning material such as sponge on the inner bottom surface and the outer bottom surface of the bulk stacking component box 2 or by supporting the bulk stacking component box 2 with a spring material or the like.
- a cushioning material such as sponge
- a structure such as a plate-like protrusion may be provided inside the bulk stacking component box 2 so that the postures of the components are easily aligned within a specific range. Further, by making it possible to replace the bulk component box 2 with a pallet changer, a belt conveyor, or the like, the parts may not be used up continuously.
- the robot 3 and the robot group 6 are vertical articulated robots that are generally widely used.
- the robot group 6 along with the two-dimensional vision sensor 5 and the control device 7 function as part position / orientation changing means, and each of the robots 6a to 6c in the robot group 6 has a parallel chuck hand.
- the three-dimensional vision sensor 1 analyzes a distance image, so that a part candidate that can be gripped by the robot 3 from the parts stacked in the bulk parts box 2 (hereinafter also simply referred to as “candidate”). By calculating and further optimizing, the candidate parts that can be grasped are narrowed down to one. Note that the shape and size of the claw 3t of the hand 3h of the robot 3 are quantified in advance in the calculation process of the graspable candidates.
- a numerical model is formed as two cylinders or prisms of the minimum size including each nail 3t separated by a distance corresponding to the opening width W of the nail 3t.
- the thickness of the cylinder or the prism approximates the thickness of the claw 3t
- the length of the cylinder or the prism approximates the depth at which the claw 3t is applied to the part when gripping.
- the 3D vision sensor 1 assumes that the latest distance image has been obtained.
- the 3D vision sensor 1 is a space in which a digitized cylinder or prism is entered from the latest distance image, and between the spaces. Search for the location of the part. In the case of FIG. 2, two locations are found: a location extending in a protruding shape in the middle of FIG. 2 and a location adjacent to the location where the recess is formed on the near side in FIG. 2. become.
- template matching is performed on the distance image with respect to the shape of a small protrusion portion that can be gripped by the hand 3h of the robot 3 (for example, a prism, cylinder, flat plate, or disk that fits between the claws 3t of the opened hand 3h) Search and extract multiple candidates.
- a small protrusion portion that can be gripped by the hand 3h of the robot 3 (for example, a prism, cylinder, flat plate, or disk that fits between the claws 3t of the opened hand 3h) Search and extract multiple candidates.
- the three-dimensional vision sensor 1 performs an optimization calculation by assigning evaluation values to the extracted candidates and selecting only one candidate having the highest evaluation value.
- the highest Z-axis value is adopted as the evaluation value among the projections (or the minute prisms discovered) sandwiched between two cylinders, and the evaluation value is the largest candidate. Select.
- This optimization operation is equivalent to selecting the part that is stacked at the top of the parts that are stacked in bulk. That is, by selecting the maximum evaluation value, the candidate for the component is optimized. In the case of FIG. 2, the location where the central protrusion in FIG. 2 exists is an optimized candidate.
- the position and orientation of the robot 3 that can grip the component can be calculated by adding the relative position and orientation with respect to the position and orientation of the candidate that can be grasped as the XYZ values in the robot coordinate system and the rotation angles around each axis.
- a candidate point is extracted from a distance image by template matching a minute prismatic part of a size that can be gripped by the hand 3h, and optimized using the Z-axis height of the candidate point. Can also calculate the grip position and orientation, and the same effect can be obtained.
- the robot 3 carries the gripped parts to the temporary placement table 4 and releases the parts above the temporary placement table 4. At this time, it is desirable that the robot 3 not place the parts carefully on the temporary table 4 but put them on the temporary table 4 so as to throw them. As a result, the entangled parts are unwound, and the probability that the parts are separated and moved to the temporary table 4 in a separated state is increased.
- the tool coordinate system of the robot 3 is the Z axis as the direction in which the tool advances
- the X, Y, and Z axes are defined in the right-handed system
- the A, B, and C axes are defined around the respective axes X, Y, and Z.
- the claw 3t of the hand 3h of the robot 3 is always in the posture in which the A-axis value and the B-axis value in the tool coordinate system are always the same value, and the claw 3t is in the gap between parts with respect to the C-axis. It is assumed that the robot moves forward in the Z-axis direction with the rotated posture.
- the robot 3 operates so as to lower the hand 3h vertically downward. That is, the Z axis of the tool coordinate system is set to be the same as the vertical downward direction of the world coordinate system or the direction of gravity acceleration. Note that “isolation” refers to picking out only one part from the para-loading part box 2.
- the temporary table 4 is a simple table that is not equipped with anything, but when an article (foreign material) to be removed is placed on the table, the foreign material is slid down by its own weight by reversing the surface of the table (for example, Alternatively, a mechanism such as “flip off” may be added. In the case where the temporary placing table 4 is provided with an exclusion mechanism, there is an effect that error recovery is performed quickly and the tact time is not easily extended. It is assumed that the height of the top surface of the temporary table 4 (robot coordinate system Z-axis direction) is measured in advance and stored in the storage area of the control device 7.
- the robot 3 sets the parts stacked in the bulk parts box 2 in a state where the claws 3t of the hand 3h are opened based on the robot coordinate calculation process, and the gripping position obtained at that time To close the hand 3h.
- the robot 3 moves the hand 3h so as to be lifted along the robot coordinate Z-axis direction, and performs an operation of pulling up the part that can be gripped from the bulk stacking part box 2, and then the gripped part is temporarily placed on the temporary placement table 4. Roll on top.
- the robot 3 has an isolation function of taking out only one part from a large number of parts contained in the bulk stacking part box 2.
- the robot 3 may fail to take out the parts and may not be able to grip the parts, and a plurality of parts may be entangled and rolled to the temporary table 4 in a lump, and the plurality of parts may not be entangled.
- each may roll on the temporary table 4.
- any of the above states can be easily determined by imaging the temporary table 4 with the two-dimensional vision sensor 5 immediately after the robot 3 executes the above-described operation. For example, when the removal of the parts from the bulk parts box 2 fails, the picking operation by the robot 3 is performed again.
- the parts are lined out using an exclusion means (not shown) that reverses the top plate of the temporary table 4. .
- the line-out can be easily realized, for example, by preparing a parts disposal box (not shown) and discarding the parts in the parts disposal box.
- one robot in the rear robot group 6 handles the parts one by one or only one part, The remaining parts are lined out by means such as exclusion means for inverting the top plate of the temporary table 4.
- the two-dimensional vision sensor 5 functions as a component outer shape measuring unit and is a sensor that is widely used.
- the two-dimensional vision sensor 5 captures an image of a component that has been rolled onto the temporary placement table 4 and acquires its outer shape.
- the external shape of the part acquired by measurement contributes to the calculation of the position and orientation of the part.
- the calculation processing of the position and orientation of the component is performed in the two-dimensional vision sensor 5 or the control device 7 by, for example, a template matching method.
- template images are registered in advance.
- the number of images registered for the parts handled is one for parts whose front and back are irrelevant, two for parts related to the front and back, and five for parts that are stationary in five postures. Become.
- a dedicated jig is provided on the temporary table 4 to fix the angle, or the robot group 6 performs a delivery operation. Determine the angle at.
- the robot 6a picks up the parts from the temporary placing table 4 with a parallel chuck hand. Thereafter, the robot 6c arranges and arranges the components on the component aligning pallet 8 after delivering the robot 6a, 6b and 6c while reversing the front and back of the components.
- part handling procedure by the part supply operation will be described.
- component recognition by the three-dimensional vision sensor 1 is performed, and the position and orientation of a part of the recognized component that can be gripped (for example, a portion protruding like an ear or a portion estimated to protrude) is determined. Narrow down to one.
- the control device 7 operates the robot 3 so that the narrowed position and orientation match the position and orientation of the claw 3t of the hand 3h of the robot 3, and then closes the claw 3t to grip the component.
- the robot 3 takes out the parts from the bulk parts part box 2, opens the claw 3 t above the temporary placing table 4, and rolls the parts onto the temporary placing table 4, thereby placing the parts on the temporary placing table 4. Put.
- the part rests on the temporary table 4 in one of several stable states.
- a case will be described in which parts are placed on the temporary placement table 4 without causing entanglement or overlap.
- a state in which the parts are placed on the temporary table 4 without causing entanglement or overlap is referred to as an “isolated state”.
- the two-dimensional vision sensor 5 recognizes the position and orientation of the component placed on the temporary placement table 4 using a template image registered in advance and a pattern matching method. At this time, when the component has a plurality of stable states, the two-dimensional vision sensor 5 executes a recognition program for all of the stable states, and obtains the result of the stable state with the highest reliability of the recognition result. Adopted as the overall recognition result. As described above, in the case of parts whose front and back are irrelevant, there is only one stable posture.
- the positional displacement and rotation of the component in the plane can be measured by the two-dimensional vision sensor 5.
- the two-dimensional vision sensor 5 outputs “pattern identification information” indicating which template image the coordinates of the position and orientation of the part match. Also, the 2D vision sensor 5 has no part on the temporary table 4, the part is out of the sensor field of view, or the position and orientation coordinates of the part do not match any template image. “Identification information” is output.
- one robot 6a in the robot group 6 is Depending on the state, a part is gripped by a predetermined operation, and a part delivery operation with the robot 6b is performed.
- the robot 6b knows what operation the robot 6a performs from the stable state of the parts measured by the two-dimensional vision sensor 5, the robot 6a and the robot 6a are operated in advance according to each stable state. The parts are delivered between the two.
- the robot 6c knows what operation the robot 6b performs from the stable state of the components recognized by the two-dimensional vision sensor 5, and is determined in advance according to each stable state. In operation, a parts delivery operation is performed with the robot 6b, and an alignment supply to the pallet 8 for alignment is performed.
- the robot 6a After the delivery of the part to the robot 6b, the robot 6a proceeds to grip the next part on the temporary table 4.
- the robot 6b prepares for the delivery of the part from the next robot 6a after delivering the part to the robot 6c.
- the robot 6c moves in preparation for parts delivery from the robot 6b after the parts are aligned.
- the above procedure realizes pipeline processing in which the robots 3 and 6a to 6c always move. Even if the posture of the parts is changed a plurality of times, a single robot moves the parts even if the posture of the parts is changed multiple times. It is determined by the longest one of the actions to be performed. It has been observed experimentally that the operating time of each robot is approximately the same.
- FIG. 5 is a flowchart showing an overall operation sequence according to the first embodiment of the present invention.
- the operation procedures of the three-dimensional vision sensor 1, the robot 3, the two-dimensional vision sensor 5, and the robots 6a to 6c are associated with each other. Shown in parallel.
- the sequence in FIG. 5 is assumed to be softwareized as a control program and stored in the control device 7.
- the 3D vision sensor 1 starts its operation in response to the processing procedure of the robot 3 (step S ⁇ b> 12 described later) as indicated by a dotted arrow, and first, the distance of the components in the bulk stacking component box 2. An image is measured (step S1).
- step S2 the 3D vision sensor 1 optimizes the gripping candidates (step S2), and sends the coordinates of the gripping candidates to the robot 3 via the control device 7 as indicated by the dotted arrows (step S3). Thereafter, upon completion of the operations in steps S1 to S3, the process returns to step S1.
- the robot 3 first moves to the retracted coordinates so as not to obstruct the field of view of the three-dimensional vision sensor 1 (step S11), and requests the above-described measurement by the three-dimensional vision sensor 1 (step S12).
- the robot 3 acquires the coordinates of the gripping candidate by the measurement of the three-dimensional vision sensor 1, the robot 3 moves to the gripping coordinate (step S13), and closes the hand 3h to grip the gripping candidate (step S14).
- step S15 the robot 3 moves to the coordinates of the temporary table 4 (step S15), opens the hand 3h, and rolls the gripped component onto the temporary table 4 (step S16). Thereafter, upon completion of the operations in steps S11 to S16, the process returns to step S11.
- the two-dimensional vision sensor 5 starts its operation in response to the processing procedure of the robot 6a (step S32 described later) as indicated by the dotted arrow, and first measures the image on the temporary table 4 (step S21). .
- the two-dimensional vision sensor 5 performs pattern matching between the measurement image and the template image (step S22), and sends the pattern identification information and the grip coordinates to the robot 6a via the control device 7 as indicated by a dotted arrow. It is sent out (step S23). Thereafter, upon completion of the operations in steps S21 to S23, the process returns to step S21.
- the robot 6a moves to the retracted coordinates so as not to obstruct the field of view of the two-dimensional vision sensor 5 (step S31), and requests the above-described measurement by the two-dimensional vision sensor 5 (step S32).
- the robot 6a acquires the pattern identification information and the grip coordinates by the measurement of the two-dimensional vision sensor 5, the robot 6a performs branch determination to steps S31, S32, and S34 according to the measurement result (component information in the control device 7). A branching operation is performed according to the determination result (step S33).
- the robot 6a moves to the gripping coordinate on the temporary table 4 (step S34), and closes the hand to move the hand on the temporary table 4.
- the part is gripped (step S35).
- Step S36 the robot 6a moves to the part delivery posture to the adjacent robot 6b (step S36), and shifts to a part pick-up waiting state (step S37).
- Step S37 links to the parts waiting state of the robot 6b (step S42) as indicated by the dotted arrow.
- step S44 the robot 6a confirms the hand closing operation (step S44) of the robot 6b, as shown by the broken line arrow, the robot 6a opens the hand and passes the parts gripped by itself to the robot 6b (step S38). . Thereafter, upon completion of the operations in steps S31 to S38, the process returns to step S31.
- the robot 6b moves to the retracted coordinates so as not to block the operation space of the robot 6a (step S41), and shifts to a parts waiting state from the robot 6a in response to the robot 6a waiting to be picked up (step S37). (Step S42).
- the robot 6b moves to the parts feed coordinates of the robot 6a (step S43), closes the hand, and grips the parts gripped by the robot 6a (step S44).
- Step S45 changes the component posture
- step S46 further moves to the component delivery posture to the adjacent robot 6c
- step S47 shifts to a component pick-up waiting state
- Step S47 is linked to the parts waiting state of the robot 6c (step S52) as indicated by the dotted arrow.
- step S54 When the robot 6b confirms the hand closing operation (step S54) of the robot 6c, as shown by the broken line arrow, the robot 6b opens the hand and passes the parts gripped by itself to the robot 6c (step S48). . Thereafter, upon completion of the operations in steps S41 to S48, the process returns to step S41.
- the robot 6c moves to the retracted coordinates so as not to block the operation space of the robot 6b (step S51), and shifts to a parts waiting state from the robot 6b in response to the robot 6b waiting state (step S47). (Step S52).
- the robot 6c moves to the parts feed coordinates of the robot 6b (step S53), closes the hand, and grips the parts gripped by the robot 6b (step S54).
- step S55 the robot 6c changes the component posture (step S55), moves to the component insertion coordinates on the pallet 8 (step S56), opens the hand, and removes the component held by itself by the pallet 8. (Step S57). Thereafter, upon completion of the operations in steps S51 to S57, the process returns to step S51.
- step S33 the branching operation (step S33) of the robot 6a based on the measurement result (component information) of the two-dimensional vision sensor 5 will be specifically described.
- the robot 6a performs the following determination process based on the measurement result and a branching operation according to the determination result.
- the robot 6a returns to step S31 and moves to its own standby coordinates. To do.
- the control device 7 considers that the operation of the robot 3 was not properly performed, generates an operation command for the robot 3, and causes the robot 3 to execute a series of operations (steps S11 to S16) again. .
- the robot 6a moves from the temporary placement table 4.
- the operation of removing the part is performed.
- the robot 6 a flips off the parts on the temporary table 4 and removes them from the temporary table 4.
- processing such as providing an exclusion mechanism for inverting the top plate of the temporary table 4 and operating the exclusion mechanism according to an instruction from the control device 7 is performed.
- the robot 6a After removing the parts from the temporary placement table 4, the robot 6a returns to step S31 and moves to the standby posture.
- the control device 7 generates an operation command for the robot 3, and causes the robot 3 to execute a series of operations (steps S11 to S16) again.
- the robot 6a An operation for reducing the number of posture reversals is performed by touching and defeating parts on the temporary table 4. After the component posture reversal manpower reduction operation is completed, the robot 6a returns to step S32 and causes the measurement operation of the two-dimensional vision sensor 5 to be executed again.
- the robot 6a When the measurement result (part information) of the two-dimensional vision sensor 5 indicates that (D) there are a plurality of types of parts that the apparatus handles as supply targets and the part posture is not appropriate, the robot 6a The operation of removing the parts from the temporary placement table 4 is performed, and the process returns to step S31 to move to the standby posture. At the same time, the control device 7 generates an operation command for the robot 3 and causes the robot 3 to execute a series of operations (steps S11 to S16) again.
- the component supply apparatus includes the bulk stacking component box 2 for storing the bulk stacked components and the components in the bulk stacking component box 2.
- a three-dimensional vision sensor 1 distance image measuring means
- a robot 3 isolation means
- a robot 3 robot 3
- a robot group 6 position and orientation changing means that changes the position and orientation of the parts isolated by the above to a position and orientation that are equal to or less than a predetermined error with respect to the final position and orientation that are specified in advance.
- the position / orientation changing means includes a robot group 6 (second robot) that receives a part from the robot 3 and changes the position / orientation of the part. Further, the isolating means includes a temporary placing table 4 on which one or more parts released (rolled) after being gripped by the robot 3 are placed.
- the position / orientation changing means includes a two-dimensional vision sensor 5 (component outline measuring means) for measuring the outline of the parts on the temporary table 4, and the robot group 6 (second robot) is provided on the temporary table 4. Pick up the pointed part and change the position and orientation of the part.
- a two-dimensional vision sensor 5 component outline measuring means
- the robot group 6 second robot
- the component supply apparatus includes operations of the robot 3 (first robot) and the robot group 6 (second robot), and the three-dimensional vision sensor 1 and the two-dimensional vision sensor 5. And a control device 7 for controlling the operation timing.
- the robot group 6 includes a plurality of robots 6a to 6c, and changes the position and orientation of the parts while delivering the parts picked up from the temporary table 4 between the plurality of robots 6a to 6c.
- the components in the stacked state are positioned on the temporary placement table 4 by the three-dimensional vision sensor 1 and the robot 3, and the position and orientation of the components are recognized by the two-dimensional vision sensor 5. Since parts are delivered by a plurality of robots 6a to 6c and handled by pipeline processing to align them on the pallet 8, it is possible to align the parts in a stacked state at high speed.
- the parallel processing time can be suppressed to a time equivalent to the operation time of the robots 3 and 6a to 6c, it is possible to avoid an increase in the cycle time of the alignment process even for parts having a complicated shape. it can.
- the parallel processing time can be suppressed to a time equivalent to the operation time of the robots 3 and 6a to 6c, it is possible to avoid an increase in the cycle time of the alignment process even for parts having a complicated shape. it can.
- it is possible to quickly switch between production models simply by changing the software there is no need for a dedicated hand for each part, reducing the cost of the hand, shortening the design time of the hand, and reducing the temporary place for the hand. Become. Further, even when a plurality of types of parts are to be supplied, a dedicated hand is not required for each part, and hand costs can be reduced, hand design time can be reduced, and temporary hand placement locations can be reduced.
- the three-dimensional vision sensor 1 and the two-dimensional vision sensor 5 are configured separately from the robots 3 and 3a, but the three-dimensional vision sensor 1 is mounted on the side of the robot 3 and the two-dimensional vision sensor is installed. 5 may be attached to the side of the hand of the robot 6a, and each may have a hand-eye configuration.
- the size of the bulk parts box 2 can be made larger than the field of view of the three-dimensional vision sensor 1
- the size of the temporary table 4 can be made larger than the field of view of the two-dimensional vision sensor 5. Can also be increased.
- Embodiment 2 Although not particularly mentioned in the first embodiment (FIGS. 1 to 5), in the operation of the robot 3 associated with the three-dimensional vision sensor 1 and the control device 7, the components in the bulk stacking component box 2 are In order to optimize the picking success rate, a parameter optimization means for selecting an optimal parameter may be provided.
- the claw with the width W with respect to the ear of the part 3t is applied, the posture of the claw 3t is applied to the ear of the component, the relative position and posture of the claw 3t is moved and stopped with respect to the ear of the component, and further, the claw 3t is It is necessary to adjust the value of the width W, the value of the posture, and each numerical parameter that determines the shape of the track as to which track the hand 3h is lifted after closing and gripping the part.
- the isolation success probability P has a relationship between the occurrence probability of the redo operation and the reverse side, it affects the increase / decrease of the tact time. If the isolation success probability P is low, the tact time increases. Therefore, it is important to select a parameter that shortens the tact time.
- the robot 3 or the control device 7 is provided with parameter optimization means, and the parameter optimization means is activated by the control device 7.
- the control device 7 controls the opening width W of the claw 3t of the hand 3h, for example, when operating the robot 3, and the opening width W is stored as one of the parameters.
- the initial value of the parameter is given in advance.
- the robot 3 When the parameter optimizing means is activated, the robot 3 repeatedly performs an experimental trial of observing the isolation success probability P for a certain combination of the parameter values while changing each parameter by a predetermined method. . At this time, a plurality of combination map data composed of the given parameter values and the isolation success probability P obtained by each parameter value is recorded in the control device 7.
- the relationship between the combination of each parameter and the isolation success probability P is modeled using, for example, a regression equation.
- parameters are optimized using a model. That is, the parameter value vector having the highest isolation success probability P is read out.
- the combination of parameters to be tried may be optimized so that the mathematical model is accurate when the trial is performed. If the combination of parameters is optimized before starting the trial, a method such as an orthogonal table or D-optimum may be used. It is also possible to optimize dynamically during the trial. For this, a method for automatically generating experimental conditions disclosed in a publicly known document (for example, JP 2008-36812 A) can be used.
- FIG. 6 (a) shows an extraction operation with a low isolation success probability P
- FIG. 6 (b) shows an extraction operation with a high isolation success probability P according to Embodiment 2 of the present invention.
- 6 (c) shows operating parameters of the hand 3h in FIG. 6 (b)
- FIG. 6 (d) shows a set of parameters (horizontal distance d, angle ⁇ ) that maximize the isolation success probability P.
- the alignment is shown.
- the hand 3h grips one part in the bulk stacking part box 2 and pulls it straight upwards as indicated by a thick arrow.
- the hand 3h that grips the component moves up by the horizontal distance d in the angle ⁇ direction and then pulls it upward vertically as indicated by the thick arrow.
- the hand 3h when one part is gripped, the hand 3h is not immediately pulled straight up vertically as shown in the trajectory (thick arrow) in FIG. As shown in the orbit of FIG. 6B, it is effective to pull up vertically after moving by the horizontal distance d in the angle ⁇ direction.
- each parameter has the shape and size of the part. Depends on the size.
- the angle ⁇ and the horizontal distance d are combined in an orthogonal table, and n trials are performed for each parameter combination to obtain the angle ⁇ and the horizontal distance.
- Map data of the isolation success probability P for the combination with the distance d is acquired.
- each coefficient A, B, C, D, E for example, a least square method is used.
- the regression equation is obtained, the angle ⁇ and the horizontal distance d that maximize the isolation success probability P are read from the map data.
- the trial and the calculation of the regression equation coefficients A, B, C, D, and E are performed, and the parameters (angle ⁇ , horizontal distance d) that maximize the isolation success probability P are read out from the obtained regression equation.
- the selected values of the angle ⁇ and the horizontal distance d are determined.
- the robot 3 or the control device 7 constituting the isolation means includes the parameter optimization means, and the parameter optimization means While automatically changing the parameters that define the picking operation of the parts from the stacked parts box 2, the picking success rate is observed and recorded, and the parameter with the best picking success rate is selected.
- Embodiment 3 FIG.
- the functions and features of the respective parts have not been mentioned in general terms.
- the effect of aligning loosely stacked parts according to the present invention includes the following first to fifth features (functions). It is realized by.
- the first feature is that a distance image measuring unit including the three-dimensional vision sensor 1 and the control device 7 is provided.
- the position / orientation for gripping a part is determined in advance for each type of part, but in the present invention, the three-dimensional vision is set immediately before the gripping operation without being determined to be one. It has a function of performing measurement by the sensor 1 and changing the gripping position and the subsequent operation in a flexible manner according to the obtained measurement data.
- the parts can be transported from the separated state to the aligned state on the pallet 8. it can.
- the second feature is that a three-dimensional vision sensor 1, a robot 3, and a control device 7 (isolation means) are provided. This isolation function is effective when taking out the parts from the bulk parts box 2.
- the position / orientation at which the target part can be gripped is acquired based on the sensor measurement value.
- gripping is performed around the part to be gripped. Since other parts other than the target part exist at the same time, interference may occur between the other part and the hand 3h when gripping the part, and the surrounding parts may be flipped off.
- the conventional apparatus often cannot be gripped because the position and orientation of parts change.
- the gripping target part is found at the end in the bulk stacking part box 2
- the hand 3h and the partition plate or wall in the bulk stacking part box 2 are Interference often occurs, and there arises a problem that the possibility of unsuccessful gripping increases.
- the three-dimensional vision sensor 1 and the robot 3 search for a place that can be gripped by hooking the claw 3t, including interference from other parts, etc. Therefore, there is an effect that the probability that the target component can be reliably grasped is increased.
- the isolation function by the robot 3 is also effective in the process of aligning the parts on the pallet 8 after the parts are rolled on the temporary table 4. That is, for each posture in which the parts are rolled, prepare a limited number of order sequences indicating which parts and positions of the following robot group 6 should be gripped by the hand, and which of the order sequences is to be selected. It can be determined based on the result of the measurement value of the two-dimensional vision sensor 5. Thus, by changing the sequence of the gripping position and posture in a flexible manner, the success probability of parts alignment can be reliably improved.
- the third feature is in how to use the sensor measurement values obtained by the three-dimensional vision sensor 1 and the control device 7 (distance image measurement means).
- the shape model of the entire part and the measurement data are matched.
- the part of the grippable part for example, the ear of the part
- the part is the nail 3t. It has a function of closing the nail 3t after the toe is moved to a position in between and the robot 3 is operated to move the toe. Thereby, there is an effect that the probability that the component can be reliably gripped is increased.
- the fourth feature is the temporary placement table 4, the two-dimensional vision sensor 5, the robot group 6, and the control device 7 (position and orientation changing means).
- the conventional apparatus only one position / orientation (or relative position / orientation between the part and the claw) of the part is set in advance, and the relative position is fixed after once grasping without changing the set position and orientation. Work continued to be held until the end in the state.
- the temporary table 4, the two-dimensional vision sensor 5, the robot group 6 and the control device 7 are replaced with the three-dimensional vision sensor 1 and the control device 7 (distance image measuring means).
- work can be made low.
- the success probability of parts alignment work can be obtained, for example, 90% or more, although it depends on the type of parts. Even if it fails, by repeating the measurement and gripping operations again, the probability of continuing failure becomes extremely low under the success probability as described above.
- the fifth feature is that, as an additional function of the temporary placing table 4, the two-dimensional vision sensor 5, the robot group 6 and the control device 7 (position and orientation changing means), instead of continuing to use the initially gripped posture, It has a grip repair position changing function for releasing gripping and re-holding in the middle of work.
- the parts are isolated from the bulk parts box 2 by the gripping position changing function of the present invention, the parts are taken out from the bulk parts box 2 and put into the pallet 8 even if the above-described position and orientation deviation phenomenon occurs. This has the effect of not affecting the success or failure of the entire operation leading to the alignment.
- the grip position change function by re-gripping from another direction, the accuracy of the position and orientation of the parts required at the final stage of the work becomes higher than the current accuracy every time it is re-gripped. There is an effect that necessary accuracy can be obtained in the final stage.
- the position / orientation changing means has the following effects.
- the part posture reversal operation for example, the work of turning the front and back of the part upside down
- the position / orientation changing means as the position / orientation changing means, the components reversed in the air are transferred between the robot groups 6 (the plurality of robots 6a to 6c), thereby re-gripping. Has been.
- the production system is started up and the model is switched, it is possible to change the operation of the robot group 6 only by changing the software in the control device 7 and to invert the parts.
- FIG. 7 is an explanatory view showing the principle of the posture changing operation according to the third embodiment of the present invention.
- An example in which the gripping position changing function is realized by using the robot group 6 (three robots 6a to 6c) is shown in FIG. The space coordinates are shown.
- Each of the robots 6a to 6c can change the position and orientation of the gripped component by gripping the component at a certain position (connection point between the broken line and the solid line). Under the restriction of physicality of ⁇ 6c, the manifold is set up in the space of “6” freedom of position and orientation.
- the trajectory consisting of a thick solid line corresponds to the movement path of the component by the robot group 6, and the actual value of the position and orientation (broken line) indicated by the fan-shaped figure formally represents the manifold of the component position and orientation. is doing.
- the occupied space in each fan-shaped figure (broken line) corresponds to the movable range of each robot 6a to 6c, and each occupied space is restricted. However, this restriction is limited to the reverse operation of parts. It expresses that there is.
- parts are delivered from the robot 6a to the robot 6b and parts are delivered from the robot 6b to the robot 6c, so that the actual values (manifolds) of the parts positions and orientations by the individual robots 6a to 6c can be obtained.
- the operation of changing the position and orientation of the parts necessary for alignment from the temporary placement table 4 to the pallet 8 can be realized beyond the restriction range.
- FIG. 8 is an explanatory diagram showing three posture change trajectories M1 to M3 according to Embodiment 3 of the present invention, and is shown in 6-dimensional spatial coordinates together with a variety of fan-shaped figures (broken lines) corresponding to FIG. ing.
- the temporary placement by the robot 6 a is taken as an example of three posture changing operations in the case of taking three stable postures when a certain part is isolated and rolled on the temporary placement table 4.
- Three gripping position postures L1 to L3 (connection points between a broken line and a solid line) of the parts on the table 4 and three posture change trajectories M1 to M3 by the robot 6b are shown in association with each other.
- the posture of the rolled part may be measured by the two-dimensional vision sensor 5 to confirm that the measurement data can be classified in three ways.
- the inside and outside are not required.
- 3 kinds as shown in FIG. Obtaining has been observed experimentally.
- the position finally aligned with the pallet 8 via the robot group 6 from the position and orientation of the parts on the temporary table 4 It is possible to design an operation sequence of the gripping position / posture up to the posture. Accordingly, the position and orientation of the component on the temporary table 4 are measured by the two-dimensional vision sensor 5, the operation sequence is selected in each case, the position and orientation for gripping the component is calculated, and the operation sequence is controlled by the control device 7.
- any or all of the states in which the components are gripped by the robots 6a to 6c are detected by the two-dimensional vision sensor 5 (or the three-dimensional vision). You may comprise so that it may measure with the sensor 1) and may change subsequent robot operation
- FIG. 9 is a side view schematically showing the overall configuration of the component supply apparatus according to Embodiment 4 of the present invention.
- the same components as those described above (see FIG. 1) are denoted by the same reference numerals as those described above, or A detailed description will be omitted by adding “A” after the reference numeral.
- the three-dimensional vision sensor 1 is attached to the side of the hand 3h of the robot 3 and realizes a hand-eye configuration together with the hand 3h. Accordingly, the three-dimensional vision sensor 1 is configured to be able to change the imaging position and orientation in accordance with the movement of the robot 3.
- FIG. 10 is a flowchart showing the operations of the three-dimensional vision sensor 1 and the robot 3 according to the fourth embodiment of the present invention.
- the same processing steps as those described above (see FIG. 5) are denoted by the same reference numerals as those described above. The description is omitted.
- FIG. 11 is an explanatory view showing the operation of the robot 3 according to the fourth embodiment of the present invention.
- steps S71 to S73 correspond to the measurement process (step S12) by the three-dimensional vision sensor 1.
- the robot 3 departs from its own standby position (step S11) and starts to move along the imaging trajectory (step S70).
- the imaging trajectory refers to a trajectory that moves from left to right so that the imaging direction of the imaging element of the three-dimensional vision sensor 1 is directed to the bulk stacking component box 2 from above the bulk stacking component box 2, for example.
- the imaging trajectory is a trajectory in which the 3D vision sensor 1 gradually approaches the bulk stacking component box 2 while placing the bulk stacking component box 2 in the field of view of the 3D vision sensor 1, or a geometric curve. It refers to a trajectory that gradually approaches the bulk stacking component box 2 while drawing a spiral or arc.
- imaging instructions F1 and F2 are generated a plurality of times (here, twice) for the three-dimensional vision sensor 1 during the movement of the robot 3 by the imaging trajectory (steps S71 and S72). ). Thereby, the three-dimensional vision sensor 1 performs image capturing G1 and G2 a plurality of times (here, twice) (steps S61 and S62).
- the three-dimensional vision sensor 1 performs the same arithmetic processing as described above in cooperation with the control device 7A, and measures the distance image based on the plurality of image data obtained in steps S61 and S62 (step S1). .
- the three-dimensional vision sensor 1 and the control device 7A optimize the gripping candidate coordinates (step S2), determine the gripping coordinates, send the gripping candidate coordinates to the robot 3 (step S3), and perform step S61.
- the robot 3 When the robot 3 receives the coordinates of the gripping candidate (grip coordinates) from the 3D vision sensor 1 (step S73), the robot 3 moves to the obtained grip coordinates (step S13) and closes the hand (step S14) as described above. ) To move to the coordinates of the temporary table 4 (step S15), roll the component on the temporary table 4 by the hand opening operation (step S16), and return to step S11.
- the two-dimensional vision sensor 5, the control device 7A, and the robot group 6 perform the same processing operation as described above (FIG. 5).
- FIG. 11 shows operations of steps S61, S62, S70 to S73, and S13 in FIG.
- the posture of the hand 3h (F1) and the 3D vision sensor 1 (F1) in the imaging instruction F1 the posture of the hand 3h (F2) and the 3D vision sensor 1 (F2) in the imaging instruction F2
- the grip The posture of the hand 3h moved to the coordinates is shown.
- the broken line arrows indicate the movement sequence of the robot 3 at the time of gripping coordinate measurement
- the two-dot chain line arrows indicate the image pickup G1 and G2 by the three-dimensional vision sensors 1 (F1) and (F2) at the respective movement positions. Is shown.
- the three-dimensional vision sensor 1 is provided integrally with the robot 3 to realize the hand-eye configuration.
- the imageable range by the sensor 1 and the operable range of the robot 3 are almost the same.
- FIG. 9 to 11 the robot group 6 (robots 6a to 6c) is used as the position / orientation changing means as in the first to third embodiments. As described above, one robot 6B may be used.
- FIG. 12 is a side view schematically showing an overall configuration of a component supply apparatus according to Embodiment 5 of the present invention.
- Components similar to those described above are denoted by the same reference numerals. Or “B” after the reference numeral, and detailed description is omitted.
- the point / orientation changing means for the parts from the temporary placing table 4 to the pallet 8 is different from the above (FIG. 9) in that it is constituted by one robot 6B.
- the above-described two-dimensional vision sensor 5 is omitted, and a part of the control sequence program in the control device 7B is different from the above.
- the robots 3 and 6B are, for example, general vertical articulated robots, horizontal joint robots, linear motion robots, or the like.
- the robot 3 includes a hand 3h having thin claws 3t (see FIGS. 3 and 4) that are tweezers or forceps, and the robot 6B includes a parallel chuck hand.
- the three-dimensional vision sensor 1 (distance image measuring means) integrally attached to the hand 3h of the robot 3 calculates a distance image of the target part in the bulk stacking part box 2. Similarly to the above, by analyzing the distance image, the robot 3 can pick up from the stacked parts (a part extending in the middle in FIG. The candidates are reduced to one by calculating and optimizing candidates.
- candidate parts are optimized by performing an optimization operation in which evaluation values are assigned to a plurality of candidates and one candidate having the highest evaluation value is selected. After that, the robot 3 does not carry the parts to the temporary table 4 and carefully places them on the temporary table 4, but releases the parts above the temporary table 4 and puts them on the temporary table 4 so as to throw them away. Roll.
- the entangled parts are unwound, and the probability that the parts are separated and moved to the temporary table 4 and stopped still is increased. Further, as described above, it is possible to improve the isolation success probability by using a region in which the posture error is small in the calibration error between the coordinate system of the three-dimensional vision sensor 1 and the coordinate system of the robot 3. Become.
- the three-dimensional vision sensor 1 (distance image measuring unit) has a function as a component outer shape measuring unit that captures an image of a component rolled on the temporary table 4 and obtains the outer shape of the component, and performs measurement.
- the position and orientation of the part are calculated from the outer shape.
- the position / orientation calculation is performed in the three-dimensional vision sensor 1 or in the control device 7B as described above.
- the calculation operation is performed by, for example, a template matching method, and the template image is registered in advance.
- the robot 6B equipped with the parallel chuck hand picks up the parts from the temporary placing table 4 and arranges the parts on the pallet 8 in an aligned manner.
- the three-dimensional vision sensor 1 recognizes a component, and recognizes a graspable part (protruded portion like an ear) of the recognized component or a portion that can be estimated as a graspable shape. Narrow down the position and orientation to one.
- the robot 3 operates so that the position and orientation narrowed down by the distance image coincides with the position and orientation of the claw 3t of the hand 3h. After the claw 3t is closed and the part is gripped, the part is removed from the bulk stacking part box 2. The claw 3t is opened above the temporary placement table 4, and the parts are rolled and placed on the temporary placement table 4.
- the component rests on the temporary placement table 4 in one position / posture among several stable states.
- the state is stable in an isolated state (a state in which the parts are placed on the temporary table 4 without causing entanglement or overlap).
- the 3D vision sensor 1 images the parts on the temporary placement table 4, and the position and orientation of the parts placed on the temporary placement table 4 by a pattern matching method between a template image registered in advance and a distance image. Recognize
- the robot 6B grips the component on the temporary placement table 4. At this time, if it is necessary to change the orientation of the component, the robot 6B once separates the component on the temporary placement table 4 and again grips the component from a different direction.
- the control device 7B outputs a sequence control command to the three-dimensional vision sensor 1 (distance image measuring means), the robot 3 (isolation means), and the robot 6B (position and orientation changing means), and repeatedly executes the above series of operations.
- the three-dimensional vision sensor 1 is configured to also function as a component outer shape measuring unit, but the above-described two-dimensional vision sensor 5 may be provided as the component outer shape measuring unit.
- the three-dimensional vision sensor 1 distance image measuring means for measuring the distance image
- the bulk stacking component box 2 and the bulk stacking component box 2
- a robot 3 isolation means for picking up parts from the machine
- a temporary table 4 for rolling one or more parts
- a part shape measuring means three-dimensional vision sensor 1
- a temporary table 4 for rolling one or more parts
- a part shape measuring means three-dimensional vision sensor 1
- a temporary table 4 for rolling one or more parts
- a part shape measuring means three-dimensional vision sensor 1
- a temporary table 4 for picking up the external shape of the part
- a temporary table 4 for picking up the parts that have been rolled up and changes the position and orientation of the parts to a position and orientation that are less than a certain error with respect to the position and orientation that are specified in advance.
- the robot 3 (first robot) grips and picks up the parts from the bulk parts box 2, and the robot 6B (second robot) picks up the parts that the robot 3 has rolled onto the temporary placement table 4. Change the position and orientation of parts.
- the three-dimensional vision sensor 1 integrated with the robot 3 also functions as a component outer shape measuring unit that measures the outer shape of a component on the temporary table 4 and also includes a partial function of a position / orientation changing unit.
- a dedicated hand is not required for each part, and hand cost reduction, hand design time reduction, and hand temporary storage place reduction can be realized.
- the production model can be switched only by changing the software, and the production model can be switched quickly.
- the robot 3 since the robot 3 has a hand-eye configuration in which the three-dimensional vision sensor 1 is integrally provided, the tact time is increased, but the sizes of the bulk stacking component box 2 and the temporary placing table 4 are set to the three-dimensional vision sensor 1. Larger than the field of view.
- Embodiment 6 FIG.
- the three-dimensional vision sensor 1 (distance image measuring means) is attached only to the hand 3h of the robot 3.
- a dimensional vision sensor 1C distance image measuring means
- the temporary placement table 4 may be omitted.
- FIG. 13 is a side view schematically showing the overall configuration of the component supply apparatus according to Embodiment 6 of the present invention.
- Components similar to those described above are denoted by the same reference numerals as those described above, or A “C” is appended after the reference numerals, and the detailed description is omitted.
- the point that the temporary placing table 4 is removed and the point that the three-dimensional vision sensor 1C is added to the robot 6B are different from the above.
- the 3D vision sensors 1 and 1C calculate the distance image (FIG. 2) of the object in the same manner as described above. That is, the three-dimensional vision sensor 1 attached to the robot 3 calculates a candidate for a part that can be picked by the robot 3 from among the accumulated parts based on the analysis result of the distance image, and optimizes the candidate. Narrow down to one.
- a numerical model is formed as two cylinders (or prisms) having a minimum size including each nail separated by a distance corresponding to the opening width W of the nail 3t.
- a search is made from the latest distance image for a space in which the digitized cylinder is placed and where a part exists between the spaces.
- two places are found: a place extending in the middle and a place where a dent is formed on the near side.
- a shape of a small protrusion that can be gripped by the robot hand for example, a prism, a cylinder, a flat plate, or a disk that fits between the claws 3t of the open hand 3h is subjected to template matching with a distance image, and a plurality of candidates are obtained.
- an optimization operation is performed in which evaluation values are assigned to a plurality of candidates and one candidate having the highest evaluation value is selected.
- the highest Z-axis value is adopted as the evaluation value among the protrusions sandwiched between the two cylinders or the small prisms discovered, and the candidate having the maximum evaluation value is selected. select. This is equivalent to selecting the part stacked at the uppermost position among the parts stacked in bulk. In other words, this means that the candidate for the component is optimized by selecting the maximum evaluation value.
- the central protrusion location is an optimized candidate.
- How to apply the claw 3t of the hand 3h of the robot 3 to the optimization candidate to grasp the part is determined by using the position and orientation of the robot 3 as the XYZ value in the robot coordinate system and the rotation angle around each axis. By adding the relative position and orientation with respect to the position and orientation of the previous candidate, it can be obtained by simple calculation.
- the gripping position can also be obtained by extracting candidate points by template matching a minute prismatic part that can be picked up by the hand 3h to the distance image and optimizing the Z-axis height.
- the posture can be calculated, and the same effect can be obtained.
- the robot 3 sets the parts stacked in the bulk parts box 2 in the state where the claws 3t of the hand 3h are opened by the robot coordinate calculation method, and puts the parts in the gripping position obtained at that time. Move and close the hand.
- the robot 3 moves the hand 3h in the Z-axis direction, pulls up the parts that can be gripped from the bulk stacking part box 2, and then passes the picked-up parts to the robot 6B.
- the robot 6B measures the part gripped by the robot 3 with a three-dimensional vision sensor 1C (distance image measuring means) provided in the hand 6h, and recognizes the position and orientation of the part and determines the gripping position. Approach the part and grip the part.
- a three-dimensional vision sensor 1C distance image measuring means
- the robot 3 may fail in the operation of taking out the parts from the bulk parts box 2 or may be picked up in a lump with a plurality of parts tangled.
- the isolation failure state as described above can be determined based on the imaging result of the three-dimensional vision sensor 1C attached to the robot 6B during the parts delivery operation between the robot 3 and the robot 6B.
- the robot 3 when the robot 3 cannot be brought into the isolated state (preferably in a state where only one part is not entangled) and the removal of the part has failed, the robot 3 line-outs the part currently being gripped. The picking operation is performed again.
- the robot 6B grips the parts delivered from the robot 3 and arranges them on the pallet 8 by the hand 6h consisting of a parallel chuck hand. , Complete a series of operations.
- the operation of the entire apparatus according to the sixth embodiment (FIG. 13) of the present invention will be described in the order in which the parts are handled.
- the three-dimensional vision sensor 1 (distance image measuring means) of the robot 3 recognizes a part in the bulk stacking part box 2 and a graspable part (for example, a part protruding like an ear) of the recognized part. Or the position and orientation of a portion that can be estimated to have such a shape) is narrowed down to one.
- the robot 3 is operated so that the position and orientation of the grippable part and the position and orientation of the claw 3t of the hand 3h of the robot 3 coincide. After that, the robot 3 closes the claw 3t and grips the part, then takes out the part from the bulk stacking part box 2 and puts it out near the robot 6B.
- the three-dimensional vision sensor 1C (distance image measuring means) of the robot 6B recognizes the position and orientation of the component that has been sent out. At this time, if it is necessary to change the orientation of the component, the robot 6B grips the component while rotating the hand 6h. Is returned and the robot 3 picks up the part, and then grips the part again from a different direction.
- the control device 7C outputs a sequence control command to the robot 3 (isolation means), the robot 6B (position and orientation change means), and the three-dimensional vision sensors 1 and 1C (distance image measurement means), and performs the above series of operations. Let it run repeatedly.
- the three-dimensional vision sensor 1 distance image measuring means
- the bulk stacking component box 2 and the bulk stacking component box 2
- a robot 3 isolation means
- a 3D vision sensor 1C distance image measurement means
- a robot 6B position and orientation changing means that changes the position and orientation of the component to a position and orientation that is equal to or less than a predetermined error with respect to the position and orientation that are specified in advance.
- a dedicated hand is not required for each part, and hand cost reduction, hand design time reduction, and hand temporary storage place reduction can be realized.
- the production model can be switched only by changing the software, and the production model can be switched quickly.
- the robots 3 and 6B are each provided with the three-dimensional vision sensors 1 and 1C integrally provided and have a hand-eye configuration, the tact time is increased, but the size of the bulk parts box 2 and the pallet 8 is increased. Can be configured larger than the field of view of the three-dimensional vision sensor 1, 1C.
- Embodiment 7 FIG.
- the robot 6B for delivering parts to and from the robot 3 is provided.
- the 3D may also serve as the position / orientation changing means, and the robot 6B described above may be omitted.
- FIG. 13 is a side view schematically showing the overall configuration of a component supply apparatus according to Embodiment 7 of the present invention.
- the same parts as those described above (see FIG. 12) are denoted by the same reference numerals as those described above, or A “D” is appended to the reference numeral and the detailed description is omitted. In this case, only the point that the robot 6B is removed is different from the above, and the robot 3D has a hand 3h as shown in FIG.
- the three-dimensional vision sensor 1 cooperating with the control device 7D obtains a distance image (FIG. 2) of the bulk stacking component box 2 and performs template matching to obtain a grippable part as described above.
- the robot 3D is operated so that the claw 3t of the hand 3h of the robot 3D can be applied to the optimization candidates.
- the robot 3D (isolation means) takes out only one part from among the parts contained in the bulk parts box 2. At this time, in order to improve the isolation success probability, an area where the posture error is small is used among the calibration errors between the coordinate system of the three-dimensional vision sensor 1 and the coordinate system of the robot 3D.
- the robot 3D carries the parts gripped from the bulk stacking part box 2 to the temporary placing table 4 and releases the parts so as to throw them above the temporary placing table 4.
- the robot 3D may fail to take out the parts from the bulk parts box 2, and a plurality of parts may be tangled and rolled to the temporary table 4 in a lump, or a plurality of parts may be It may roll to the temporary table 4 without being tangled.
- the state as described above can be determined by imaging the temporary placing table 4 with the three-dimensional vision sensor 1 immediately after releasing the parts gripped by the robot 3D.
- the picking operation is performed again. Further, when a plurality of parts are entangled and rolled to the temporary table 4 in a lump, the parts are lined out by means of inverting the top plate of the temporary table 4. Furthermore, when a plurality of parts roll onto the temporary table 4 without being entangled, the robot 3D handles the parts one by one, or handles only one, and then reverses the top plate of the temporary table 4 The remaining parts are lined out by means of making them.
- the three-dimensional vision sensor 1 (distance image measuring unit) cooperating with the control device 7D functions as a component outer shape measuring unit that captures an image of the part rolled on the temporary placement table 4 and acquires the outer shape.
- the position and orientation of the component on the temporary table 4 are calculated from the measured outer shape by the template matching method.
- the robot 3D picks up the parts from the temporary placement table 4 and arranges them on the pallet 8 in an aligned manner.
- the operation of the entire apparatus according to the seventh embodiment (FIG. 14) of the present invention will be described according to the order in which the parts are handled.
- the three-dimensional vision sensor 1 distance image measuring means
- the robot 3D The robot 3D is operated so that the position and orientation of the claw 3t (see FIGS. 3 and 4) of the hand 3h match.
- the robot 3D closes the claw 3t, grips the part, takes out the part from the bulk stacking part box 2, opens the claw above the temporary placing table 4, and rolls the part to the temporary placing table 4. .
- the part rests on the temporary table 4 in one of several stable states.
- the parts are placed on the temporary placing table 4 without being entangled or overlapped in an isolated state.
- the three-dimensional vision sensor 1 recognizes the position and orientation of a component placed on the temporary placement table 4 by a pattern matching method between a captured image on the temporary placement table 4 and a template image registered in advance. As a result, since the three-dimensional position and orientation of the part that has been rolled onto the upper surface of the temporary placement table 4 can be measured, the robot 3D grips the component on the temporary placement table 4. At this time, if it is necessary to change the orientation of the component, the component is once released on the temporary table 4 and is gripped again from a different direction.
- the control device 7D outputs a sequence control command to the robot 3D (isolation unit and position and orientation change unit) and the three-dimensional vision sensor 1 (distance image measurement unit), and repeatedly executes the above series of operations.
- the three-dimensional measuring the distance image of the bulk stacking component box 2 for storing the bulk stacked components and the components in the bulk stacking component box.
- Vision sensor 1 distance image measurement means
- robot 3D isolation means
- Position and orientation change means temporary table 4, robot 3D, three-dimensional vision sensor 1, control to change the position and orientation of the component to a position and orientation that is below a certain error with respect to the final position and orientation specified in advance.
- Device 7D ).
- the isolating means and the position / orientation changing means have one robot 3D that functions in common.
- the three-dimensional vision sensor 1 distance image measuring means
- the production model can be switched only by changing the software, and the production model can be switched quickly.
- Embodiment 8 FIG.
- the Z-axis of the coordinate axis representing the posture of the hand 3h of the robot 3 is used in the world coordinate system in the operation of approaching the bulk component box 2 in order for the robot 3 to pick up the parts. Although it is fixed vertically downward or in the direction of gravitational acceleration, the direction of the hand 3h only needs to rotate around the Z axis.
- a vertical joint type robot is used as the robot 3
- a SCARA robot horizontal articulated robot
- an orthogonal robot is used as the robot 3. I will do it.
- the use of a SCARA robot or an orthogonal robot reduces the number of motors compared to a vertical robot, so that the system cost can be reduced.
- Embodiment 9 FIG. Further, although not specifically mentioned in the first to eighth embodiments, when a grippable part is first measured with a three-dimensional vision sensor among the parts in a stacked state, as shown in FIG. A plurality of three-dimensional vision sensors 1D may be used. A ninth embodiment of the present invention using a plurality of three-dimensional vision sensors 1D will be described below with reference to FIGS.
- FIG. 15 is a perspective view showing a three-dimensional shape of a part 10 to be gripped in the ninth embodiment of the present invention
- FIGS. 16 and 17 are explanatory diagrams showing a problem to be solved by the ninth embodiment of the present invention.
- 16A is a side view of the parts 10 stacked in bulk
- FIG. 16B is a front view of the same parts 10.
- 17A is a side view of the overlapping parts 10a and 10b
- FIG. 17B is a side view of the parts 10c and 10d in a light-shielded state.
- FIG. 18 is a side view showing a plurality of three-dimensional vision sensors 1D according to Embodiment 9 of the present invention, corresponding to the components 10a to 10d in the same state as FIG.
- the part 10 is in the middle of closing the claw 3t.
- the probability of escaping and gripping the component 10 is low.
- the claw 3t should be closed with respect to the component 10 in the front direction, thereby increasing the probability that the component 10 can be gripped. Accordingly, the side surfaces of the protrusions (corner portions) of the component 10 are not gripped from a direction in which the side surfaces are inclined as shown in FIG. 16A, but the side surfaces are vertically cut as shown in FIG.
- the recognition algorithm may be configured to search for and select the current direction.
- a blank area from which distance data is not obtained is regarded as a part 10a that is standing up, and is determined to be a grippable part, and the nail 3t of the hand 3h is lowered.
- the claw 3t may collide with the part 10b that is not visible, and in the worst case, the claw 3t may be damaged.
- wear and metal fatigue progress, eventually leading to breakage and a decrease in the gripping probability.
- a plurality of (for example, three) three-dimensional vision sensors 1D are used.
- the distance data measured by each three-dimensional vision sensor 1D is transferred to the control device 7 and synthesized as data in one space.
- the distance data measured by the other 3D vision sensor 1D is transferred and synthesized with the distance data measured by one 3D vision sensor 1D.
- the blank areas of the respective three-dimensional vision sensors 1D are complemented with each other, and the synthesized distance data is obtained as shown in FIG.
- a gripping operation of the component 10d is performed following the operation of searching for a protruding portion. This can be similarly applied to the other embodiments 2, 3, 5, and 7.
- the robot is attached to the robot 3 as in the above-described fourth and sixth embodiments, and is moved while moving to a plurality of locations, and a plurality of distance data at each stationary position. May be measured. However, the entire measurement time is extended.
- the ninth embodiment (FIG. 18) of the present invention by acquiring a plurality of distance data, it is possible to grip a part 10 in the bulk stacking part box 2 that has a high probability that the robot 3 can grip it. A part with a low probability can be distinguished. Therefore, the success rate of gripping the component 10d is increased, and the number of times of gripping is reduced, so that the tact time of the entire system is increased. In addition, there is an effect of avoiding a system failure due to a collision between the component 10b and the claw 3t.
- an apparatus for aligning (or assembling) parts to be supplied in bulk can be obtained, and an automatic assembling apparatus. It is possible to innovate the necessary parts supply process for automatic assembly robots.
- the versatility of the robot can be dealt with by changing the software in the control device and changing the hardware of the robot.
- hardware changes it is basically possible to engineer as much as changing the size of the robot hand, and then use the hardware to handle the handling of various parts. Contributes to cost reductions in product and model switching.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
したがって、製品組立の自動化を推進するためには、組立装置に供給する部品を、何らかの手段で位置姿勢を揃える必要がある。
また、パーツフィーダによる部品整列が不可能な形状の部品が存在することから、パーツフィーダすら使えない場合もある。
または、バラ積み状態の部品の位置姿勢を認識し、必要があれば向きを変更して仮置き後、さらに必要があれば向きを変更して整列させる技術が知られている(たとえば、特許文献2参照)。
また、好ましい姿勢になる確率が低い部品を供給対象とした場合には、より多くの部品を投入しておかなければならないので、手持ち在庫を多くかかえる必要があり、工場の空間体積(床面積、高さ)を浪費するという問題がある。
なお、吸着パッドが使用不可の場合に、ブロワで吸い付けることも考えられるが、この場合には、騒音が大きいうえ、電力消費が大きいという問題が生じる。
また、部品ごとにハンドを設計し直さなければならず、生産する機種の切替え時に、ハンドを作りなおすためのコスト(設計、製作、調整費)を要するという問題がある。
また、特許文献2の技術では、吸着パットが当てられる部位がない部品に対しては対応できないという課題があった。
さらに、従来の部品供給装置は、部品ごとに設計した特化ハンドを使用しているので、多大な設計コスト、ハンド切替え時間およびハンド仮置き場所を必要とするという課題があった。
また、複雑な形状の部品であっても、整列処理のサイクルタイムが延長を回避することができる。さらに、ソフトウェアの変更のみで生産機種の切替を迅速に行うことができるので、部品ごとに専用ハンドを必要とせず、ハンドのコスト削減、ハンド設計時間の短縮、ハンド仮置き場所の削減が可能となる。
以下、図1~図5を参照ながら、この発明の実施の形態1について説明する。
図1はこの発明の実施の形態1に係る部品供給装置の全体構成を概略的示す側面図であり、複数(ここでは4台)のロボット3、6a、6b、6cを用いて、バラ積み部品を整列供給する構成を示している。
距離画像とは、撮像されたシーンの画像の各ピクセルに対し、ある座標系から見たときの「特定座標軸方向の座標値をマッピングしたもの」であり、たとえば、ロボット3のベース座標系から見たときの、部品が積み上がった高さのマップのことである。
なお、バラ積み部品箱2は、内部を仕切って複数種類の部品を格納可能な構成としてもよく、この場合、仕切り区画の大きさは均等でなくてもよい。
さらに、パレットチェンジャやベルトコンベアなどでバラ積み部品箱2を交換可能とすることにより、連続的に部品を使い切らないように構成してもよい。
ロボット3は、3次元ビジョンセンサ1、仮置き台4および制御装置7とともに、単離手段(後述する)として機能し、ピンセット状または鉗子状の細い爪3tを有するハンド3hを備えている。
一方、ロボット群6は、2次元ビジョンセンサ5および制御装置7とともに、部品の位置姿勢変更手段として機能し、ロボット群6内のロボット6a~6cは、それぞれ平行チャックハンドを有している。
まず、3次元ビジョンセンサ1は、距離画像を解析することにより、バラ積み部品箱2内に積み上がった部品の中から、ロボット3が把持可能な部位候補(以下、単に「候補」ともいう)を算出し、さらに最適化することにより、把持可能な部位候補を1つに絞り込む。
なお、把持可能な候補の演算処理に際しては、あらかじめ、ロボット3のハンド3hの爪3tの形状およびサイズを数値化しておく。
このとき、円柱または角柱の太さは、爪3tの太さを近似し、円柱または角柱の長さは、把持する際に部品に爪3tを掛ける深さを近似することになる。
図2の場合であれば、図2の中程に突起状に伸びている箇所と、図2中の手前側で凹みが生じている箇所に隣接した箇所と、の2カ所が発見されることになる。
最適化演算の方法として、2本の円柱に挟まれた突起物(または、発見された微少な角柱)のうち、最も高いZ軸値を評価値として採用し、その評価値が最大となる候補を選択する。
図2の場合であれば、図2の中央の突起が存在する箇所が、最適化された候補となる。
すなわち、部品を把持できるロボット3の位置姿勢を、ロボット座標系におけるXYZ値および各軸周りの回転角として、把持可能な候補の位置姿勢に対する相対位置姿勢を加えることにより、算出することができる。
このとき、ロボット3は、部品を丁寧に仮置き台4に置くのではなく、放り投げるように仮置き台4に置くことが望ましい。これにより、絡まった部品が解けて、1つずつ、分離した状態で仮置き台4に転がって静止する確率が高くなる。
なお、「単離」とは、パラ積み部品箱2の中から1つの部品のみを摘み出すことを指す。
仮置き台4に排除機構を設けた場合には、エラーリカバリが迅速に行われ、タクトタイム(Tact Time)が伸びにくくなるという効果がある。
なお、仮置き台4の天板上面の高さ(ロボット座標系Z軸方向)は、あらかじめ計測されて、制御装置7の記憶領域に記憶されているのとする。
ただし、ロボット3は、部品の取り出しに失敗して部品を把持できないこともあり、複数の部品が絡まって、ひとかたまりの状態で仮置き台4に転がることもあり、また、複数の部品が絡まらずに、それぞれが仮置き台4の上に転がることもある。
たとえば、バラ積み部品箱2からの部品の取り出しに失敗した場合は、ロボット3による摘み出し操作を再度行う。
なお、ラインアウトとは、たとえば部品捨て箱(図示せず)を用意しておき、部品捨て箱内に部品を廃棄する、などを行うことにより、容易に実現することができる。
部品の位置姿勢の算出演算処理は、2次元ビジョンセンサ5内または制御装置7において、たとえばテンプレートマッチング法により行われる。
取り扱う部品に対して登録される画像は、裏表が無関係の部品の場合には1つ、裏表に関係がある部品の場合には2つ、5通りの姿勢で静止する部品の場合には5つとなる。
まず、3次元ビジョンセンサ1による部品認識を行い、認識された部品の把持可能な部品の一部分(たとえば、耳のように突き出た部分、または、突き出た形状と推定される部分)の位置姿勢を1つに絞り込む。
続いて、ロボット3は、バラ積み部品箱2から部品を取り出して、仮置き台4の上方で爪3tを開き、部品を仮置き台4に転がすことにより、部品を仮置き台4の上に置く。
また、部品が絡みや重なりを起こさずに仮置き台4に置かれた状態を、「単離された状態」と記すことにする。
このとき、部品に複数の安定状態がある場合には、2次元ビジョンセンサ5は、それらすべての安定状態に対して認識プログラムを実行し、認識結果の信頼性が最も高い安定状態の結果を、全体の認識結果として採用する。
前述のように、裏表が無関係の部品の場合には、安定姿勢は1つのみになる。
なぜなら、仮置き台4の高さが既知であり、且つ部品が単離されているので、どの安定状態であるかを判別できれば、部品の高さ位置が決定し、部品の姿勢ズレも平面内の回転のみになるからである。
2次元ビジョンセンサ5は、部品の位置姿勢の座標がどのテンプレート画像とマッチしたかを示す「パターンの識別情報」を出力する。
また、2次元ビジョンセンサ5は、仮置き台4上に部品が存在しなかったこと、部品がセンサ視野から外れていること、または、部品の位置姿勢の座標がいずれのテンプレート画像ともマッチしなかったこと、を示す「識別情報」を出力する。
図5のシーケンスは、制御プログラムとしてソフトウェア化されており、制御装置7内に格納されているものとする。
ロボット3は、3次元ビジョンセンサ1の計測により把持候補の座標を取得すると、把持座標に移動し(ステップS13)、ハンド3hを閉動作させて把持候補を把持する(ステップS14)。
ロボット6aは、2次元ビジョンセンサ5の計測によりパターン識別情報および把持座標を取得すると、計測結果(制御装置7内の部品情報)に応じて、ステップS31、S32、S34への分岐判定を行い、判定結果にしたがう分岐動作を行う(ステップS33)。
ロボット6bは、ロボット6aの部品送り座標に移動し(ステップS43)、ハンドを閉動作させて、ロボット6aが把持していた部品を把持する(ステップS44)。
ロボット6cは、ロボット6bの部品送り座標に移動し(ステップS53)、ハンドを閉動作させて、ロボット6bが把持していた部品を把持する(ステップS54)。
ロボット6aは、ステップ33において、計測結果に基づく以下の判定処理と、判定結果にしたがう分岐動作とを行う。
仮置き台4から部品を排除した後は、ロボット6aは、ステップS31に戻り、待機姿勢に移動する。これと同時に、制御装置7はロボット3に対する動作指令を生成し、ロボット3に一連の動作(ステップS11~S16)を再度実行させる。
この部品姿勢反転手数低減動作の終了後、ロボット6aは、ステップS32に戻り、2次元ビジョンセンサ5の計測動作を再度実行させる。
この場合、ロボット3が各升目に対して順番に抽出作業を行うと、間違った部品に行き当たったとき、本来の順番とは異なる部品を抽出して仮置き台4上に乗せることになるが、2次元ビジョンセンサ5のパターン計測結果により、異種部品であることを判定することができる。
また、単離手段は、ロボット3による把持後に解放されて(転がされた)1つ以上の部品が載置される仮置き台4を備えている。
ロボット群6は、複数のロボット6a~6cからなり、仮置き台4から摘み上げた部品を複数のロボット6a~6cの相互間で受渡ししながら、部品の位置姿勢を変更する。
また、ソフトウェアの変更のみで生産機種の切替を迅速に行うことができるので、部品ごとに専用ハンドを必要とせず、ハンドのコスト削減、ハンド設計時間の短縮、ハンド仮置き場所の削減が可能となる。
また、複数種類の部品を供給対象とした場合でも、部品ごとに専用ハンドを必要とせず、ハンドのコスト削減、ハンド設計時間の短縮、ハンド仮置き場所の削減が可能となる。
なお、上記実施の形態1(図1~図5)では、特に言及しなかったが、3次元ビジョンセンサ1および制御装置7と関連するロボット3の動作において、バラ積み部品箱2内の部品の摘み上げ成功率を最良化するために、最適なパラメータを選択するパラメータ最適化手段を設けてもよい。
ロボット3の動作において、部品の耳の位置姿勢と、ロボット3のハンド3hの爪3tの位置姿勢とが一致するようにロボット3を動作させた後に、爪3tを閉じて、部品を把持する動作がある。
したがって、タクトタイムが短くなるようなパラメータを選択することが重要である。
このとき、与えたパラメータ値と、各パラメータ値で得られた単離成功確率Pとからなる複数の組合わせマップデータは、制御装置7内に記録されていく。
なお、試行を始める前に、パラメータの組合わせを最適化するのであれば、直交表またはD最適などの手法を用いればよい。また、試行の最中に動的に最適化を行うことも可能である。これには、公知文献(たとえば、特開2008-36812号公報)に示される実験条件の自動発生方法を使用することができる。
図6において、図6(a)は、単離成功確率Pが低い摘み出し動作を示し、図6(b)は、この発明の実施の形態2による単離成功確率Pが高い摘み出し動作を示し、図6(c)は、図6(b)におけるハンド3hの動作パラメータを示し、図6(d)は、単離成功確率Pを最大化するパラメータ(水平距離d、角度θ)の組合わせ(黒丸参照)を示している。
一方、図6(b)、図6(c)において、部品を把持したハンド3hは、太線矢印で示すように、角度θ方向に水平距離dだけ移動した後に、鉛直上方に引き上げている。
つまり、把持部品が引き抜き軌道を移動している期間において、把持点には外力が加わるので、この外力が、把持初期の安定把持状態を破壊して、単離成功確率Pを低下させる要因となっている。
その後、単離成功確率Pを角度θおよび水平距離dの関数と見なして、単離成功確率Pを計算するための回帰式、たとえば、以下の式(1)において、各係数A、B、C、D、Eを求める。
回帰式が求まったら、単離成功確率Pを最大化する角度θおよび水平距離dをマップデータから読み出す。
こうして、試行と回帰式係数A、B、C、D、Eの演算とを行い、求まった回帰式から、単離成功確率Pを最大化するパラメータ(角度θ、水平距離d)の読み出しを行うことにより、図6(d)に示すように、角度θおよび水平距離dの選択値が決定する。
なお、上記実施の形態1、2では、各部の機能および特徴について、総括的に言及しなかったが、この発明によるバラ積み部品の整列効果は、以下の第1~第5の特徴(機能)により実現されている。
まず、第1の特徴は、3次元ビジョンセンサ1および制御装置7からなる距離画像計測手段を備えていることである。
すなわち、部品の転がった姿勢ごとに、以下のロボット群6がどの位置姿勢の部品をハンドで把持すればよいかという順序シーケンスを有限個用意し、その順序シーケンスのうちのいずれを選択するかを、2次元ビジョンセンサ5の計測値の結果に基づき決定することができる。このように、臨機応変に把持位置姿勢のシーケンスを変更することにより、部品整列の成功確率を確実に向上させることができる。
従来装置では、部品全体の形状モデルと計測データとのマッチングが行われていたが、この発明においては、把持可能な部品の部位(たとえば、部品の耳)を探して、その部位が爪3tの間に入る位置まで爪先を移動させ、ロボット3を動作させて爪先を移動した後、爪3tを閉じる機能を有する。これにより、確実に部品を把持できる確率が高くなるという効果がある。
従来装置では、部品を把持する位置姿勢(部品と爪との相対位置姿勢)を、あらかじめ1つのみ設定し、設定した位置姿勢を変更することなく、一旦把持した後は、相対姿勢を固定した状態で最後まで把持し続けて作業が行われていた。
実験では、部品整列作業の成功確率は、部品の種類にもよるが、たとえば9割以上得られることが分かっている。たとえ失敗したとしても、再度、計測および把持の動作を繰り返すことにより、先に示すような成功確率のもとでは、失敗し続ける確率は極端に低くなる。
また、把持位置変更機能によれば、他の方向から把持し直すことにより、作業の最終段階で必要となる部品の位置姿勢の精度が、把持し直しを行うごとに現在精度よりも高くなるので、最終段階で必要な精度が得られるという効果がある。
従来装置のように、1台のロボットのみで部品をハンドリングした場合、部品姿勢反転操作(たとえば、部品の裏表をひっくり返す作業)が成功しないという問題がある。
なぜなら、1台のロボットが一度把持した部品の裏表を反転させる操作において、部品を把持したまま反転させ、反転状態で把持を解放して仮置きし、再度把持する必要があるが、反転状態で把持を解放して仮置きしようとしても、部品の重力方向にハンドが存在するので、仮置き台に置くことができないからである。
これにより、生産システムの立上げと機種切替の際に、制御装置7内のソフトウェアの変更のみでロボット群6の動作を変更して、部品の反転を実現することができる。
この最終的な把持位置と部品の姿勢とを制約条件として、仮置き台4に転がった部品の姿勢から出発して、最終的な把持位置および姿勢に至る位置姿勢変更手段として、複数台のロボット6a~6cを備えている。
各扇型図形(破線)内の占有空間は、各ロボット6a~6cの移動可能範囲に対応し、各占有空間には制限が与えられているが、この制限は、部品の表裏反転動作に制限があることを表現している。
図8においては、仮置き台4上に或る部品を単離して転がした際に3通りの安定的な姿勢を取る場合での、3通りの姿勢変更操作を例にとって、ロボット6aによる仮置き台4上の部品の3通りの把持位置姿勢L1~L3(破線と実線との接続点)と、ロボット6bによる3通りの姿勢変更軌道M1~M3とが、関連付けて示されている。
部品の形状によっては、裏表すら不問の場合もあるし、裏表の2通りの場合、図8のように3通りの場合、または複雑な形状の部品であれば、5通り以上の安定姿勢を取り得ることが、実験的に観測されている。
したがって、2次元ビジョンセンサ5により仮置き台4上の部品の位置姿勢を計測し、各場合においてどの動作シーケンスを選択したうえで、部品を把持する位置姿勢を算出し、動作シーケンスを制御装置7で行うように構成する。
なお、供給対象部品を最終的にパレット8に整列させる場合について説明したが、パレット8への整列に限定されることはなく、最終段階で部品同士を順次に勘合させて製品を組立てるように構成してもよい。
このように、バラ積み供給された部品を製品に組立てるように構成した部品供給装置においても、同様の効果を奏することは言うまでもない。
なお、上記実施の形態1~3(図1~図8)では、3次元ビジョンセンサ1を、ロボット3とは別構成としたが、図9のように、3次元ビジョンセンサ1をロボット3のハンド横に装着してハンドアイ構成としてもよい。
図9はこの発明の実施の形態4に係る部品供給装置の全体構成を概略的示す側面図であり、前述(図1参照)と同様のものについては、前述と同一符号を付して、または符号の後に「A」を付して詳述を省略する。
3次元ビジョンセンサ1は、ロボット3のハンド3hの横に取り付けられて、ハンド3hとともにハンドアイ構成を実現している。これにより、3次元ビジョンセンサ1は、ロボット3の運動にともなって撮像位置姿勢を変更可能に構成されている。
また、図11はこの発明の実施の形態4によるロボット3の動作を示す説明図である。
まず、ロボット3は、自身の待機位置(ステップS11)を出発し、撮像軌道による運動を開始する(ステップS70)。
なお、撮像軌道とは、たとえば、バラ積み部品箱2の上方から、3次元ビジョンセンサ1の撮像素子の撮像方向がバラ積み部品箱2に向くように、左右に移動するような軌道を指す。
これにより、3次元ビジョンセンサ1は、複数回(ここでは、2回)の画像撮像G1、G2を行う(ステップS61、S62)。
また、3次元ビジョンセンサ1および制御装置7Aは、把持候補の座標を最適化して(ステップS2)、把持座標を確定し、把持候補の座標をロボット3に送出して(ステップS3)、ステップS61に戻る。
以下、2次元ビジョンセンサ5、制御装置7Aおよびロボット群6は、前述(図5)と同様の処理動作を行う。
図11においては、撮像指示F1におけるハンド3h(F1)および3次元ビジョンセンサ1(F1)の姿勢と、撮像指示F2におけるハンド3h(F2)および3次元ビジョンセンサ1(F2)の姿勢と、把持座標に移動したハンド3hの姿勢とが示されている。
また、バラ積み部品箱2のサイズを、3次元ビジョンセンサ1の視野よりも大きく構成することができる効果がある。
なお、上記実施の形態4(図9~図11)では、位置姿勢変更手段として、前述の実施の形態1~3と同様に、ロボット群6(ロボット6a~6c)を用いたが、図12のように、1台のロボット6Bを用いてもよい。
ロボット3は、ピンセット状または鉗子状の細い爪3t(図3、図4参照)を有するハンド3hを備えており、ロボット6Bは、平行チャックハンドを備えている。
また、前述と同様に、距離画像を解析することにより、積み上がった部品の中から、ロボット3が、摘める部位(図2内の中程に突起状に伸びている箇所、手前側に凹みが生じている箇所)の候補を算出し、最適化することで候補を1つに絞り込む。
その後、ロボット3は、部品を仮置き台4に運び、丁寧に仮置き台4上に置くのではなく、仮置き台4の上方で部品を解放して、放り投げるように仮置き台4に転がす。
また、前述と同様に、3次元ビジョンセンサ1の座標系とロボット3の座標系のキャリブレーション誤差のうち、姿勢誤差が小さくなる領域を使うことにより、単離成功確率を向上させることが可能となる。
なお、位置姿勢の算出演算は、前述と同様に、3次元ビジョンセンサ1内、または制御装置7B内で行われる。また、算出演算は、たとえばテンプレートマッチング法により行われ、テンプレート画像はあらかじめ登録されるものとする。
まず、3次元ビジョンセンサ1(距離画像計測手段)は、部品認識を行い、認識された部品の把持可能な部位(耳のように突き出た部分)、または、把持可能な形状と推定できる部分の位置姿勢を、1つに絞り込む。
ここでは、説明を簡単にするため、単離状態(部品が絡みや重なりを起すことなく、仮置き台4に置かれた状態)で安定した場合について説明する。
このとき、部品の向きを変える必要がある場合には、ロボット6Bは、一旦仮置き台4上で部品を離して、再度、違う向きから部品を把持し直す。
なお、ここでは、3次元ビジョンセンサ1が、部品外形計測手段の機能を兼ねる構成としたが、部品外形計測手段として、前述の2次元ビジョンセンサ5を備えていてもよい。
ロボット3と一体の3次元ビジョンセンサ1は、仮置き台4上の部品の外形を計測する部品外形計測手段としても機能し、位置姿勢変更手段の一部機能をも備えている。
また、ソフトウェアの変更のみで生産機種の切替が可能となり、生産機種切替の迅速化を実現することができる。
なお、上記実施の形態5(図12)では、ロボット3のハンド3hのみに3次元ビジョンセンサ1(距離画像計測手段)を取り付けたが、図13のように、ロボット6Bのハンド6hにも3次元ビジョンセンサ1C(距離画像計測手段)を取り付けて、仮置き台4を省略してもよい。
この場合、仮置き台4が除去された点と、ロボット6Bに3次元ビジョンセンサ1Cが追加された点と、が前述と異なる。
すなわち、ロボット3に取り付けられた3次元ビジョンセンサ1は、距離画像の解析結果により、積み上がった部品の中から、ロボット3が、摘める部位の候補を算出し、最適化することで候補を1つに絞り込む。
図2の例であれば、中程に突起状に伸びている箇所と、手前側に凹みが生じている箇所との2カ所が発見されることになる。
ロボット3は、バラ積み部品箱2の中にバラ積みされた部品を、上記ロボット座標算出方法によって、ハンド3hの爪3tが開いた状態にしたうえで、その時点で得られている把持位置に移動して、ハンドを閉じる。
このとき、ロボット6Bは、ロボット3が把持した部品を、ハンド6hに設けられた3次元ビジョンセンサ1C(距離画像計測手段)で計測し、部品の位置姿勢を認識したうえで把持位置を判定し、部品に接近して、部品を把持する。
上記のような単離失敗状態は、ロボット3とロボット6Bとの間の部品受渡し動作の際に、ロボット6Bに取り付けられた3次元ビジョンセンサ1Cの撮像結果により、判別することができる。
まず、ロボット3の3次元ビジョンセンサ1(距離画像計測手段)は、バラ積み部品箱2内の部品を認識し、認識された部品の、把持可能な部位(たとえば、耳のように突き出た部分、または、そのような形状になっていると推定できる部分)の位置姿勢を、1つに絞り込む。
その後、ロボット3は、爪3tを閉じて部品を把持したうえで、バラ積み部品箱2から部品を取り出して、ロボット6Bの近くに差し出す。
このとき、ロボット6Bは、部品の向きを変える必要がある場合には、ハンド6hを回転動作させながら部品を把持し、さらに複雑に姿勢を変更する必要がある場合には、一旦ロボット3に部品を差し出し返して、ロボット3に部品を引き取らせた後、違う向きから部品を再度把持し直す。
また、ソフトウェアの変更のみで生産機種の切替が可能となり、生産機種切替の迅速化を実現することができる。
なお、上記実施の形態5、6(図12、図13)では、ロボット3との間で部品の受渡しを行うロボット6Bを設けたが、図14のように、3次元ビジョンセンサ1を有するロボット3Dが位置姿勢変更手段の機能を兼ねる構成とし、前述のロボット6Bを省略してもよい。
この場合、ロボット6Bが除去された点のみが前述と異なり、前述と同様に、ロボット3Dは、図3または図4のようなハンド3hを有するものとする。
ロボット3Dは、バラ積み部品箱2から把持した部品を仮置き台4に運び、仮置き台4の上方で、放り投げるように部品を解放する。
上記のような状態は、ロボット3Dが把持した部品を解放した直後に、仮置き台4を3次元ビジョンセンサ1で撮像することにより、判別することができる。
また、複数の部品が絡まって、ひとかたまりの状態で仮置き台4に転がった場合には、仮置き台4の天板を反転させる手段などで、部品をラインアウトさせる。
さらに、複数の部品が絡まらずに仮置き台4に転がった場合には、ロボット3Dが、部品を1つずつハンドリングするか、1つのみをハンドリングした後、仮置き台4の天板を反転させる手段などで、残りの部品をラインアウトさせる。
以下、ロボット3Dは、仮置き台4から部品を摘み上げて、パレット8上に整列配置する。
まず、3次元ビジョンセンサ1(距離画像計測手段)は、バラ積み部品箱2内の部品の認識を行い、把持可能な部位を1つに絞り込み、把持可能な部位の位置姿勢と、ロボット3Dのハンド3hの爪3t(図3、図4参照)の位置姿勢とが一致するようにロボット3Dを動作させる。
ここでは、説明を簡単にするため、単離状態で、部品が絡みや重なりを起すことなく仮置き台4に置かれた場合について説明する。
これにより、仮置き台4の上面に転がった部品の、3次元的な位置姿勢が計測できたので、ロボット3Dは、仮置き台4上の部品を把持する。このとき、部品の向きを変える必要のある場合は、一旦仮置き台4上で部品を離して、再度、違う向きから把持し直す。
3次元ビジョンセンサ1(距離画像計測手段)は、ロボット3Dに一体的に設けられている。
これにより、前述と同様に、部品ごとに専用ハンドを必要とせず、ハンドのコスト削減、ハンド設計時間の短縮、ハンド仮置き場所の削減を実現することができる。
また、ソフトウェアの変更のみで生産機種の切替ができ、生産機種切替の迅速化を実現することができる。
なお、前述の実施の形態1では、ロボット3が部品をつまむために、バラ積み部品箱2に近づいていく動作において、ロボット3のハンド3hの姿勢を表わす座標軸のZ軸を、世界座標系の鉛直下向き、または重力加速度の向きに固定したが、ハンド3hの向きは、Z軸回りに回転しさえすればよい。
また、上記実施の形態1~8では、具体的に言及しなかったが、バラ積み状態の部品のうち、把持可能な部品を、まず3次元ビジョンセンサで計測する際に、図18のように、複数の3次元ビジョンセンサ1Dを用いてもよい。
以下、図15~図18を参照しながら、複数の3次元ビジョンセンサ1Dを用いたこの発明の実施の形態9について説明する。
図16において、図16(a)はバラ積みされた部品10の側面図、図16(b)は同じ部品10の正面図である。図17において、図17(a)は重ね部品10a、10bの側面図、図17(b)は遮光状態の部品10c、10dの側面図である。
また、図18はこの発明の実施の形態9による複数の3次元ビジョンセンサ1Dを示す側面図であり、図17と同じ状態の部品10a~10dに対応させて示している。
したがって、図16(a)のように、部品10の突起物(角部)の側面が斜めになっている方向から把持するのではなく、図16(b)のように、側面が垂直に切り立っている方向を探して選択するように、認識アルゴリズムを構成すればよい。
各3次元ビジョンセンサ1Dで計測された距離データは、制御装置7に転送され、1つの空間内のデータとして合成される。または、1つの3次元ビジョンセンサ1Dが計測した距離データに対し、他の3次元ビジョンセンサ1Dが計測した距離データが転送されて合成される。
以下、前述の実施の形態1と同様に、突起している部位を探す動作に続いて、たとえば部品10dの把持動作が行われる。このことは、他の実施の形態2、3、5、7においても、同様に適用可能である。
また、3次元ビジョンセンサ1Dに代えて、前述の実施の形態4、6のようにロボット3に取り付けた状態にして、複数箇所に移動させつつ静止させて、それぞれの静止位置で複数の距離データを計測してもよい。ただし、全体の計測時間は延長されることになる。
したがって、部品10dを把持する成功率が高まり、把持のやり直し回数が減るので、システム全体のタクトタイムが高まる効果が得られる。また、部品10bと爪3tとの衝突によるシステム故障を回避する効果も奏する。
Claims (11)
- バラ積みされた部品を収納するバラ積み部品箱と、
前記バラ積み部品箱内の部品の距離画像を計測する距離画像計測手段と、
前記距離画像に基づき前記バラ積み部品箱から部品を摘み出す単離手段と、
前記単離手段により単離された部品の位置姿勢を、あらかじめ指定される最終位置姿勢に対して一定の誤差以下の位置姿勢に変更する位置姿勢変更手段と、
を備えたことを特徴とする部品供給装置。 - 前記単離手段および前記位置姿勢変更手段は、共通に機能する1台のロボットを備えたことを特徴とする請求項1に記載の部品供給装置。
- 前記距離画像計測手段は、3次元ビジョンセンサを備え、
前記3次元ビジョンセンサは、前記1台のロボットに一体的に設けられたことを特徴とする請求項2に記載の部品供給装置。 - 前記単離手段は、前記バラ積み部品箱から部品を把持して摘み出す第1のロボットを備え、
前記位置姿勢変更手段は、前記第1のロボットから部品を受け取って、前記部品の位置姿勢を変更する第2のロボットを備えた
ことを特徴とする請求項1に記載の部品供給装置。 - 前記第1および第2のロボットは、それぞれに一体的に設けられた3次元ビジョンセンサを備えたことを特徴とする請求項4に記載の部品供給装置。
- 前記単離手段は、前記第1のロボットが把持して転がされた1つ以上の部品が載置される仮置き台を備え、
前記位置姿勢変更手段は、前記仮置き台上の部品の外形を計測する部品外形計測手段を備え、
前記第2のロボットは、前記仮置き台上に転がった部品を摘み上げて、前記部品の位置姿勢を変更することを特徴とする請求項4に記載の部品供給装置。 - 前記距離画像計測手段は、3次元ビジョンセンサを備え、
前記部品外形計測手段は、2次元ビジョンセンサを備え、
前記第2のロボットは、複数のロボットからなり、前記仮置き台から摘み上げた部品を前記複数のロボットの相互間で受渡ししながら、前記部品の位置姿勢を変更し、
前記第1および第2のロボット、ならびに、前記3次元ビジョンセンサおよび前記2次元ビジョンセンサの動作および動作タイミングを制御する制御装置を、さらに備えたことを特徴とする請求項6に記載の部品供給装置。 - 前記3次元ビジョンセンサは、前記第1のロボットに一体的に設けられたことを特徴とする請求項7に記載の部品供給装置。
- 前記距離画像計測手段は、複数の3次元ビジョンセンサを備え、
前記制御装置は、前記複数の3次元ビジョンセンサにより計測された複数の距離画像を合成したデータを用いて、前記第1および第2のロボット、ならびに、前記3次元ビジョンセンサおよび前記2次元ビジョンセンサの動作および動作タイミングを制御することを特徴とする請求項7に記載の部品供給装置。 - 前記第1のロボットは、複数の静止位置での前記3次元ビジョンセンサによる複数の距離画像を取得し、
前記制御装置は、前記複数の距離画像を合成したデータを用いて、前記第1および第2のロボット、ならびに、前記3次元ビジョンセンサおよび前記2次元ビジョンセンサの動作および動作タイミングを制御することを特徴とする請求項8に記載の部品供給装置。 - 前記単離手段は、パラメータ最適化手段を備え、
前記パラメータ最適化手段は、前記バラ積み部品箱からの部品の摘み上げ動作を定義するパラメータを自動的に変化させながら、摘み上げ成功率を観測して記録し、摘み上げ成功率が最良となるパラメータを選択することを特徴とする請求項1から請求項10までのいずれか1項に記載の部品供給装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013522797A JP5837065B2 (ja) | 2011-06-29 | 2012-06-20 | 部品供給装置 |
KR1020147002263A KR101634463B1 (ko) | 2011-06-29 | 2012-06-20 | 부품 공급 장치 |
US14/125,712 US9469035B2 (en) | 2011-06-29 | 2012-06-20 | Component supply apparatus |
CN201280032794.9A CN103687702B (zh) | 2011-06-29 | 2012-06-20 | 部件供给装置 |
DE112012002677.2T DE112012002677B4 (de) | 2011-06-29 | 2012-06-20 | Zuführvorrichtung für Bauelemente |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-144073 | 2011-06-29 | ||
JP2011144073 | 2011-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013002099A1 true WO2013002099A1 (ja) | 2013-01-03 |
Family
ID=47423996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/065766 WO2013002099A1 (ja) | 2011-06-29 | 2012-06-20 | 部品供給装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9469035B2 (ja) |
JP (1) | JP5837065B2 (ja) |
KR (1) | KR101634463B1 (ja) |
CN (1) | CN103687702B (ja) |
DE (1) | DE112012002677B4 (ja) |
WO (1) | WO2013002099A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2679353A1 (de) * | 2012-06-29 | 2014-01-01 | LIEBHERR-VERZAHNTECHNIK GmbH | Vorrichtung zur automatisierten Handhabung von Werkstücken |
EP2679352A1 (de) * | 2012-06-29 | 2014-01-01 | LIEBHERR-VERZAHNTECHNIK GmbH | Vorrichtung zum automatisierten Entnehmen von in einem Behälter angeordneten Werkstücken |
JP2015044274A (ja) * | 2013-08-29 | 2015-03-12 | 三菱電機株式会社 | 部品供給装置および部品供給装置のプログラム生成方法 |
JP2015089589A (ja) * | 2013-11-05 | 2015-05-11 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
WO2015097904A1 (ja) * | 2013-12-27 | 2015-07-02 | 富士機械製造株式会社 | 部品供給システム |
JP5965561B1 (ja) * | 2016-04-15 | 2016-08-10 | 宮川工機株式会社 | 金物供給装置 |
JP2017074651A (ja) * | 2015-10-16 | 2017-04-20 | 株式会社特電 | ワークの自動供給方法 |
JP2017100142A (ja) * | 2015-11-30 | 2017-06-08 | 株式会社特電 | 自動ナット溶接装置 |
JP2017107432A (ja) * | 2015-12-10 | 2017-06-15 | 学校法人立命館 | 機械システムの生産性能評価装置及び機械システムの生産性能評価方法 |
WO2018173318A1 (ja) * | 2017-03-24 | 2018-09-27 | 三菱電機株式会社 | ロボットプログラムの生成装置及び生成方法 |
KR20190070387A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 재파지를 이용하여 테스크를 수행하는 로봇 핸드 및 그 제어방법 |
KR20190070386A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 시각 정보와 촉각 정보를 함께 이용하여 객체를 파지하는 로봇 핸드 및 그 제어방법 |
KR20190070385A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 정보가 없는 객체를 파지하는 로봇 핸드 및 그 제어방법 |
JP2020190551A (ja) * | 2019-05-15 | 2020-11-26 | オムロン株式会社 | 計測システム、計測装置、計測方法、及び計測プログラム |
WO2021149429A1 (ja) * | 2020-01-23 | 2021-07-29 | オムロン株式会社 | ロボットシステムの制御装置、ロボットシステムの制御方法、コンピュータ制御プログラム、及びロボットシステム |
JP2021115693A (ja) * | 2020-01-23 | 2021-08-10 | オムロン株式会社 | ロボットシステムの制御装置、ロボットシステムの制御方法、コンピュータ制御プログラム、及びロボットシステム |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9605952B2 (en) | 2012-03-08 | 2017-03-28 | Quality Manufacturing Inc. | Touch sensitive robotic gripper |
JP5929854B2 (ja) * | 2013-07-31 | 2016-06-08 | 株式会社安川電機 | ロボットシステムおよび被加工物の製造方法 |
WO2015186188A1 (ja) * | 2014-06-03 | 2015-12-10 | 富士機械製造株式会社 | ばら部品供給装置および部品実装装置 |
DE102014008444A1 (de) * | 2014-06-06 | 2015-12-17 | Liebherr-Verzahntechnik Gmbh | Vorrichtung zum automatisierten Entnehmen von in einem Behälter angeordneten Werkstücken |
WO2016122840A1 (en) | 2015-01-26 | 2016-08-04 | Duke University | Specialized robot motion planning hardware and methods of making and using same |
WO2016129069A1 (ja) * | 2015-02-12 | 2016-08-18 | 富士機械製造株式会社 | 部品供給装置 |
JP6117853B2 (ja) * | 2015-05-13 | 2017-04-19 | ファナック株式会社 | バラ積みされた物品を取り出すための物品取出システム、および方法 |
KR102484200B1 (ko) * | 2015-07-13 | 2023-01-03 | 현대모비스 주식회사 | 부품 공급 장치 및 그 제어방법 |
JP6219901B2 (ja) * | 2015-10-28 | 2017-10-25 | ファナック株式会社 | ワークの向きを調整可能な仮置き装置 |
DE102016000611A1 (de) * | 2016-01-23 | 2017-07-27 | Sk-Technologies Ug (Haftungsbeschränkt) | Anordnung und Verfahren für die automatisierte Erfassung und Entnahme von Werkstücken aus ungeordneter Ansammlung |
JP2017151011A (ja) * | 2016-02-26 | 2017-08-31 | セイコーエプソン株式会社 | 電子部品搬送装置および電子部品検査装置 |
CN105668206B (zh) * | 2016-03-23 | 2019-02-12 | 深圳市华星光电技术有限公司 | 玻璃基板的分流方法、分流装置及加工系统 |
JP6548816B2 (ja) * | 2016-04-22 | 2019-07-24 | 三菱電機株式会社 | 物体操作装置及び物体操作方法 |
US11429105B2 (en) | 2016-06-10 | 2022-08-30 | Duke University | Motion planning for autonomous vehicles and reconfigurable motion planning processors |
US11122721B2 (en) | 2016-09-22 | 2021-09-14 | Fuji Corporation | Component supply system |
DE102017000527A1 (de) | 2017-01-20 | 2018-07-26 | Liebherr-Verzahntechnik Gmbh | Vorrichtung zum automatisierten Entnehmen von in einem Behälter angeordneten Werkstücken |
DE102017000524A1 (de) * | 2017-01-20 | 2018-07-26 | Liebherr-Verzahntechnik Gmbh | Vorrichtung zum automatisierten Entnehmen von in einem Behälter angeordneten Werkstücken |
CA3051434A1 (en) * | 2017-02-02 | 2018-08-09 | Walmart Apollo, Llc | Conveyor and logic systems to return, balance, and buffer processed or empty totes |
JP6453922B2 (ja) | 2017-02-06 | 2019-01-16 | ファナック株式会社 | ワークの取り出し動作を改善するワーク取り出し装置およびワーク取り出し方法 |
CN110382173B (zh) * | 2017-03-10 | 2023-05-09 | Abb瑞士股份有限公司 | 用于标识物体的方法和设备 |
DE202017101643U1 (de) * | 2017-03-21 | 2018-05-08 | Kuka Systems Gmbh | Fertigungsstation |
US10537990B2 (en) * | 2017-03-30 | 2020-01-21 | Dematic Corp. | Split robotic article pick and put system |
JP6880457B2 (ja) | 2017-11-14 | 2021-06-02 | オムロン株式会社 | 把持方法、把持システム及びプログラム |
JP6676030B2 (ja) * | 2017-11-20 | 2020-04-08 | 株式会社安川電機 | 把持システム、学習装置、把持方法、及び、モデルの製造方法 |
US11072074B2 (en) * | 2017-12-13 | 2021-07-27 | Cognex Corporation | Calibration and operation of vision-based manipulation systems |
WO2019139815A1 (en) | 2018-01-12 | 2019-07-18 | Duke University | Apparatus, method and article to facilitate motion planning of an autonomous vehicle in an environment having dynamic objects |
TWI822729B (zh) | 2018-02-06 | 2023-11-21 | 美商即時機器人股份有限公司 | 用於儲存一離散環境於一或多個處理器之一機器人之運動規劃及其改良操作之方法及設備 |
US11738457B2 (en) | 2018-03-21 | 2023-08-29 | Realtime Robotics, Inc. | Motion planning of a robot for various environments and tasks and improved operation of same |
WO2019180953A1 (ja) * | 2018-03-23 | 2019-09-26 | 株式会社Fuji | 部品装着装置 |
CN112218748B (zh) | 2018-06-14 | 2023-09-05 | 雅马哈发动机株式会社 | 机器人系统 |
CN109086736A (zh) * | 2018-08-17 | 2018-12-25 | 深圳蓝胖子机器人有限公司 | 目标获取方法、设备和计算机可读存储介质 |
JP7031540B2 (ja) * | 2018-09-07 | 2022-03-08 | オムロン株式会社 | 対象物認識装置、マニピュレータ、および移動ロボット |
KR102561103B1 (ko) * | 2018-11-16 | 2023-07-31 | 삼성전자주식회사 | 로봇 보정 시스템 및 그것의 보정 방법 |
JP7247572B2 (ja) * | 2018-12-17 | 2023-03-29 | 京セラドキュメントソリューションズ株式会社 | 制御装置 |
CN113905855B (zh) | 2019-04-17 | 2023-08-25 | 实时机器人有限公司 | 运动规划图生成用户界面、系统、方法和规则 |
CN114206698B (zh) | 2019-06-03 | 2024-07-02 | 实时机器人有限公司 | 在具有动态障碍物的环境中促进运动规划的装置、方法和物品 |
US11014295B2 (en) * | 2019-07-02 | 2021-05-25 | Saudi Arabian Oil Company | Fabrication of composite parts by additive manufacturing and microstructure topology optimization |
US11472122B2 (en) | 2019-07-02 | 2022-10-18 | Saudi Arabian Oil Company | Fabrication of composite parts by additive manufacturing and microstructure topology customization |
WO2021041223A1 (en) | 2019-08-23 | 2021-03-04 | Realtime Robotics, Inc. | Motion planning for robots to optimize velocity while maintaining limits on acceleration and jerk |
US20220241982A1 (en) * | 2019-09-18 | 2022-08-04 | Fuji Corporation | Work robot and work system |
DE102019129417B4 (de) * | 2019-10-31 | 2022-03-24 | Sick Ag | Verfahren zum automatischen Handhaben von Objekten |
CN114845844B (zh) * | 2019-12-17 | 2023-05-30 | 三菱电机株式会社 | 信息处理装置、工件识别装置及工件取出装置 |
TW202146189A (zh) | 2020-01-22 | 2021-12-16 | 美商即時機器人股份有限公司 | 於多機器人操作環境中之機器人之建置 |
KR102350638B1 (ko) * | 2020-08-26 | 2022-01-17 | 주식회사 이엠에스 | 인공지능형 로봇 시스템 |
DE102020212768B4 (de) | 2020-10-09 | 2022-05-05 | Ifc Intelligent Feeding Components Gmbh | Schüttgut-Zuführsystem |
CN112620908A (zh) * | 2020-12-28 | 2021-04-09 | 武汉智艾德科技有限公司 | 一种基于机器人与视觉匹配的自动凸焊系统及方法 |
CN112847375B (zh) * | 2021-01-22 | 2022-04-26 | 熵智科技(深圳)有限公司 | 一种工件抓取方法、装置、计算机设备及存储介质 |
US12064886B1 (en) * | 2021-03-23 | 2024-08-20 | Amazon Technologies, Inc. | Systems and methods for scalable perception and purposeful robotic picking of items from a collection |
CN117381802B (zh) * | 2023-12-12 | 2024-03-05 | 吉林省吉邦自动化科技有限公司 | 一种分布式多机器人协同控制方法 |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06127698A (ja) * | 1992-10-20 | 1994-05-10 | Omron Corp | 部品供給装置 |
JPH08112788A (ja) * | 1994-10-14 | 1996-05-07 | Canon Inc | 部品挿入装置及び部品挿入方法 |
JP2004230513A (ja) * | 2003-01-30 | 2004-08-19 | Fanuc Ltd | ワーク取出し装置 |
JP2006035346A (ja) * | 2004-07-23 | 2006-02-09 | Toyota Motor Corp | 部品組付け方法 |
JP2007245283A (ja) * | 2006-03-15 | 2007-09-27 | Nissan Motor Co Ltd | ワーク姿勢検知装置、ワーク姿勢検知方法、ピッキングシステム、およびピッキング方法 |
JP2008087074A (ja) * | 2006-09-29 | 2008-04-17 | Fanuc Ltd | ワーク取り出し装置 |
JP2008178930A (ja) * | 2007-01-23 | 2008-08-07 | Fuji Electric Holdings Co Ltd | ロボットのワーク把持方法及びロボット |
JP2010089238A (ja) * | 2008-10-10 | 2010-04-22 | Honda Motor Co Ltd | ワーク取り出し方法 |
JP2010105105A (ja) * | 2008-10-29 | 2010-05-13 | Olympus Corp | 自動生産装置 |
JP2010120141A (ja) * | 2008-11-21 | 2010-06-03 | Ihi Corp | バラ積みピッキング装置とその制御方法 |
JP2011000685A (ja) * | 2009-06-19 | 2011-01-06 | Denso Wave Inc | ビンピッキングシステム |
JP2011000669A (ja) * | 2009-06-18 | 2011-01-06 | Yaskawa Electric Corp | ロボットシステム及び物品並置方法 |
JP2011093058A (ja) * | 2009-10-30 | 2011-05-12 | Fuji Electric Holdings Co Ltd | 対象物把持領域抽出装置および対象物把持領域抽出装置を用いたロボットシステム |
JP2011183537A (ja) * | 2010-03-11 | 2011-09-22 | Yaskawa Electric Corp | ロボットシステム及びロボット装置並びにワーク取り出し方法 |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4011437A (en) * | 1975-09-12 | 1977-03-08 | Cincinnati Milacron, Inc. | Method and apparatus for compensating for unprogrammed changes in relative position between a machine and workpiece |
US4402053A (en) | 1980-09-25 | 1983-08-30 | Board Of Regents For Education For The State Of Rhode Island | Estimating workpiece pose using the feature points method |
JPH09239682A (ja) | 1996-03-06 | 1997-09-16 | Nissan Motor Co Ltd | ワーク供給方法およびワーク供給装置 |
WO2003004222A2 (en) * | 2001-07-02 | 2003-01-16 | Microbotic A/S | Apparatus comprising a robot arm adapted to move object handling hexapods |
JP2005335010A (ja) * | 2004-05-26 | 2005-12-08 | Toyota Motor Corp | 把持制御装置 |
CN101522377B (zh) * | 2006-10-20 | 2011-09-14 | 株式会社日立制作所 | 机械手 |
US7844105B2 (en) * | 2007-04-23 | 2010-11-30 | Mitsubishi Electric Research Laboratories, Inc. | Method and system for determining objects poses from range images |
US8070409B2 (en) * | 2007-11-05 | 2011-12-06 | Ajax Tocco Magnethermic Corp. | Method and apparatus for transporting steel billets |
DE102008052436A1 (de) | 2008-10-21 | 2010-04-22 | Daimler Ag | Verfahren und Vorrichtung zum Vereinzeln von Bauteilen aus einem Behältnis |
CN201852793U (zh) * | 2009-09-21 | 2011-06-01 | Abb技术有限公司 | 用于生产制造部件的系统 |
JP2011115877A (ja) * | 2009-12-02 | 2011-06-16 | Canon Inc | 双腕ロボット |
CN101706968B (zh) * | 2009-12-10 | 2012-11-07 | 江苏大学 | 基于图像的果树枝干三维模型重建方法 |
US9089966B2 (en) | 2010-11-17 | 2015-07-28 | Mitsubishi Electric Corporation | Workpiece pick-up apparatus |
JP5767464B2 (ja) * | 2010-12-15 | 2015-08-19 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、およびプログラム |
JP5618908B2 (ja) | 2011-05-31 | 2014-11-05 | 三菱電機株式会社 | 部品供給装置 |
JP5434991B2 (ja) | 2011-09-01 | 2014-03-05 | 株式会社安川電機 | ロボット |
JP5623358B2 (ja) | 2011-09-06 | 2014-11-12 | 三菱電機株式会社 | ワーク取り出し装置 |
JP5494597B2 (ja) * | 2011-09-16 | 2014-05-14 | 株式会社安川電機 | ロボットシステム |
JP6004809B2 (ja) * | 2012-03-13 | 2016-10-12 | キヤノン株式会社 | 位置姿勢推定装置、情報処理装置、情報処理方法 |
JP5670397B2 (ja) * | 2012-08-29 | 2015-02-18 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
KR102056664B1 (ko) * | 2012-10-04 | 2019-12-17 | 한국전자통신연구원 | 센서를 이용한 작업 방법 및 이를 수행하는 작업 시스템 |
JP5642759B2 (ja) * | 2012-10-31 | 2014-12-17 | ファナック株式会社 | 物品取出装置及び物品取出方法 |
JP6108860B2 (ja) * | 2013-02-14 | 2017-04-05 | キヤノン株式会社 | ロボットシステム及びロボットシステムの制御方法 |
-
2012
- 2012-06-20 US US14/125,712 patent/US9469035B2/en active Active
- 2012-06-20 DE DE112012002677.2T patent/DE112012002677B4/de active Active
- 2012-06-20 WO PCT/JP2012/065766 patent/WO2013002099A1/ja active Application Filing
- 2012-06-20 CN CN201280032794.9A patent/CN103687702B/zh active Active
- 2012-06-20 KR KR1020147002263A patent/KR101634463B1/ko active IP Right Grant
- 2012-06-20 JP JP2013522797A patent/JP5837065B2/ja active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06127698A (ja) * | 1992-10-20 | 1994-05-10 | Omron Corp | 部品供給装置 |
JPH08112788A (ja) * | 1994-10-14 | 1996-05-07 | Canon Inc | 部品挿入装置及び部品挿入方法 |
JP2004230513A (ja) * | 2003-01-30 | 2004-08-19 | Fanuc Ltd | ワーク取出し装置 |
JP2006035346A (ja) * | 2004-07-23 | 2006-02-09 | Toyota Motor Corp | 部品組付け方法 |
JP2007245283A (ja) * | 2006-03-15 | 2007-09-27 | Nissan Motor Co Ltd | ワーク姿勢検知装置、ワーク姿勢検知方法、ピッキングシステム、およびピッキング方法 |
JP2008087074A (ja) * | 2006-09-29 | 2008-04-17 | Fanuc Ltd | ワーク取り出し装置 |
JP2008178930A (ja) * | 2007-01-23 | 2008-08-07 | Fuji Electric Holdings Co Ltd | ロボットのワーク把持方法及びロボット |
JP2010089238A (ja) * | 2008-10-10 | 2010-04-22 | Honda Motor Co Ltd | ワーク取り出し方法 |
JP2010105105A (ja) * | 2008-10-29 | 2010-05-13 | Olympus Corp | 自動生産装置 |
JP2010120141A (ja) * | 2008-11-21 | 2010-06-03 | Ihi Corp | バラ積みピッキング装置とその制御方法 |
JP2011000669A (ja) * | 2009-06-18 | 2011-01-06 | Yaskawa Electric Corp | ロボットシステム及び物品並置方法 |
JP2011000685A (ja) * | 2009-06-19 | 2011-01-06 | Denso Wave Inc | ビンピッキングシステム |
JP2011093058A (ja) * | 2009-10-30 | 2011-05-12 | Fuji Electric Holdings Co Ltd | 対象物把持領域抽出装置および対象物把持領域抽出装置を用いたロボットシステム |
JP2011183537A (ja) * | 2010-03-11 | 2011-09-22 | Yaskawa Electric Corp | ロボットシステム及びロボット装置並びにワーク取り出し方法 |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2679353A1 (de) * | 2012-06-29 | 2014-01-01 | LIEBHERR-VERZAHNTECHNIK GmbH | Vorrichtung zur automatisierten Handhabung von Werkstücken |
EP2679352A1 (de) * | 2012-06-29 | 2014-01-01 | LIEBHERR-VERZAHNTECHNIK GmbH | Vorrichtung zum automatisierten Entnehmen von in einem Behälter angeordneten Werkstücken |
JP2015044274A (ja) * | 2013-08-29 | 2015-03-12 | 三菱電機株式会社 | 部品供給装置および部品供給装置のプログラム生成方法 |
JP2015089589A (ja) * | 2013-11-05 | 2015-05-11 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
US9415511B2 (en) | 2013-11-05 | 2016-08-16 | Fanuc Corporation | Apparatus and method for picking up article randomly piled using robot |
WO2015097904A1 (ja) * | 2013-12-27 | 2015-07-02 | 富士機械製造株式会社 | 部品供給システム |
JPWO2015097904A1 (ja) * | 2013-12-27 | 2017-03-23 | 富士機械製造株式会社 | 部品供給システム |
US9949417B2 (en) | 2013-12-27 | 2018-04-17 | Fuji Machine Mfg. Co., Ltd. | Component supply system |
JP2017074651A (ja) * | 2015-10-16 | 2017-04-20 | 株式会社特電 | ワークの自動供給方法 |
JP2017100142A (ja) * | 2015-11-30 | 2017-06-08 | 株式会社特電 | 自動ナット溶接装置 |
JP2017107432A (ja) * | 2015-12-10 | 2017-06-15 | 学校法人立命館 | 機械システムの生産性能評価装置及び機械システムの生産性能評価方法 |
JP5965561B1 (ja) * | 2016-04-15 | 2016-08-10 | 宮川工機株式会社 | 金物供給装置 |
WO2018173318A1 (ja) * | 2017-03-24 | 2018-09-27 | 三菱電機株式会社 | ロボットプログラムの生成装置及び生成方法 |
KR20190070387A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 재파지를 이용하여 테스크를 수행하는 로봇 핸드 및 그 제어방법 |
KR20190070386A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 시각 정보와 촉각 정보를 함께 이용하여 객체를 파지하는 로봇 핸드 및 그 제어방법 |
KR20190070385A (ko) * | 2017-12-12 | 2019-06-21 | 한국로봇융합연구원 | 정보가 없는 객체를 파지하는 로봇 핸드 및 그 제어방법 |
KR102067878B1 (ko) | 2017-12-12 | 2020-01-17 | 한국로봇융합연구원 | 재파지를 이용하여 테스크를 수행하는 로봇 핸드 및 그 제어방법 |
KR102109697B1 (ko) | 2017-12-12 | 2020-05-13 | 한국로봇융합연구원 | 시각 정보와 촉각 정보를 함께 이용하여 객체를 파지하는 로봇 핸드 및 그 제어방법 |
KR102109696B1 (ko) | 2017-12-12 | 2020-05-13 | 한국로봇융합연구원 | 정보가 없는 객체를 파지하는 로봇 핸드 및 그 제어방법 |
JP2020190551A (ja) * | 2019-05-15 | 2020-11-26 | オムロン株式会社 | 計測システム、計測装置、計測方法、及び計測プログラム |
JP7448884B2 (ja) | 2019-05-15 | 2024-03-13 | オムロン株式会社 | 計測システム、計測装置、計測方法、及び計測プログラム |
WO2021149429A1 (ja) * | 2020-01-23 | 2021-07-29 | オムロン株式会社 | ロボットシステムの制御装置、ロボットシステムの制御方法、コンピュータ制御プログラム、及びロボットシステム |
JP2021115693A (ja) * | 2020-01-23 | 2021-08-10 | オムロン株式会社 | ロボットシステムの制御装置、ロボットシステムの制御方法、コンピュータ制御プログラム、及びロボットシステム |
JP7454132B2 (ja) | 2020-01-23 | 2024-03-22 | オムロン株式会社 | ロボットシステムの制御装置、ロボットシステムの制御方法、コンピュータ制御プログラム、及びロボットシステム |
US12097627B2 (en) | 2020-01-23 | 2024-09-24 | Omron Corporation | Control apparatus for robotic system, control method for robotic system, computer-readable storage medium storing a computer control program, and robotic system |
Also Published As
Publication number | Publication date |
---|---|
CN103687702B (zh) | 2016-08-24 |
DE112012002677B4 (de) | 2018-12-06 |
KR101634463B1 (ko) | 2016-06-28 |
US9469035B2 (en) | 2016-10-18 |
DE112012002677T5 (de) | 2014-04-17 |
JPWO2013002099A1 (ja) | 2015-02-23 |
KR20140037943A (ko) | 2014-03-27 |
DE112012002677T9 (de) | 2014-07-03 |
CN103687702A (zh) | 2014-03-26 |
JP5837065B2 (ja) | 2015-12-24 |
US20140147240A1 (en) | 2014-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5837065B2 (ja) | 部品供給装置 | |
JP6057862B2 (ja) | 部品供給装置および部品供給装置のプログラム生成方法 | |
US11383380B2 (en) | Object pickup strategies for a robotic device | |
CN111730603B (zh) | 机器人系统的控制装置以及控制方法 | |
US7657346B2 (en) | Object picking system | |
JP4565023B2 (ja) | 物品取り出し装置 | |
JP5685027B2 (ja) | 情報処理装置、物体把持システム、ロボットシステム、情報処理方法、物体把持方法およびプログラム | |
EP1621296A1 (en) | Transfer robot system comprising a manipulator and a temporary container depository moving synchronously with the manipulator | |
JP2017520417A (ja) | 複数の吸着カップの制御 | |
JP2007313624A (ja) | ワーク取り出し装置及び方法 | |
JP2012024903A (ja) | ワーク取出し装置およびワーク取出し方法 | |
JP7163506B2 (ja) | 作業ロボットおよび作業システム | |
CN110167723A (zh) | 作业机及拾取位置选择方法 | |
JP2012030320A (ja) | 作業システム、作業ロボット制御装置および作業プログラム | |
JP5659640B2 (ja) | ロボット制御装置、物品取り出しシステム、プログラムおよびロボットの制御方法 | |
JP5879704B2 (ja) | ロボット制御装置、物品取り出しシステム、プログラムおよびロボットの制御方法 | |
JP5458807B2 (ja) | 対象物把持領域抽出装置および対象物把持領域抽出装置を用いたロボットシステム | |
Weng et al. | A framework for robotic bin packing with a dual-arm configuration | |
JP2019501033A (ja) | 異なる保管領域に置かれる部品から部品のバッチを構成するための方法および設備 | |
JP5544957B2 (ja) | カメラ脱着ロボット装置、ワーク把持システム、およびワーク把持方法 | |
WO2024105783A1 (ja) | ロボット制御装置、ロボットシステムおよびロボット制御プログラム | |
WO2019102575A1 (ja) | 作業機および把持位置探索方法 | |
TW202432319A (zh) | 機器人控制裝置、機器人系統及機器人控制程式 | |
US20240157567A1 (en) | Picking system | |
CN116604594A (zh) | 基于区域的抓取生成 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12803812 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013522797 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14125712 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112012002677 Country of ref document: DE Ref document number: 1120120026772 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 20147002263 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12803812 Country of ref document: EP Kind code of ref document: A1 |