CN110799311A - Workpiece recognition device and workpiece recognition method - Google Patents

Workpiece recognition device and workpiece recognition method Download PDF

Info

Publication number
CN110799311A
CN110799311A CN201880038077.4A CN201880038077A CN110799311A CN 110799311 A CN110799311 A CN 110799311A CN 201880038077 A CN201880038077 A CN 201880038077A CN 110799311 A CN110799311 A CN 110799311A
Authority
CN
China
Prior art keywords
workpiece
predetermined
installation space
highest point
recognizing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880038077.4A
Other languages
Chinese (zh)
Other versions
CN110799311B (en
Inventor
宫崎利彦
大野诚太
徐天奋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kawasaki Motors Ltd
Original Assignee
Kawasaki Jukogyo KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kawasaki Jukogyo KK filed Critical Kawasaki Jukogyo KK
Publication of CN110799311A publication Critical patent/CN110799311A/en
Application granted granted Critical
Publication of CN110799311B publication Critical patent/CN110799311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Abstract

The invention provides a workpiece recognition device and a workpiece recognition method. The control section executes: the first step (S1) is to analyze the image information to detect that the image information is in the setting space (T)1) A second step (S2) of detecting an area of a portion where a difference between the height position of the highest point and the height position of the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value, and a third step (S3) of recognizing that the work (W3) is present at the highest point when the area satisfies a predetermined condition1) The predetermined condition includes that the area is determined to be larger than a predetermined second threshold value.

Description

Workpiece recognition device and workpiece recognition method
Technical Field
The present invention relates to a workpiece recognition apparatus and a method of recognizing a workpiece.
Background
Conventionally, workpiece recognition is performed in an installation space in which a plurality of workpieces are installed. Such a process is performed, for example, in the object extraction device described in patent document 1.
The object extraction device of patent document 1 includes a gripping order determining device. The gripping order determination device sequentially selects a plurality of objects having recognized positions from the object located above, and determines the degree of interference of the plurality of objects in accordance with the selected order. In the interference degree determination, when the selected object is extracted, the degree of interference between the selected object and another object adjacent to the selected object in the horizontal direction is set to a first degree, and when the other object is extracted, the degree of interference between the other object and the selected object is set to a second degree, and whether or not the first degree is equal to or less than the second degree is determined. The gripping order determination device determines the picking-up order such that the orders of the plurality of objects are at the upper level in the order in which the determination is affirmative.
Patent document 1: japanese patent laid-open publication No. 2013-119121
However, patent document 1 and a conventional workpiece recognition device for performing the same process generally process a workpiece having a predetermined shape (for example, a component used for assembling a machine), and when a workpiece having no predetermined shape (for example, food such as dry-fried food, rocks, etc.) is included, there is a problem that it is difficult to perform such a process. In addition, the conventional workpiece recognition apparatus needs to perform processing of acquiring three-dimensional data such as Computer-Aided Design (CAD) data by imaging an installation space in which a plurality of workpieces are installed, and comparing the three-dimensional data with pre-stored data relating to the shape of the workpiece. That is, the conventional work recognition apparatus needs to acquire a large amount of data and perform complicated processing. This causes a problem of a slow processing speed.
Disclosure of Invention
Accordingly, an object of the present invention is to provide a workpiece recognition device and a workpiece recognition method that can quickly recognize a workpiece even in an installation space where a plurality of workpieces are installed, even when the workpiece does not have a predetermined shape.
In order to solve the above problem, a workpiece recognition device according to the present invention recognizes a workpiece in an installation space where a plurality of workpieces are installed, and includes: an imaging device for obtaining image information by imaging within a predetermined range of the installation space; and a control unit configured to analyze image information obtained from the imaging device to recognize the workpiece, the control unit executing: a first step of detecting a point having a highest height position within a predetermined range of the installation space by analyzing the image information, a second step of detecting an area of a portion having a difference in height position from the highest point within a predetermined distance from the highest point smaller than a predetermined first threshold, and a third step of recognizing that a workpiece is present at the highest point when the area satisfies a predetermined condition, the predetermined condition including a determination that the area is larger than a predetermined second threshold.
According to this configuration, since it is not necessary to recognize the shape of the workpiece as is generally performed by a conventional workpiece recognition device, it is not necessary to process a large amount of data such as CAD data. In addition, it is not necessary to store data relating to the shape of the workpiece in advance, and the like, and it is not necessary to further perform comparison with the data. That is, no complicated processing is required. As a result, the workpiece recognition device according to the present invention can quickly recognize a workpiece even in a case where the workpiece does not have a predetermined shape in an installation space where a plurality of workpieces are installed.
The predetermined condition may further include determining that the area is smaller than a predetermined third threshold value larger than the second threshold value.
According to this configuration, the workpiece can be reliably recognized without erroneous recognition.
The imaging device may image the predetermined range of the installation space from obliquely above.
According to this configuration, since the angle of view of the imaging device is wide, a wide range can be imaged.
The imaging device may image a predetermined range of the installation space from two points, and the control unit may execute the first step and the second step by analyzing parallax obtained by the image capturing from the two points and detecting a height position within the predetermined range of the installation space.
According to this configuration, since the image information can be easily analyzed, the height position can be efficiently detected.
The image pickup apparatus may further include a light projector that irradiates a dot pattern within a predetermined range of the installation space, the image pickup device may pick up an image of the predetermined range of the installation space irradiated with the dot pattern, and the controller may analyze image information obtained by picking up an image of the predetermined range of the installation space to execute the first step and the second step.
According to this configuration, since the image information can be easily analyzed, the height position can be efficiently detected.
The dot pattern irradiated by the light projector may be blue.
The blue light is short-wavelength and strong light. Therefore, according to this configuration, the dot pattern which is less susceptible to the influence of ambient light can be irradiated. In addition, for example, when food is processed as a workpiece, since there are few blue workpieces, the possibility that the workpiece and the dot pattern become similar colors can be reduced. This makes it possible to clearly irradiate the workpiece with the dot pattern, and therefore, the image information can be more easily analyzed.
The plurality of workpieces may be loaded in a bulk manner within a predetermined range of the installation space.
With this configuration, the workpiece recognition device according to the present invention can be effectively used.
The control unit may further control a robot for holding and taking out a workpiece from a predetermined range of the installation space, the robot including: a robot arm; and an end effector attached to a distal end portion of the robot arm and configured to hold the workpiece, wherein the control unit further executes: a fourth step of moving the end effector to the highest point by the robot arm or a portion where a difference between a height position of the end effector and the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value when the presence of the workpiece at the highest point is recognized in the third step, a fifth step of holding the workpiece at the highest point by the end effector, and a sixth step of moving the end effector and the workpiece held by the end effector from a predetermined range of the installation space by the robot arm.
With this configuration, the workpiece recognition device according to the present invention can hold and pick out workpieces in order from the workpiece that is positioned at the uppermost portion within the predetermined range of the installation space and is easy to pick out. This prevents the workpiece from being damaged by the end effector touching the workpiece or from being scattered from the installation space.
The end effector may hold the workpiece by suction using a negative pressure.
According to this structure, the workpiece can be held without being damaged. Further, this effect is particularly effective when the workpiece is a food such as a dry-fried food which is easily damaged.
The end effector may be provided on a central axis extending in a height direction of the robot, and the imaging device may be configured to image the predetermined range of the installation space from obliquely above by being provided on the robot so as to be offset from the central axis.
According to this configuration, the field angle of the imaging device is widened, and the interference with the end effector can be suppressed, so that a wide range can be efficiently imaged.
In order to solve the above problem, a method of recognizing a workpiece according to the present invention recognizes a workpiece in an installation space where a plurality of workpieces are installed, the method including: a first step of detecting a point having a highest height position within a predetermined range of the installation space by analyzing the image information obtained by imaging within the predetermined range of the installation space, a second step of detecting an area of a portion having a difference in height position from the highest point within a predetermined distance from the highest point smaller than a predetermined first threshold, and a third step of recognizing that a workpiece is present at the highest point when the area satisfies a predetermined condition including a determination that the area is larger than a predetermined second threshold.
According to this configuration, since it is not necessary to recognize the shape of the workpiece as is generally performed by a conventional workpiece recognition device, it is not necessary to process a large amount of data such as CAD data. In addition, it is not necessary to store data relating to the shape of the workpiece in advance, and the like, and it is not necessary to further perform comparison with the data. I.e. without the need to perform complicated steps. As a result, the method of recognizing a workpiece according to the present invention can quickly recognize a workpiece in an installation space where a plurality of workpieces are installed, even when the workpiece does not have a predetermined shape.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide a workpiece recognition device and a workpiece recognition method that can quickly recognize a workpiece in an installation space where a plurality of workpieces are installed, even when the workpiece does not have a predetermined shape.
Drawings
Fig. 1 is a diagram showing a case where a robot controlled by a workpiece recognition device according to an embodiment of the present invention is applied to a food production site and food storage work is performed.
Fig. 2 is a front view of a robot controlled by a workpiece recognition device according to an embodiment of the present invention.
Fig. 3 is a diagram showing a configuration example of each arm mechanism of a robot controlled by a workpiece recognition device according to an embodiment of the present invention.
Fig. 4 is a diagram showing an example of the configuration of the base end joint of each arm mechanism of the robot controlled by the workpiece recognition device according to the embodiment of the present invention.
Fig. 5 is a diagram showing an example of the configuration of the end joint of each arm mechanism of the robot controlled by the workpiece recognition device according to the embodiment of the present invention.
Fig. 6 is a bottom view schematically showing a configuration example of a carriage of a robot controlled by a workpiece recognition device according to an embodiment of the present invention.
Fig. 7 is a diagram showing a configuration example of each arm mechanism of a robot controlled by a workpiece recognition device according to an embodiment of the present invention, and is a diagram showing a state in which a parallel link unit is swung.
Fig. 8 is a diagram showing a configuration example of each arm mechanism of a robot controlled by a workpiece recognition device according to an embodiment of the present invention, and is a diagram showing a state in which a parallel link unit is swung.
Fig. 9 is a perspective view showing an end effector of a robot controlled by a workpiece recognition device according to an embodiment of the present invention.
Fig. 10 is a flowchart showing a process executed by the control unit of the robot controlled by the workpiece recognition device according to the embodiment of the present invention.
Detailed Description
Hereinafter, a workpiece recognition device according to an embodiment of the present invention will be described with reference to the drawings. The present invention is not limited to the embodiment. In the following, the same or corresponding elements are denoted by the same reference numerals throughout the drawings, and redundant description thereof will be omitted.
Fig. 1 is a diagram showing a case where a robot controlled by a workpiece recognition device according to an embodiment of the present invention is applied to a food production site and food storage work is performed. The workpiece recognition device according to the present embodiment includes: a shooting device 90 for shooting atTray T disposed on the front left side in FIG. 11Obtaining image information within a prescribed range (setting space); and a control unit 100 for analyzing the image information obtained from the imaging device 90 to recognize the tray T1Inner dry-fried food W1(work piece). As shown in fig. 1, the workpiece recognition device according to the present embodiment is applied to a food manufacturing site, and controls a robot 10 that performs a food storing operation on a lunch box L. Fig. 1 shows only a food manufacturing site and a distal end portion of an end effector 80, which is a part of the robot 10 provided therein.
In fig. 1, a tray T is provided on the front left side1A plurality of dry-fried foods W are loaded in bulk in the (setting space)1(a plurality of workpieces) on a pallet T arranged on the front and right sides2A plurality of wheat shavings W are independently placed in the space2(a plurality of workpieces). In fig. 1, a conveyor B is provided so as to extend in the left-right direction on the back side, and a plurality of lunch boxes L are repeatedly conveyed from the right side toward the left side in the drawing.
The robot 10 controlled by the workpiece recognition device according to the present embodiment is installed on the pallet T1Tray T2And a conveyor belt B (i.e., a food manufacturing site) from a tray T provided on the front side in the figure1Holding and taking out the dry-fried food W1And the package is loaded into a lunch box L conveyed on a conveyor belt B arranged on the inner side in the figure. Then, the tray T is moved from the back side of the figure2From the pallet T2Holding and taking out shaomai W2It is stored in the lunch box L. Then, the tray T is moved from the back side in the figure1Again holding and removing the dry-fried food W1And the tray is stored in the next lunch box L conveyed on the conveyor belt B. The robot 10 repeats the above-described operations to store food in the plurality of lunch boxes L.
Fig. 2 is a front view of a robot controlled by a workpiece recognition device according to an embodiment of the present invention. The robot 10 has a base body 2 provided with a bucket 15. The bucket 15 has an upper edge portion 15a surrounding the opening, and a plurality of ribs 17 projecting horizontally are provided on the upper edge portion 15 a. The rib 17 is supported by a peripheral portion 191b of the mounting opening 191a provided in the mount 191, and the base 2 is fixed to the mount 191 by screwing the rib 17 to the peripheral portion 191b, for example.
Fig. 2 illustrates a mode in which the lower surface of the rib 17 abuts against the upper surface of the peripheral portion 191b and is held by the peripheral portion 191b screwed with the rib 17. However, instead of this, the upper surface of the rib 17 may be held by the peripheral portion 191b screwed to the rib 17 in contact with the lower surface of the peripheral portion 191 b.
When base 2 is fixed to mount 191, the portion of robot 10 below upper edge 15a is disposed in the working space so as to hang from mount 191. The lower portion of the bucket 15 is connected to the single carriage 4 via a plurality of sets of arm mechanisms 3 (robot arms). The base end portions of the plurality of sets of arm mechanisms 3 are arranged at equal intervals in the circumferential direction in plan view. More specifically, in the reference posture of the robot 10, the plurality of arm mechanisms 3 are arranged in rotational symmetry with a predetermined central axis 101 (a central axis extending in the height direction) as the center at equal intervals. According to an exemplary embodiment of the present embodiment, the arm mechanism 3 is provided with three sets, which are provided with rotational symmetry at equal intervals of 120 degrees with the center axis 101 as a center. In the present embodiment, the robot 10 is provided such that the center axis 101 extends in the vertical direction. However, the robot 10 is not limited to this, and the center axis 101 may be inclined with respect to the vertical direction.
Fig. 3 is a diagram showing a configuration example of each arm mechanism 3 of the robot 10.
As shown in fig. 2 and 3, each arm mechanism 3 includes a parallel link unit 19, and the parallel link unit 19 includes an arm 18, a first link 21, and a second link 22. The arm 18 is connected at its base end portion to a lower portion of the bucket 15 so as to be swingable. The pivot axes of the arms 18 are located in the same plane as each other orthogonal to the central axis 101. That is, in the present embodiment, the pivot shafts of the arms 18 are located in the same horizontal plane. The first link 21 is pivotably connected at a base end portion to a distal end portion of the arm 18, and pivotably connected at a distal end portion to the bracket 4. The second link 22 and the first link 21 extend in parallel and are arranged at the same position in the extending direction of the predetermined central axis 101. The second link 22 is pivotably connected at a proximal end portion to a distal end portion of the arm 18, and pivotably connected at a distal end portion to the bracket 4. The first link 21 and the second link 22 constitute a pair of links.
The base end side link clamping portion 20 and the base end portion of the parallel link unit 19 constitute a base end joint 8 that couples the parallel link unit 19 to the arm 18 so as to be swingable in any direction. The distal end side link clamping portion 29 and the distal end portion of the parallel link unit 19 constitute a distal end joint 9 that couples the parallel link unit 19 to the bracket 4 so as to be swingable in any direction.
Fig. 4 is a diagram showing a configuration example of the base end joint 8.
As shown in fig. 2, 3, and 4, the arm 18 has a base end side link clamping portion 20 having a pair of joint portions 23 at a tip end portion. In the pair of joint portions 23, the axis 102 extends from the distal end portion of the arm 18 in a tangential direction of a circle centered on the central axis 101, and extends from the distal end portion of the arm 18 outward in opposite directions from each other. The tip end of each joint 23 forms a spherical free end. The spherical free end of the joint portion 23 constitutes a ball portion 24.
A concave portion 25 that is concave in a direction orthogonal to the extending direction of the first link 21 and the second link 22 is formed at the base end portion of the first link 21 and the second link 22. A bowl-shaped socket 26 is detachably attached to the concave portion 25, and the inner circumferential surface of the socket 26 forms a substantially hemispherical surface. The inner peripheral surface of the socket 26 and the outer peripheral surface of the ball 24 are formed substantially in the same shape. The socket 26 and the ball 24 are fitted to each other so that the ball 24 of the pair of joint portions 23 is sandwiched by the sockets 26 of the first link 21 and the second link 22. Accordingly, the outer peripheral surface of the ball 24 and the inner peripheral surface of the socket 26 form a spherical pair, and the first link 21 and the second link 22 are coupled to the arm 18 so as to be capable of swinging with at least two degrees of freedom.
A biasing unit 27 is provided between the base end of the first link 21 and the base end of the second link 22. The biasing means 27 biases the first link 21 and the second link 22 so as to approach each other, so that the links 21 and 22 keep a state of sandwiching the base-end-side link sandwiching portion 20, and the first and second links 21 and 22 are prevented from falling off the arm 18.
Fig. 6 is a bottom view schematically showing a configuration example of the bracket 4.
As shown in fig. 2, 3, and 6, the bracket 4 is formed in a flat plate shape and is disposed in a horizontal posture extending on a plane orthogonal to the central axis 101. The bracket 4 has a plurality of tip-side link clamping portions 29. Each distal-side link clamping portion 29 is formed, for example, at the outer peripheral edge portion of the bracket 4. The plurality of distal end side link clamping portions 29 are provided in rotational symmetry at equal intervals with an axis extending parallel to the central axis 101 as a center. According to the mode exemplified in the present embodiment, the distal-end-side link clamping portion 29 is provided with three, which are provided in rotational symmetry at equal intervals of 120 degrees with an axis extending parallel to the central axis 101 as a center.
Fig. 5 is a diagram showing a configuration example of the end joint 9.
As shown in fig. 5, each of the distal end side link clamping portions 29 has a pair of joint portions 30. In the pair of joint portions 30, the axis 104 extends from the outer peripheral edge portion of the bracket 4 in the tangential direction of a circle having the central axis 101 as the center, and extends from the outer peripheral edge portion of the bracket 4 outward in opposite directions from each other. That is, the axis 104 is configured to be parallel to the axis 102 in the assembled state. The tip end of each joint 30 forms a spherical free end. The spherical free end of the joint portion 30 constitutes a ball portion 31.
The distal end portions of the first link 21 and the second link 22 are configured to be the same as the base end portions thereof. The socket 26 and the ball 31 are fitted to each other so that the ball 31 of the pair of joint portions 30 is sandwiched by the sockets 26 of the first link 21 and the second link 22. Thus, the ball 31 and the socket 26 constitute a spherical pair, and the first link 21 and the second link 22 are coupled to the bracket 4 so as to be capable of swinging with at least two degrees of freedom.
In this way, the proximal joint 8 and the distal joint 9 constitute a ball joint. The first link 21 and the second link 22 are disposed such that the base end portions thereof rotatably sandwich the base end side link sandwiching portions 20, and the distal end portions thereof rotatably sandwich corresponding distal end side link sandwiching portions 29 provided in the bracket 4.
Further, the same biasing means 27 as described above is also provided between the distal end portion of the first link 21 and the distal end portion of the second link 22. This structure allows the links 21 and 22 to hold the distal-end-side link holding portion 29, thereby preventing the links 21 and 22 from falling off the bracket 4. In this way, the parallel link unit 19 is configured as a parallel link.
With the above configuration, the bracket 4 is coupled to the base 2 via a plurality of sets of parallel links including the first and second links 21 and 22. As shown in fig. 2, an arm actuator 13 for swinging the base end of each arm 18 with respect to the base 2 is provided at the lower portion of the base 2. When the arms 18 swing in accordance with the operation of the arm actuators 13, the carriage 4 moves to a predetermined position defined by the plurality of arm mechanisms 3 within the movable range. At this time, the posture of the bracket 4 with respect to the base 2 is restrained (defined) by the plurality of sets of parallel links, and the posture in the plane direction orthogonal to the predetermined central axis 101 is maintained.
When the carriage 4 moves to a predetermined position defined by the plurality of arm mechanisms 3 within the movable range, the inclination angle of the parallel link unit 19 with respect to the center axis 101 changes. As shown in fig. 7 and 8, the first link 21 and the second link 22 swing on a plane passing through the first link 21 and the second link 22. Thus, the interval between the first link 21 and the second link 22 changes in the direction orthogonal to the extending direction of the first link 21 and the second link 22 as viewed from the normal direction passing through the plane of the first link 21 and the second link 22. Specifically, in a state where the first link 21 and the second link 22 are in an attitude orthogonal to the pair of joint portions 23 and the pair of joint portions 30, the distance D1 between the first link 21 and the second link 22 in the direction orthogonal to the first link 21 and the second link 22 is the largest. Then, as the swing from this posture is performed, the interval between the first link 21 and the second link 22 in the direction orthogonal to the first link 21 and the second link 22 becomes narrower. That is, the intervals D2 and D3 shown in fig. 7 and 8 are smaller than the interval D1.
As shown in fig. 2, the lower surface side (tip end portion of the robot arm) of the carriage 4 is mountedUseful for holding dry-fried food W1And shaomai W2 The end effector 80.
The workpiece recognition device according to the present embodiment includes an imaging device 90, and the imaging device 90 is used to image the tray T1And a tray T2The image information is obtained within a prescribed range (setting space). Here, the predetermined range of the installation space may mean, for example, the tray T in a plan view1(or tray T)2) The term "divided into two" may mean either one divided into three or more parts, or may mean that the divided parts are not divided but are arbitrarily selected. In addition, the tray T in the plan view may be1(or tray T)2) The processing is repeated for each of the regions. In addition, the tray T may be arranged within a predetermined range of the installation space in a plan view1(or tray T)2) And (4) integration.
As shown in fig. 2, the imaging device 90 is provided, and the imaging device 90 is attached between the two arm mechanisms 3 at the same or substantially the same height position as the upper end portion of the arm mechanism 3 of the robot 10. The imaging device 90 is provided on the robot 10 so as to be offset from the central axis 101, thereby imaging the tray T from obliquely above1Inner and tray T2Within the specified range of (3). In addition, the photographing device 90 photographs the tray T from two places inside thereof1Inner and tray T2And (4) the following steps.
The recognition device according to the present embodiment further includes a projector not shown in the drawings. The projector projects a dot pattern onto a tray T1Inner and tray T2And (4) the following steps. The dot pattern is blue. The projector may be directly attached to the robot 10, or may be separately provided in the robot 10.
Fig. 9 is a perspective view showing an end effector of a robot controlled by a workpiece recognition device according to an embodiment of the present invention. The end effector 80 according to the present embodiment is provided on a central axis 101 extending in the height direction of the robot 10, and can hold one dry-fried food W by suction using negative pressure1Or a shaomai W2. The end effector 80 includes: a mounting portion 81 mounted on the lower surface side (the distal end portion of the robot arm) of the carriage 4; three shaft members 82 extending in the vertical direction from the bottom surface of the mounting portion 81; a negative pressure forming portion 83 attached to a lower end portion of the three shaft member 82; and a holding portion 84 that is flexible, is attached to the bottom surface of the negative pressure forming portion 83, and is formed in a hollow substantially cylindrical shape. The holding portion 84 is configured as described above, and even if the work to be worked is the dry-fried food W having the unevenness1It may be held so as not to be damaged by being adsorbed so as to cover it.
(an example of processing performed by the control unit 100)
An example of processing executed by the control unit 100 of the workpiece recognition apparatus according to the embodiment of the present invention will be described mainly with reference to fig. 10. Fig. 10 is a flowchart illustrating a process executed by the control unit 100 of the robot according to the embodiment of the present invention.
First, when the control section 100 detects, by a sensor or the like, that the cartridge L has passed a predetermined position while being conveyed by the conveyor B, the image information obtained by the imaging device 90 is analyzed to detect that a plurality of fried foods W are stacked in bulk1Tray T1The point having the highest height position within a predetermined range (installation space). The first step S1 is thus performed.
Next, the control unit 100 detects the area of a portion where the difference between the height position of the highest point detected in the first step S1 and the height position of the highest point is smaller than a predetermined first threshold within a predetermined distance from the highest point. This performs the second step S2.
The control unit 100 may detect the tray T by analyzing parallax captured from two points by the imaging device 901Thereby performing the first step S1 and the second step S2. Further, the dot pattern may be irradiated to the tray T by a projector not shown1And analyzing the tray T irradiated with the dot pattern by imaging1The obtained image information, thereby performing the first stepS1 and a second step S2.
When the area detected in the second step S2 satisfies the predetermined condition, the control unit 100 recognizes that the dry-fried food W is present at the highest point1(work piece). Here, the predetermined condition includes that the area is determined to be larger than a predetermined second threshold value. Thus, the third step S3 is executed.
The predetermined condition may further include determining that the area is smaller than a predetermined third threshold value larger than the second threshold value.
Further, the control section 100 recognizes that the dry-fried food W exists at the highest point in the third step S31Then, the end effector 80 is moved by the three sets of arm mechanisms 3 (robot arms) to the highest point detected in the first step S1 or a portion where the difference between the height positions of the highest point and the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value. This executes the fourth step S4. Here, the movement may be made to a portion where the difference between the height position within a predetermined distance from the highest point and the height position of the highest point is smaller than a predetermined first threshold, for example, to a portion which becomes the center of gravity thereof. Thus, the dry-fried food W can be easily kept1(work piece).
Next, the dry-fried food W existing at the highest point detected in the first step S1 is held by the end effector 801. At this time, the end effector 80 may hold the fried food W by suction with negative pressure1. This executes the fifth step S5.
Finally, the end effector 80 and the dry-fried food W held by the end effector 80 are caused to rotate by the three-group arm mechanism 3 (robot arm)1Slave tray T1The (installation space) is moved within a predetermined range. This executes the sixth step S6. Further, the three arm mechanisms 3 serve to dry-fry the food W taken out1Is placed at a predetermined position of the lunch box L conveyed by the conveyor belt B, thereby carrying out the containing operation of the lunch box L.
As described above, the control unit 100 causes the plurality of dry-fried foods W to be stacked in bulk1Tray T1Holding and taking out the dry-fried food W1And is contained in the lunch box L.
Further, the fried food W is stored as described above1Thereafter, the robot 10 is placed on the pallet T2A plurality of shaomai W in the container2Middle keeping and taking out shaomai W2The same lunch box L is subjected to the containing operation. Here, shaomai W2And bulk-packed in trays T1Inner dry-fried food W1Different from, to be on the tray T2Are mounted so as not to overlap each other. However, according to each wheat W2Slightly different in shape, thereby in the tray T2In the height position, there is also a deviation. Thus, and holding and removing bulk loads in trays T1Inner dry-fried food W1In the same manner as above, the control unit 100 also detects the tray T2The first step S1 to the sixth step S6 are executed at the inner height position.
Further, the robot 10 can perform the same storing operation for a plurality of lunch boxes L sequentially conveyed on the conveyor B by repeating the first step S1 to the sixth step S6. Further, the tray T may be stored in advance1And the smallest dry-fried food W allowed to be contained into the lunch box L1Based on the pre-stored data, when bulk-packed in trays T1Inner dry-fried food W1When all the food is used up, the situation is recognized, and the dry-fried food W is finished1The taking out operation of (1). In addition, the wheat is applicable to wheat burning W2And so the description thereof will not be repeated here.
(Effect)
The workpiece recognition device according to the present embodiment photographs the bulk-loaded dry-fried food W from above by analysis1Tray T1The image information obtained in the predetermined range of (1) is detected on the tray T1The point having the highest height position within the predetermined range of (1). When it is determined that the predetermined condition within the predetermined distance from the highest point is satisfied, it is recognized that the dry-fried food W is present at the highest point1. Here, the predetermined condition includes determining that an area of a portion where a difference between a height position of the highest point and a height position within a predetermined distance from the highest point is smaller than a predetermined first threshold value is equal toGreater than a predetermined second threshold. Thus, in the present embodiment, since it is not necessary to recognize the shape of the workpiece as is generally performed by a conventional workpiece recognition device, it is not necessary to process a large amount of data such as CAD data. In addition, it is not necessary to store data relating to the shape of the workpiece or the like in advance, nor to perform a step and comparison of such data. That is, no complicated processing is required. As a result, the workpiece recognition device according to the present embodiment can quickly recognize a workpiece in an installation space where a plurality of workpieces are installed, even when the workpiece does not have a predetermined shape.
Further, by making the predetermined condition further include the determination that the area detected in the second step S2 is smaller than the predetermined third threshold value larger than the second threshold value, the workpiece can be reliably recognized without erroneous recognition.
The imaging device 90 according to the present embodiment images the tray T from obliquely above1The field angle is widened within the predetermined range (installation space), and therefore the imaging range can be widened.
In the present embodiment, the tray T is imaged from two points by analysis1Parallax obtained inside, detecting tray T1The first step S1 and the second step S2 are performed. Accordingly, the workpiece recognition device according to the present embodiment can easily analyze image information, and thus can efficiently detect a height position.
In the present embodiment, the tray T to which the dot pattern is irradiated using the projector is analyzed1The first step S1 and the second step S2 are executed to the image information obtained by the imaging. Accordingly, the workpiece recognition device according to the present embodiment can easily analyze image information, and thus can efficiently detect a height position.
In this embodiment, the dot pattern irradiated by the projector is blue. Here, the blue light is short-wavelength and strong light. Therefore, by making blue as described above, a dot pattern which is less susceptible to the influence of ambient light can be irradiated. In addition, when asIn one embodiment, the workpiece is a food product W1In the case of food, etc., since the workpiece is colored in blue very rarely, the possibility that the dot pattern and the workpiece will be similar in color can be reduced. This makes it possible to clearly irradiate the workpiece with the dot pattern, and therefore, the image information can be more easily analyzed.
In the present embodiment, a plurality of dry-fried food items W are stacked in bulk1Pallet T (workpiece)1(setting space) the above-described processing is performed. This makes it possible to effectively use the workpiece recognition device according to the present embodiment.
In the present embodiment, the control unit 100 of the workpiece recognition device also controls the robot 10, and the robot 10 is used for the slave pallet T1The dry-fried food W is held and taken out within a predetermined range (installation space)1(work piece). Thus, the workpiece recognition device according to the present embodiment can be used to recognize the workpiece from the pallet T1The dry-fried food W which is positioned at the uppermost part in the prescribed range and is easy to take out1The dry-fried food W is held and taken out in the beginning1. As a result, the damage of the fried food W due to the touch of the end effector 80 can be prevented1Or dry-fried food W1Slave tray T1And (4) sprinkling.
The end effector 80 according to the present embodiment holds the dry-fried food W by suction using negative pressure1. This makes it possible to hold the workpiece without damaging the workpiece. In the case of the dry-fried food W whose workpiece is easily damaged as in the present embodiment1Etc., the effect is particularly effective.
The end effector 80 according to the present embodiment is provided on a central axis 101 extending in the height direction of the robot 10, and the imaging device 90 is provided on the robot 10 so as to be offset from the central axis 101, thereby imaging the tray T from obliquely above1Within the specified range of (3). This widens the field angle of the imaging device 90, and prevents interference with the end effector 80, thereby enabling efficient imaging of a wide range.
(modification example)
In the above embodiment, the case where the robot 10 is controlled by the controller 100 of the workpiece recognition device to hold and take out the workpiece from the predetermined range of the installation space and the fourth step S4 to the fifth step S5 are performed, but the present invention is not limited thereto. That is, the controller 100 of the workpiece recognition device may perform only the first step S1 to the third step S3, or may perform other processes performed by the robot 10, such as any processing, on the workpiece existing at the highest position in the steps after performing the steps.
In the above embodiment, the food (the dry-fried food W) whose workpiece does not have a predetermined shape is described1And shaomai W2) The case (2) is not limited thereto. For example, the workpiece may be a rock having no predetermined shape. The workpiece recognition device and the workpiece recognition method according to the present invention are particularly effective for use by performing work on workpieces that do not have a predetermined shape, but can perform work similarly on workpieces having a predetermined shape (for example, components used for machine assembly).
In the above-described embodiment, the end effector 80 has been described as holding the workpiece (the dry-fried food W) by suction with a negative pressure1And shaomai W2) The case (2) is not limited thereto. For example, the end effector 80 may penetrate a needle to hold a workpiece, or may hold (or clamp) the workpiece by being constrained from both sides.
In the above embodiment, the case where the robot 10 includes the three-group arm mechanism 3 in a suspended state as the robot arm has been described, but the present invention is not limited thereto. For example, the robot 10 may have only one robot arm having multiple joints, and the end effector 80 may be attached to a distal end portion of the robot arm. Further, the robot 10 may be provided in a rib of the installation space.
Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains having the benefit of the teachings presented in the foregoing descriptions. Accordingly, the foregoing description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the best mode of carrying out the invention. The configuration and/or detailed functions thereof can be substantially changed without departing from the spirit of the present invention.
Description of reference numerals:
2 … a base body; 3 … arm mechanism; 4 … bracket; 8 … a base end fitting; 9 … end joint; 10 … robot; 13 … arm actuator; 15 … bucket; 15a … upper edge portion; 17 … ribs; 18 … arm; 19 … parallel link unit; 20 … base end side link clamping part; 21 … a first link; 21 … connecting rod; 22 … second link; 23 … joint part; a 24 … ball section; 25 … concave portion; 26 … a socket; 27 … a force application unit; 29 … tip-side link clamping portion; a 30 … joint portion; a ball portion 31 …; 80 … end effector; 81 … mounting part; 82 … shaft member; 83 … negative pressure forming part; 84 … holding part; a 90 … camera; 100 … control section; 101 central axis 101 …; 102 … axis; 104 axis 104 …; 191 … stand; 191a … mounting port; 191b … peripheral portion; b … conveyor belt; l … lunch box; t is1、T2… a tray; w1… dry frying the food; w2… baking wheat.

Claims (11)

1. A workpiece recognition apparatus for recognizing a workpiece in an installation space where a plurality of workpieces are installed,
it is characterized in that the preparation method is characterized in that,
the workpiece recognition device is provided with:
a photographing device for photographing within a prescribed range of the setting space to obtain image information; and
a control unit for analyzing image information obtained from the imaging device to recognize the workpiece,
the control section executes:
a first step of detecting a point having a highest height position within a predetermined range of the installation space by analyzing the image information,
a second step of detecting an area of a portion where a difference between a height position of the highest point and the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value, an
A third step of recognizing that the workpiece exists at the highest point when the area satisfies a predetermined condition,
the predetermined condition includes that the area is determined to be larger than a predetermined second threshold value.
2. The apparatus for recognizing a workpiece according to claim 1,
the predetermined condition further includes determining that the area is smaller than a predetermined third threshold value larger than the second threshold value.
3. The apparatus for identifying a workpiece according to claim 1 or 2,
the imaging device images the predetermined range of the installation space from an obliquely upper direction.
4. The apparatus for recognizing a workpiece according to any one of claims 1 to 3,
the photographing means photographs a prescribed range of the setting space from two places,
the control unit executes the first step and the second step by detecting a height position within a predetermined range of the installation space by analyzing parallax acquired by imaging from the two points.
5. The apparatus for recognizing a workpiece according to any one of claims 1 to 4,
the identification device is also provided with a light projector,
the light projector irradiates a dot pattern within a predetermined range of the installation space,
the imaging device images a predetermined range of the installation space to which the dot pattern is irradiated,
the control unit executes the first step and the second step by analyzing image information obtained by imaging within a predetermined range of the installation space.
6. The apparatus for recognizing a workpiece according to claim 5,
the dot pattern illuminated by the light projector is blue.
7. The apparatus for recognizing a workpiece according to any one of claims 1 to 6,
the plurality of workpieces are stacked in bulk within a predetermined range of the installation space.
8. The apparatus for recognizing a workpiece according to any one of claims 1 to 7,
the control unit further controls a robot for holding and taking out the workpiece from a predetermined range of the installation space,
the robot is provided with:
a robot arm; and
an end effector mounted to a distal end portion of the robot arm for holding the workpiece,
the control section further performs:
a fourth step of moving the end effector to the highest point by the robot arm or a portion where a difference between a height position of the end effector and a height position of the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value when the workpiece is recognized to be present at the highest point in the third step,
a fifth step of holding the workpiece existing at the highest point by the end effector, an
A sixth step of moving the end effector and the workpiece held by the end effector from a predetermined range of the installation space by the robot arm.
9. The apparatus for recognizing a workpiece according to claim 8,
the end effector holds the workpiece by performing suction with negative pressure.
10. The apparatus for recognizing a workpiece according to claim 8 or 9,
the end effector is provided on a central axis extending in a height direction of the robot,
the imaging device images a predetermined range of the installation space from obliquely above by being provided on the robot so as to be offset from the central axis.
11. A method of identifying a workpiece, which identifies a workpiece in an installation space where a plurality of workpieces are installed,
it is characterized in that the preparation method is characterized in that,
the method for identifying a workpiece includes:
a first step of analyzing image information obtained by imaging a predetermined range of the installation space to detect a point having a highest height position within the predetermined range of the installation space,
a second step of detecting an area of a portion where a difference between a height position of the highest point and the highest point within a predetermined distance from the highest point is smaller than a predetermined first threshold value,
a third step of recognizing that the workpiece exists at the highest point when the area satisfies a predetermined condition,
the predetermined condition includes that the area is determined to be larger than a predetermined second threshold value.
CN201880038077.4A 2017-06-09 2018-06-07 Workpiece recognition device and workpiece recognition method Active CN110799311B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-114569 2017-06-09
JP2017114569A JP6860432B2 (en) 2017-06-09 2017-06-09 Work recognition device and work recognition method
PCT/JP2018/021889 WO2018225827A1 (en) 2017-06-09 2018-06-07 Workpiece recognition device, and method for recognizing workpiece

Publications (2)

Publication Number Publication Date
CN110799311A true CN110799311A (en) 2020-02-14
CN110799311B CN110799311B (en) 2022-06-24

Family

ID=64567408

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880038077.4A Active CN110799311B (en) 2017-06-09 2018-06-07 Workpiece recognition device and workpiece recognition method

Country Status (3)

Country Link
JP (1) JP6860432B2 (en)
CN (1) CN110799311B (en)
WO (1) WO2018225827A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7120153B2 (en) 2019-05-21 2022-08-17 トヨタ自動車株式会社 Work identification method
JP2022116606A (en) * 2021-01-29 2022-08-10 セイコーエプソン株式会社 Robot control method and robot system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0430991A (en) * 1990-05-25 1992-02-03 Toyoda Mach Works Ltd Robot with visual device
JP2012135820A (en) * 2010-12-24 2012-07-19 Ihi Corp Automatic picking device and automatic picking method
JP2014024142A (en) * 2012-07-26 2014-02-06 Fanuc Ltd Apparatus and method for taking out bulk articles by robot
JP2015089590A (en) * 2013-11-05 2015-05-11 ファナック株式会社 Method and apparatus for taking out bulked article by using robot
CN105034024A (en) * 2014-04-15 2015-11-11 株式会社安川电机 Robot control system, information communication module, a robot controller and a robot control method
JP2017030115A (en) * 2015-08-04 2017-02-09 株式会社リコー Picking device, picking method and picking program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0430991A (en) * 1990-05-25 1992-02-03 Toyoda Mach Works Ltd Robot with visual device
JP2012135820A (en) * 2010-12-24 2012-07-19 Ihi Corp Automatic picking device and automatic picking method
JP2014024142A (en) * 2012-07-26 2014-02-06 Fanuc Ltd Apparatus and method for taking out bulk articles by robot
JP2015089590A (en) * 2013-11-05 2015-05-11 ファナック株式会社 Method and apparatus for taking out bulked article by using robot
CN105034024A (en) * 2014-04-15 2015-11-11 株式会社安川电机 Robot control system, information communication module, a robot controller and a robot control method
JP2017030115A (en) * 2015-08-04 2017-02-09 株式会社リコー Picking device, picking method and picking program

Also Published As

Publication number Publication date
CN110799311B (en) 2022-06-24
JP2018202593A (en) 2018-12-27
WO2018225827A1 (en) 2018-12-13
JP6860432B2 (en) 2021-04-14

Similar Documents

Publication Publication Date Title
JP4309439B2 (en) Object take-out device
US10124489B2 (en) Locating, separating, and picking boxes with a sensor-guided robot
US10894324B2 (en) Information processing apparatus, measuring apparatus, system, interference determination method, and article manufacturing method
US11046530B2 (en) Article transfer apparatus, robot system, and article transfer method
US9415511B2 (en) Apparatus and method for picking up article randomly piled using robot
US9126337B2 (en) Robot system having a robot for conveying a workpiece
JP2018027581A (en) Picking system
CN112368212B (en) Manipulator system for pattern recognition based inspection of drug containers
JP2018161692A (en) Information processing system, information processing method and program
Nerakae et al. Using machine vision for flexible automatic assembly system
CN113825598A (en) Object grasping system and method
US20120165986A1 (en) Robotic picking of parts from a parts holding bin
CN110799311B (en) Workpiece recognition device and workpiece recognition method
US10434652B2 (en) Workpiece picking system
US9604360B2 (en) Robot system for preventing accidental dropping of conveyed objects
JP6909609B2 (en) Inspection system and controllers and programs for controlling the system
CN112292235A (en) Robot control device, robot control method, and robot control program
US20180311824A1 (en) Article retrieval system
JP2012076216A (en) Method for combining camera coordinate system and robot coordinate system in robot control system, image processing device, program, and storage medium
US11926061B2 (en) Logistic device
CN110587592B (en) Robot control device, robot control method, and computer-readable recording medium
JP7143410B2 (en) robot system
KR102109698B1 (en) Object auto sorting, classifying system using image processing algorithm
JP2024052998A (en) Handling Equipment
JP6167760B2 (en) Article position recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant