WO2023013743A1 - 保持パラメータ推定装置及び保持パラメータ推定方法 - Google Patents
保持パラメータ推定装置及び保持パラメータ推定方法 Download PDFInfo
- Publication number
- WO2023013743A1 WO2023013743A1 PCT/JP2022/030016 JP2022030016W WO2023013743A1 WO 2023013743 A1 WO2023013743 A1 WO 2023013743A1 JP 2022030016 W JP2022030016 W JP 2022030016W WO 2023013743 A1 WO2023013743 A1 WO 2023013743A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- holding
- end effector
- held
- area
- control unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 49
- 239000012636 effector Substances 0.000 claims abstract description 206
- 238000013459 approach Methods 0.000 claims description 109
- 230000014759 maintenance of location Effects 0.000 claims description 48
- 230000002452 interceptive effect Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 19
- 238000003860 storage Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 9
- 238000004040 coloring Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- 238000002156 mixing Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000012937 correction Methods 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 230000012447 hatching Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000005520 cutting process Methods 0.000 description 2
- 230000000916 dilatatory effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000004519 grease Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Definitions
- the present disclosure relates to a retention parameter estimation device and a retention parameter estimation method.
- Patent Document 1 Japanese Patent Document 1
- a retention parameter estimator comprises: an acquisition unit that acquires information about an end effector having a holding part that holds an object to be held with an arbitrary opening width, holding object information that indicates the holding object, and depth data related to the holding object; obtaining an end effector model indicating a region where the end effector may exist based on the information; and a control unit for estimating the opening width of the holding unit.
- the retention parameter estimation method includes: Acquiring information about an end effector having a holding portion that holds an object to be held, holding object information indicating the object to be held, and depth data about the object to be held; Acquiring an end effector model indicating a region where the end effector may exist based on the information; Based on the end effector model, the holding target information, and the depth data, an opening width of the holding section is estimated for holding the holding target.
- FIG. 1 is a schematic diagram showing a configuration example of a robot control system according to an embodiment
- FIG. FIG. 4 is a side view showing an example of holding an object to be held by an end effector
- 1 is a block diagram showing a configuration example of a robot control system according to an embodiment
- FIG. FIG. 10 is a diagram showing an example of a mask image representing the outer shape of a holding object on a plane that holds the holding object
- FIG. 10 is a diagram showing a model representing finger opening widths of an end effector model in which a holding portion has two fingers
- FIG. 10 is a diagram showing a model representing a finger stroke of an end effector model in which the holding part has two fingers
- FIG. 10 is a diagram showing an overall model of an end effector model in which a holding part has two fingers, and a finger opening width and a stroke are combined;
- FIG. 10 is a diagram showing a model representing finger opening widths of an end effector model in which a holding portion has three fingers;
- FIG. 11 shows a first model representing the finger stroke of an end effector model in which the retainer has two fingers;
- FIG. 11 shows a second model representing the stroke of the fingers of the end effector model in which the retainer has two fingers;
- It is a figure which shows an example of an approach map.
- FIG. 10 is a diagram showing a technique for generating a second area of the approach map;
- FIG. 11 is a diagram in which an object area and an obstacle area are arranged in a second area while an approach map is being created.
- FIG. 8 is a diagram obtained by subjecting the approach map of FIG. 7 to expansion processing;
- FIG. 10 is a diagram showing an example of positions at which an end effector model is projected onto an approach map;
- FIG. 10 is a diagram showing positions where an end effector model can be projected and positions where it cannot be projected on an approach map;
- FIG. 5 is a diagram for explaining a method of estimating a possible area based on projection of an end effector model onto an approach map;
- FIG. 10 is a diagram showing the center position of the possible existence area corresponding to each holding unit;
- FIG. 10 is an external view of a holding object having a special shape
- 16 is a diagram for explaining that the center position of the possible existence area based on the projection of the end effector model onto the approach map created for the object to be protected in FIG. 15 is off the movable straight line
- FIG. FIG. 10 is a diagram illustrating a method of creating an opening width model based on estimated opening widths; It is a figure which shows an example of a surrounding environment map.
- FIG. 10 is a diagram showing an object map representing the center of a held object;
- FIG. 10 is a diagram showing an object map representing the priority of holding positions specified by a user; It is a figure which shows an example of a contact map.
- a robot control system 100 includes a robot 2, a camera 4, a robot control device 110, and a holding parameter estimation device 10.
- the robot 2 holds the object 80 to be held by the end effector 2B and performs the work.
- a robot controller 110 controls the robot 2 .
- the holding parameter estimating device 10 estimates the opening width of the end effector 2B described later and the contact position when holding the object 80 as holding positions, and outputs them to the robot control device 110. do.
- the robot 2 holds the object 80 positioned on the work start table 6 . That is, the robot control device 110 controls the robot 2 to hold the object 80 on the work start table 6 .
- the robot 2 may move the object 80 to be held from the work start table 6 to the work target table 7 .
- the holding target 80 is also called a work target.
- the robot 2 operates inside the operating range 5 .
- the robot 2 has an arm 2A and an end effector 2B.
- the arm 2A may be configured as, for example, a 6-axis or 7-axis vertical articulated robot.
- the arm 2A may be configured as a 3-axis or 4-axis horizontal articulated robot or SCARA robot.
- the arm 2A may be configured as a 2-axis or 3-axis Cartesian robot.
- Arm 2A may be configured as a parallel link robot or the like.
- the number of shafts forming the arm 2A is not limited to the illustrated one.
- the robot 2 has an arm 2A connected by a plurality of joints and operates by driving the joints.
- the end effector 2B may include, for example, a gripper configured to hold the object 80 with an arbitrary opening width.
- the gripper has at least one holding part. Each holding part may be configured to be movable along a predetermined direction.
- the holding section may contact the holding object 80 when holding the holding object 80 .
- the retainer may have one or more joints.
- the holding part may be, for example, two or more fingers that sandwich and hold the object 80 to be held.
- the fingers may be formed by members movable relative to each other.
- the holding section may be at least one suction section that holds the holding target 80 by suction, or a scooping hand configured to scoop the holding target 80 .
- the end effector 2B is not limited to these examples, and may be configured to perform various other operations. In the configuration illustrated in FIG. 1, the end effector 2B is assumed to include a gripper.
- the robot 2 can control the position of the end effector 2B by operating the arm 2A.
- the end effector 2B may have an axis that serves as a reference for the direction in which it acts on the object to be held 80 . If the end effector 2B has an axis, the robot 2 can control the direction of the axis of the end effector 2B by operating the arm 2A.
- the robot 2 controls the start and end of the action of the end effector 2B acting on the object 80 to be held.
- the robot 2 can move or process the holding object 80 by controlling the position of the end effector 2B or the direction of the axis of the end effector 2B and controlling the operation of the end effector 2B. In the configuration illustrated in FIG.
- the robot 2 causes the end effector 2B to hold the object 80 on the work start table 6 and moves the end effector 2B to the work target table 7 .
- the robot 2 causes the end effector 2B to release the held object 80 on the work target table 7 . By doing so, the robot 2 can move the object 80 to be held from the work start table 6 to the work target table 7 .
- the robot control system 100 is assumed to have a camera 4 attached to the end effector 2B of the robot 2 .
- the camera 4 photographs the held object 80 .
- the camera 4 may photograph the held object 80 from the direction in which the end effector 2B holds the held object 80, for example.
- An image obtained by capturing the holding object 80 is also referred to as a holding target image.
- the camera 4 also includes a depth sensor and is configured to be able to acquire depth data of the held object 80 .
- Depth data is data about the distance in each direction within the angle of view range of the depth sensor. More specifically, the depth data can also be said to be information about the distance from the camera 4 to the measurement point.
- An image captured by the camera 4 may include monochrome luminance information, or may include luminance information of each color represented by RGB (Red, Green, Blue) or the like.
- the number of cameras 4 is not limited to one, and may be two or more.
- the camera 4 may also photograph other objects located within a predetermined range from the object to be held 80 as obstacles, and acquire depth data of the obstacles.
- the camera 4 is not limited to being attached to the end effector 2B, and may be provided at any position where the held object 80 can be photographed.
- the image to be held may be combined based on the image captured by the camera 4 attached to the structure.
- the image to be held may be synthesized by image conversion based on the relative position and orientation of the end effector 2B with respect to the mounting position and orientation of the camera 4 .
- images to be retained may be capable of being generated from CAD and drawing data.
- the retention parameter estimation device 10 includes a control section 12 and an interface (acquisition section) 14 .
- the control unit 12 may include at least one processor to provide control and processing power to perform various functions.
- the processor may execute programs that implement various functions of the controller 12 .
- a processor may be implemented as a single integrated circuit.
- An integrated circuit is also called an IC (Integrated Circuit).
- a processor may be implemented as a plurality of communicatively coupled integrated and discrete circuits. Processors may be implemented based on various other known technologies.
- the control unit 12 may include a storage unit.
- the storage unit may include an electromagnetic storage medium such as a magnetic disk, or may include a memory such as a semiconductor memory or a magnetic memory.
- the storage unit stores various information.
- the storage unit stores programs and the like executed by the control unit 12 .
- the storage unit may be configured as a non-transitory readable medium.
- the storage section may function as a work memory for the control section 12 . At least part of the storage unit may be configured separately from the control unit 12 .
- the control unit 12 estimates the opening width when part of the end effector is displaced in order to cause the robot 2 to hold the object 80 to be held.
- the opening width is the position of the portion of the holding portion that contacts the holding object 80 when held by the holding portion and is opened in a predetermined direction with respect to the reference position.
- the reference position is the most closed position in a predetermined movable direction.
- the most closed position is, for example, the displaceable end on the direction side where the object to be held can be sucked in the predetermined direction in the configuration having at least one holding portion.
- the most closed position is, for example, the displaceable end toward the other holding part in a configuration with a plurality of holding parts.
- the estimation of the opening width is based on the end effector model, retention target information, and depth data, which will be described later.
- the hold target information is information indicating the position of the hold target at the time of imaging by the camera, and is, for example, a hold target image as information.
- An image to be held in the following description is a specific description of the information to be held, and may include information that is not limited to an image.
- the control unit 12 may further estimate the holding position at which the robot 2 holds the object 80 based on the information or data acquired by the interface 14 .
- the interface 14 acquires information or data regarding the held object 80 and the like from an external device.
- the interface 14 may include an input device that receives input of information, data, or the like from the user.
- the input device may include, for example, a touch panel or touch sensor, or a pointing device such as a mouse.
- the input device may be configured including physical keys.
- the input device may include an audio input device such as a microphone.
- the interface 14 acquires, for example, a holding target image obtained by capturing the holding target 80 from the camera 4 and depth data associated with the holding target image.
- the interface 14 obtains information about the end effector 2B from the robot 2 or an input device, for example.
- the interface 14 may also output information or data to an external device.
- the interface 14 may output the opening width of the end effector 2B estimated by the control section 12 .
- the interface 14 may output the holding position to hold the holding object 80 estimated by the control unit 12 .
- the interface 14 may output information or data so that the user recognizes it.
- the interface 14 may include an output device that outputs information, data, or the like to the user.
- the interface 14 may present an estimation result to the user and receive an instruction from the user regarding whether or not to execute the control when the robot is controlled based on the estimated opening width, for example, using the output device.
- the user's instruction may be acquired by the above input device.
- the holding parameter estimation device 10 may output the estimated opening width and the like to the robot control device 110 without requesting the user's instruction based on the estimation result of the estimated opening width.
- the output device may include, for example, a display device that outputs visual information such as images or characters or graphics.
- the display device may include, for example, an LCD (Liquid Crystal Display), an organic EL (Electro-Luminescence) display or an inorganic EL display, a PDP (Plasma Display Panel), and the like.
- the display device is not limited to these displays, and may be configured to include other various types of displays.
- the display device may include light emitting devices such as LEDs (Light Emission Diodes) and LDs (Laser Diodes).
- the display device may be configured including other various devices.
- the output device may include, for example, an audio output device such as a speaker that outputs auditory information such as sound. Output devices are not limited to these examples, and may include other various devices.
- the interface 14 may be configured including a communication device configured to be capable of wired or wireless communication.
- a communication device may be configured to be able to communicate with communication schemes based on various communication standards.
- a communication device may be configured according to known communication technologies.
- the robot control device 110 may acquire information specifying the opening width from the retention parameter estimation device 10 .
- the robot control device 110 may control the robot 2 to open the end effector 2B to the estimated opening width when a part of the end effector 2B is displaced to an approach area described later.
- the robot control device 110 may further acquire information specifying the holding position from the holding parameter estimation device 10 .
- the robot control device 110 may control the robot 2 so that the robot 2 holds the holding object 80 at the estimated holding position.
- the robot controller 110 may be configured with at least one processor to provide control and processing power to perform various functions.
- Each component of the robot control device 110 may be configured including at least one processor.
- a plurality of components among the components of the robot control device 110 may be realized by one processor.
- the entire robot controller 110 may be implemented with one processor.
- the processor may execute programs that implement various functions of the robot controller 110 .
- the processor may be configured identically or similarly to the processor used in retention parameter estimator 10 .
- the robot control device 110 may include a storage unit.
- the storage unit may be configured identically or similarly to the storage unit used in the retention parameter estimation device 10 .
- the robot control device 110 may include the holding parameter estimation device 10 .
- the robot control device 110 and the holding parameter estimation device 10 may be configured separately.
- the robot control system 100 controls the robot 2 by the robot control device 110 to cause the robot 2 to perform work.
- the work to be executed by the robot 2 includes an action for holding the object 80 to be held.
- the work to be performed by the robot 2 may include the action of holding the object 80 to be held.
- the holding parameter estimation device 10 estimates the opening width when the end effector 2B is displaced to the holding object 80.
- the robot controller 110 may control the robot 2 to open the end effector 2B to the estimated opening width.
- the holding parameter estimation device 10 may further estimate the holding position of the holding target 80 by the robot 2 .
- the robot control device 110 may control the robot 2 so that the robot 2 holds the holding object 80 at the holding position.
- the control portion 12 sets a combination of positions at which the holding portion contacts the holding object 80 when the holding portion grips the holding object 80 as a holding position. can be estimated as In a configuration in which the end effector 2B has a suction portion as a holding portion, the control portion 12 estimates the position at which the suction portion contacts the holding object 80 when the end effector 2B sucks the holding object 80 as the holding position. good.
- the retention parameter estimation device 10 acquires a retention target image obtained by capturing the retention target 80 from the camera 4 and the depth data of the retention target 80 via the interface 14 .
- the control unit 12 recognizes the outer shape and position of the object to be held 80 based on the image to be held and the depth data. As illustrated in FIG. 4, the control unit 12 generates a mask image 20 representing the recognition result of the held object 80 when the held object 80 is viewed from the camera 4 attached to the end effector 2B. .
- the mask image 20 includes a window 22 representing the area where the object to be held 80 exists when viewed from the camera 4, and a mask 24 representing the other area. In FIG. 4, window 22 is represented as a white area. In FIG.
- the mask 24 is represented as a region hatched with upward slanted lines.
- the portion of the mask 24 in the mask image 20 is shown as a hatched area for convenience of description in the drawing, it may be shown as a black area in an actual embodiment.
- the coloring of portions of window 22 and mask 24 may be reversed.
- the coloring and hatching of portions of window 22 and mask 24, respectively may be reversed.
- the control unit 12 acquires information about the end effector 2B via the interface 14.
- the information about the end effector 2B includes, for example, information specifying the maximum distance between gripped holding portions, the thickness of the holding portion, the width of the holding portion, and the like.
- the thickness of the holding portion is the length in the opening/closing direction of the holding portion.
- the width of the holding portion is the length in the direction perpendicular to the opening/closing direction of the holding portion.
- the control unit 12 may generate the end effector model 30 based on information regarding the end effector 2B.
- the end effector model 30 shows the area where the end effector 2B can exist.
- the control unit 12 may acquire the end effector model 30 as information on the end effector 2B via the interface 14 .
- An end effector model 30 with a holding portion having two fingers includes a retainer model that specifies an out-of-operation range 38 representing ranges other than 32 .
- the end effector model 30 has a range in which the holding portions of the gripper are positioned at predetermined intervals, as shown in FIG. 6A.
- a retainer model may be included that identifies retainer positions 32 to represent and an out-of-operation range 38 to represent areas outside of retainer positions 32 . That is, the holding part model represents the opening width of the holding part of the gripper.
- the retainer position 32 may represent the range over which the retainers are positioned at maximum spacing.
- the holder model represents the maximum opening width of the holder. It is assumed that the holder model shown in FIGS. 5A and 6A represents the maximum opening width of the holder.
- the retainer positions 32 are represented as white areas.
- the out-of-operation range 38 is represented as a hatched area with upward slanting lines. Note that the out-of-operation range 38 is represented as a hatched area for convenience of description in the drawing, but may be represented as a black area in an actual embodiment.
- the coloring of the retainer position 32 and the portion of the out-of-motion area 38 may be reversed.
- the coloring and hatching of the retainer location 32 and portions of the out-of-operation area 38, respectively, may be reversed.
- the end effector model 30 with a two-finger holding portion includes a stroke model that specifies a stroke range 34 representing the range over which the holding portion of the gripper operates, as shown in FIG. 5B.
- the end effector model 30 defines a stroke range 34 representing the range in which the holding portion of the gripper operates, as shown in FIG. 6B or 6C. May include a stroke model to specify.
- a stroke range 34 is represented as a white area.
- the out-of-operation range 38 is represented as a hatched area with upward slanting lines.
- out-of-operation range 38 is represented as a hatched area for convenience of description in the drawing, but may be represented as a black area in an actual embodiment.
- the coloring of portions of stroke range 34 and out-of-motion range 38 may be reversed.
- the coloring and hatching of portions of the stroke range 34 and out-of-motion range 38, respectively, may be reversed.
- the end effector model 30, as shown in FIG. 5C, includes an overall model combining the holding part model of FIG. 5A and the stroke model of FIG. 5B.
- the global model specifies the operating range 36 of the retainer.
- the retainer operating range 36 includes the retainer position 32 .
- the holding portion portion of the operating range 36 of the holding portion is shown by a dashed line, but it may not be distinguished in an actual embodiment.
- the end effector model 30 represents a range in which the holding portions are positioned with the maximum spacing (maximum opening width), but the present invention is not limited to this.
- the end effector model 30 may represent a range in which the holding portions of the gripper are positioned at arbitrary intervals (predetermined intervals). For example, the end effector model 30 may represent the interval between the holding portions of the gripper according to the size of the object to be gripped by the end effector 2B.
- the control unit 12 may generate only the overall model as the end effector model 30.
- the holding portion position 32 may be identified by associating information specifying the holding portion position 32 with the overall model.
- the information specifying the holding portion position 32 may include numerical values representing feature points of the holding portion.
- the end effector model 30 is configured as a model that defines the range of interference with other objects when the suction section suctions the holding target 80 .
- the control unit 12 sets the height at which the holding object 80 is held based on the depth data of the holding object 80 . Specifically, as illustrated in FIG. 2, the control unit 12 sets the height from the work start table 6 as the position for holding the object 80 placed on the work start table 6. .
- a holding point 82 represents a position where the object 80 to be held is sandwiched and held by the holding portions of the end effector 2B.
- H represents the height of the holding point 82 from the work starting table 6 .
- the control unit 12 sets H as the height for holding the object 80 to be held. Based on the depth data of the holding object 80, the control unit 12 sets the holding height of the holding object 80 to a value smaller than the distance from the work start table 6 to the highest point of the holding object 80. .
- the control unit 12 may set the height at which the object to be held 80 is held to a value approximately half the height of the object to be held 80 .
- the control unit 12 may generate the mask image 20 based on the height at which the object to be held 80 is held and the depth data of the object to be held 80 . Specifically, the control unit 12 may generate the mask image 20 with the cross-sectional shape of the holding target 80 as the plane of the height at which the holding target 80 is held as the window 22 .
- the control unit 12 may create an approach map for specifying the opening width when part of the end effector 2B is displaced to hold the object 80 to be held.
- the control unit 12 may create an approach map based on the image to be held and the depth data.
- the approach map 90 may indicate at least the approach area 91.
- the approach map 90 may further show an object area 92 , a non-approach area 93 and a first area 94 .
- the approach region 91 is a region in which the end effector 2B can be opened without interfering with objects other than the held object 80 at the height at which the held object 80 is held.
- the target object area 92 is the existence area of the holding target object 80 .
- the non-approach area 93 is an area outside the second area described later, and is the entire area of the approach map 90 other than the approach area 91 , the object area 92 and the first area 94 .
- the first area 94 is an area facing the opposite side of the object to be held 80 from the outer edge facing the object to be held 80 in the presence area of an object other than the object to be held 80 , in other words, an obstacle.
- the approach area 91 is represented as a white area.
- An object area 92, a non-approach area 93, and a first area 94 are represented as areas hatched with upward slanting lines.
- the object area 92, the non-approach area 93, and the first area 94 are represented as hatched areas for convenience of drawing description, but may be represented as black areas in an actual embodiment. .
- the coloring of portions of the approach region 91, object region 92, non-approach region 93, and first region 94 may be reversed.
- the coloring and hatching of the approach area 91 and portions of the object area 92, non-approach area 93, and first area 94, respectively may be reversed.
- the control unit 12 may generate a second region based on the mask image 20 in order to create the approach map 90.
- the second area is an area in which the end effector 2B can be opened around the object 80 to be held without interfering with the object 80 by focusing only on the object 80 to be held.
- the second region may be the region having the maximum opening width of the holding portion of the end effector 2B from the outer edge of the object 80 to be held.
- the control unit 12 may generate the second region by performing a convolution of the end effector model 30 on the mask image 20 . More specifically, the control unit 12 controls the movement of the holding part so that at least a part of the movement range 36 of the holding part specified by the end effector model 30 overlaps the window 22 included in the mask image 20 .
- Range 36 may be moved.
- the rectangle representing the retainer range of motion 36 may be rotated and positioned at various angles, for example, such that at least a portion of the rectangle overlaps the upper left corner point of the window 22 .
- the control unit 12 may generate, as the second area, an area through which the movement range 36 of the holding section passes when the movement range 36 of the holding section is moved.
- the trajectory drawn by the point farthest from window 22 as movement range 36 of the holding part moves is represented as boundary 95 of the second region.
- a boundary 95 is represented by a dashed line.
- the control unit 12 generates an object area 92 in the generated second area 96 based on the depth data of the holding object 80 .
- the control unit 12 generates a range in which the holding target 80 exists at a position higher than the height at which the holding target 80 is held as the target object region 92 .
- the control unit 12 arranges an obstacle area 97 in the generated second area 96 based on the depth data of an object other than the object to be held 80 .
- the control unit 12 generates, as an obstacle area 97 , a range in which an object other than the holding object 80 exists at a position higher than the holding object 80 .
- the control unit 12 generates a first area 94 based on the target object area 92 and the obstacle area 97 .
- the first area 94 is an area that includes the obstacle area 97 and extends outside the obstacle area 97. As described above, the first area 94 extends from the outer edge of the obstacle area 97 facing the object 80 to be held. is the region facing the opposite side.
- the control unit 12 calculates a straight line that passes through the center C92 of the object area 92 and overlaps the obstacle area 97 .
- the control unit 12 rotates the straight line about the center C92 to generate a first area 94 as a trajectory through which line segments outside the obstacle area 97 of the straight line pass.
- control unit 12 draws two straight lines SL that pass through the center C92 of the object area 92 and intersect the outer edge of the obstacle area 97 at one point. As shown in FIGS. 7 and 9 , the control unit 12 sets the area surrounded by the two straight lines SL, the outer edge of the second area 96 , and the outer edge of the obstacle area 97 on the side of the object area 92 as the first area 94 .
- the control unit 12 generates the approach area 91 by excluding the object area 92 and the first area 94 from the second area 96 . As shown in FIG. 10 , the control unit 12 excludes at least one of a region 92ex obtained by dilating the object region 92 and a region 94ex obtained by dilating the first region 94 from the second region 96 to form an approach region. 91 may be generated.
- the object region 92 and the first region 94 excluded from the second region 96 have a size equal to or larger than half the width of the holding portion in the width direction of the holding portion, and a width of the holding portion in the thickness direction of the holding portion. It may be expanded to half or more of its thickness.
- the object region 92 or the dilated region 92ex may also be referred to as “object regions 92, 92ex”.
- the first region 94 or the expanded region 94ex may be referred to as “first regions 94, 94ex”.
- the control unit 12 estimates the opening width at any point within the window 22 of the mask image 20 .
- the arbitrary point is, for example, a reference position of the end effector 2B when holding the object 80 to be held by the end effector 2B. Any point selected within the window 22 corresponds to the approach position 70 contained in the object regions 92, 92ex on the approach map 90, as shown in FIG. That is, the control unit 12 sets an arbitrary point within the window 22 of the mask image 20 as the approach position 70 .
- the control unit 12 aligns the center of the end effector model 30 with the approach position 70 and projects it. Projected end effector model 30 is represented in FIG. 11 as projected models 72a and 72b.
- the projection model 72a corresponds to a model obtained by rotating the end effector model 30 along the short side direction of the object regions 92 and 92ex.
- the projection model 72b corresponds to a model obtained by rotating the end effector model 30 clockwise by 45 degrees from the direction of the short sides of the object regions 92 and 92ex.
- the control unit 12 determines that the end effector model 30 cannot be projected when the holding portion position 32 included in the end effector model 30 overlaps the object regions 92 and 92ex. As shown in FIG. 12, in the projection model 72a, the projection position 74a of the holding part included in the projection model 72a does not overlap the object regions 92, 92ex. Therefore, the control unit 12 can project the projection model 72 a onto the approach map 90 . On the other hand, in the projection model 72c, the projection position 74c of the holding portion included in the projection model 72c overlaps the object regions 92 and 92ex. Therefore, the control unit 12 cannot project the projection model 72c onto the approach map 90. FIG.
- a position 70 is determined. By determining the approach position 70 at which the end effector 2B collides with the held object 80, the position to be excluded as an arbitrary point selected for estimation is determined.
- control unit 12 determines that the end effector model 30 cannot be projected when it can be determined that the distance between the target object region 92 and the first region 94 is shorter than the thickness of the holding portion. It can be determined that the distance is shorter than the thickness of the holding part because in the approach map 90 in which either the object region 92 or the first region 94 is expanded, the distance is 1/1 of the thickness of the holding part. /2. Alternatively, it can be determined that the distance is shorter than the thickness of the holding portion because in the approach map 90 in which both the object region 92 and the first region 94 are expanded, the expanded regions 92ex and 94ex are even slightly This is the case when they overlap.
- the approach position 70 at which the end effector 2B collides with an obstacle when a part of the end effector 2B is displaced to the holding object 80 to hold the holding object 80 is determined. be.
- the position to be excluded as an arbitrary point selected for estimating the opening width is determined.
- control unit 12 can project the end effector model 30 based on the characteristic points of the holding portion associated with the end effector model 30 in a configuration in which the holding portion position 32 is not specified in the end effector model 30. determine.
- the position of the end effector 2B corresponding to the position and rotation angle of the end effector model 30 that can be projected onto the approach map 90 is a position suitable for the approach map 90. It can also be said that the control unit 12 is a position for estimating the opening width for each position that conforms to the approach map 90 .
- the control unit 12 estimates the opening width based on the approach area 91 on the approach map 90 and the area where the end effector 2B may exist on the end effector model 30. You can As shown in FIG. 13, the control unit 12 estimates a possible existence area 99 for each holding unit in order to estimate the opening width.
- the possible existence area 99 is an area where the approach area 91 and the motion range 36 of the holding part overlap when the end effector model 30 is superimposed on the approach map 90 .
- the control section 12 may estimate an arbitrary point within the holding section possible area 99 as the position of the holding section corresponding to the opening width.
- the control unit 12 estimates the position of each holding part, in other words, the position of the holding part in each possible area 99 corresponding to each holding part.
- the control section 12 may estimate the position of the holding section corresponding to the opening width by the following method. As shown in FIG. 14, the control unit 12 calculates a first center position CP1 that is the center position of the possible existence area 99 corresponding to each holding unit. The control unit 12 determines whether or not the holding unit interferes with the object to be held 80 and objects other than the object to be held 80 when the holding unit is displaced along each predetermined direction starting from the first center position CP1. determine whether The control unit 12 displaces the holding unit from the first center position CP1 until it reaches the end of the corresponding possible area 99 .
- the control unit 12 selects an arbitrary position that does not interfere with the holding object 80 and objects other than the holding object 80 in the area through which the holding unit passes until it reaches the end of the possible existence area 99 . It is estimated as the position of the holding portion corresponding to the opening width in the area 99 . In other words, the control unit 12 sets an arbitrary position in the area where the approach area 91 overlaps with the area through which the holding unit passes before reaching the end of the possible area 99 as the opening width of the possible area 99. is estimated as the position of the holding part corresponding to . More specifically, the controller 12 displaces the first center position CP1 along a predetermined direction. Furthermore, the control unit 12 estimates an arbitrary point that can be located within the approach area 91 as the position of the holding unit, the point being displaced from the first center position CP1.
- the position of each holding portion corresponding to the opening width is the same for all holding portions. is estimated to be
- the first center position CP1 of the possible area 99 corresponding to one holding portion is the same as that of the holding portion. It may deviate from the center in the width direction. In other words, the first center position CP1 may deviate from the movable straight line AL parallel to the opening/closing direction of the holding section and passing through the center position (approach position 70) of the area where the end effector 2B may exist.
- the control unit 12 sets the intersection of the movable straight line AL and the perpendicular drawn from the first center position CP1 to the movable straight line AL as the first center.
- the position of the retainer corresponding to the opening width may be estimated as described above.
- the control section 12 may first displace the holding section from the first central position CP1 toward the area where the end effector 2B can exist in the end effector model 30, in other words, toward the central position of the operating range 36 of the holding section. .
- the control section 12 may search for the approach area 91 outside the first center position CP1 in the possible area 99 .
- the control unit 12 selects, as an arbitrary position, the position where the approach area 91 is reached first while displacing the point along a predetermined direction with the first center position CP1 as the starting point. , may be estimated as the position of the holding part.
- the first center position CP1 may be estimated as the position of the retainer.
- the control unit 12 moves the approach position 70 so as to scan within the range of the window 22 of the mask image 20 , rotates the end effector model 30 at each position, and projects it onto the approach map 90 .
- the control unit 12 extracts a combination of the approach position 70, the rotation angle of the end effector model 30, and the opening width of the end effector 2B that can project the end effector model 30 onto the approach map 90.
- the control unit 12 may create an opening width model based on the estimated opening width.
- the opening width model identifies a region where the end effector 2B can exist within the range of the opening width estimated as the position of the holding portion.
- the opening width model 31 defines a region having the width of the holding portion from the center of the operating range 36 of the holding portion to the position WP of the holding portion corresponding to the opening width. generated by drawing on
- the control unit 12 may estimate the holding position where the end effector 2B contacts the holding object 80 based on the opening width model 31 and a rule map to be described later.
- the control unit 12 may generate a rule map for estimating the holding position.
- a rule map may be generated for each height at which the object to be held 80 is held.
- the rule map may specify the rule by the display form of the image expressed in two dimensions.
- the control unit 12 may generate a rule map based on at least one of the image to be held and the depth data.
- the control unit 12 may acquire the rule map from an external device via the interface 14 .
- the rule map may include a map that defines the position of the holding object 80 that the end effector 2B should use when holding, in other words, the holding position, that is, the holding position.
- the rule map may include a map generated based on the height at which the holding object 80 is held.
- the rule map may be classified into, for example, a surrounding environment map 40 (see FIG. 18), an object map 50 (see FIGS. 19A and 19B), or a contact map 60 (see FIG. 20).
- the control unit 12 may acquire a rule map based on at least one of the shape data of the hold target object 80 and the depth data associated with the hold target image.
- the surrounding environment map 40 may specify an approach area 41, an object area 42, a non-approach area 43, and a first area 44, as shown in FIG.
- the surrounding environment map 40 may be the same as the approach map 90 without dilation processing. Therefore, the approach area 41, the object area 42, the non-approach area 43, and the first area 44 in the surrounding environment map 40 correspond to the approach area 91, the object area 92, the non-approach area 93, and the approach map 90, respectively. It may be the same as the first region 94 .
- the surrounding environment map 40 may be generated by the same method as the approach map 90 .
- the control unit 12 may perform a blurring process to obscure the boundaries of the generated surrounding environment map 40 .
- Each area included in the surrounding environment map 40 generated by the procedure described above can be distinguished as a numerical value at each coordinate in the map.
- the control unit 12 may set 1 as the numerical value for the coordinates included in the approach area 41, which indicates that the point specified by the coordinates is included in the motion range of the end effector 2B.
- the control unit 12 determines that the points specified by the coordinates are not included in the movement range of the end effector 2B as the numerical values of the coordinates included in the target object area 42, the non-approach area 43, and the first area 44. 0 (zero) may be set as a numerical value representing .
- the control unit 12 sets the numerical value of the coordinates specifying a point within a predetermined range from the boundary between the area set to 1 and the area set to 0 to a value greater than 0 and less than 1, such as 0.5. set the value of The control unit 12 executes the process of making the boundary of the area ambiguous in this way as the blurring process.
- the surrounding environment map 40 generated by the procedure described above can be distinguished by the color at each coordinate in the map.
- the control unit 12 may represent points included in the approach area 41 in white and points included in other areas in black.
- the control unit 12 may represent the color of points within a predetermined range from the boundary between the area represented by white and the area represented by black in grayscale.
- the control unit 12 may execute the process of making the boundary of the area ambiguous in this way as the blurring process. Representing the color of each area in black, white, and gray corresponds to representing the numerical value set in each area as a luminance value.
- the control unit 12 can estimate the position of the end effector 2B with respect to the object to be held 80 by taking various margins into consideration by the blurring process.
- a blurring process may be performed on the peripheral portion of each region included in the generated surrounding environment map 40 . Each region is enlarged by blurring.
- the object map 50 represents information to be referred to in order to determine at which position of the holding object 80 the holding object 80 should be held when the worker holds the holding object 80 .
- the object map 50 represents information such as the shape, material, or density distribution of the holding object 80, for example.
- the object map 50 regarding the holding object 80 assumed in this way is obtained by cutting the holding object 80 by a plane of the height at which the holding object 80 is held. Closer to the center, it is represented by a color closer to white, which corresponds to a correct holding position for the rules. may be represented in grayscale as shown. 19B, the object map 50 may be expressed in grayscale that specifies a rule arbitrarily set by the user as a rule for estimating the holding position of the object 80 to be held.
- the object map 50 in FIG. 19B is represented in the same color along the height direction of the cross section 52 .
- the solid black line surrounding the cross section 52 of the object map 50 is simply a line representing the contour of the cross section 52 and does not indicate a rule. It does not mean that the coordinates drawn with a solid black line are improper holding positions.
- the object map 50 assumes a position to hold the holding object 80, and when evaluating the adequacy of holding the holding object 80 at the assumed position, by holding the vicinity of the area represented by a color close to white. It may be configured such that the aptitude value representing the evaluation of suitability increases. Also, the object map 50 may be represented by associating a numerical value with the color of each coordinate, like the surrounding environment map 40 . For example, the control unit 12 may set 1 to coordinates represented in white and 0 to coordinates represented in black.
- the control unit 12 is not limited to the above example, and may generate an object map 50 in which colors or numerical values are set for each coordinate so as to specify various rules.
- the control unit 12 may generate the object map 50 in which colors or numerical values are set for each coordinate according to the distance from the center of gravity of the object 80 to be held.
- the control unit 12 may generate the object map 50 in which a color or a numerical value is set for each coordinate so as to specify positions that should be avoided as holding positions or positions that should be prohibited as holding positions.
- the control unit 12 may generate one object map 50 specifying a plurality of rules by mixing object maps 50 specifying one rule.
- the control unit 12 may set a weighting factor for each object map 50 and mix them. For example, when the center-of-gravity position of the held object 80 is important, the control unit 12 may set a large weighting factor for the object map 50 that specifies the center-of-gravity position.
- the object map 50 may be defined based on the properties of the holding object 80 itself.
- the object map 50 may be defined based on the shape, material, texture, weight, or coefficient of friction of the object 80 to be held.
- the object map 50 may be based on rules arbitrarily defined by the user regarding the holding position of the holding object 80 . For example, among the parts of the object to be held 80, it should not be held at the holding position for various reasons, such as parts that are likely to be damaged or deformed by contact, parts to which grease or the like adheres, and parts that are slippery and unsuitable for holding. Parts can be defined in the object map 50 as rules.
- parts that are difficult to break or deform, parts that are not coated with grease, etc., parts that are not slippery, and other parts that should be placed in a holding position based on empirical rules are rules for objects.
- the object map 50 may be generated for each type of holding object 80 .
- the contact map 60 represents rules determined based on the relationship between the finger of the end effector 2B and the state of the surface of the held object 80. Like the object map 50, the contact map 60 represents information to be referred to for determining at which position of the holding object 80 the holding object 80 should be held when the operator holds the holding object 80. .
- the contact map 60 is defined based on the shape of the contact portion of the end effector 2 ⁇ /b>B with the object to be held 80 and the shape of the object to be held 80 .
- the contact map 60 represents the adequacy of the position where the end effector 2B contacts the object 80 to be held.
- the portion should be at the holding position or the portion should not be at the holding position.
- a portion that is difficult for the end effector 2B to be used to hold can be defined in the contact map 60 as a rule representing a portion that should not be held.
- a portion where the contact area between the end effector 2B and the object to be held 80 is large, a portion where the coefficient of friction between the end effector 2B and the object to be held 80 is larger than a predetermined value, or other empirical rules are used.
- a portion that should be easily held by the end effector 2B can be defined in the contact map 60 as a rule representing a portion that should be held.
- the contact map 60 is, for example, a contact area between the surface of the object to be held 80 and the finger of the end effector 2B when the object to be held 80 is held by the end effector 2B, or a contact area between the surface of the object to be held 80 and the end effector 2B. Represents the frictional force acting between the finger and the like. If the surface of the object to be held 80 has irregularities, even a slight shift in the position of the finger of the end effector 2B can greatly change the contact area.
- the contact map 60 is closer to the center of each side on the outer circumference 62 of the cross section obtained by cutting along the plane of the height at which the object to be held 80 is held.
- the contact area between the surface of the holding object 80 and the finger of the end effector 2B increases when holding positions near the center of each side, and holding positions near the corners. This means that the contact area becomes smaller when
- the black solid line surrounding the perimeter 62 of the contact map 60 is simply a line representing the contour of the perimeter 62. It does not mean that the coordinates drawn with a solid black line are improper holding positions.
- the control unit 12 may generate one contact map 60 specifying a plurality of rules by mixing contact maps 60 specifying one rule.
- the control unit 12 may set a weighting factor for each contact map 60 and mix them. For example, when the contact area between the surface of the object to be held 80 and the finger of the end effector 2B is important, the control section 12 may set a large weighting factor for the contact map 60 that specifies the contact area.
- the control unit 12 estimates the holding position of the holding object 80 based on the generated rule map. Specifically, the control unit 12 projects the opening width model 31 as a temporary holding position onto the rule map, and calculates the degree of matching at the projected position to actually hold the object 80 at the temporary holding position. Evaluate the appropriateness of retention.
- the control unit 12 calculates the degree of matching in the surrounding environment map 40 for each combination of the approach position 70 extracted together with the opening width of the end effector 2B and the rotation angle of the end effector 2B. Specifically, the control unit 12 projects the opening width model 31 on the surrounding environment map 40 at the extracted approach position 70 and rotation angle. The control unit 12 calculates the numerical value or the average value of the brightness of the color set for each coordinate in the range in which the approach area 41 and the opening width model 31 overlap in the surrounding environment map 40 as the degree of matching.
- the position of the end effector 2B corresponding to the position of the opening width model 31 and the rotation angle that can be projected onto the surrounding environment map 40 is a position suitable for the surrounding environment map 40. It can also be said that the control unit 12 estimates the holding position from among the positions that match the surrounding environment map 40 .
- the control unit 12 calculates the matching degree in the object map 50 for each combination of the extracted approach position 70 and the rotation angle of the end effector 2B together with the opening width of the end effector 2B. Specifically, the control unit 12 projects the opening width model 31 onto the object map 50 at the extracted approach position 70 and rotation angle. The control unit 12 calculates the numerical value or the average value of the brightness values of the colors set for each coordinate in the range where the cross section 52 of the held object 80 in the object map 50 and the opening width model 31 overlap as the matching degree.
- the object map 50 represents the adequacy of the holding object 80 as a position. It can also be said that the position of the end effector 2B corresponding to the position of the opening width model 31 projected onto the object map 50 and the rotation angle is a position suitable for the object map 50 . It can be said that the control unit 12 estimates the holding position from among the positions that match the object map 50 .
- the control unit 12 calculates the matching degree in the contact map 60 for each combination of the approach position 70 and the rotation angle of the end effector 2B extracted together with the opening width of the end effector 2B. Specifically, the control unit 12 projects the opening width model 31 onto the contact map 60 at the extracted approach position 70 and rotation angle. The control unit 12 calculates the numerical value or the average value of the brightness values of the colors set for each coordinate in the range overlapping the outer circumference 62 and the opening width model 31 in the contact map 60 as the matching degree.
- the contact map 60 represents the adequacy of the contact portion of the end effector 2B with the holding object 80 as the position of contact with the holding object 80 . It can also be said that the control unit 12 estimates the holding position from among a plurality of positions of the end effector 2B corresponding to the positions and rotation angles of the plurality of opening width models 31 projected onto the contact map 60 .
- the control unit 12 may calculate the angle at which each holding portion position 32 of the opening width model 31 is incident on the outer circumference 62 along the direction of the stroke range 34 . In other words, the control unit 12 may calculate the angle of intersection at each of the two points of intersection between the line along the direction of the stroke range 34 of the opening width model 31 and the outer circumference 62 . In the present embodiment, the controller 12 calculates the incident angle when the holder position 32 is vertically incident on the outer circumference 62 as 0 degrees. The control unit 12 may reflect the calculated angle in the value of the degree of matching in the contact map 60 . The control unit 12 may calculate the degree of matching as a larger value as the angle is closer to 0 degrees.
- control unit 12 calculates the average value of the numerical value or the luminance value of the color set at each coordinate in the range overlapping the projected opening width model 31 in the outer circumference 62 of the contact map 60 and the cosine of the calculated angle.
- a product with a value (cosine value) may be calculated as the matching degree.
- control unit 12 When the outer circumference 62 has irregularities, the control unit 12 generates a flattened model of the outer circumference 62 based on the thickness, width, or length of the holding portion of the end effector 2B, and adjusts the flattened model to the holding portion.
- the angle at which position 32 is incident may be calculated.
- control unit 12 For each combination of the approach position 70 and the rotation angle of the opening width model 31, the control unit 12 adds up the degrees of matching calculated in each rule map to calculate the overall degree of matching.
- the control unit 12 may weight the degrees of matching calculated in each rule map and add them.
- the control unit 12 assigns the same weight to the degree of matching calculated in each rule map for all combinations.
- a weighting coefficient applied to the degree of matching calculated in each rule map is also referred to as a map coefficient. Map coefficients may be defined for each map.
- control unit 12 calculates the overall matching degree for each holding position based on the opening width model 31 and the rule map.
- the overall matching degree corresponds to a proper value representing the properness of each holding position.
- the rule map indicates the appropriateness of the position for holding the holding object 80 by a numerical value or a brightness value of color assigned to each position (each coordinate) of the rule map.
- the control unit 12 can calculate the appropriate value by calculating the value assigned to each position when the end effector model 30 is superimposed on the rule map.
- the control unit 12 compares the total matching degree calculated for each combination of the approach position 70 and the rotation angle of the end effector model 30, and selects a combination with a high total matching degree.
- the control unit 12 determines the position at which the finger of the end effector 2B is incident on the object to be held 80 when it moves along the stroke direction, which is determined based on the selected combination of the approach position 70 and the rotation angle of the end effector model 30. It is estimated as the position at which the holding object 80 is held. That is, the control unit 12 estimates the holding position based on the end effector model 30 and the rule map. It can also be said that the control unit 12 estimates the holding position based on the proper value.
- the control unit 12 outputs the estimated holding position to the robot control device 110 via the interface 14 .
- the control unit 12 of the retention parameter estimation device 10 may execute the retention parameter estimation method including the procedure of the flowchart illustrated in FIG. 21 .
- the retention parameter estimation method may be realized as a retention position estimation program that is executed by a processor that constitutes the control unit 12 of the retention parameter estimation device 10 .
- the retention parameter estimation program may be stored on a non-transitory computer readable medium.
- the control unit 12 acquires data including information about the end effector 2B, a holding target image obtained by photographing the holding target 80, and depth data of the holding target 80 (step S1).
- the control unit 12 generates the end effector model 30 (step S2).
- the control unit 12 estimates the height at which the object to be held 80 is held (step S3).
- the control unit 12 generates the mask image 20 (step S4).
- the control unit 12 creates an approach map 90 and rule maps such as the surrounding environment map 40, the object map 50, and the contact map 60 (step S5).
- the control unit 12 projects the end effector model 30 onto the approach map 90 (step S6).
- the control unit 12 estimates the opening width of the end effector 2B by projecting the end effector model 30 onto the approach map 90 (step S7).
- the control unit 12 creates the opening width model 31 based on the estimated opening width (step S8).
- the control unit 12 projects the opening width model 31 onto each rule map (step S9).
- the control unit 12 calculates the degree of matching between each rule map and the opening width model 31 projected onto each rule map (step S10).
- the control unit 12 weights the degree of matching calculated for each rule map in order to calculate the overall degree of matching (step S11).
- the control unit 12 selects the projection position of the opening width model 31 when the overall matching score is high, and estimates the position at which the holding unit of the end effector 2B is incident on the holding object 80 at the selected position as the holding position (step S12). After executing the procedure of step S12, the control unit 12 ends the execution of the procedure of the flowchart of FIG.
- the control unit 12 may execute the procedure of step S4 for generating the mask image 20 before step S2 or S3.
- the end effector model 30, the holding target image, and the depth data are used to hold the holding target 80.
- An opening width is estimated when a part of the effector 2B is displaced. If the opening width of the end effector 2B is close to the width of the object to be held 80 when part of the end effector 2B is displaced to the object to be held 80 for holding, the end effector 2B may collide with the object to be held during displacement. On the other hand, if the opening width is too wide, it takes a long time to hold the holding object 80 after the end effector 2B is partially displaced to the holding object 80 .
- the holding parameter estimating device 10 estimates the opening width at the time of displacement, so it is possible to estimate an appropriate state when the end effector 2B is brought closer to the holding object 80 . Therefore, the holding parameter estimating device 10 can provide parameters for operating the robot 2 so as to obtain an appropriate opening width.
- the approach area 91 in the approach map 90 corresponds to the area where the object to be held 80 exists and the first area 94 in the width direction of the holding portion of the end effector 2B. It is a region excluding a region expanded to a size equal to or more than half the width of the holding portion and to a size equal to or more than half the thickness of the holding portion in the thickness direction of the holding portion. With such a configuration, the holding parameter estimating device 10 can estimate the opening width in a region where there is a low possibility of interference with the holding target 80 and objects other than the holding target 80 .
- the first center position CP1 which is the center position of the possible existence area 99 for each holding portion of the end effector 2B, is calculated, and along a predetermined direction for each of the holding portions
- the holder is displaced from the first center position CP1 toward the object to be held 80, of the areas through which the holder passes until reaching the end of the possible area 99, the object 80
- an arbitrary position that does not interfere with objects other than the object to be held 80 is estimated as the position of the holding part corresponding to the opening width in the possible existence area 99 .
- the holding parameter estimating apparatus 10 can determine whether or not there is interference at only one point without determining whether or not there is interference in the entire area of the holding portion at an arbitrary position. Therefore, the retention parameter estimation device 10 can estimate the opening width at high speed.
- the control unit 12 determines the vertical line drawn from the first center position CP1 to the movable straight line AL and Considering the intersection of AL as the first center position CP1, the position of the holding portion corresponding to the opening width is estimated.
- the holding parameter estimating apparatus 10 allows a portion of the held object 80, etc. is positioned, the position of the holding portion corresponding to the opening width can be estimated on the movable straight line AL.
- the holding position when holding the holding object 80 by the end effector 2B of the robot 2 is estimated based on the rule map.
- the operator's experience and the like can be reflected in the rule map. For example, when a worker holds various objects 80 to be held, the worker considers at which position to hold each object 80 to hold.
- the operator determines the holding position by considering, for example, the center of gravity of the object to be held 80, obstacles existing around the object to be held 80, or the wide position at which the object to be held 80 is held.
- the operator's thoughts are reflected in the rule map, the degree of matching in each rule map is calculated, and the overall degree of matching is calculated by weighting the degree of matching.
- the grip position is estimated.
- the robot 2 can hold the holding object 80 at the holding position considered by the operator. In other words, the holding position of the object can be easily estimated as a position that is in line with human intention.
- workers can use a rule map that rules the holding positions. As a result, no learning is required. Also, by adding a rule map when a new rule occurs, it becomes easier to cope with changes in the environment.
- the work including holding the holding object 80 may include various components depending on the worker or the environment of the work site. Therefore, it is necessary to estimate and hold the holding position of the holding object 80 based on each component.
- By generating a rule map that reflects various components and adding it to the targets for matching degree calculation it becomes easier to deal with special rules and the like for each worker or each site.
- the holding parameter estimation device 10 may acquire the holding position of the holding object 80 estimated based on another method.
- the holding parameter estimation device 10 may estimate the opening width for each acquired holding position by executing the holding parameter estimation method according to the present embodiment for the acquired holding position.
- the control unit 12 of the holding parameter estimation device 10 acquires the holding position of the holding object 80 estimated based on another method via the interface 14 .
- the control unit 12 calculates a combination of the approach position 70 and the rotation angle of the end effector model 30 corresponding to the acquired holding position.
- the control unit 12 projects the end effector model 30 onto the approach map 90 in a combination corresponding to the acquired holding positions, and estimates the opening width.
- the control unit 12 may create the opening width model 31 based on the estimated opening width.
- the control unit 12 may project the opening width model 31 onto each rule map and calculate the matching degree in each rule map.
- the control unit 12 may weight the degree of coincidence in each rule map, sum them up, and calculate the overall degree of coincidence.
- the control unit 12 may select the opening width and the holding position corresponding to the combination with the large overall matching score and output them to the robot control device 110 .
- the holding parameter estimating device 10 may evaluate the appropriateness of the acquired holding position by calculating the overall degree of matching at the acquired holding position.
- the control unit 12 evaluates whether the acquired holding position is appropriate based on the calculated overall matching score. For example, the control unit 12 may determine that the acquired holding position is appropriate when the calculated overall matching score is equal to or greater than a predetermined value.
- the holding parameter estimation device 10 may acquire the center position and rotation angle of the end effector 2B estimated based on other methods.
- the holding parameter estimating device 10 can calculate a combination of the approach position 70 and the rotation angle by regarding the obtained center position of the end effector 2B as the approach position 70, and execute the holding parameter estimating method according to the present embodiment.
- the holding parameter estimating device 10 can reduce the number of combinations of the approach position 70 and the rotation angle for which the matching degree is to be calculated by obtaining the holding position of the holding object 80 estimated based on another method. . As a result, computational load can be reduced.
- the retention parameter estimating device 10 calculates an overall matching degree by weighting the matching degrees calculated for each rule map and summing them up.
- the holding parameter estimating device 10 may update the weighting coefficients by learning based on the annotation information for the estimated holding position.
- the holding parameter estimation device 10 can improve the holding position estimation accuracy by updating the weighting coefficients.
- control unit 12 of the holding parameter estimation device 10 may notify the user of the estimated holding position via the interface 14 .
- the control unit 12 receives an input for correcting the holding position as an annotation for the estimated holding position from the user through the interface 14 .
- the control unit 12 may output the corrected holding position to the robot control device 110 based on the correction information by the user.
- the control unit 12 may update the weighting coefficients by learning based on the correction information by the user, and re-estimate the holding position.
- the control unit 12 may estimate a plurality of holding position candidates and notify the user of them via the interface 14 .
- the control unit 12 may estimate, as a candidate, a holding position for which the overall matching score is equal to or greater than a predetermined value.
- the control unit 12 receives an input from the user to select from the holding position candidates as an annotation for the estimated holding position through the interface 14 .
- the control unit 12 may output the holding position selected by the user to the robot control device 110 .
- the control unit 12 may update the weighting coefficients by learning based on information selected by the user.
- the control unit 12 may extract holding positions where the overall matching score is equal to or higher than a predetermined value as candidate positions where the holding object 80 can be held, and notify the user of them.
- the control unit 12 receives an input for correcting a candidate position or an input for selecting a candidate position as an annotation for the candidate position from the user via the interface 14 .
- the control unit 12 evaluates the suitability of each candidate position based on the input of modification or selection of the candidate position.
- the control unit 12 may update the weighting coefficient so that the value of the overall matching degree increases for the candidate positions evaluated as being highly suitable as holding positions based on the user's input.
- the control unit 12 may output the selected candidate position to the robot control device 110 as the holding position.
- the control unit 12 may output the corrected candidate position to the robot control device 110 as the holding position.
- the control unit 12 of the retention parameter estimation device 10 may execute the procedure of the flowchart shown in FIG.
- the control unit 12 selects holding position candidates by, for example, executing the procedure of the flowchart of FIG. 22 (step S21).
- the control unit 12 outputs the holding position candidates to the interface 14 (step S22).
- the control unit 12 corrects the holding position based on the correction input by the user (step S23).
- the control unit 12 learns the contents of correction (step S24).
- the control unit 12 updates the weighting based on the learning result (step S25). After executing the procedure of step S25, the control unit 12 ends the execution of the procedure of the flowchart of FIG.
- the retention parameter estimation device 10 can update the weighting based on the content of the user's annotations. As a result, robustness that can cope with various work environments can be improved.
- the retention parameter estimation device 10 weights and mixes multiple maps to generate a rule map. For example, the retention parameter estimation device 10 generates one object map 50 by mixing multiple object maps 50 . When weighting and blending multiple maps, the retention parameter estimator 10 may update the weighting factors for each map through annotation-based learning.
- the holding parameter estimation device 10 may acquire annotation information for a plurality of holding positions, and correct the appropriate value based on the annotation information.
- the retention parameter estimation device 10 may generate a rule map based on the retention target image captured by the camera 4 .
- the control unit 12 of the holding parameter estimation device 10 may, for example, estimate the center of gravity of the holding target 80 based on the holding target image and generate the object map 50 specifying the position of the center of gravity.
- the control unit 12 estimates the material of the object to be held 80 based on the image to be held, and creates a contact map 60 that specifies the frictional force acting between the surface of the object to be held 80 and the holding portion of the end effector 2B. may be generated.
- the control unit 12 may estimate the material of the object to be held 80 based on the information on the color or pattern of the object to be held 80 or the unevenness.
- the holding parameter estimating device 10 first sets the height at which the object to be held 80 is held, and estimates the opening width and holding position at the set height.
- the holding parameter estimating device 10 changes the height at which the holding object 80 is held, executes the holding parameter estimation method at each height, and determines the holding height and approach
- the position 70 and the combination of rotation angles may be estimated together with the opening width as the holding position. By doing so, the holding stability can be improved.
- a storage medium on which the program is recorded for example, an optical disc, an optical Magnetic disk, CD-ROM, CD-R, CD-RW, magnetic tape, hard disk, memory card, etc.
- the implementation form of the program is not limited to an application program such as an object code compiled by a compiler or a program code executed by an interpreter. good.
- the program may or may not be configured so that all processing is performed only in the CPU on the control board.
- the program may be configured to be partially or wholly executed by another processing unit mounted on an expansion board or expansion unit added to the board as required.
- Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to any novel feature or combination thereof described in the present disclosure or any novel method or process step or combination thereof described. be able to.
- Descriptions such as “first” and “second” in this disclosure are identifiers for distinguishing the configurations. Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration. For example, a first region can exchange identifiers “first” and “second” with a second region. The exchange of identifiers is done simultaneously. The configurations are still distinct after the exchange of identifiers. Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes. The description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration or the existence of lower numbered identifiers.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Manipulator (AREA)
Abstract
Description
任意の開き幅で保持対象物を保持する保持部を有するエンドエフェクタに関する情報、前記保持対象物を示す保持対象情報、及び前記保持対象物に関するデプスデータを取得する取得部と、
前記情報に基づいて該エンドエフェクタが存在しうる領域を示すエンドエフェクタモデルを取得し、該エンドエフェクタモデルと前記保持対象情報と前記デプスデータとに基づいて、該保持対象物を保持させるために前記保持部の開き幅を推定する制御部と、を備える。
保持対象物を保持する保持部を有するエンドエフェクタに関する情報、前記保持対象物を示す保持対象情報、及び前記保持対象物に関するデプスデータを取得し、
前記情報に基づいて該エンドエフェクタが存在しうる領域を示すエンドエフェクタモデルを取得し、
前記エンドエフェクタモデル、前記保持対象情報、及び前記デプスデータに基づいて、該保持対象物を保持させるために前記保持部の開き幅を推定する。
図1、図2、及び図3に示されるように、本開示の一実施形態に係るロボット制御システム100は、ロボット2と、カメラ4と、ロボット制御装置110と、保持パラメータ推定装置10とを備える。ロボット2は、保持対象物80をエンドエフェクタ2Bによって保持して作業を実行する。ロボット制御装置110は、ロボット2を制御する。保持パラメータ推定装置10は、ロボット2が保持対象物80を保持するために、後述するエンドエフェクタ2Bの開き幅、及び保持するときに接触する位置を保持位置として推定し、ロボット制御装置110に出力する。
ロボット2は、アーム2Aと、エンドエフェクタ2Bとを備える。アーム2Aは、例えば、6軸又は7軸の垂直多関節ロボットとして構成されてよい。アーム2Aは、3軸又は4軸の水平多関節ロボット又はスカラロボットとして構成されてもよい。アーム2Aは、2軸又は3軸の直交ロボットとして構成されてもよい。アーム2Aは、パラレルリンクロボット等として構成されてもよい。アーム2Aを構成する軸の数は、例示したものに限られない。言い換えれば、ロボット2は、複数の関節で接続されるアーム2Aを有し、関節の駆動によって動作する。
図1に示される構成例において、ロボット制御システム100は、ロボット2のエンドエフェクタ2Bに取り付けられたカメラ4を備えるとする。カメラ4は、保持対象物80を撮影する。カメラ4は、例えば、エンドエフェクタ2Bが保持対象物80を保持する方向から保持対象物80を撮影してもよい。保持対象物80を撮影した画像は、保持対象画像とも称される。また、カメラ4は、デプスセンサを備え、保持対象物80のデプスデータを取得可能に構成される。デプスデータとは、デプスセンサの画角範囲内の方向別の距離に関するデータである。より具体的には、デプスデータは、カメラ4から測定点までの距離に関する情報ともいえる。カメラ4が撮影する画像は、モノクロの輝度情報を含んでもよいし、RGB(Red,Green,Blue)等で表される各色の輝度情報を含んでもよい。カメラ4の数は、1つに限られず、2つ以上であってもよい。カメラ4は、保持対象物80から所定範囲内に位置する他の物体も障害物として撮影し、障害物のデプスデータも取得してよい。カメラ4は、エンドエフェクタ2Bに取り付けられる構成に限定されず、保持対象物80を撮影可能な任意の位置に設けられてよい。エンドエフェクタ2B以外の構造物に取り付けられる構成においては、当該構造物に取り付けられたカメラ4の撮像した画像に基づいて、上記の保持対象画像が合成されてよい。保持対象画像は、カメラ4の取付位置及び姿勢に対する、エンドエフェクタ2Bの相対位置及び相対姿勢に基づいて画像変換することにより合成されてよい。或いは、保持対象画像は、CAD及び図面データから生成可能であってよい。
図3に示されるように、保持パラメータ推定装置10は、制御部12と、インタフェース(取得部)14とを備える。
ロボット制御装置110は、保持パラメータ推定装置10から、開き幅を特定する情報を取得してよい。ロボット制御装置110は、後述するアプローチ領域への、エンドエフェクタ2Bの一部の変位に際して、推定した開き幅にエンドエフェクタ2Bを開くようロボット2を制御してよい。ロボット制御装置110は、更に、保持パラメータ推定装置10から保持位置を特定する情報を取得してよい。ロボット制御装置110は、推定した保持位置でロボット2が保持対象物80を保持するようにロボット2を制御してよい。
ロボット制御システム100は、ロボット制御装置110によってロボット2を制御してロボット2に作業を実行させる。本実施形態において、ロボット2に実行させる作業は、保持対象物80を保持するための動作を含む。更に、ロボット2に実行させる作業は保持対象物80を保持する動作を含んでよい。ロボット制御システム100において、保持パラメータ推定装置10は、保持対象物80へのエンドエフェクタ2Bの変位時の開き幅を推定する。ロボット制御装置110は、エンドエフェクタ2Bを推定した開き幅に開くようにロボット2を制御してよい。ロボット制御システム100において、保持パラメータ推定装置10は、更に、ロボット2による保持対象物80の保持位置を推定してよい。ロボット制御装置110は、保持位置でロボット2が保持対象物80を保持するようにロボット2を制御してよい。
保持パラメータ推定装置10の制御部12は、図21に例示されるフローチャートの手順を含む保持パラメータ推定方法を実行してもよい。保持パラメータ推定方法は、保持パラメータ推定装置10の制御部12を構成するプロセッサに実行させる保持位置推定プログラムとして実現されてもよい。保持パラメータ推定プログラムは、非一時的なコンピュータ読み取り可能な媒体に格納されてよい。
以上述べてきたように、本実施形態に係る保持パラメータ推定装置10によれば、エンドエフェクタモデル30と前記保持対象画像と前記デプスデータとに基づいて、保持対象物80を保持させるために前記エンドエフェクタ2Bの一部を変位させるときの開き幅が推定される。保持のためにエンドエフェクタ2Bの一部を保持対象物80に変位させるときのエンドエフェクタ2Bの開き幅が保持対象物80の幅に近い場合、変位中に保持対象物に衝突する虞がある。一方、当該開き幅が広すぎる場合、エンドエフェクタ2Bの一部の保持対象物80への変位後に、保持対象物80を保持するまでの時間が長くなる。更に、当該開き幅が広すぎる場合、変位後にエンドエフェクタ2Bを保持のために閉じる動作において、保持対象物80の周囲の障害物に衝突する虞がある。このような事象に対して、上述の構成を有する保持パラメータ推定装置10は、変位時の開き幅を推定するので、エンドエフェクタ2Bを保持対象物80に近づける時の適切な状態を推定し得る。したがって、保持パラメータ推定装置10は、適切な開き幅となるようにロボット2を操作するパラメータを提供し得る。
以下、他の実施形態が説明される。
保持パラメータ推定装置10は、他の手法に基づいて推定された保持対象物80の保持位置を取得してよい。保持パラメータ推定装置10は、取得した保持位置に対して本実施形態に係る保持パラメータ推定方法を実行することによって、取得した保持位置毎の開き幅を推定してよい。
保持パラメータ推定装置10は、各ルールマップについて算出した一致度に対して重みづけを行って合算することによって、総合一致度を算出する。保持パラメータ推定装置10は、推定した保持位置に対するアノテーションの情報に基づいて学習することによって、重みづけの係数を更新してよい。保持パラメータ推定装置10は、重みづけの係数を更新することによって保持位置の推定精度を向上できる。
保持パラメータ推定装置10は、カメラ4で撮影した保持対象画像に基づいて、ルールマップを生成してもよい。保持パラメータ推定装置10の制御部12は、例えば、保持対象画像に基づいて保持対象物80の重心を推定し、重心位置を特定する物体マップ50を生成してもよい。制御部12は、例えば、保持対象画像に基づいて保持対象物80の材質を推定し、保持対象物80の表面とエンドエフェクタ2Bの保持部との間に作用する摩擦力を特定する接触マップ60を生成してもよい。制御部12は、保持対象物80の色若しくは模様、又は、凹凸に関する情報に基づいて、保持対象物80の材質を推定してもよい。
保持パラメータ推定装置10は、最初に保持対象物80を保持する高さを設定し、設定した高さにおける開き幅及び保持位置を推定する。保持パラメータ推定装置10は、保持対象物80を保持する高さを変化させ、それぞれの高さにおいて保持パラメータ推定方法を実行し、総合一致度の値が最も大きくなるときの保持の高さと、アプローチ位置70、及び回転角度の組み合わせとを、保持位置として開き幅とともに推定してもよい。このようにすることで、保持の安定性が向上し得る。
2A アーム
2B エンドエフェクタ
4 カメラ
5 ロボットの動作範囲
6 作業開始台
7 作業目標台
10 保持パラメータ推定装置
12 制御部
14 インタフェース
20 マスク画像
22 窓
24 マスク
30 エンドエフェクタモデル
31 開き幅モデル
32 保持部位置
34 ストローク範囲
36 保持部の動作範囲
38 動作外範囲
40 周辺環境マップ
41 アプローチ領域
42 対象物領域
43 非アプローチ領域
44 第1の領域
50 物体マップ
52 断面
60 接触マップ
62 外周
70 アプローチ位置
72a、72b、72c 投影モデル
74a、74c 保持部の投影位置
80 保持対象物
81 本体
82 保持点
83 鉤括弧状部材
90 アプローチマップ
91 アプローチ領域
92 対象物領域
93 非アプローチ領域
94 第1の領域
94ex 第1の領域を膨張させた領域
95 境界
96 第2の領域
96ex 対象物領域を膨張させた領域
97 障害物領域
98 外部領域
99 存在可能領域
100 ロボット制御システム
110 ロボット制御装置
AL 可動直線
CP1 第1中心位置
C92 対象物領域の中心
SL 対象物領域の中心を通り且つ障害物領域の外縁に一点において交わる直線
WP 開き幅に対応する保持部の位置
Claims (14)
- 任意の開き幅で保持対象物を保持する保持部を有するエンドエフェクタに関する情報、前記保持対象物を示す保持対象情報、及び前記保持対象物に関するデプスデータを取得する取得部と、
前記情報に基づいて該エンドエフェクタが存在しうる領域を示すエンドエフェクタモデルを取得し、該エンドエフェクタモデルと前記保持対象情報と前記デプスデータとに基づいて、該保持対象物を保持させるために前記保持部の開き幅を推定する制御部と、を備える
保持パラメータ推定装置。 - 請求項1に記載の保持パラメータ推定装置において、
前記制御部は、
前記保持対象情報及び前記デプスデータに基づいて、前記保持対象物を保持する高さにおいて、前記エンドエフェクタを該保持対象物以外の物体に干渉させずに開き得るアプローチ領域を示すアプローチマップを作成し、
前記アプローチ領域と前記エンドエフェクタが存在しうる領域とに基づき、前記開き幅を推定する
保持パラメータ推定装置。 - 請求項2に記載の保持パラメータ推定装置において、
前記アプローチ領域は、前記保持対象物の周囲で前記エンドエフェクタを保持対象物に干渉させずに開き得る領域から、前記保持対象物の存在領域と、該保持対象物以外の物体の存在領域における該保持対象物に面する外縁から前記保持対象物とは反対側を向く第1の領域とを除外した領域である
保持パラメータ推定装置。 - 請求項3に記載の保持パラメータ推定装置において、
前記エンドエフェクタは、前記保持対象物を保持する際に前記保持対象物に接触させる保持部を有し、
前記アプローチ領域は、前記保持対象物の存在領域及び前記第1の領域を、前記エンドエフェクタが有する保持部の幅方向に該保持部の幅の半分以上の大きさ、且つ該保持部の太さ方向に該保持部の太さの半分以上の大きさで膨張させた領域を除外した領域である
保持パラメータ推定装置。 - 請求項2から4のいずれか1項に記載の保持パラメータ推定装置において、
前記制御部は、
前記アプローチマップへの前記エンドエフェクタモデルの重合わせにおいて、前記アプローチ領域と前記エンドエフェクタモデルが重なる存在可能領域を、前記エンドエフェクタの保持部毎に推定し、
前記保持部毎の前記存在可能領域の任意の地点を、該存在可能領域における前記開き幅に対応する保持部の位置として推定する
保持パラメータ推定装置。 - 請求項5に記載の保持パラメータ推定装置において、
前記保持部は、所定の方向に沿って移動可能に構成されており、
前記制御部は、
前記エンドエフェクタの保持部毎の前記存在可能領域の中心位置である第1中心位置を算出し、
前記保持部の各々についての前記所定の方向に沿って、前記第1中心位置を起点として前記保持部を保持対象物へ向かって変位させた場合に、前記存在可能領域の端に到達するまでに前記保持部が通過する領域のうち、前記保持対象物及び該保持対象物以外の物体と干渉しない任意の位置を、該存在可能領域における前記開き幅に対応する保持部の位置として推定する
保持パラメータ推定装置。 - 請求項6に記載の保持パラメータ推定装置において、
前記制御部は、前記第1中心位置が、前記エンドエフェクタモデルにおける前記エンドエフェクタが存在しうる領域の中心位置を通る前記保持部の開閉方向に平行な可動直線から外れている場合、前記第1中心位置から該可動直線に下ろした垂線と該可動直線の交点を前記第1中心位置とみなして、前記開き幅に対応する保持部の位置を推定する
保持パラメータ推定装置。 - 請求項6又は7に記載の保持パラメータ推定装置において、
前記制御部は、前記第1中心位置から前記エンドエフェクタモデルにおける前記エンドエフェクタが存在しうる領域の中心位置に向けて変位させ、前記アプローチ領域が存在しない場合、前記第1中心位置より外側の前記存在可能領域において前記アプローチ領域を探索する
保持パラメータ推定装置。 - 請求項2から8のいずれか1項に記載の保持パラメータ推定装置において、
前記エンドエフェクタの保持部は、対向して移動可能な少なくとも2の部材により形成されている、
保持パラメータ推定装置。 - 請求項1から9のいずれか1項に記載の保持パラメータ推定装置において、
前記制御部は、
前記開き幅の範囲内で前記エンドエフェクタが存在しうる領域を特定する開き幅モデルを作成し、
前記保持対象情報に基づいて、前記エンドエフェクタが保持に使用すべき前記保持対象物の位置を規定するマップを含むルールマップを生成し、
前記開き幅モデルと前記ルールマップとに基づいて、前記エンドエフェクタが前記保持対象物と接触する位置を保持位置として推定する
保持パラメータ推定装置。 - 請求項10に記載の保持パラメータ推定装置において、
前記制御部は、
前記開き幅モデルと前記ルールマップとに基づいて、前記保持位置の適正さを表す適正値を複数の前記保持位置について算出し、
前記適正値に基づいて前記保持位置を推定する
保持パラメータ推定装置。 - 請求項11に記載の保持パラメータ推定装置において、
前記開き幅モデル及び前記ルールマップは、各位置に割り当てた前記保持対象物を保持する位置としての適正さを示す値で表され、
前記制御部は、前記開き幅モデルを前記ルールマップに重ねた場合に各位置に割り当てた値を演算することによって、前記適正値を算出する
保持パラメータ推定装置。 - 請求項11又は12に記載の保持パラメータ推定装置において、
前記制御部は、前記ルールマップに含まれる複数の前記マップに基づく前記適正値と、各々の前記マップについて定めたマップ係数とに基づいて、前記ルールマップに基づく前記適正値を算出する
保持パラメータ推定装置。 - 保持対象物を保持する保持部を有するエンドエフェクタに関する情報、前記保持対象物を示す保持対象情報、及び前記保持対象物に関するデプスデータを取得し、
前記情報に基づいて該エンドエフェクタが存在しうる領域を示すエンドエフェクタモデルを取得し、
前記エンドエフェクタモデル、前記保持対象情報、及び前記デプスデータに基づいて、該保持対象物を保持させるために前記保持部の開き幅を推定する
保持パラメータ推定方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023540418A JPWO2023013743A1 (ja) | 2021-08-04 | 2022-08-04 | |
CN202280054041.1A CN117794709A (zh) | 2021-08-04 | 2022-08-04 | 握持参数估计设备和握持参数估计方法 |
EP22853161.2A EP4382266A1 (en) | 2021-08-04 | 2022-08-04 | Holding parameter estimation device and holding parameter estimation method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021128476 | 2021-08-04 | ||
JP2021-128476 | 2021-08-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023013743A1 true WO2023013743A1 (ja) | 2023-02-09 |
Family
ID=85155801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/030016 WO2023013743A1 (ja) | 2021-08-04 | 2022-08-04 | 保持パラメータ推定装置及び保持パラメータ推定方法 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4382266A1 (ja) |
JP (1) | JPWO2023013743A1 (ja) |
CN (1) | CN117794709A (ja) |
WO (1) | WO2023013743A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012066819A1 (ja) * | 2010-11-17 | 2012-05-24 | 三菱電機株式会社 | ワーク取り出し装置 |
JP2018205929A (ja) | 2017-05-31 | 2018-12-27 | 株式会社Preferred Networks | 学習装置、学習方法、学習モデル、検出装置及び把持システム |
JP2019089157A (ja) * | 2017-11-14 | 2019-06-13 | オムロン株式会社 | 把持方法、把持システム及びプログラム |
WO2021029064A1 (ja) * | 2019-08-15 | 2021-02-18 | オムロン株式会社 | 情報処理装置及び情報処理方法 |
-
2022
- 2022-08-04 WO PCT/JP2022/030016 patent/WO2023013743A1/ja active Application Filing
- 2022-08-04 JP JP2023540418A patent/JPWO2023013743A1/ja active Pending
- 2022-08-04 EP EP22853161.2A patent/EP4382266A1/en active Pending
- 2022-08-04 CN CN202280054041.1A patent/CN117794709A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012066819A1 (ja) * | 2010-11-17 | 2012-05-24 | 三菱電機株式会社 | ワーク取り出し装置 |
JP2018205929A (ja) | 2017-05-31 | 2018-12-27 | 株式会社Preferred Networks | 学習装置、学習方法、学習モデル、検出装置及び把持システム |
JP2019089157A (ja) * | 2017-11-14 | 2019-06-13 | オムロン株式会社 | 把持方法、把持システム及びプログラム |
WO2021029064A1 (ja) * | 2019-08-15 | 2021-02-18 | オムロン株式会社 | 情報処理装置及び情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4382266A1 (en) | 2024-06-12 |
CN117794709A (zh) | 2024-03-29 |
JPWO2023013743A1 (ja) | 2023-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5122887B2 (ja) | 実写基盤移動機器の制御方法、装置および媒体 | |
US10279473B2 (en) | Image processing device, image processing method, and computer program | |
JP6271953B2 (ja) | 画像処理装置、画像処理方法 | |
Schmidt et al. | Depth-based tracking with physical constraints for robot manipulation | |
US7280687B2 (en) | Device for detecting position/orientation of object | |
JP7128933B2 (ja) | 画像処理装置 | |
US11969893B2 (en) | Automated personalized feedback for interactive learning applications | |
JP6708260B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US11833697B2 (en) | Method of programming an industrial robot | |
JP4765075B2 (ja) | ステレオ画像を利用した物体の位置および姿勢認識システムならびに物体の位置および姿勢認識方法を実行するプログラム | |
JP2015044257A (ja) | ロボット、ロボット制御方法、及びロボット制御プログラム | |
US20230339118A1 (en) | Reliable robotic manipulation in a cluttered environment | |
WO2023013743A1 (ja) | 保持パラメータ推定装置及び保持パラメータ推定方法 | |
WO2022250152A1 (ja) | 保持位置決定装置、及び保持位置決定方法 | |
WO2023013740A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
Pedrosa et al. | A skill-based architecture for pick and place manipulation tasks | |
KR20220067719A (ko) | 딥러닝과 마커를 이용한 비전인식을 통한 로봇 제어장치 및 그 방법 | |
JP2022017738A (ja) | 画像処理装置 | |
Funakubo et al. | Verification of illumination tolerance for clothes recognition | |
US20240144532A1 (en) | CALIBRATION METHOD FOR CAMERA, and CAMERA CALIBRATION SYSTEM | |
TWI788253B (zh) | 適應性移動操作設備及方法 | |
US11348280B2 (en) | Method and computer readable medium for pose estimation | |
US20230319419A1 (en) | Network system, computer, and deep learning method | |
US20220415094A1 (en) | Method and system for estimating gesture of user from two-dimensional image, and non-transitory computer-readable recording medium | |
US20230154162A1 (en) | Method For Generating Training Data Used To Learn Machine Learning Model, System, And Non-Transitory Computer-Readable Storage Medium Storing Computer Program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853161 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023540418 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280054041.1 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022853161 Country of ref document: EP Effective date: 20240304 |