JP2013052490A - Workpiece takeout device - Google Patents

Workpiece takeout device Download PDF

Info

Publication number
JP2013052490A
JP2013052490A JP2011193918A JP2011193918A JP2013052490A JP 2013052490 A JP2013052490 A JP 2013052490A JP 2011193918 A JP2011193918 A JP 2011193918A JP 2011193918 A JP2011193918 A JP 2011193918A JP 2013052490 A JP2013052490 A JP 2013052490A
Authority
JP
Japan
Prior art keywords
gripping
workpiece
measurement
opening
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2011193918A
Other languages
Japanese (ja)
Other versions
JP5623358B2 (en
Inventor
Yukiyasu Domae
幸康 堂前
Tatsuya Nagatani
達也 永谷
Tetsuo Noda
哲男 野田
Makito Seki
真規人 関
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2011193918A priority Critical patent/JP5623358B2/en
Publication of JP2013052490A publication Critical patent/JP2013052490A/en
Application granted granted Critical
Publication of JP5623358B2 publication Critical patent/JP5623358B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Engineering & Computer Science (AREA)
  • Manipulator (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a workpiece takeout device that reduces failures of taking out a workpiece from a plurality of workpieces and also reduces a retry operation.SOLUTION: A measuring characteristic determination part 52 estimates the easy-to-grip performance of a part of the workpiece measured by a sensor 2 based on information on success or a failure of past takeout in a DB1. A grip operation calculation part 53 and an opening/closing operation calculation part 54 decide not to preferentially grip a workpiece hard in gripping but adjust at least one of opening/closing amount, operation speed and gripping force of a hand to perform careful takeout operation when there is no workpiece for gripping.

Description

この発明は、バラ積みされた多数のワークの中からのワークの取り出し技術に関する。   The present invention relates to a technique for picking up a workpiece from a large number of workpieces stacked in bulk.

多数のワークの中からの取り出しの失敗を減らすために、計測失敗や取り出し失敗を判定し、NGワーク登録をおこない、NGワーク登録された粗位置にあるワークを取り出さないようにするというワーク取出し装置がある(例えば下記特許文献1参照)。   In order to reduce unsuccessful removal from a large number of workpieces, a workpiece unloading apparatus that determines a measurement failure or unsuccessful failure, performs NG workpiece registration, and prevents a workpiece at a coarse position registered as an NG workpiece from being removed. (See, for example, Patent Document 1 below).

特許第4226623号公報Japanese Patent No. 4226623 特許第2555824号公報Japanese Patent No. 2555824 特許第3930490号公報Japanese Patent No. 3930490 特開2011−93058号公報JP 2011-93058 A

しかしながら、従来技術のように、取り出しに失敗したワークの粗位置をNG登録して同じ失敗をさせないようにした場合、NG登録が増え、取り出せるワークが減っていってしまう。これによりワークは見えているが取り出せない状況になり、リトライが増えるという課題があった。   However, when the rough position of the workpiece that has failed to be taken out is registered as NG by preventing the same failure as in the conventional technique, NG registration increases and the number of workpieces that can be taken out decreases. As a result, the workpiece can be seen but cannot be taken out, and there is a problem that the number of retries increases.

この発明は、上記のような問題点を解決するためになされたものであり、少ないリトライと高い取り出し成功率を両立し、作業効率ひいては生産効率の向上をもたらすワーク取り出し装置を提供することを目的とする。   The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a work picking apparatus that achieves both low retries and a high picking success rate, and improves work efficiency and consequently production efficiency. And

この発明は、少なくともワークの計測特徴と取り出し成否を蓄えるDBと、バラ積みされたワークを計測するセンサと、バラ積みされたワークを把持するハンドと、ハンドを把持位置姿勢に移動するロボットと、計測データから把持位置姿勢と、開閉量・把持力・動作速度のうち少なくとも1つの動作制御情報を計算する情報処理部と、前記情報処理部5の動作制御情報に従って前記ロボットへ把持動作を指令し、ハンドへ開閉動作を指令する制御部と、を備え、前記情報処理部が、計測データから把持位置姿勢を計算するための計測特徴を抽出する計測特徴抽出部と、前記DBの情報から前記計測特徴を識別する計測特徴識別部と、前記計測特徴と前記DBの情報からワークを把持するハンドの把持位置姿勢を計算する把持動作演算部と、前記計測特徴、把持位置姿勢、前記DBの情報からハンドの開閉に関するパラメータを計算する開閉動作演算部と、を含み、前記計測特徴識別部では、前記DBに蓄えられた過去の計測特徴と取り出し成否の情報をもとに、前記計測特徴識別部で抽出した計測特徴の取り出し成功率を推定し、前記開閉動作演算部では、推定した取り出し成功率に応じて、ハンドの開閉量・把持力・動作速度のうち少なくとも1つを調整する、ことを特徴とするワーク取り出し装置にある。   The present invention includes at least a DB for storing measurement features and success / failure of a workpiece, a sensor for measuring a stacked workpiece, a hand for gripping a stacked workpiece, a robot for moving the hand to a gripping position and posture, An information processing unit that calculates at least one motion control information among the gripping position and orientation and the opening / closing amount, gripping force, and motion speed from the measurement data, and commands the robot to perform a gripping operation according to the motion control information of the information processing unit 5 A control unit that commands the hand to open and close, a measurement feature extraction unit that extracts a measurement feature for calculating a gripping position and orientation from measurement data, and the measurement from the information in the DB A measurement feature identification unit for identifying a feature; a gripping operation calculation unit for calculating a gripping position and orientation of a hand that grips a workpiece from the measurement feature and the information of the DB; An opening / closing operation calculating unit that calculates parameters related to opening / closing of the hand from the DB information, and the measurement feature identifying unit includes past measurement features stored in the DB and success / failure of extraction. Based on the information, the measurement feature extraction unit extracts the measurement feature extraction success rate, and the opening / closing operation calculation unit calculates the opening / closing amount / grip force / motion of the hand according to the estimated extraction success rate. A workpiece picking device characterized in that at least one of the speeds is adjusted.

この発明では、計測したシーンにおいて掴みにくそうなワークしかない状況でも、リトライをせずに、動作を改善することで取り出しを成功させるため、リトライは少なく、取り出し成功率は高くなり、従来にない高い作業効率ひいては生産効率をもたらす。   In this invention, even in the situation where there is only work that is difficult to grasp in the measured scene, the retrieval is succeeded by improving the operation without retrying, so the retry is small and the retrieval success rate is high. There is no high work efficiency and hence production efficiency.

この発明の実施の形態1によるワーク取り出し装置を示す構成図である。It is a block diagram which shows the workpiece removal apparatus by Embodiment 1 of this invention. この発明における計測特徴抽出部の動作を説明するための図である。It is a figure for demonstrating operation | movement of the measurement feature extraction part in this invention. この発明における計測特徴識別部でのP/N計算の一つの方法のイメージ図である。It is an image figure of one method of P / N calculation in the measurement feature identification part in this invention. この発明における情報処理部5の動作フローチャート図である。It is an operation | movement flowchart figure of the information processing part 5 in this invention. この発明の実施の形態2によるワーク取り出し装置を示す構成図である。It is a block diagram which shows the workpiece taking-out apparatus by Embodiment 2 of this invention. この発明の実施の形態3によるワーク取り出し装置を示す構成図である。It is a block diagram which shows the workpiece removal apparatus by Embodiment 3 of this invention. この発明の実施の形態4によるワーク取り出し装置を示す構成図である。It is a block diagram which shows the workpiece taking-out apparatus by Embodiment 4 of this invention.

この発明によるワーク取り出し装置では、過去の取り出し成否の情報をもとに、計測特徴(ワークの見え方や形状から得られる特徴情報)の掴みやすさを推定し、掴みにくそうなものは優先して掴まないが、他に掴みやすそうな計測特徴がない場合、ハンドの開閉量・把持力・動作速度の少なくとも1つを自動で調整し、より丁寧な動作に変更することで、掴みにくそうなワークを取り出すものである。   In the workpiece picking device according to the present invention, the ease of grasping the measurement feature (feature information obtained from the appearance and shape of the workpiece) is estimated based on the past success / failure information. If there is no other measurement feature that seems to be easy to grip, it will automatically adjust at least one of the opening / closing amount, gripping force, and operating speed of the hand and change it to a more polite motion. The work that seems to be difficult is taken out.

以下、この発明によるワーク取り出し装置を各実施の形態に従って図面を用いて説明する。なお、各実施の形態において、同一もしくは相当部分は同一符号で示し、重複する説明は省略する。   Hereinafter, a workpiece picking apparatus according to the present invention will be described with reference to the drawings according to each embodiment. In each embodiment, the same or corresponding parts are denoted by the same reference numerals, and redundant description is omitted.

実施の形態1.
図1は、この発明の実施の形態1によるワーク取り出し装置を示す構成図である。ワーク取り出し装置は、DB(データベース)1と、センサ2と、ハンド3と、ロボット4と、情報処理部5と、制御部6とを少なくとも備える。情報処理部5は、計測特徴抽出部51、計測特徴識別部52、把持動作演算部53、開閉動作演算部54を少なくとも含んでいる。そして例えば、供給箱8にワーク7がバラ積みにされている。
Embodiment 1 FIG.
FIG. 1 is a block diagram showing a workpiece picking apparatus according to Embodiment 1 of the present invention. The workpiece picking device includes at least a DB (database) 1, a sensor 2, a hand 3, a robot 4, an information processing unit 5, and a control unit 6. The information processing unit 5 includes at least a measurement feature extraction unit 51, a measurement feature identification unit 52, a gripping operation calculation unit 53, and an opening / closing operation calculation unit 54. For example, the workpieces 7 are piled up in the supply box 8.

図1の構成による動作の流れを説明する。センサ2が供給箱8にバラ積みされたワーク7を計測し、情報処理部5に計測データを伝える。情報処理部5内では、まず計測特徴抽出部51が、ワークの見え方や形状の特徴(計測特徴と呼ぶ)を計測データより抽出する。計測特徴識別部52では、DB1に蓄えられた過去の計測特徴と取り出し成否(成功か否か)の関係より、計測特徴抽出部51で抽出した計測特徴の掴みやすさを推定する。   An operation flow according to the configuration of FIG. 1 will be described. The sensor 2 measures the workpiece 7 stacked in the supply box 8 and transmits the measurement data to the information processing unit 5. In the information processing unit 5, first, the measurement feature extraction unit 51 extracts the work appearance and shape features (referred to as measurement features) from the measurement data. The measurement feature identification unit 52 estimates the ease of grasping the measurement feature extracted by the measurement feature extraction unit 51 from the relationship between the past measurement feature stored in the DB 1 and the success or failure of extraction (success or not).

把持動作演算部53では、推定された計測特徴の掴みやすさに応じた方法で把持位置姿勢を計算する。また、開閉動作演算部54でも、推定された計測特徴の掴みやすさに応じた方法で開閉パラメータ(開閉量・開閉(把持)力・開閉(把持)動作速度の1つ以上を含むハンド3の開閉(把持)に関する調整パラメータを意味する)を調節する。   The gripping motion calculation unit 53 calculates the gripping position and orientation by a method according to the ease of gripping the estimated measurement feature. The opening / closing operation calculation unit 54 also uses an opening / closing parameter (opening / closing amount / opening / closing (gripping) force / opening / closing (gripping) operation speed of the hand 3 including one or more of the opening / closing (gripping) operation speed) in a method according to the ease of grasping the estimated measurement feature. Adjustment parameters (which means adjustment parameters related to opening and closing).

情報処理部5で計算した把持位置姿勢、開閉パラメータ情報は制御部6に送られる。
把持位置姿勢情報は、制御部6でハンド3が把持位置姿勢をとるためのロボット4の動作制御情報になり、ロボット4へ把持動作として伝わる。開閉パラメータは、制御部6でハンド3がワーク7を把持するための動作制御情報になり、ハンド3に開閉動作として伝わる。制御部6からの情報に応じてロボット4、ハンド3がワークを把持する。
The grip position / posture and opening / closing parameter information calculated by the information processing unit 5 are sent to the control unit 6.
The grip position / posture information becomes operation control information of the robot 4 for the hand 3 to take the grip position / posture in the control unit 6, and is transmitted to the robot 4 as a gripping operation. The opening / closing parameter becomes operation control information for the hand 3 to grip the workpiece 7 by the control unit 6, and is transmitted to the hand 3 as an opening / closing operation. The robot 4 and the hand 3 hold the workpiece in accordance with information from the control unit 6.

次に、構成要素ごとの説明を補足する。DB1は、過去に計測したセンサ2からのワークの形状情報(計測データ)から抽出した計測特徴と、ハンド3からのその計測特徴の取り出し成否のデータの少なくとも2つを蓄積格納する。CADデータ、エッジ、点群、円、楕円、面、曲面やそのパラメータの集合、幾何関係などで表されるワーク形状情報やハンド形状情報を蓄えていてもよい。   Next, the explanation for each component will be supplemented. The DB 1 stores and stores at least two of the measurement features extracted from the workpiece shape information (measurement data) from the sensor 2 measured in the past and the success / failure data of the measurement features extracted from the hand 3. Work shape information and hand shape information represented by CAD data, edges, point groups, circles, ellipses, surfaces, curved surfaces, sets of parameters thereof, geometric relationships, and the like may be stored.

センサ2は供給箱8にバラ積みされたワーク7を計測するもので、ロボット4の上に設置しても、それ以外の固定ジグや可動スライダなどに設置してもよい。ワークのバラ積み状態の2次元画像や、3次元データを計測するもので、2次元の場合はカメラ、3次元の場合は例えば、二眼・多眼のステレオ方式、マルチベースラインステレオ、レーザやプロジェクタなどの投光部とカメラとを備えた空間コード化法などのアクティブ方式、照射光の返り時間から距離を測定するタイムオブフライト法、単眼でロボット動作を利用した因子分解法、Structure from Motion(あるシーンをカメラの視点を変えながら撮影した複数枚の画像からそのシーンの3次元形状とカメラの位置を同時に復元する手法、以下SFM、またはStructure and Motion(以下SAMとも呼ばれる))、モーションステレオ、視体積交差法や、合焦方式のShape from Focus(Shape from Focus法は合焦情報(フォーカスが合っているかいないかを表す情報)から物体の表面形状を求める手法)、Shape from Defocus(Shape From Focus/Defocus法:焦点を変化させて撮影した多焦点画像群からブロック単位に合焦評価値を算出し、これより距離推定を行う手法)、ライン光による光切断法等による機器を利用してもよい。   The sensor 2 measures a workpiece 7 stacked in a supply box 8 and may be installed on the robot 4 or may be installed on another fixed jig or movable slider. For measuring two-dimensional images and three-dimensional data of workpieces stacked. For two-dimensional, a camera, for three-dimensional, for example, binocular / multi-eye stereo system, multi-baseline stereo, laser, An active method such as a spatial coding method with a projector and a projector and a camera, a time-of-flight method that measures distance from the return time of irradiated light, a factorization method that uses robot motion with a single eye, Structure from Motion (Method of simultaneously restoring the 3D shape of the scene and the position of the camera from multiple images taken while changing the viewpoint of the camera, hereinafter referred to as SFM or Structure and Motion (hereinafter also referred to as SAM)), motion stereo , Volumetric intersection method and focus type Shape from Focus (Shape from Focus method is based on in-focus information (information indicating whether the subject is in focus or not) (Shape from Defocus (Shape From Focus / Defocus method): Focus evaluation value is calculated for each block from multifocal images taken with different focal points, and the distance is estimated from this) Alternatively, an apparatus using a light cutting method using line light may be used.

センサ2が計測する3次元情報の形式は、点群データとしてX,Y,Z、合計3次元の位置情報を用いても、距離画像として各画素に計測対象の高さ情報を持つ画像形式を用いてもよい。計測した時点での座標系はカメラ座標系、ロボット座標系、ワーク座標系を用いてもよい。位置や距離はメートル単位、ピクセル単位で表されたものでもよい。   The format of the three-dimensional information measured by the sensor 2 is an image format in which each pixel has height information of the measurement object as a distance image even if X, Y, Z, total three-dimensional position information is used as point cloud data. It may be used. The coordinate system at the time of measurement may be a camera coordinate system, a robot coordinate system, or a work coordinate system. The position and distance may be expressed in units of meters or pixels.

ハンド3はバラ積みされたワーク7を把持して取り出す。ハンド形状としては、例えば物体を挟み把持する挟持型、面に吸い付く吸着型、物体の穴に指し込み拡げて持ち上げる拡持型などを用いてもよい。またワーク7に接触するハンド3の指(爪)や吸着部分の本数は何本でもよい。   The hand 3 grips and removes the workpieces 7 stacked in bulk. As the hand shape, for example, a sandwiching type that sandwiches and grips an object, an adsorption type that sucks onto a surface, and a spreading type that extends into a hole of an object and lifts it up may be used. Further, the number of fingers (nails) and suction portions of the hand 3 that come into contact with the work 7 may be any number.

ロボット4はハンド3を、ワーク7を把持する位置姿勢に移動させたり、取り出し後の移動をさせる。ロボット4はスカラー型、垂直多関節型、パラレルリンク型など、なにを使ってもよい。またロボット4の自由度も、ワークを把持する動作ができれば、何自由度でもよい。ワークを取り出す動作とは、バラ積みされたワーク7をハンド3で掴み、供給箱8から取り出すまでの動作を指す。取り出し動作の後の動作は、ベルトコンベア(図示省略)等へのワーク7の移載、パレタイジング、組み付けなどをおこなってもよい。   The robot 4 moves the hand 3 to a position / posture for gripping the workpiece 7 or moves after taking out. The robot 4 may be anything such as a scalar type, a vertical articulated type, or a parallel link type. The degree of freedom of the robot 4 may be any degree as long as the operation of gripping the workpiece can be performed. The operation of taking out the workpiece refers to the operation from grasping the workpiece 7 stacked in bulk with the hand 3 and taking out the workpiece 7 from the supply box 8. The operation after the take-out operation may be performed by transferring the workpiece 7 onto a belt conveyor (not shown), palletizing, assembling, or the like.

情報処理部5は、計測特徴抽出部51と、計測特徴識別部52と、把持動作演算部53と、開閉動作演算部54とを備えている。これはPC(Personal Computer)や、FPGA(Field-Programmable Gate Array)ボードで実現してもよい。   The information processing unit 5 includes a measurement feature extraction unit 51, a measurement feature identification unit 52, a gripping operation calculation unit 53, and an opening / closing operation calculation unit 54. This may be realized by a PC (Personal Computer) or an FPGA (Field-Programmable Gate Array) board.

計測特徴抽出部51は、センサ2で計測した計測データからワーク7の見え方や形状の特徴を抽出する。ここでは、これを計測特徴と呼ぶ。
計測特徴は、例えば、2次元センサであればワークの表面のテクスチャの空間的配置、テクスチャの複雑さ、エッジ量、エッジで囲まれた領域の面積や主軸方向、局所的なエッジの方向などを用いてもよい。
3次元センサであれば、それに加えて面の主法線方向、面の法線方向の分布、それをヒストグラム化したもの、平面度などを用いてもよい。また、周辺の複雑度をテクスチャ状態や高さ分布のバラつきなどをもとに定量化し、計測特徴としてもよい。
これら計測特徴から少なくとも1つ以上を多次元特徴ベクトルとしてDB1に蓄える。計測特徴はベクトル量子化してもよい。
The measurement feature extraction unit 51 extracts the appearance and shape features of the workpiece 7 from the measurement data measured by the sensor 2. Here, this is called a measurement feature.
For example, in the case of a two-dimensional sensor, the measurement features include the spatial arrangement of the texture on the surface of the workpiece, the complexity of the texture, the amount of edges, the area of the area surrounded by the edges, the principal axis direction, the local edge direction, etc. It may be used.
In the case of a three-dimensional sensor, the main normal direction of the surface, the distribution of the normal direction of the surface, a histogram thereof, flatness, etc. may be used. In addition, the peripheral complexity may be quantified based on the texture state or height distribution variation, and may be used as a measurement feature.
At least one of these measurement features is stored in the DB 1 as a multidimensional feature vector. The measurement feature may be vector quantized.

図2は計測データとして距離画像を用いたときの計測特徴抽出部51での動作を説明するための図であり、(a)はフローチャート、(b)は(a)の各ステップの対応する説明図である。距離画像からエッジ検出をすると物体の形状エッジが抽出される(ステップST1)。このエッジで区切られる領域を分割(セグメンテーション)すると(ステップST2)、ワーク7の一部の面領域を抽出することができる。ここから候補選択をおこなう(ステップST3)。候補選択は、高さ、周辺に面があるか、などの情報をもとに計算できる。図2では最も高い位置にある面を候補として抽出している。抽出した候補(セグメントS1)から、上述した領域の面積、主軸方向、面の主法線方向や面積などの情報を抽出し多次元の特徴ベクトルとし(ステップST4)、これを計測特徴としている。この方法であれば、ワーク形状情報を持たずとも面などの計測特徴が抽出できる。   2A and 2B are diagrams for explaining the operation of the measurement feature extraction unit 51 when a distance image is used as measurement data. FIG. 2A is a flowchart, and FIG. 2B is a description corresponding to each step of FIG. FIG. When the edge is detected from the distance image, the shape edge of the object is extracted (step ST1). When the region delimited by the edge is divided (segmented) (step ST2), a partial surface region of the workpiece 7 can be extracted. A candidate is selected from here (step ST3). Candidate selection can be calculated based on information such as height and whether there is a surface in the vicinity. In FIG. 2, the surface at the highest position is extracted as a candidate. From the extracted candidate (segment S1), information such as the area of the region, the principal axis direction, the principal normal direction and the area of the surface is extracted to obtain a multidimensional feature vector (step ST4), which is used as a measurement feature. With this method, measurement features such as surfaces can be extracted without having workpiece shape information.

計測特徴識別部52は、過去の計測特徴とその取り出し成否(成功か否か)をもとに、抽出した計測特徴が掴みやすいかどうかを推定する。これは、新たに抽出した計測特徴が取り出し成否の2クラスのどちらに分類されるかを、多次元特徴ベクトル空間の過去の計測特徴群を使って判断するパタン識別問題として解くことができる。このとき識別器としてニューラルネットワーク、SVM(サポートベクターマシーン)、k近傍識別器、ベイズ分類を使って解いてもよい。掴みやすいかどうかを、取り出し成否のどちらに分類されたかという情報より掴みやすい、掴みにくい、の2値として表してもよい。識別器から得られる、クラス1(取り出し成)に近い度合いSpと、クラス2(取り出し否)に近い度合いSnから計算してもよい。   The measurement feature identification unit 52 estimates whether or not the extracted measurement feature is easy to grasp based on the past measurement feature and its success or failure (whether it is successful). This can be solved as a pattern identification problem in which whether a newly extracted measurement feature is classified into two classes of extraction success or failure is determined using a past measurement feature group in the multidimensional feature vector space. At this time, the neural network, SVM (support vector machine), k-neighbor classifier, and Bayesian classification may be used as the classifier. Whether or not it is easy to grasp may be expressed as a binary value that is easy to grasp or difficult to grasp from the information of whether it is classified as success or failure of extraction. You may calculate from the degree Sp close | similar to the class 1 (removal completion) and the degree Sn close | similar to a class 2 (rejection failure) obtained from a discriminator.

SpとSnは、例えば、多次元特徴ベクトル空間における最近傍探索により、クラス1に分類される過去の計測特徴と抽出した計測特徴のユークリッド距離をSpとし、クラス2に分類される過去の計測特徴と抽出した計測特徴のユークリッド距離をSnとすることで計算できる。これは想定される最大のユークリッド距離の差で正規化してもよい。
取り出しが成功した計測特徴サンプル(Positive sample)と、取り出しが失敗した計測特徴サンプル(Negative sample)のどちらに近いかという意味で、次のような値を定義する。
Sp and Sn are, for example, past measurement features classified into class 2 with Sp as the Euclidean distance between the past measurement features classified into class 1 and the extracted measurement features by the nearest neighbor search in the multidimensional feature vector space. And the extracted Euclidean distance of the measurement feature can be calculated as Sn. This may be normalized by the difference in the maximum expected Euclidean distance.
The following values are defined in terms of whether the measurement feature sample (Positive sample) successfully extracted or the measurement feature sample (Negative sample) failed to be extracted.

P/N=S/S (1) P / N = S p / S n (1)

ここでP/NはPositive Negative ratio(成功/否比)を意味する。計測特徴の掴みやすさをP/Nで定量化することができる。P/Nが高いワークは高速な処理を重視して把持動作、開閉動作を演算し、P/Nが低いワークは丁寧な動作を重視して把持動作、開閉動作を演算する。P/Nは、計測特徴をもとにすれば他の求め方をしてもよい。   Here, P / N means Positive Negative ratio. The ease of grasping the measurement feature can be quantified by P / N. For a workpiece having a high P / N, a gripping operation and an opening / closing operation are calculated with an emphasis on high-speed processing, and for a workpiece having a low P / N, a gripping operation and an opening / closing operation are calculated with an emphasis on a polite operation. P / N may be obtained in other ways based on measurement characteristics.

図3はP/N計算の一つの方法のイメージ図である。このとき、DB1には過去の計測特徴と、その取り出し成否をもとに、取り出し成功と取り出し失敗をそれぞれ代表する多次元特徴ベクトルを蓄えている。これは過去の計測特徴の多次元ベクトルを、取り出し成否で2クラスに分類した後、各クラスの各特徴ベクトルの平均値や中央値で代表する多次元特徴ベクトルを作ることができる。このとき、全ての特徴ベクトルを使ってもよいし、クラス内の特徴のうち、バラつきが少ない特徴に絞って、他の特徴を使わないことで計算量を削減してもよい。これらの代表する多次元特徴ベクトルの演算処理は、DB1に接続されたDB1内の情報の管理処理を行うDB管理処理部(図示省略)で行ってよいし、あるいは情報処理部5のいずれかの部で行うようにしてもよい。   FIG. 3 is an image diagram of one method of P / N calculation. At this time, the DB1 stores multidimensional feature vectors that represent extraction success and extraction failure based on past measurement features and their success or failure. In this case, multidimensional vectors of past measurement features are classified into two classes by success or failure, and then a multidimensional feature vector represented by the average value or median value of each feature vector of each class can be created. At this time, all feature vectors may be used, or the amount of calculation may be reduced by focusing on the features with less variation among the features in the class and not using other features. These representative multidimensional feature vector computation processes may be performed by a DB management processing unit (not shown) that manages information in the DB 1 connected to the DB 1, or any one of the information processing units 5. You may make it carry out by a part.

図3では、それぞれのクラスを代表する多次元特徴ベクトルと、抽出した多次元特徴ベクトルの類似度を評価し、それをSp,Snとしている。それぞれの類似度Sp,Snは、抽出した計測特徴の多次元特徴ベクトルをベクトルv(i=1,2,…,n)、取り出し成功を代表する多次元特徴ベクトルをベクトルp、取り出し失敗を代表する多次元特徴ベクトルをベクトルnとするとき、 In FIG. 3, the similarity between the multidimensional feature vector representing each class and the extracted multidimensional feature vector is evaluated, and these are set as Sp and Sn. The respective similarities Sp and Sn are the extracted multi-dimensional feature vectors of the measurement features as vectors v i (i = 1, 2,..., N), the multi-dimensional feature vectors representing successful extraction as vectors p i , and the extraction failure. Is a multidimensional feature vector representing the vector n i ,

Figure 2013052490
Figure 2013052490

として計算することができる。式(2),(3)から式(1)を求めることで、ワークの取り出しを繰り返していくうちに、センサで計測した計測特徴に、掴みやすいかどうかを判断するための指標を付加することができるようになる。これはワーク形状情報を事前にもたずとも計算することができる。   Can be calculated as By adding Equation (1) from Equations (2) and (3), an index for determining whether or not it is easy to grasp is added to the measurement feature measured by the sensor while the workpiece is repeatedly taken out. Will be able to. This can be calculated without any prior work shape information.

計測特徴識別部52は計測特徴の掴みやすさを定量化するだけでなく、過去に取り出しが成功した計測特徴と、抽出した計測特徴の類似度を評価し、対応付けることで後段の処理を省略することを可能にする。例えば、DB1に過去の計測特徴のP/Nと多次元特徴ベクトルを蓄えておく。そのとき、まず、抽出した計測特徴のP/Nを計算する。P/Nが高く把持しやすいと推定されるとき、近いP/Nを持つ過去の計測特徴を選択する。選択した過去の計測特徴の多次元特徴ベクトルbと、抽出した計測特徴の多次元特徴ベクトルvとの類似度Sbをあらためて計算する。 The measurement feature identification unit 52 not only quantifies the ease of grasping the measurement feature, but also evaluates the similarity between the measurement feature that has been successfully extracted in the past and the extracted measurement feature, and omits subsequent processing. Make it possible. For example, P / N of past measurement features and multidimensional feature vectors are stored in DB1. At that time, first, P / N of the extracted measurement feature is calculated. When the P / N is estimated to be high and easy to grip, a past measurement feature having a close P / N is selected. The degree of similarity Sb between the selected multi-dimensional feature vector b i of the past measurement feature and the extracted multi-dimensional feature vector v i of the measurement feature is newly calculated.

Figure 2013052490
Figure 2013052490

最初から式(4)の計算をおこなわずに、P/Nを計算しておく理由は、蓄積した過去の多くの計測特徴と式(4)の計算をすると処理時間が膨大になるためである。式(1),(2),(3)の計算は,DB1に蓄積される情報が増えても変わらない。先にP/Nを計算しておくことで、DB1に蓄えられた過去の情報が増えても処理時間を大きく増やさない。   The reason for calculating P / N without calculating equation (4) from the beginning is that the processing time becomes enormous if many accumulated measurement features in the past and equation (4) are calculated. . The calculations of equations (1), (2), and (3) do not change even if the information stored in DB1 increases. By calculating P / N first, the processing time is not greatly increased even if the past information stored in DB1 increases.

式(4)で計算したSbが高く、過去の計測特徴と見え方や形状が類似していると判定されれば、把持動作演算部53や開閉動作演算部54の計算処理を省略し、過去の計測特徴を把持し成功したときの把持位置姿勢、開閉パラメータを流用する。これにより処理時間を短縮し高速化することができる。なお、過去の計測特徴を把持し成功したときの把持位置姿勢、開閉パラメータのデータは例えばDB1に保存しておく。   If it is determined that the Sb calculated by Expression (4) is high and the appearance and shape are similar to the past measurement characteristics, the calculation processing of the gripping motion calculation unit 53 and the opening / closing operation calculation unit 54 is omitted, and the past The gripping position / posture and open / close parameters when the measurement feature is successfully gripped are used. Thereby, processing time can be shortened and speeded up. Note that the gripping position / posture and opening / closing parameter data when the past measurement features are successfully gripped are stored in, for example, the DB 1.

把持動作演算部53は、抽出した計測特徴から把持位置姿勢を計算する。これは、計測特徴とDB1内のワーク形状情報とのマッチング(検索)によりワークの位置姿勢を認識し、ワークに対応した把持位置姿勢を計算する方法でもよいし、さらにDB1内のハンド形状により周辺干渉を考慮してもよい。マッチングは2次元画像であればエッジ特徴のチャンファーマッチング、3次元点群であればICP(Iterative Closest Point)法などにより実現する。   The gripping operation calculation unit 53 calculates a gripping position / orientation from the extracted measurement features. This may be a method of recognizing the position and orientation of the workpiece by matching (searching) the measurement feature with the workpiece shape information in DB1, and calculating the gripping position and orientation corresponding to the workpiece. Interference may be taken into account. Matching is realized by chamfer matching of edge features if it is a two-dimensional image, or by ICP (Iterative Closest Point) method if it is a three-dimensional point group.

ワーク形状情報と計測データのマッチング方法は、多数のワークの取り出しに関する技術として多くの方法が提案されている。例えば、上述の特許文献2、特許文献3に処理の例がある。本構成で実現する形態であれば方法は問わない。   As a method for matching workpiece shape information and measurement data, many methods have been proposed as techniques relating to the extraction of a large number of workpieces. For example, there are examples of processing in Patent Document 2 and Patent Document 3 described above. Any method can be used as long as it is realized by this configuration.

例えば特許文献2では、画像入力用カメラによりトレー内のワークが撮像され、物体認識装置はその映像信号を処理し輪郭線から線分画像を得る。この線分画像と照合モデルとが照合され、ワークまたはワークの特定部位の位置が認識される。ワークの特定部位に対応した照合では、照合時間が極めて少なくて済むことになる。
例えば特許文献3では、ワークの全体モデルといくつかの部分モデルを相互位置関係とともに教示しておく。計測位置へロボットを移動させセンサで全体モデルを検出する。各部分モデルの存在領域を計算し、その範囲で各部分モデルの検出を試みる。検出した全体モデルに対応するワークの3次元位置・姿勢を取得し、基準把持姿勢を定める。
For example, in Patent Document 2, a work in a tray is imaged by an image input camera, and the object recognition apparatus processes the video signal to obtain a line segment image from the contour line. The line image and the matching model are collated, and the position of the workpiece or a specific part of the workpiece is recognized. In collation corresponding to a specific part of the workpiece, the collation time is extremely short.
For example, in Patent Document 3, an entire model of a workpiece and several partial models are taught together with a mutual positional relationship. The robot is moved to the measurement position and the entire model is detected by the sensor. The existence area of each partial model is calculated, and detection of each partial model is attempted within that range. A three-dimensional position / posture of the workpiece corresponding to the detected overall model is acquired, and a reference gripping posture is determined.

また、計測特徴とDB1内のハンド形状情報のマッチングにより、ワーク形状情報を使わずに把持位置姿勢を計算してもよい。ハンドのマッチング方法も過去にいくつかの方法が提案されている。例えば、上述の特許文献4に処理の例があり、本構成で実現する形態であればどういった方法でもよい。   Further, the gripping position / orientation may be calculated without using the workpiece shape information by matching the measurement feature with the hand shape information in the DB 1. Several methods for matching hands have been proposed in the past. For example, there is an example of processing in Patent Document 4 described above, and any method may be used as long as the configuration is realized by this configuration.

例えば特許文献4では、ロボットで把持するワークの供給部の三次元計測データから供給部の三次元情報を生成し、予め記憶された把持機構領域と把持部分領域とからなる把持領域を用い、把持部分領域の全体に物体が存在し、把持機構領域に物体が存在していない領域を、三次元情報から把持可能領域として抽出する。   For example, in Patent Document 4, three-dimensional information of a supply unit is generated from three-dimensional measurement data of a supply unit of a workpiece gripped by a robot, and a gripping region including a gripping mechanism region and a gripping portion region that are stored in advance is used. An area where an object exists in the entire partial area and no object exists in the gripping mechanism area is extracted from the three-dimensional information as a grippable area.

また、「ハンドのマッチング方法の別の方法」として、例えば、ハンドが吸着タイプであれば、モデルと特徴点群とのマッチングが高い場所を候補とし、特徴面が小さかったり、特徴面内に穴が開いていたりして、モデルとの一致度が低い場合は、候補から照合スコアを低くする。こうしたマッチングは、3DモデルであればICP(Iterative Closest Point)などの点群どうしの照合により、2Dモデルであればテンプレートマッチングや、モデルをフィルタと考えての畳みこみ処理によるマッチングによって、実現することができる。また挟持タイプや拡持タイプであればさらに周辺環境との干渉も同時に考慮することができる。すなわち例えば挟持タイプであれば、進入深さとハンドの縦幅、横幅とで定義される領域に計測データが含まれていた場合、照合スコアを下げることで実現する。ハンドマッチングの制御部では、抽出された複数の特徴に対して、特徴ごとにマッチングスコアが最も高い把持位置姿勢を計算し、それら複数の把持地位(誤→位置)姿勢を把持位置姿勢候補とするか、あるいは、抽出された単一の特徴に対して、マッチングスコアが所定の閾値よりも高い複数の把持位置姿勢を計算し、それら複数の把持位置姿勢を把持位置姿勢候補とする。このような処理によって、ワーク形状が未知でも、動的に把持位置姿勢を定義することができる。本構成で実現する形態であればどういった方法でもよい。   In addition, as another method of hand matching, for example, if the hand is a suction type, a candidate is a place where the matching between the model and the feature point group is high, the feature surface is small, or a hole is formed in the feature surface. If the match level with the model is low, the matching score is lowered from the candidate. Such matching is realized by matching points such as ICP (Iterative Closest Point) if it is a 3D model, by template matching if it is a 2D model, or by matching by convolution processing considering the model as a filter. Can do. In addition, interference with the surrounding environment can be taken into consideration at the same time in the case of the sandwiching type or the extension type. That is, for example, in the case of the pinching type, when the measurement data is included in the area defined by the depth of entry and the vertical and horizontal widths of the hand, this is realized by lowering the matching score. The hand matching control unit calculates the gripping position / posture having the highest matching score for each of the extracted features, and uses the plurality of gripping positions (error → position) and postures as gripping position / posture candidates. Alternatively, for a single extracted feature, a plurality of gripping position / postures whose matching score is higher than a predetermined threshold are calculated, and the plurality of gripping position / postures are set as gripping position / posture candidates. By such processing, the gripping position / posture can be dynamically defined even if the workpiece shape is unknown. Any method may be used as long as it is realized by this configuration.

把持位置姿勢の演算は上記のように、ワーク形状情報と計測データのマッチングや、ハンド形状情報と計測データのマッチングにより実現する。しかし、例えばワーク形状のマッチングをベースにすると処理時間がかかり、部品ごとに事前調整が必要である。また、ハンド形状のマッチングをベースにすると把持失敗の可能性がある、といった課題がある。   As described above, the calculation of the gripping position and orientation is realized by matching workpiece shape information and measurement data, or matching hand shape information and measurement data. However, for example, based on workpiece shape matching, processing time is required, and prior adjustment is required for each part. In addition, there is a problem that there is a possibility of gripping failure based on hand shape matching.

そこで普段は速度を重視し、計測特徴とDB1内のハンド形状情報のマッチングだけで計算するが、P/Nが低い場合は把持が安定しない可能性があることから、ワーク形状情報を用いて把持位置姿勢を探索する。これにより、把持位置姿勢をより高い精度で探索することが可能になったり、ワーク形状に基づいた掴みやすい位置を掴むことができるようになる。   Therefore, we usually focus on speed and calculate only by matching the measurement characteristics and hand shape information in DB1. However, if P / N is low, gripping may not be stable. Search for position and orientation. As a result, it becomes possible to search the grip position / posture with higher accuracy, or to grasp a position that is easy to grip based on the workpiece shape.

また、ワーク形状を使わなくとも、普段は省略しているが、P/Nが低い場合はハンドモデルの回転とワークの形状エッジの直交度が高い(例えば所定の閾値以上)ところを優先して把持する処理を加えることができる。これにより、把持動作でワークを掴みそこねてしまうケースを減らすことができる。また、抽出したセグメント(例えば図2のS1などを意味する)の重心に近い部分を把持することで、把持後に安定せずにワークが落下する可能性を減らすことができる。このように計測特徴識別部52での掴みやすさの推定結果をもとに、処理数を増やすことで把持を成功しやすい動作をすることができる。   Even if the workpiece shape is not used, it is usually omitted. However, when the P / N is low, priority is given to the case where the rotation of the hand model and the orthogonality between the workpiece shape edges are high (for example, above a predetermined threshold). A gripping process can be added. As a result, it is possible to reduce the number of cases in which the workpiece is missed by the gripping operation. Further, by grasping a portion close to the center of gravity of the extracted segment (for example, meaning S1 in FIG. 2), it is possible to reduce the possibility of the workpiece falling without being stabilized after the grasping. As described above, based on the estimation result of the ease of grasping by the measurement feature identifying unit 52, the operation can be easily performed by increasing the number of processes.

開閉動作演算部54ではハンド3の開閉量・力(把持力)・速度(動作速度)などの把持パラメータを調整する。狭持形状のハンドにおいて、把持する直前の開き幅をどうするかは、周辺干渉を考慮するときに重要である。異なる開閉量でハンドが開いている状態のハンド形状情報をDB1に蓄えておき、計測特徴とその周辺の計測データと、異なるハンド開閉量のハンド形状データとをマッチングし、最も評価値が高い開閉量を選択する。ハンド形状データのマッチング方法は上述の「ハンドのマッチング方法の別の方法」として示した方法で実現する。動作速度や把持力は、タクトタイムを上げるためには基本は最大に設定しておく。しかし、取り出し成功率が低いと推定されるときは、動作速度、把持力を下げる。これにより動作が丁寧になり、ハンド3の開閉動作中にワークを弾き飛ばして取り出し失敗してしまうようなことを防止できる。   The opening / closing operation calculation unit 54 adjusts gripping parameters such as the opening / closing amount, force (gripping force), and speed (operation speed) of the hand 3. In a pinch-shaped hand, what to do with the opening width immediately before gripping is important when considering peripheral interference. Stores hand shape information in a state where the hand is opened with different opening / closing amounts in DB1, and matches the measurement characteristics and the surrounding measurement data with the hand shape data of different hand opening / closing amounts, and the opening / closing with the highest evaluation value Select the amount. The hand shape data matching method is realized by the method described above as “another method of hand matching method”. Basically, the operating speed and gripping force are set to the maximum in order to increase the tact time. However, when it is estimated that the extraction success rate is low, the operation speed and the gripping force are lowered. As a result, the operation becomes polite, and it is possible to prevent the workpiece 3 from being missed by flipping off the workpiece during the opening / closing operation of the hand 3.

情報処理部5内の処理の説明を続けてきたが、ここであらためて処理の全容をまとめる。図4は計測データとして距離画像を使ったときの情報処理部5の動作フローチャートである。   The description of the processing in the information processing unit 5 has been continued. Here, the entire processing is summarized. FIG. 4 is an operation flowchart of the information processing unit 5 when a distance image is used as measurement data.

計測特徴抽出部51では、距離画像からエッジ検出をすると物体の形状エッジが抽出され(ステップST1)、このエッジで区切られる領域を分割(セグメンテーション)すると(ステップST2)、ワーク7の一部の面領域を抽出することができる。ここから、高さ(距離)、周辺に面があるか否か、等の情報に基づく計算により、候補選択をおこなう(ステップST3)。例えば最も高い(センサ2から距離が短い)位置にある面を候補として選択する。抽出した候補から、上述した領域の面積、主軸方向、面の主法線方向や面積などの情報を抽出し多次元の特徴ベクトルを計算し(ステップST4)、これを計測特徴とする。   In the measurement feature extraction unit 51, when an edge is detected from the distance image, the shape edge of the object is extracted (step ST1), and when the region delimited by this edge is segmented (step ST2), a part of the surface of the workpiece 7 is obtained. Regions can be extracted. From here, candidate selection is performed by calculation based on information such as height (distance) and whether or not there is a surface in the vicinity (step ST3). For example, the surface at the highest position (the distance from the sensor 2 is short) is selected as a candidate. From the extracted candidates, information such as area, principal axis direction, principal normal direction and area of the surface is extracted to calculate a multidimensional feature vector (step ST4), and this is used as a measurement feature.

計測特徴識別部52では、計測特徴である多次元特徴ベクトルの上述の式(1)のP/Nによる特徴ベクトル特性評価を行う(ステップST5)。P/Nが低い場合(予め定められた閾値未満の場合)には掴み難いと判断し、高い場合(予め定められた閾値以上の場合)は掴み易いと判断する(ステップST6)。
把持動作演算部53では、P/Nが低く掴み難いと判断された場合には、把持位置姿勢探索の処理を追加する。把持位置姿勢探索は通常は、上述の「ハンドのマッチング方法の別の方法」として示した方法のハンドモデルのマッチングで実現する(ステップST8)。また追加の評価処理として、ハンドモデルの回転とワークの形状エッジの直交率(度)が高いところを優先して把持する。また抽出したセグメントの重心に近い部分を把持することで、把持後の落下の可能性を減らす。また、ワークに依存した調整が許容されるシステムであれば、ワーク形状情報を使って、より高精度な把持位置姿勢探索をおこなってもよい(ステップST9)。
The measurement feature identification unit 52 performs feature vector characteristic evaluation by P / N of the above-described equation (1) of the multidimensional feature vector that is the measurement feature (step ST5). When P / N is low (when it is less than a predetermined threshold), it is determined that it is difficult to grasp, and when P / N is high (when it is equal to or greater than a predetermined threshold), it is determined that it is easy to grasp (step ST6).
When the gripping motion calculation unit 53 determines that the P / N is low and it is difficult to grip, a gripping position / posture search process is added. The gripping position / orientation search is usually realized by hand model matching according to the method shown as “another method of hand matching” described above (step ST8). In addition, as an additional evaluation process, the hand model is preferentially gripped where the orthogonality (degree) between the rotation of the hand model and the workpiece shape edge is high. Also, by grasping the portion near the center of gravity of the extracted segment, the possibility of dropping after grasping is reduced. In addition, if the system allows adjustment depending on the workpiece, the grip position / posture search may be performed with higher accuracy using the workpiece shape information (step ST9).

このようにP/Nが低い場合、開閉動作演算部54では、ハンド3の開閉量を下げ(ステップST12)、また開閉(把持)動作速度や把持力を下げて最適化を図る(ステップST13)。これにより、把持アプローチ中にワークを弾き飛ばしたり、把持位置姿勢が不安定で把持後にワークが落下したりする取り出し失敗の発生を減らす。   When the P / N is low in this way, the opening / closing operation calculation unit 54 reduces the opening / closing amount of the hand 3 (step ST12), and reduces the opening / closing (gripping) operation speed and gripping force for optimization (step ST13). . This reduces the occurrence of unsuccessful picking, such as flipping the workpiece during the gripping approach or dropping the workpiece after gripping because the gripping position and posture are unstable.

また計測特徴識別部52において、ステップST6でP/Nが高い場合は、速度を重視した処理をおこなう。そこで最初に、類似の高P/N計測特徴があるか否かを判断する(ステップST7)。ない場合には過去に取り出し成功した類似の計測特徴がないと判断する。この場合、把持動作演算部53では、速度を重視した処理をおこない、把持位置姿勢探索を行い(ステップST10)、追加の評価処理は行わない。そして開閉動作演算部54では、ハンド開閉に関しても、開閉量計算(ステップST14)のみで、開閉(把持)速度や把持力は最大に設定する(ステップST15)。   Further, in the measurement feature identification unit 52, when P / N is high in step ST6, processing with an emphasis on speed is performed. Therefore, first, it is determined whether or not there is a similar high P / N measurement feature (step ST7). If not, it is determined that there is no similar measurement feature that has been successfully extracted in the past. In this case, the gripping motion calculation unit 53 performs processing focusing on speed, performs gripping position / posture search (step ST10), and does not perform additional evaluation processing. The opening / closing operation calculation unit 54 sets the opening / closing (gripping) speed and gripping force to the maximum (step ST15) only by calculating the opening / closing amount (step ST14).

また計測特徴識別部52のステップST7で、類似の高P/N計測特徴がある場合には、過去に取り出し成功した類似の計測特徴があると判断する。この場合、把持動作演算部53では、過去の類似計測特徴の把持位置姿勢を流用し(ステップST11)、さらに開閉動作演算部54では、過去の類似計測特徴の開閉量計算・開閉(把持)動作速度・把持力を流用することで(ステップST16)、処理を省略し、高速化する。   If there is a similar high P / N measurement feature in step ST7 of the measurement feature identification unit 52, it is determined that there is a similar measurement feature that has been successfully extracted in the past. In this case, the gripping motion calculation unit 53 uses the past gripping position / posture of similar measurement features (step ST11), and the opening / closing motion calculation unit 54 calculates the opening / closing amount / opening / closing (gripping) motion of the past similar measurement features. By using the speed and gripping force (step ST16), the processing is omitted and the processing speed is increased.

以上の情報処理部5の処理により、掴みにいくワークの見え方に応じた把持位置姿勢、開閉パラメータを制御部6へ出力し、制御部6ではこれらをロボット、ハンドの動作制御情報に変換してロボット4、ハンド3の動作制御が行われ、ワーク取り出し動作を実現する。   Through the processing of the information processing unit 5 described above, the gripping position / posture and the open / close parameters according to the appearance of the workpiece to be gripped are output to the control unit 6, and the control unit 6 converts these into motion control information for the robot and hand. Then, the operation control of the robot 4 and the hand 3 is performed to realize the workpiece picking operation.

ハンド3の開閉量をもとに、ワークの取り出しが成功したか、失敗したかを判断することができる。ワーク把持後に、ハンド3は開閉量が0(閉じきっている)であればワークを把持していないとし、ワーク取り出し失敗という情報をDB1に伝える。ハンド3の開閉量が0でなければ、ワーク取り出し成功という情報をDB1に伝える。このとき、把持するワークの大きさから、開閉量がM以上であれば取り出し成功とするといった事前情報をDB1に予め蓄えておいてもよい。図示を省略した上述のDB管理処理部は、ワークの取り出しに成功した場合、抽出した計測特徴の多次元特徴ベクトルを、取り出し成功を代表する多次元特徴ベクトルに反映する。これは、ワーク取り出し回数i(i=1,2,…,n)回目の試行とするとき、次の式で得ることができる。   Based on the opening / closing amount of the hand 3, it is possible to determine whether the workpiece has been successfully taken out or failed. After gripping the workpiece, if the opening / closing amount is 0 (closed), it is determined that the hand 3 does not grip the workpiece, and information indicating that the workpiece is unsuccessful is transmitted to the DB 1. If the opening / closing amount of the hand 3 is not 0, information indicating that the workpiece has been successfully taken out is transmitted to the DB 1. At this time, advance information such as successful removal may be stored in the DB 1 in advance based on the size of the workpiece to be grasped if the opening / closing amount is M or more. The above-described DB management processing unit (not shown) reflects the extracted multi-dimensional feature vector of the measurement feature in the multi-dimensional feature vector representing the successful extraction when the workpiece is successfully extracted. This can be obtained by the following formula when the workpiece is taken out i (i = 1, 2,..., N).

Figure 2013052490
Figure 2013052490

現在の取り出し成功を代表する特徴ベクトルpに、取り出し成功した計測特徴の特徴ベクトルvとの差(p−v)を試行回数l(エル)で割ったものを加えている。pi+1は、式(5)でなくとも、pとvより求めるものであればよい。 The feature vector p i which represents the current extraction success, are added divided by the difference between the feature vector v i of extraction successful measurement features (p i -v i) the number of attempts l (el). p i + 1 is not limited to equation (5), but may be obtained from p i and v i .

ワークの取り出しに失敗した場合は、抽出した計測特徴の多次元特徴ベクトルvを、取り出し失敗を代表する多次元特徴ベクトルnに反映する。これは、ワーク取り出し回数i(i=1,2,…,n)回目の試行とするとき、次のような式で得ることができる。 When the workpiece extraction fails, the extracted multi-dimensional feature vector v i of the measurement feature is reflected in the multi-dimensional feature vector n i representing the extraction failure. This can be obtained by the following formula when the workpiece is taken out i (i = 1, 2,..., N) times.

Figure 2013052490
Figure 2013052490

i+1は、式(6)でなくとも、nとvより求めるものであればよい。 p i + 1 is not limited to equation (6), but may be obtained from n i and v i .

取り出しを開始した初期の状態では、取り出し成否と計測特徴の関係が十分に学習されていない。こういった初期状態のときは、図4におけるP/Nが高く、類似の高P/Nの計測特徴がないという分類時の把持位置姿勢、開閉パラメータ演算(図4のST10、ST14,ST15)をおこなう。失敗事例が増えてくれば、計測特徴識別部52の識別性能が高くなるが、それまではDB1にデータを蓄積するだけで、常にP/Nが高く、類似の高P/Nの計測特徴がないという識別をしておいてもよい。例えば試行回数がn回を超えたときに計測特徴識別部52で識別を開始するようにしてもよい。   In the initial state where extraction is started, the relationship between extraction success and measurement characteristics is not sufficiently learned. In such an initial state, the grip position / posture and open / close parameter calculation at the time of classification that the P / N in FIG. 4 is high and there is no similar high P / N measurement feature (ST10, ST14, ST15 in FIG. 4). To do. If the number of failure cases increases, the identification performance of the measurement feature identification unit 52 increases, but until then, only by storing data in DB1, the P / N is always high, and similar high P / N measurement features are present. You may identify that there is no. For example, the identification may be started by the measurement feature identification unit 52 when the number of trials exceeds n.

実施の形態1によるこの発明を使うことにより、取り出しを続けるうちに、自動的に取り出し失敗が減らせる。また、処理を省略し高速化が図れる。NG判定(ワークの取り出し成功/失敗)により取り出せない領域が増えたりせず、それに起因するリトライが増えない。上述の「ハンドのマッチング方法の別の方法」として示した方法のハンドモデルのマッチングのように、ワーク形状を使わない把持位置姿勢探索を使う場合、ワークの形状情報をもたず、対象物が未知の状態でも、取り出し続けるうちに、抽出した計測特徴を識別し、適切な取り出しが行えるようになる。   By using the present invention according to the first embodiment, extraction failures can be automatically reduced while extraction is continued. Further, the processing can be omitted and the speed can be increased. The number of areas that cannot be extracted does not increase due to NG determination (success / unsuccessful workpiece extraction), and retries resulting from the increase do not increase. When using a gripping position and orientation search that does not use a workpiece shape, such as hand model matching in the method described above as “another method of hand matching”, the object does not have shape information of the workpiece and Even in an unknown state, the extracted measurement features can be identified and appropriately extracted as the extraction continues.

実施の形態2.
図5は、この発明の実施の形態2によるワーク取り出し装置を示す構成図である。この実施の形態では、ハンド3の開閉量をもとに、取り出し失敗分類部55で取り出しの失敗分類をおこないDB1に蓄える。取り出し成否の情報だけでなく、取り出し失敗分類の情報を用いることで、把持位置姿勢、開閉パラメータの計算を省略したり、失敗原因に応じたパラメータのみを計算する。
Embodiment 2. FIG.
FIG. 5 is a block diagram showing a workpiece picking apparatus according to Embodiment 2 of the present invention. In this embodiment, the extraction failure classification unit 55 performs extraction failure classification based on the opening / closing amount of the hand 3 and stores it in the DB 1. By using not only extraction success / failure information but also extraction failure classification information, calculation of gripping position / posture and open / close parameters is omitted, or only parameters corresponding to the cause of failure are calculated.

取り出し失敗分類部55は、ワーク取り出し時におけるハンド3の開閉量を入力として、取り出し失敗は、掴みそこねたのか、開閉動作により弾き飛ばしたのか、把持後にワークを落としたのかの分類をおこなう。   The take-out failure classification unit 55 receives as an input the opening / closing amount of the hand 3 at the time of taking out the workpiece, and classifies whether the take-out failure has been missed, has been flipped by an open / close operation, or has dropped the workpiece after gripping.

掴みそこねの時は、開閉動作中に開閉量がスムーズに変化し、かつ開閉動作完了後に開閉量が0となる。開閉量がスムーズに変化するとは、例えば開閉運動が台形制御により実現していれば、その台形制御と等しい速度変化で開閉がされるか否かを意味する。途中でワークを把持しかけて、開閉動作が一時停止した場合などは、掴みそこねに該当しない。これを、開閉動作中の開閉量をサンプリングし、開閉運動のモデルと照らし合わせることで確認する。このため、掴みそこねの判断には、DB1に開閉運動のモデルを予め蓄えておく。DB1には他に、そのときの計測特徴の多次元特徴ベクトル、取り出し失敗、掴みそこねという情報をまとめて蓄積する(失敗分類)。   When the user does not grasp the object, the opening / closing amount changes smoothly during the opening / closing operation, and the opening / closing amount becomes 0 after the opening / closing operation is completed. That the opening / closing amount changes smoothly means whether, for example, if the opening / closing movement is realized by trapezoidal control, the opening / closing amount is opened / closed at a speed change equal to that of the trapezoidal control. If the workpiece is gripped in the middle and the opening / closing operation is temporarily stopped, it does not fall under gripping. This is confirmed by sampling the opening / closing amount during the opening / closing operation and comparing it with the model of the opening / closing movement. For this reason, a model of the opening / closing motion is stored in advance in DB1 for the determination of failure. In addition, information such as the multi-dimensional feature vector of the measurement feature at that time, failure to extract, and failure to grasp is collectively stored in DB1 (failure classification).

掴みそこねは、ワークにあたらず、ハンドが空振りした状態を意味する。ハンドの空振りは、把持位置姿勢の探索ミスにより発生することから、計測特徴識別部52で次に上記式(4)のような類似度評価により、掴みそこねをおこした計測特徴に類似した計測特徴が抽出されたときは、把持動作演算部53で把持位置姿勢の探索処理数を増やし、より精確な探索をおこなう。一方で、開閉動作演算部54の掴みそこねに関係しない開閉パラメータは、掴みそこね時のものを流用する。これにより、処理を省略し高速化しつつ、把持成功率を高める。   Grabbing means that the hand is not in contact with the workpiece and the hand is swung. Since the swing of the hand occurs due to a search error of the gripping position / orientation, the measurement feature identification unit 52 performs measurement similar to the measurement feature that is gripped by the similarity evaluation as in the above equation (4). When the feature is extracted, the gripping motion calculation unit 53 increases the number of gripping position / orientation search processes to perform a more precise search. On the other hand, as the opening / closing parameters not related to the grasping of the opening / closing operation calculation unit 54, the parameters at the time of the grasping are diverted. This increases the success rate of gripping while omitting the processing and increasing the speed.

ワークの弾き飛ばしは、開閉動作中に開閉動作が一時停止し、かつ開閉動作終了後に開閉量が0となる。弾き飛ばしは、ワークに接触したものの、把持できなかった状態を意味する。DB1には計測特徴の多次元特徴ベクトル、取り出し失敗、弾き飛ばしという情報をまとめて蓄積する(失敗分類)。   In the flipping of the workpiece, the opening / closing operation is temporarily stopped during the opening / closing operation, and the opening / closing amount becomes 0 after the opening / closing operation is completed. Bounce-off means a state where the workpiece is in contact with the workpiece but cannot be gripped. In DB1, information such as multi-dimensional feature vectors of measurement features, extraction failure, and skipping is collectively stored (failure classification).

弾き飛ばしは、把持位置姿勢を変更することで改善する可能性もあるが、ハンドの開閉(動作)速度、開閉力を落とすことで簡単に予防できる可能性がある。そこで、計測特徴識別部52で上記式(4)のような類似度評価により、弾き飛ばしをおこした計測特徴に類似した計測特徴が抽出されたときは、開閉動作演算部54でハンドの開閉(把持)力、開閉(把持)速度のみを落とし、他のパラメータは、弾き飛ばし時のものを流用する。これにより、処理を省略し高速化しつつ、把持成功率を高める。また、それでも失敗した場合は、DB1には計測特徴の多次元特徴ベクトル、取り出し失敗、弾き飛ばし、開閉パラメータ調整済みという情報をまとめて蓄積する。その後さらに、弾き飛ばしをおこした計測特徴に類似した計測特徴が抽出されたときは、把持位置姿勢、開閉量を、処理数を増やし高精度に探索する。これにより把持成功率を高める。   The flipping may be improved by changing the gripping position / posture, but may be easily prevented by reducing the opening / closing (operation) speed and opening / closing force of the hand. Therefore, when a measurement feature similar to the measurement feature that has been flipped is extracted by the measurement feature identification unit 52 by the similarity evaluation as in the above equation (4), the opening / closing operation calculation unit 54 opens and closes the hand ( Only the gripping force and the opening / closing (gripping) speed are reduced, and the other parameters are the same as those used for flipping. This increases the success rate of gripping while omitting the processing and increasing the speed. If it still fails, the DB 1 accumulates together the information that the multi-dimensional feature vector of the measurement feature, the extraction failure, the flipping, and the open / close parameter adjustment completed. After that, when a measurement feature similar to the measurement feature that performed the flipping is extracted, the gripping position / posture and the opening / closing amount are searched with high accuracy by increasing the number of processes. This increases the gripping success rate.

把持後の落下は、開閉動作終了時に開閉量が0でないが、その後、供給箱8からワークを引き上げる移送中に開閉量が0になってしまうケースを意味する。この測定のためには、把持動作開始から移送完了までのハンド開閉量をサンプリングする。   The drop after gripping means a case where the opening / closing amount is not zero at the end of the opening / closing operation, but thereafter the opening / closing amount becomes zero during the transfer of lifting the workpiece from the supply box 8. For this measurement, the amount of opening and closing of the hand from the start of the gripping operation to the completion of transfer is sampled.

把持後の落下は、把持位置姿勢に大きく影響される。特に、ワークの重心位置から離れた部分を把持してしまったときに起こりやすい。そこで、計測特徴識別部52で上記式(4)のような類似度評価により、把持後の落下をおこした計測特徴に類似した計測特徴が抽出されたときは、把持動作演算部53で把持位置姿勢の探索処理数を増やす、特に、ワーク形状を使わないのであれば、抽出したセグメントの重心位置に近づくように把持位置姿勢を補正する。ワーク形状を使う場合は、ワーク重心に近づくように把持位置姿勢を補正する。把持位置姿勢補正は、ハンドのモデルマッチング時の評価値に重心からのユークリッド距離を加えることで実現する。他のパラメータは、取り出し落下時のものを流用する。これにより、処理を省略し高速化しつつ把持成功率を高める。   The fall after gripping is greatly influenced by the gripping position and posture. In particular, this is likely to occur when a part of the workpiece away from the center of gravity position is gripped. Accordingly, when a measurement feature similar to the measurement feature that has dropped after gripping is extracted by the similarity evaluation as in the above equation (4) by the measurement feature identification unit 52, the gripping operation calculation unit 53 performs the gripping position. If the number of posture search processes is increased, especially if the workpiece shape is not used, the grip position / posture is corrected so as to approach the center of gravity of the extracted segment. When the workpiece shape is used, the gripping position / posture is corrected so as to approach the workpiece center of gravity. The gripping position / orientation correction is realized by adding the Euclidean distance from the center of gravity to the evaluation value at the time of hand model matching. As for other parameters, the parameters at the time of taking out and dropping are used. As a result, the success rate of gripping is increased while the processing is omitted and the processing speed is increased.

以上のように、実施の形態2により、ハンド3の開閉量をもとに、取り出し失敗分類部55で失敗原因を分類し、失敗原因に応じたパラメータのみを再調節することで、処理を省略し高速化しつつ把持成功率を高めることができる。   As described above, according to the second embodiment, the extraction failure classification unit 55 classifies the cause of failure based on the opening / closing amount of the hand 3, and only the parameter corresponding to the cause of failure is readjusted, thereby omitting the processing. However, the success rate of gripping can be increased while speeding up.

実施の形態3.
図6は、この発明の実施の形態3によるワーク取り出し装置を示す構成図である。この実施の形態では、取り出し成否、失敗分類を供給箱8の外部に設置した第2のセンサ9の計測情報をもとに実現する。
Embodiment 3 FIG.
FIG. 6 is a block diagram showing a workpiece picking apparatus according to Embodiment 3 of the present invention. In this embodiment, the success / failure extraction and failure classification are realized based on the measurement information of the second sensor 9 installed outside the supply box 8.

第2のセンサ9では、ワークを把持したかどうかは、開閉動作後にハンドが閉じきっているかを計測することで実現する。これは、ハンド3が把持後に供給箱8の一定の位置を通過する場合、そこを第2のセンサ9で計測し、把持していないときのデータとの差分により実現する。この場合は、DB1に把持していないハンドが計測されたデータを蓄えておく。これにより取り出し成否を判定できる。   In the second sensor 9, whether or not the workpiece is gripped is realized by measuring whether the hand is fully closed after the opening / closing operation. This is realized by the difference from the data when the hand 3 passes through a certain position of the supply box 8 after gripping and is measured by the second sensor 9 and is not gripped. In this case, data obtained by measuring a hand not gripped in DB1 is stored. Thereby, the success or failure of extraction can be determined.

また、取り出しが失敗した場合、開閉動作中に供給箱8内にバラ積みされたワークが振動や移動をしたかを判定することで、ワークに干渉せずに取り出しを空振りしたのか、ワークに干渉して取り出しを失敗したのかを判断することができる。この判断は、把持アプローチ直前のバラ積みワークを計測し、把持中に計測したバラ積みワークに位置変化があるかを判定することで実現し、差分による変化領域検出、やエッジ検出とエッジ移動判定、オプティカルフロー判定などを用いてもよい。   In addition, if the removal fails, it is determined whether the workpiece stacked in the supply box 8 vibrates or moves during the opening / closing operation. Thus, it is possible to determine whether the retrieval has failed. This determination is realized by measuring the bulk workpiece just before the gripping approach and determining whether there is a change in position of the bulk workpiece measured during gripping. Change area detection based on difference, edge detection and edge movement determination Optical flow determination or the like may be used.

ワークに干渉せずに空振りする場合、第1のセンサ2での計測データの乱れが影響して、高さ情報に誤差があった可能性が高い。そこで、計測特徴識別部52で上記式(4)のような類似度評価により、空振りをおこした計測特徴に類似した計測特徴が抽出されたときは、第1のセンサ2に対して計測命令を送り、計測回数を増やす。複数計測データを使い平滑化処理をおこなうことで計測データの乱れを抑え、それを計測データとすることで、把持成功率を高める。   When swinging without interfering with the workpiece, there is a high possibility that there is an error in the height information due to the disturbance of the measurement data in the first sensor 2. Therefore, when a measurement feature similar to the measurement feature that has been swung is extracted by the measurement feature identification unit 52 based on the similarity evaluation as in the above formula (4), a measurement command is issued to the first sensor 2. Increase the number of feeds and measurements. By performing smoothing using a plurality of measurement data, the measurement data is prevented from being disturbed, and the measurement data is used to increase the gripping success rate.

ワークへの干渉は、ハンドの開閉量が大きく影響する。そこで、計測特徴識別部52で上記式(4)のような類似度評価により、ワークに干渉した計測特徴に類似した計測特徴が抽出されたときは、開閉動作演算部54でハンドの開閉量の演算をより精確にするために、異なる開閉量のハンドモデルの数を増やすことで上述の「ハンドのマッチング方法の別の方法」によるハンドモデルマッチングにより実現する。他のパラメータは、ワーク干渉時のパラメータを流用する。   Interference with the work is greatly affected by the amount of opening and closing of the hand. Therefore, when the measurement feature similar to the measurement feature that interfered with the workpiece is extracted by the similarity evaluation as in the above formula (4) by the measurement feature identification unit 52, the opening / closing operation calculation unit 54 determines the opening / closing amount of the hand. In order to make the calculation more accurate, the number of hand models having different opening / closing amounts is increased to realize the hand model matching by the above-mentioned “another method of hand matching method”. For other parameters, the parameters at the time of workpiece interference are used.

以上のように、実施の形態3により、第2のセンサ9の計測データをもとに、取り出し失敗分類部55で失敗原因を分類し、失敗原因に応じたパラメータのみを再調節することで、処理を省略し高速化しつつ把持成功率を高めることができる。   As described above, according to the third embodiment, based on the measurement data of the second sensor 9, the failure failure classification unit 55 classifies the failure cause, and only the parameter corresponding to the failure cause is readjusted. It is possible to increase the success rate of gripping while omitting the processing and increasing the speed.

実施の形態4.
図7は、この発明の実施の形態4によるワーク取り出し装置を示す構成図である。この実施の形態では、ハンド3の開閉量と、第2のセンサ9の計測データより、取り出し失敗分類部55で失敗分類をおこなう。この実施の形態では、ワークの取り出し失敗を、空振り、干渉による掴みそこね、弾き飛ばし、把持後の落下の4種類に分類できる。
空振りの場合は、これまでの説明による方法で、計測命令による複数計測の平滑結果を計測データとして使う。
干渉による掴みそこねの場合は、開閉動作演算部54においてハンドの開閉量をより高精度に調整する。
弾き飛ばしのときは、開閉動作演算部54においてハンドの開閉量、開閉速度を調整する。
把持後の落下のときは、把持動作演算部53において把持位置姿勢を調整する。
以上のように、実施の形態4により、失敗分類を増やし、適したパラメータのみを調整することで、処理を省略し高速化しながら、把持成功率を高めることができる。
Embodiment 4 FIG.
FIG. 7 is a block diagram showing a workpiece picking apparatus according to Embodiment 4 of the present invention. In this embodiment, the extraction failure classification unit 55 performs failure classification based on the opening / closing amount of the hand 3 and the measurement data of the second sensor 9. In this embodiment, the workpiece take-out failure can be classified into four types: idle swinging, unreasonable gripping, flipping off, and dropping after gripping.
In the case of idling, the smoothing result of multiple measurements by the measurement command is used as measurement data by the method described so far.
In the case where the object is not grasped by interference, the opening / closing operation calculation unit 54 adjusts the opening / closing amount of the hand with higher accuracy.
When flipping, the opening / closing operation calculating unit 54 adjusts the opening / closing amount and opening / closing speed of the hand.
When dropping after gripping, the gripping motion calculation unit 53 adjusts the gripping position and orientation.
As described above, according to the fourth embodiment, by increasing the failure classification and adjusting only suitable parameters, it is possible to increase the success rate of grasping while omitting the processing and increasing the speed.

なおこの発明は上記各実施の形態に限定されるものではなく、これらの実施の形態の特徴の可能な組み合わせを全て含むことは云うまでもない。   The present invention is not limited to the above-described embodiments, and it goes without saying that all possible combinations of the features of these embodiments are included.

1 DB、2 (第1の)センサ、3 ハンド、4 ロボット、5 情報処理部、6 制御部、7 ワーク、8 供給箱、9 第2のセンサ、51 計測特徴抽出部、52 計測特徴識別部、53 把持動作演算部、54 開閉動作演算部、55 取り出し失敗分類部。   1 DB, 2 (first) sensor, 3 hand, 4 robot, 5 information processing unit, 6 control unit, 7 work, 8 supply box, 9 second sensor, 51 measurement feature extraction unit, 52 measurement feature identification unit , 53 Grasping operation calculation unit, 54 Opening / closing operation calculation unit, 55 Extraction failure classification unit.

Claims (9)

少なくともワークの計測特徴と取り出し成否を蓄えるDBと、
バラ積みされたワークを計測するセンサと、
バラ積みされたワークを把持するハンドと、
ハンドを把持位置姿勢に移動するロボットと、
計測データから把持位置姿勢と、開閉量・把持力・動作速度のうち少なくとも1つの動作制御情報を計算する情報処理部と、
前記情報処理部の動作制御情報に従って前記ロボットへ把持動作を指令し、ハンドへ開閉動作を指令する制御部と、
を備え、
前記情報処理部が、
計測データから把持位置姿勢を計算するための計測特徴を抽出する計測特徴抽出部と、
前記DBの情報から前記計測特徴を識別する計測特徴識別部と、
前記計測特徴と前記DBの情報からワークを把持するハンドの把持位置姿勢を計算する把持動作演算部と、
前記計測特徴、把持位置姿勢、前記DBの情報からハンドの開閉に関するパラメータを計算する開閉動作演算部と、
を含み、
前記計測特徴識別部では、前記DBに蓄えられた過去の計測特徴と取り出し成否の情報をもとに、前記計測特徴識別部で抽出した計測特徴の取り出し成功率を推定し、
前記開閉動作演算部では、推定した取り出し成功率に応じて、ハンドの開閉量・把持力・動作速度のうち少なくとも1つを調整する、
ことを特徴とするワーク取り出し装置。
DB that stores at least the measurement features and success / failure of the workpiece,
A sensor that measures the stacked workpieces;
A hand for gripping the stacked workpieces;
A robot that moves the hand to the holding position and posture;
An information processing unit that calculates at least one motion control information among the gripping position and orientation and the opening / closing amount, gripping force, and motion speed from the measurement data;
A control unit that commands a gripping operation to the robot according to the operation control information of the information processing unit, and commands an opening / closing operation to a hand;
With
The information processing unit
A measurement feature extraction unit that extracts measurement features for calculating a gripping position and orientation from measurement data;
A measurement feature identifying unit for identifying the measurement feature from the information in the DB;
A gripping operation calculator that calculates a gripping position and orientation of a hand that grips a workpiece from the measurement feature and the information of the DB;
An opening / closing operation calculating unit for calculating a parameter related to opening / closing of the hand from the measurement feature, gripping position / posture, and information of the DB;
Including
The measurement feature identification unit estimates the success rate of extraction of the measurement feature extracted by the measurement feature identification unit based on the past measurement feature stored in the DB and information on the success or failure of the extraction,
The opening / closing operation calculation unit adjusts at least one of the opening / closing amount, gripping force, and operation speed of the hand according to the estimated extraction success rate.
A workpiece picking device characterized by the above.
前記情報処理部が、前記ハンドとワークの少なくとも一方の状態から、取り出し失敗原因を分類して失敗分類の情報として前記DBに格納する取り出し失敗分類部をさらに含み、
前記DBに蓄えられた過去の計測特徴と失敗分類の情報をもとに、前記把持動作演算部で演算する把持位置姿勢と、前記開閉動作演算部で演算する開閉量・把持力・動作速度のうち、失敗に影響するパラメータを調整する、
ことを特徴とする請求項1に記載のワーク取り出し装置。
The information processing unit further includes a take-out failure classifying unit that classifies the cause of take-out failure from at least one state of the hand and the work and stores it in the DB as failure class information,
Based on the past measurement characteristics and failure classification information stored in the DB, the gripping position / posture calculated by the gripping motion calculation unit and the opening / closing amount / grip force / motion speed calculated by the opening / closing motion calculation unit Adjust the parameters that affect the failure,
The workpiece picking apparatus according to claim 1.
前記計測特徴抽出部で抽出した計測特徴が、前記計測特徴識別部で過去の計測特徴と類似していると識別され、かつ該過去の計測特徴の把持成功率が高いと推定されるとき、前記把持動作演算部と前記開閉動作演算部で把持位置姿勢、開閉量・把持力・動作速度のうち少なくとも一つを、過去の類似の計測特徴を把持したときと同じものを流用することを特徴とする請求項1または2に記載のワーク取り出し装置。   When the measurement feature extracted by the measurement feature extraction unit is identified as similar to the past measurement feature by the measurement feature identification unit, and when it is estimated that the past measurement feature has a high grasp success rate, The gripping motion calculation unit and the opening / closing motion calculation unit use at least one of the gripping position / posture, the opening / closing amount, the gripping force, and the operating speed, the same as when gripping a similar measurement feature in the past. The workpiece picking device according to claim 1 or 2. 前記計測特徴抽出部で抽出した計測特徴が、前記計測特徴識別部で取り出し成功率が低いと推定されたとき、前記把持動作演算部でおこなう把持位置姿勢探索の処理数を増やすことを特徴とする請求項1から3までのいずか1項に記載のワーク取り出し装置。   When the measurement feature extracted by the measurement feature extraction unit is estimated by the measurement feature identification unit to have a low success rate, the number of gripping position / posture searches performed by the gripping motion calculation unit is increased. The workpiece picking device according to any one of claims 1 to 3. 前記計測特徴抽出部で抽出した計測特徴が、前記計測特徴識別部で取り出し成功率が低いと推定されたとき、前記把持動作識別部で把持力・動作速度のうち少なくとも一つを下げることを特徴とする請求項1から4までのいずれか1項に記載のワーク取り出し装置。   When the measurement feature extracted by the measurement feature extraction unit is estimated by the measurement feature identification unit to have a low success rate, at least one of gripping force / speed is reduced by the gripping operation identification unit. The workpiece take-out apparatus according to any one of claims 1 to 4. 前記取り出し成否の情報を、前記ハンドの開閉量から推定することを特徴とする請求項1から5までのいずれか1項に記載のワーク取り出し装置。   6. The workpiece picking apparatus according to claim 1, wherein the information on the success or failure of the picking is estimated from an opening / closing amount of the hand. 前記失敗分類の情報を、前記ハンドの開閉量から推定することを特徴とする請求項2から6までのいずれか1項に記載のワーク取り出し装置。   7. The workpiece picking apparatus according to claim 2, wherein the failure classification information is estimated from an opening / closing amount of the hand. 取り出しの瞬間の前記ハンドの動作を計測する第2のセンサを備え、前記取り出し成否を前記第2のセンサにより得られる計測データから推定することを特徴とする請求項1から7までのいずれか1項に記載のワーク取り出し装置。   8. The apparatus according to claim 1, further comprising a second sensor that measures an operation of the hand at the moment of extraction, wherein the success or failure of the extraction is estimated from measurement data obtained by the second sensor. The workpiece take-out device according to item. 取り出しの瞬間の前記ハンドの動作を計測する第2のセンサを備え、前記失敗分類を前記第2のセンサにより得られる計測データから推定することを特徴とする請求項2から8までのいずれか1項に記載のワーク取り出し装置。   9. The apparatus according to claim 2, further comprising a second sensor that measures the movement of the hand at the moment of extraction, wherein the failure classification is estimated from measurement data obtained by the second sensor. The workpiece take-out device according to item.
JP2011193918A 2011-09-06 2011-09-06 Work picking device Active JP5623358B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011193918A JP5623358B2 (en) 2011-09-06 2011-09-06 Work picking device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011193918A JP5623358B2 (en) 2011-09-06 2011-09-06 Work picking device

Publications (2)

Publication Number Publication Date
JP2013052490A true JP2013052490A (en) 2013-03-21
JP5623358B2 JP5623358B2 (en) 2014-11-12

Family

ID=48129943

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011193918A Active JP5623358B2 (en) 2011-09-06 2011-09-06 Work picking device

Country Status (1)

Country Link
JP (1) JP5623358B2 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015089589A (en) * 2013-11-05 2015-05-11 ファナック株式会社 Method and apparatus for extracting bulked article by using robot
JP2015100866A (en) * 2013-11-22 2015-06-04 三菱電機株式会社 Robot simulation device, program, recording medium and method
US9469035B2 (en) 2011-06-29 2016-10-18 Mitsubishi Electric Corporation Component supply apparatus
JP2017030135A (en) * 2015-07-31 2017-02-09 ファナック株式会社 Machine learning apparatus, robot system, and machine learning method for learning workpiece take-out motion
JP2017185577A (en) * 2016-04-04 2017-10-12 ファナック株式会社 Machine learning device for performing learning by use of simulation result, mechanical system, manufacturing system and machine learning method
JP2018034285A (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft including the same
JP2018034284A (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft including the same
JP2018039059A (en) * 2016-09-05 2018-03-15 国立大学法人信州大学 Gripping device, gripping method and program
CN107866809A (en) * 2016-09-27 2018-04-03 发那科株式会社 Learn the machine learning device and machine learning method in optimal Article gripping path
CN108340367A (en) * 2017-12-13 2018-07-31 深圳市鸿益达供应链科技有限公司 Machine learning method for mechanical arm crawl
JP2018149628A (en) * 2017-03-13 2018-09-27 ファナック株式会社 Robot system taking-up work-piece by use of measurement data corrected by machine learning, measurement data processing device and measurement data processing method
US20180281181A1 (en) * 2017-03-31 2018-10-04 Fanuc Corporation Robot controller, machine learning device and machine learning method
JP2018536550A (en) * 2015-12-02 2018-12-13 クゥアルコム・インコーポレイテッドQualcomm Incorporated Active camera movement determination for object position and range in 3D space
JP2018202550A (en) * 2017-06-05 2018-12-27 株式会社日立製作所 Machine learning device, machine learning method, and machine learning program
CN109409398A (en) * 2017-08-17 2019-03-01 佳能株式会社 Image processing apparatus, image processing method and storage medium
JP2019508273A (en) * 2016-03-03 2019-03-28 グーグル エルエルシー Deep-layer machine learning method and apparatus for grasping a robot
JP2019509905A (en) * 2016-03-03 2019-04-11 グーグル エルエルシー Deep machine learning method and apparatus for robot gripping
JPWO2018092860A1 (en) * 2016-11-16 2019-04-11 三菱電機株式会社 Interference avoidance device
DE102016009030B4 (en) 2015-07-31 2019-05-09 Fanuc Corporation Machine learning device, robot system and machine learning system for learning a workpiece receiving operation
JP2019076972A (en) * 2017-10-20 2019-05-23 株式会社安川電機 Automation apparatus
CN109814615A (en) * 2017-11-22 2019-05-28 发那科株式会社 Control device and machine learning device
JP2019093461A (en) * 2017-11-20 2019-06-20 株式会社安川電機 Holding system, learning device, holding method and model manufacturing method
CN109997108A (en) * 2016-07-18 2019-07-09 L·奥德纳 Image training robot motion arm
DE102018000730B4 (en) * 2017-02-06 2019-10-31 Fanuc Corporation Workpiece receiving device and workpiece receiving method for improving the picking operation of a workpiece
WO2019239562A1 (en) * 2018-06-14 2019-12-19 ヤマハ発動機株式会社 Machine learning device and robot system provided with same
WO2020021643A1 (en) * 2018-07-24 2020-01-30 株式会社Fuji End effector selection method and selection system
JP2020110920A (en) * 2017-09-25 2020-07-27 ファナック株式会社 Device, robot system, model generation method, and model generation program
US10807235B2 (en) 2015-07-31 2020-10-20 Fanuc Corporation Machine learning device, robot controller, robot system, and machine learning method for learning action pattern of human
JP2020532440A (en) * 2017-09-01 2020-11-12 カリフォルニア大学The Regents of the University of California Robot systems and methods for robustly gripping and targeting objects
JP2021010995A (en) * 2019-07-09 2021-02-04 株式会社日立製作所 Robot control device and robot
US11034018B2 (en) 2017-05-31 2021-06-15 Preferred Networks, Inc. Learning device, learning method, learning model, detection device and grasping system
WO2021251259A1 (en) * 2020-06-09 2021-12-16 ファナック株式会社 Workpiece unloading device
JP7237249B1 (en) * 2022-02-25 2023-03-10 三菱電機株式会社 ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM
DE102018215826B4 (en) 2017-09-25 2023-07-13 Fanuc Corporation Robot system and workpiece gripping method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000304509A (en) * 1999-04-21 2000-11-02 Matsushita Electric Works Ltd Method and device for specifying object
JP2004268160A (en) * 2003-03-05 2004-09-30 Sharp Corp Robot hand and its control method
JP2007245283A (en) * 2006-03-15 2007-09-27 Nissan Motor Co Ltd Workpiece attitude detecting device, workpiece attitude detecting method, picking system, and picking method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000304509A (en) * 1999-04-21 2000-11-02 Matsushita Electric Works Ltd Method and device for specifying object
JP2004268160A (en) * 2003-03-05 2004-09-30 Sharp Corp Robot hand and its control method
JP2007245283A (en) * 2006-03-15 2007-09-27 Nissan Motor Co Ltd Workpiece attitude detecting device, workpiece attitude detecting method, picking system, and picking method

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9469035B2 (en) 2011-06-29 2016-10-18 Mitsubishi Electric Corporation Component supply apparatus
JP2015089589A (en) * 2013-11-05 2015-05-11 ファナック株式会社 Method and apparatus for extracting bulked article by using robot
CN104608126A (en) * 2013-11-05 2015-05-13 发那科株式会社 Apparatus and method for picking up article randomly piled using robot
US9415511B2 (en) 2013-11-05 2016-08-16 Fanuc Corporation Apparatus and method for picking up article randomly piled using robot
CN104608126B (en) * 2013-11-05 2016-08-24 发那科株式会社 The apparatus and method of bulk product are taken out with robot
JP2015100866A (en) * 2013-11-22 2015-06-04 三菱電機株式会社 Robot simulation device, program, recording medium and method
US11904469B2 (en) 2015-07-31 2024-02-20 Fanuc Corporation Machine learning device, robot controller, robot system, and machine learning method for learning action pattern of human
US11780095B2 (en) 2015-07-31 2023-10-10 Fanuc Corporation Machine learning device, robot system, and machine learning method for learning object picking operation
JP2017064910A (en) * 2015-07-31 2017-04-06 ファナック株式会社 Machine learning device for learning taking-out operation of workpiece, robot system, and machine learning method
CN113199483A (en) * 2015-07-31 2021-08-03 发那科株式会社 Robot system, robot control method, machine learning device, and machine learning method
US10717196B2 (en) 2015-07-31 2020-07-21 Fanuc Corporation Machine learning device, robot system, and machine learning method for learning workpiece picking operation
JP2020168719A (en) * 2015-07-31 2020-10-15 ファナック株式会社 Robot system and control method of robot
JP7100426B2 (en) 2015-07-31 2022-07-13 ファナック株式会社 Machine learning device, robot system and machine learning method to learn the operation of taking out the work
US10807235B2 (en) 2015-07-31 2020-10-20 Fanuc Corporation Machine learning device, robot controller, robot system, and machine learning method for learning action pattern of human
DE102016015873B3 (en) * 2015-07-31 2020-10-29 Fanuc Corporation Machine learning apparatus, robot system, and machine learning system for learning a workpiece pick-up operation
DE102016009030B4 (en) 2015-07-31 2019-05-09 Fanuc Corporation Machine learning device, robot system and machine learning system for learning a workpiece receiving operation
JP2022145915A (en) * 2015-07-31 2022-10-04 ファナック株式会社 Inference method, inference program, inference device, learning method, learning program, learning device, and model generation method
CN106393102A (en) * 2015-07-31 2017-02-15 发那科株式会社 Machine learning device, robot system, and machine learning method
JP2017030135A (en) * 2015-07-31 2017-02-09 ファナック株式会社 Machine learning apparatus, robot system, and machine learning method for learning workpiece take-out motion
JP2018536550A (en) * 2015-12-02 2018-12-13 クゥアルコム・インコーポレイテッドQualcomm Incorporated Active camera movement determination for object position and range in 3D space
US10639792B2 (en) 2016-03-03 2020-05-05 Google Llc Deep machine learning methods and apparatus for robotic grasping
US11548145B2 (en) 2016-03-03 2023-01-10 Google Llc Deep machine learning methods and apparatus for robotic grasping
JP2019508273A (en) * 2016-03-03 2019-03-28 グーグル エルエルシー Deep-layer machine learning method and apparatus for grasping a robot
JP2019509905A (en) * 2016-03-03 2019-04-11 グーグル エルエルシー Deep machine learning method and apparatus for robot gripping
JP2019217632A (en) * 2016-03-03 2019-12-26 グーグル エルエルシー Deep machine learning method and apparatus for robotic grasping
US10946515B2 (en) 2016-03-03 2021-03-16 Google Llc Deep machine learning methods and apparatus for robotic grasping
US11045949B2 (en) 2016-03-03 2021-06-29 Google Llc Deep machine learning methods and apparatus for robotic grasping
JP2017185577A (en) * 2016-04-04 2017-10-12 ファナック株式会社 Machine learning device for performing learning by use of simulation result, mechanical system, manufacturing system and machine learning method
US10317854B2 (en) 2016-04-04 2019-06-11 Fanuc Corporation Machine learning device that performs learning using simulation result, machine system, manufacturing system, and machine learning method
CN109997108A (en) * 2016-07-18 2019-07-09 L·奥德纳 Image training robot motion arm
JP7145843B2 (en) 2016-07-18 2022-10-03 ラエル オドナー, Robot manipulator training
JP2019531908A (en) * 2016-07-18 2019-11-07 ラエル オドナー, Robotic manipulator training
JP2018034284A (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft including the same
US10471596B2 (en) 2016-09-02 2019-11-12 Prodrone Co., Ltd. Robot arm and unmanned aerial vehicle equipped with the robot arm
WO2018042692A1 (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft provided with same
WO2018042693A1 (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft provided with same
JP2018034285A (en) * 2016-09-02 2018-03-08 株式会社プロドローン Robot arm and unmanned aircraft including the same
JP2018039059A (en) * 2016-09-05 2018-03-15 国立大学法人信州大学 Gripping device, gripping method and program
CN107866809A (en) * 2016-09-27 2018-04-03 发那科株式会社 Learn the machine learning device and machine learning method in optimal Article gripping path
US10692018B2 (en) 2016-09-27 2020-06-23 Fanuc Corporation Machine learning device and machine learning method for learning optimal object grasp route
JPWO2018092860A1 (en) * 2016-11-16 2019-04-11 三菱電機株式会社 Interference avoidance device
DE102018000730B4 (en) * 2017-02-06 2019-10-31 Fanuc Corporation Workpiece receiving device and workpiece receiving method for improving the picking operation of a workpiece
US10603790B2 (en) 2017-02-06 2020-03-31 Fanuc Corporation Workpiece picking device and workpiece picking method for improving picking operation of workpieces
US10350752B2 (en) 2017-03-13 2019-07-16 Fanuc Corporation Robot system, measurement data processing device and measurement data processing method for picking out workpiece using measurement data corrected by means of machine learning
JP2018149628A (en) * 2017-03-13 2018-09-27 ファナック株式会社 Robot system taking-up work-piece by use of measurement data corrected by machine learning, measurement data processing device and measurement data processing method
US20180281181A1 (en) * 2017-03-31 2018-10-04 Fanuc Corporation Robot controller, machine learning device and machine learning method
JP2018171684A (en) * 2017-03-31 2018-11-08 ファナック株式会社 Robot control device, machine learning device and machine learning method
US10549422B2 (en) 2017-03-31 2020-02-04 Fanuc Corporation Robot controller, machine learning device and machine learning method
US11565407B2 (en) 2017-05-31 2023-01-31 Preferred Networks, Inc. Learning device, learning method, learning model, detection device and grasping system
US11034018B2 (en) 2017-05-31 2021-06-15 Preferred Networks, Inc. Learning device, learning method, learning model, detection device and grasping system
JP7045139B2 (en) 2017-06-05 2022-03-31 株式会社日立製作所 Machine learning equipment, machine learning methods, and machine learning programs
JP2018202550A (en) * 2017-06-05 2018-12-27 株式会社日立製作所 Machine learning device, machine learning method, and machine learning program
CN109409398B (en) * 2017-08-17 2022-04-26 佳能株式会社 Image processing apparatus, image processing method, and storage medium
CN109409398A (en) * 2017-08-17 2019-03-01 佳能株式会社 Image processing apparatus, image processing method and storage medium
JP2019036167A (en) * 2017-08-17 2019-03-07 キヤノン株式会社 Image processing apparatus and image processing method
JP2020532440A (en) * 2017-09-01 2020-11-12 カリフォルニア大学The Regents of the University of California Robot systems and methods for robustly gripping and targeting objects
DE102018215826B4 (en) 2017-09-25 2023-07-13 Fanuc Corporation Robot system and workpiece gripping method
JP2020110920A (en) * 2017-09-25 2020-07-27 ファナック株式会社 Device, robot system, model generation method, and model generation program
US11845194B2 (en) 2017-09-25 2023-12-19 Fanuc Corporation Robot system and workpiece picking method
JP2019076972A (en) * 2017-10-20 2019-05-23 株式会社安川電機 Automation apparatus
JP2019093461A (en) * 2017-11-20 2019-06-20 株式会社安川電機 Holding system, learning device, holding method and model manufacturing method
US11338435B2 (en) 2017-11-20 2022-05-24 Kabushiki Kaisha Yaskawa Denki Gripping system with machine learning
DE102018009008B4 (en) 2017-11-22 2022-03-31 Fanuc Corporation Control device and machine learning device
CN109814615A (en) * 2017-11-22 2019-05-28 发那科株式会社 Control device and machine learning device
CN109814615B (en) * 2017-11-22 2021-03-02 发那科株式会社 Control device and machine learning device
CN108340367A (en) * 2017-12-13 2018-07-31 深圳市鸿益达供应链科技有限公司 Machine learning method for mechanical arm crawl
JPWO2019239562A1 (en) * 2018-06-14 2021-04-22 ヤマハ発動機株式会社 Machine learning device and robot system equipped with it
WO2019239562A1 (en) * 2018-06-14 2019-12-19 ヤマハ発動機株式会社 Machine learning device and robot system provided with same
CN112135719B (en) * 2018-06-14 2023-08-22 雅马哈发动机株式会社 Machine learning device and robot system provided with same
CN112135719A (en) * 2018-06-14 2020-12-25 雅马哈发动机株式会社 Machine learning device and robot system provided with same
US11945115B2 (en) 2018-06-14 2024-04-02 Yamaha Hatsudoki Kabushiki Kaisha Machine learning device and robot system provided with same
JP7133017B2 (en) 2018-07-24 2022-09-07 株式会社Fuji END EFFECTOR SELECTION METHOD AND SELECTION SYSTEM
JPWO2020021643A1 (en) * 2018-07-24 2021-08-02 株式会社Fuji End effector selection method and selection system
WO2020021643A1 (en) * 2018-07-24 2020-01-30 株式会社Fuji End effector selection method and selection system
JP7229115B2 (en) 2019-07-09 2023-02-27 株式会社日立製作所 Robot controller and robot
JP2021010995A (en) * 2019-07-09 2021-02-04 株式会社日立製作所 Robot control device and robot
WO2021251259A1 (en) * 2020-06-09 2021-12-16 ファナック株式会社 Workpiece unloading device
JP7464710B2 (en) 2020-06-09 2024-04-09 ファナック株式会社 Work removal device
JP7237249B1 (en) * 2022-02-25 2023-03-10 三菱電機株式会社 ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM

Also Published As

Publication number Publication date
JP5623358B2 (en) 2014-11-12

Similar Documents

Publication Publication Date Title
JP5623358B2 (en) Work picking device
JP5558585B2 (en) Work picking device
CN112070818B (en) Robot disordered grabbing method and system based on machine vision and storage medium
US11945114B2 (en) Method and system for grasping an object
Björkman et al. Enhancing visual perception of shape through tactile glances
Goldfeder et al. Data-driven grasping
US11794343B2 (en) System and method for height-map-based grasp execution
US20130006423A1 (en) Target object gripping apparatus, method for controlling the same and storage medium
CN112819135B (en) Sorting method for guiding mechanical arm to grab materials with different poses based on ConvPoint model
US11541534B2 (en) Method and system for object grasping
CN111428731A (en) Multi-class target identification and positioning method, device and equipment based on machine vision
CN115321090B (en) Method, device, equipment, system and medium for automatically receiving and taking luggage in airport
CN116061187B (en) Method for identifying, positioning and grabbing goods on goods shelves by composite robot
CN116197893A (en) Collision processing method in grabbing generation
CN113034575A (en) Model construction method, pose estimation method and object picking device
CN114800533B (en) Sorting control method and system for industrial robot
van Vuuren et al. Towards the autonomous robotic gripping and handling of novel objects
US9418286B2 (en) Information processing device, information processing method, and program
Yasumuro et al. Japanese fingerspelling identification by using MediaPipe
CN102855279B (en) Target fingerprint fast searching method based on minutiae point carina shape
Tan A Human-in-the-Loop Robot Grasping System with Grasp Quality Refinement
CN113361651B (en) Method and computing system for generating safe space list for object detection
US20220371200A1 (en) Robotic system for object size measurement
CN115837985A (en) Disordered grabbing method based on machine vision
Zhang et al. Recognition and position estimation method for stacked untextured parts

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130927

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140516

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140527

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140715

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140826

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140924

R150 Certificate of patent or registration of utility model

Ref document number: 5623358

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250