JP7438734B2 - Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system - Google Patents

Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system Download PDF

Info

Publication number
JP7438734B2
JP7438734B2 JP2019220588A JP2019220588A JP7438734B2 JP 7438734 B2 JP7438734 B2 JP 7438734B2 JP 2019220588 A JP2019220588 A JP 2019220588A JP 2019220588 A JP2019220588 A JP 2019220588A JP 7438734 B2 JP7438734 B2 JP 7438734B2
Authority
JP
Japan
Prior art keywords
tip member
orientation
information
robot hand
around
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2019220588A
Other languages
Japanese (ja)
Other versions
JP2021089672A (en
Inventor
基善 北井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kurashiki Spinning Co Ltd
Original Assignee
Kurashiki Spinning Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kurashiki Spinning Co Ltd filed Critical Kurashiki Spinning Co Ltd
Priority to JP2019220588A priority Critical patent/JP7438734B2/en
Publication of JP2021089672A publication Critical patent/JP2021089672A/en
Application granted granted Critical
Publication of JP7438734B2 publication Critical patent/JP7438734B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

この発明は、先端部材の向き認識方法、先端部材向き合わせ方法、先端部材挿入方法、先端部材向き認識装置及び先端部材向き合わせシステムに関する。 The present invention relates to a method for recognizing the orientation of a tip member, a method for orienting the tip member, a method for inserting the tip member, a device for recognizing the orientation of the tip member, and a system for orienting the tip member.

対象物を三次元カメラ等で認識することにより、その対象物を自律的に把持するロボットの普及が進んでいる。例えば、特開2014-176917号公報(以下、「特許文献1」という。)には、線状体の先端に設けられたコネクタを把持するとともに、このコネクタを所定の接続箇所に接続することが可能なロボット装置が開示されている。 BACKGROUND ART Robots that autonomously grasp an object by recognizing the object using a three-dimensional camera or the like are becoming more and more popular. For example, Japanese Patent Application Laid-Open No. 2014-176917 (hereinafter referred to as "Patent Document 1") describes a method for gripping a connector provided at the tip of a linear body and connecting this connector to a predetermined connection point. A possible robotic device is disclosed.

特開2014-176917号公報Japanese Patent Application Publication No. 2014-176917

特許文献1に記載されるようなロボットを備えるシステムにおいて、線状物の先端に設けられた先端部材(コネクタ等)を所定の挿入部に挿入させることが求められる場合がある。この場合において、挿入部に先端部材を挿入させることが可能な先端部材の姿勢に制限がある場合、その姿勢と異なる姿勢では先端部材を挿入部に挿入させることができない。そのため、先端部材の姿勢を挿入部に挿入させることが可能な姿勢に合わせて挿入する必要がある。 In a system including a robot as described in Patent Document 1, there are cases where it is required to insert a tip member (such as a connector) provided at the tip of a linear object into a predetermined insertion portion. In this case, if there is a restriction on the posture of the tip member that allows the tip member to be inserted into the insertion portion, the tip member cannot be inserted into the insertion portion in a posture different from that posture. Therefore, it is necessary to insert the distal end member in such a manner that it can be inserted into the insertion section.

そこで、あらかじめ先端部材のCADデータ等を登録しておき、先端部材の画像を取得して、取得した画像を登録しておいたCADデータとマッチングをすることで、先端部材の向きを計測・認識する方法が考えられる。しかし、先端部材の材質や形状によっては、取得した画像上で先端部材の輪郭を認識することが難しいため、CADデータとマッチングすることが困難であるという課題がある。また、三次元データ同士のパターンマッチング処理は負荷が大きく、計算時間がかかるという課題があった。 Therefore, by registering the CAD data of the tip member in advance, acquiring an image of the tip member, and matching the acquired image with the registered CAD data, the orientation of the tip member can be measured and recognized. There are ways to do this. However, depending on the material and shape of the tip member, it is difficult to recognize the outline of the tip member on the acquired image, so there is a problem that matching with CAD data is difficult. In addition, pattern matching processing between three-dimensional data is burdensome and requires a long calculation time.

本発明は、上記課題を解決することを目的としており、先端部材の向き認識方法と、先端部材向き合わせ方法と、先端部材挿入方法と、先端部材向き認識装置と、先端部材向き合わせシステムと、を提供することである。 The present invention aims to solve the above problems, and includes a method for recognizing the orientation of a tip member, a method for aligning the tip member, a method for inserting the tip member, a device for recognizing the orientation of the tip member, a system for aligning the tip member, The goal is to provide the following.

この発明の一局面に従った先端部材の向き認識方法は、先端部材を有する対象物をロボットハンドで把持した状態において、前記先端部材の所定の軸まわりの向きを認識する方法であって、前記対象物を前記ロボットハンドで把持した状態において前記先端部材をカメラで撮影し、画像情報を取得する画像情報取得工程と、第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から前記先端部材の前記所定の軸まわりに所定の角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報に基づいて、取得した前記画像情報における前記先端部材の前記所定の軸まわりの向きから所望の向きまでの回転角を算出する算出工程と、を備える。 A method for recognizing the orientation of a tip member according to one aspect of the present invention is a method for recognizing the orientation of the tip member around a predetermined axis in a state where an object having the tip member is gripped by a robot hand, the method comprising: an image information acquisition step of photographing the tip member with a camera while the object is gripped by the robot hand to obtain image information; and first posture information of the tip member in a first posture; The tip in the image information acquired based on comparison information including second attitude information of the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member from the first attitude. and a calculation step of calculating a rotation angle from the orientation of the member around the predetermined axis to a desired orientation.

また、前記先端部材は、正規の姿勢であるときに当該先端部材の挿入を許容する挿入部に挿入されることが可能であって、前記所定の軸は、前記先端部材が前記挿入部に挿入される挿入方向と平行な方向の軸であり、前記所望の向きは、前記先端部材が前記挿入部に挿入可能な向きであることが好ましい。 Further, the tip member can be inserted into an insertion section that allows insertion of the tip member when the tip member is in a normal posture, and the predetermined axis is such that the tip member is inserted into the insertion section. It is preferable that the desired orientation is an orientation in which the tip member can be inserted into the insertion portion.

また、前記第1姿勢情報、及び、前記第2姿勢情報は、前記先端部材を前記所定の軸まわりに回転させて撮像すると変化する特徴量であることが好ましい。 Further, it is preferable that the first attitude information and the second attitude information are feature quantities that change when the distal end member is rotated around the predetermined axis and imaged.

また、前記特徴量は、前記先端部材または先端部材に含まれる形状の幅、面積もしくは外周であることが好ましい。 Moreover, it is preferable that the feature amount is the width, area, or outer circumference of the tip member or a shape included in the tip member.

また、この発明の一局面に従った先端部材向き合わせ方法は、先端部材を有する対象物をロボットハンドで把持した状態において、前記先端部材の所定の軸まわりの向きを正規の向きにする先端部材向き合わせ方法であって、前記対象物を前記ロボットハンドで把持した状態においてカメラで撮影し、画像情報を取得する画像情報取得工程と、第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から前記先端部材の前記所定の軸まわりに所定角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報に基づいて、取得した前記画像情報における前記先端部材の前記所定の軸まわりの向きから前記正規の向きまでの回転角を算出する算出工程と、前記ロボットハンドによって前記回転角だけ前記対象物を前記所定の軸まわりに回転させる回転工程と、を備える。 Further, in a method for orienting a tip member according to one aspect of the present invention, in a state where an object having a tip member is gripped by a robot hand, the tip member is oriented in a normal direction around a predetermined axis of the tip member. The facing method includes an image information acquisition step of photographing the object with a camera in a state gripped by the robot hand to obtain image information, and first posture information of the tip member in a first posture. , and the image information acquired based on comparison information including second attitude information of the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member from the first attitude. a calculation step of calculating a rotation angle from the orientation of the tip member around the predetermined axis to the normal orientation; and a rotation step of rotating the object around the predetermined axis by the rotation angle by the robot hand. and.

この先端部材向き合わせ方法では、比較情報に基づいて、取得した画像情報における先端部材の所定の軸まわりの向きから正規の向きまでの回転角が算出され、その回転角だけロボットハンドが対象物を回転させるため、先端部材が正規の向きに合わせられる。 In this tip member orientation method, the rotation angle from the orientation of the tip member around a predetermined axis in the acquired image information to the normal orientation is calculated based on the comparison information, and the robot hand moves the object by that rotation angle. For rotation, the tip member is oriented.

また、この発明の一局面に従った先端部材挿入方法は、先端部材を有する対象物をロボットハンドで把持した状態において、前記先端部材を、前記先端部材の所定の軸まわりの向きが正規の向きであるときにのみ当該先端部材の挿入を許容する挿入部に挿入する先端部材挿入方法であって、前記先端部材向き合わせ方法によって、前記先端部材の向きを合わせ、前記回転工程後、あるいは、前記回転工程と同時に、前記ロボットハンドによって前記先端部材を前記挿入部に挿入する挿入工程をさらに備える。 Further, in the tip member insertion method according to one aspect of the present invention, in a state where an object having the tip member is gripped by a robot hand, the tip member is inserted into the tip member such that the tip member is oriented in a normal direction around a predetermined axis of the tip member. A tip member insertion method in which the tip member is inserted into an insertion section that allows insertion of the tip member only when The method further includes an insertion step of inserting the tip member into the insertion portion using the robot hand simultaneously with the rotation step.

このようにすれば、先端部材が挿入部に挿入される。 In this way, the tip member is inserted into the insertion section.

また、この発明の一局面に従った先端部材向き認識装置は、先端部材を有する対象物をロボットハンドで把持した状態において、前記先端部材の所定の軸まわりの向きを認識する先端部材の向き認識装置であって、前記ロボットハンドによって前記対象物が把持された状態において前記先端部材を撮像する撮像部と、第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から先端部材の前記所定の軸まわりに所定の角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報を記憶する記憶部と、前記比較情報に基づいて、取得した画像情報における前記先端部材の前記所定の軸まわりの向きから所望の向きまでの回転角を算出する算出部と、を備える。 Further, a tip member orientation recognition device according to one aspect of the present invention includes a tip member orientation recognition device that recognizes the orientation of the tip member around a predetermined axis when an object having the tip member is gripped by a robot hand. The apparatus includes: an imaging unit that captures an image of the tip member in a state where the object is gripped by the robot hand; first posture information of the tip member in a first posture; and first posture information of the tip member in a first posture. a storage unit that stores comparison information including second attitude information of the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member; A calculation unit that calculates a rotation angle from the orientation of the tip member around the predetermined axis to a desired orientation in image information.

また、この発明の一局面に従った先端部材向き合わせシステムは、前記先端部材向き認識装置と、前記ロボットハンドを有するロボットと、を備え、前記ロボットは、前記ロボットハンドによって前記回転角だけ前記対象物を前記所定の軸まわりに回転させる。 Further, a tip member orientation system according to one aspect of the present invention includes the tip member orientation recognition device and a robot having the robot hand, and the robot moves the robot hand toward the object by the rotation angle. Rotate the object around the predetermined axis.

以上に説明したように、この発明によれば、安定して先端部材の向きを認識し、正規の角度に先端部材の向きを合わせることができる先端部材の向き認識方法と、先端部材向き合わせ方法と、先端部材挿入方法と、先端部材向き認識装置と、先端部材向き合わせシステムと、を提供することができる。 As explained above, according to the present invention, there is provided a method for recognizing the orientation of a tip member and a method for orienting the tip member, which can stably recognize the orientation of the tip member and align the orientation of the tip member at a regular angle. A tip member insertion method, a tip member orientation recognition device, and a tip member orientation system can be provided.

本発明の一実施形態の先端部材向き合わせシステムを概略的に示す斜視図である。1 is a perspective view schematically illustrating a tip member orientation system of an embodiment of the present invention. FIG. 図1に示される角度とは異なる角度における先端部材向き合わせシステムの斜視図である。2 is a perspective view of the tip orientation system at a different angle than that shown in FIG. 1; FIG. 先端部材向き認識装置及びロボットの構成を概略的に示す図である。FIG. 2 is a diagram schematically showing the configuration of a tip member orientation recognition device and a robot. 挿入方向に先端部材を投影させたときに形成される投影図である。FIG. 7 is a projection view formed when the tip member is projected in the insertion direction. ロボットハンドが対象物を把持し、かつ、先端部材が撮像部に対して正対姿勢である状態における撮像部による撮像画像を概略的に示す図である。FIG. 6 is a diagram schematically showing an image captured by the imaging unit in a state where the robot hand grips an object and the tip member is in a posture directly facing the imaging unit. 先端部材が図5に示される姿勢からその中心軸まわりに30度回転した状態における撮像部による撮像画像を概略的に示す図である。FIG. 6 is a diagram schematically showing an image captured by the imaging unit in a state where the tip member is rotated 30 degrees around its central axis from the posture shown in FIG. 5; 先端部材が図5に示される姿勢からその中心軸まわりに60度回転した状態における撮像部による撮像画像を概略的に示す図である。FIG. 6 is a diagram schematically showing an image captured by the imaging unit in a state where the tip member is rotated 60 degrees around its central axis from the posture shown in FIG. 5; 先端部材が図5に示される姿勢からその中心軸まわりに90度回転した状態における撮像部による撮像画像を概略的に示す図である。FIG. 6 is a diagram schematically showing an image captured by the imaging unit in a state where the tip member is rotated 90 degrees around its central axis from the posture shown in FIG. 5; 先端部材が挿入部に挿入された状態を概略的に示す斜視図である。FIG. 3 is a perspective view schematically showing a state in which the tip member is inserted into the insertion section. 先端部材が挿入部としての端子台に挿入された状態を概略的に示す斜視図である。FIG. 3 is a perspective view schematically showing a state in which the tip member is inserted into a terminal block serving as an insertion portion.

図1は、本発明の一実施形態の先端部材向き合わせシステムを概略的に示す斜視図である。図2は、図1に示される角度とは異なる角度における先端部材向き合わせシステムの斜視図である。図3は、先端部材向き認識装置及びロボットの構成を概略的に示す図である。 FIG. 1 is a perspective view schematically illustrating a tip member orientation system according to an embodiment of the present invention. FIG. 2 is a perspective view of the tip orientation system at a different angle than that shown in FIG. FIG. 3 is a diagram schematically showing the configuration of the tip member orientation recognition device and the robot.

この先端部材向き合わせシステム1は、対象物2の先端に設けられた先端部材3を挿入部100内に挿入することが可能な向きに合わせることと、先端部材3を挿入部100内に挿入することと、が可能なシステムである。対象物2として、電線、光ファイバ、ワイヤーハーネス、紐、糸、繊維、ガラス繊維、チューブ等(以下、「線状物2」と表記する。)が挙げられる。対象物は、ケーブルなどの柔軟物であると本願の効果がより発揮される。本実施形態では、線状物2として、電線が用いられている。 This tip member orientation system 1 aligns the tip member 3 provided at the tip of the object 2 to a direction that allows insertion into the insertion section 100, and inserts the tip member 3 into the insertion section 100. This is a system that allows you to do this. Examples of the object 2 include electric wires, optical fibers, wire harnesses, strings, threads, fibers, glass fibers, tubes, etc. (hereinafter referred to as "wire object 2"). The effects of the present application will be more effective if the object is a flexible object such as a cable. In this embodiment, an electric wire is used as the linear object 2.

先端部材3は、挿入部100への先端部材3の挿入方向に先端部材3を投影させたときに形成される投影図(図4)を挿入方向と平行な方向における先端部材3の中心軸A(図4を参照)まわりに回転させたときに回転前における投影図と重ならない角度を有する形状を有している。換言すれば、先端部材3は、前記投影図が円形とは異なる形状を有している。本実施形態では、先端部材3として、圧着端子が用いられている。図5に示されるように、先端部材3は、圧着部3aと、端子部3bと、を有している。端子部3bには、孔3hが設けられている。 The distal end member 3 has a projection view (FIG. 4) formed when the distal end member 3 is projected in the insertion direction of the distal end member 3 into the insertion section 100, and a central axis A of the distal end member 3 in a direction parallel to the insertion direction. (See FIG. 4) It has a shape that has an angle that does not overlap with the projected view before rotation when rotated. In other words, the tip member 3 has a shape different from a circular shape in the projection view. In this embodiment, a crimp terminal is used as the tip member 3. As shown in FIG. 5, the tip member 3 has a crimp portion 3a and a terminal portion 3b. A hole 3h is provided in the terminal portion 3b.

挿入部100は、先端部材3が正規の姿勢であるときにのみ当該先端部材3の挿入を許容する。挿入部100として、配線ダクトのスリットや、端子台等が挙げられる。本実施形態では、図1及び図9に示されるように、挿入部100は、四角筒状に形成されている。ただし、図10に示されるように、挿入部100は、端子台であってもよい。なお、正規の姿勢とは、挿入方向に先端部材3が挿入部100と重ならない姿勢(先端部材3の挿入部100への挿入が許容される姿勢)を意味しており、挿入部100に対する先端部材3の中心軸Aまわりの角度が厳密に規定されるものではない。 The insertion section 100 allows insertion of the distal end member 3 only when the distal end member 3 is in a normal posture. Examples of the insertion portion 100 include a slit in a wiring duct, a terminal block, and the like. In this embodiment, as shown in FIGS. 1 and 9, the insertion portion 100 is formed into a square tube shape. However, as shown in FIG. 10, the insertion section 100 may be a terminal block. Note that the normal posture means a posture in which the tip member 3 does not overlap the insertion section 100 in the insertion direction (a posture in which insertion of the tip member 3 into the insertion section 100 is permitted), and the tip relative to the insertion section 100 is The angle of the member 3 around the central axis A is not strictly defined.

図1及び図2に示されるように、先端部材向き合わせシステム1は、先端部材向き認識装置10と、ロボット20と、吊り下げ部32と、スクリーン34と、照明36と、を備えている。 As shown in FIGS. 1 and 2, the tip member orientation system 1 includes a tip member orientation recognition device 10, a robot 20, a hanging section 32, a screen 34, and lighting 36.

吊り下げ部32は、線状物2を吊り下げる部位である。具体的に、吊り下げ部32は、線状物2のうち先端部材3が接続されていない側の端部(図1における上端部)を保持している。 The hanging part 32 is a part from which the linear object 2 is suspended. Specifically, the hanging portion 32 holds the end of the linear object 2 to which the tip member 3 is not connected (the upper end in FIG. 1).

スクリーン34は、撮像部12(先端部材向き認識装置10)から見て、線状物2の背後に設置されている。スクリーン34は、白色である。 The screen 34 is installed behind the linear object 2 when viewed from the imaging unit 12 (tip member orientation recognition device 10). Screen 34 is white.

照明36は、透過光照明であることが好ましい。撮像部12が先端部材3の透過光画像を取得することで、画像処理工程において先端部材3の形状の輪郭を認識しやすくなり、後述する回転角の算出精度が高くなるためである。また、本実施形態の先端部材3の表面が光沢を有しているため、反射光画像を撮像した場合、角度によって先端部材3の形状を上手く取得できない場合があるためである。また、先端部材3に凹凸があったり、色むらがある場合は、透過光画像を取得する方が好ましい。本実施形態では、照明36は、スクリーン34に対して光を照射する。照明36から照射された光は、スクリーン34で反射して、スクリーン34と後述する撮像部12との間に位置する先端部材3を照射する。このように、先端部材向き認識装置10は、先端部材3の透過光画像を取得することができるように配置されている。これにより、後述する先端部材3の形状の輪郭等の特徴量と比較する画像情報を認識しやすくなり、回転角の算出精度が高くなる。ここで、透過光とは、すなわち対象物に遮蔽されずに対象物の無い空間を通過してカメラに到達する光(バックライト、逆光、または、背面照射光とも呼ばれる)のことを意味する。透過光画像とは、シルエット画像、バックライト画像、逆光画像、または、背面照射光画像とも呼ばれるものである。 Preferably, the illumination 36 is transmitted light illumination. This is because when the imaging unit 12 acquires a transmitted light image of the tip member 3, it becomes easier to recognize the outline of the shape of the tip member 3 in the image processing step, and the accuracy of calculation of the rotation angle, which will be described later, is increased. Further, since the surface of the tip member 3 of this embodiment has gloss, when a reflected light image is captured, the shape of the tip member 3 may not be properly captured depending on the angle. Further, if the tip member 3 has unevenness or uneven color, it is preferable to acquire a transmitted light image. In this embodiment, the illumination 36 irradiates the screen 34 with light. The light emitted from the illumination 36 is reflected by the screen 34 and irradiates the tip member 3 located between the screen 34 and the imaging section 12 described later. In this way, the tip member orientation recognition device 10 is arranged so as to be able to acquire a transmitted light image of the tip member 3. This makes it easier to recognize the image information to be compared with the feature amount such as the outline of the shape of the tip member 3, which will be described later, and increases the accuracy of calculating the rotation angle. Here, transmitted light means light that is not blocked by an object and passes through a space where there is no object and reaches the camera (also called backlight, backlight, or back illumination light). A transmitted light image is also called a silhouette image, a backlight image, a backlight image, or a backlight image.

ロボット20は、ロボットハンド22と、受信部24と、を有している。 The robot 20 includes a robot hand 22 and a receiving section 24.

ロボットハンド22は、線状物2を把持することが可能である。本実施形態では、ロボットハンド22は、線状物2を把持した状態で上下方向に移動することや、線状物2の中心軸Aまわりに線状物2を回転させることが可能である。 The robot hand 22 is capable of gripping the linear object 2 . In this embodiment, the robot hand 22 is capable of moving vertically while gripping the linear object 2 and rotating the linear object 2 around the central axis A of the linear object 2 .

受信部24については後述する。 The receiving section 24 will be described later.

先端部材向き認識装置10は、対象物2をロボットハンド22で把持した状態において、先端部材3の所定の軸まわりの向きを認識する装置である。先端部材向き認識装置10は、ロボット20に対して、ロボットハンド22によって線状物2を把持した状態で先端部材3を挿入部100に挿入させることを指示することも可能である。図1に示されるように、先端部材向き認識装置10は、吊り下げ部32に吊り下げられた線状物2を基準として、スクリーン34が配置された側と反対側に配置されている。図1~図3に示されるように、先端部材向き認識装置10は、撮像部12と、記憶部14と、算出部16と、送信部18と、を有している。 The tip member orientation recognition device 10 is a device that recognizes the orientation of the tip member 3 around a predetermined axis when the object 2 is gripped by the robot hand 22. The tip member orientation recognition device 10 can also instruct the robot 20 to insert the tip member 3 into the insertion section 100 while holding the linear object 2 with the robot hand 22. As shown in FIG. 1, the tip member orientation recognition device 10 is arranged on the opposite side to the side where the screen 34 is arranged with respect to the linear object 2 suspended from the hanging part 32 as a reference. As shown in FIGS. 1 to 3, the tip member orientation recognition device 10 includes an imaging section 12, a storage section 14, a calculation section 16, and a transmission section 18.

撮像部12は、線状物2をロボットハンド22で把持した状態において先端部材3を撮像する。より詳細には、撮像部12は、線状物2をロボットハンド22で把持した状態において先端部材3を前記挿入方向と交差する方向(本実施形態では直交する方向)から撮影し、画像情報を取得する。 The imaging unit 12 images the tip member 3 while the linear object 2 is being held by the robot hand 22 . More specifically, the imaging unit 12 photographs the tip member 3 from a direction intersecting the insertion direction (orthogonal direction in this embodiment) while the linear object 2 is being held by the robot hand 22, and captures image information. get.

撮像部12は、先端部材3を撮像できるものであればよく、二次元カメラでも三次元カメラでもよい。先端部材3の回転角度を算出するだけであれば、二次元カメラで行うことができる。二次元カメラを用いる場合、カメラから見てX軸方向とY軸方向の位置の認識はできるため、Z軸方向は事前に別途計測したおおよその位置をロボットに送信すればよい。ただし、ロボットに精確な先端部材3の三次元座標を送信する場合には、先端部材3の位置を計測できる三次元カメラを用いる方が好ましい。さらに、撮像部12はステレオカメラであることがより好ましい。TOFカメラやレーザーを用いた三次元計測機では、三次元データ上で認識工程の処理を行う必要があるため、処理時間がかかるが、ステレオカメラの場合、二次元画像上で比較情報と対比する画像情報を取得するための処理を行ってもよいため、処理時間が少なくて済む。さらに、測定対象である先端部材が金属など表面に光沢のある部材の場合、ステレオカメラの方がTOFカメラやレーザーを用いた三次元計測機よりも光沢面による影響を受けにくく、より安定した三次元計測が可能である。したがって、撮像部12はステレオカメラであることが好ましい。本実施形態では、撮像部12はステレオカメラである。回転角度を算出する際には、ステレオカメラの少なくとも片方の画像に基づいて後述する特徴量と比較する画像情報を取得すればよい。 The imaging unit 12 may be any device that can image the distal end member 3, and may be a two-dimensional camera or a three-dimensional camera. If only the rotation angle of the tip member 3 is calculated, a two-dimensional camera can be used. When using a two-dimensional camera, since the position in the X-axis direction and the Y-axis direction can be recognized when viewed from the camera, the approximate position in the Z-axis direction may be transmitted to the robot, which is measured separately in advance. However, when transmitting accurate three-dimensional coordinates of the tip member 3 to the robot, it is preferable to use a three-dimensional camera that can measure the position of the tip member 3. Furthermore, it is more preferable that the imaging unit 12 is a stereo camera. With 3D measuring machines that use TOF cameras or lasers, the recognition process must be performed on 3D data, which takes time, but with stereo cameras, comparison information is compared on 2D images. Since processing for acquiring image information may be performed, the processing time can be reduced. Furthermore, when the tip part to be measured is a member with a glossy surface such as metal, a stereo camera is less affected by the glossy surface than a TOF camera or a three-dimensional measuring machine using a laser, and provides a more stable three-dimensional measurement. Original measurement is possible. Therefore, it is preferable that the imaging unit 12 is a stereo camera. In this embodiment, the imaging unit 12 is a stereo camera. When calculating the rotation angle, image information to be compared with a feature amount described later may be acquired based on at least one image of the stereo camera.

記憶部14は、比較情報を記憶している。比較情報は、第1姿勢の状態にある先端部材3の第1姿勢情報、及び、第1姿勢から先端部材3の挿入方向と平行な中心軸Aまわりに所定角度回転した第2姿勢の状態にある先端部材3の第2姿勢情報を含んでいる。第1姿勢情報は、第1姿勢の状態にある先端部材3を交差方向から撮像して得られる第1画像情報であることが好ましい。第2姿勢情報は、第2姿勢の状態にある先端部材3を交差方向から撮像して得られる第2画像情報であることが好ましい。各画像情報は、先端部材3を所定の軸(中心軸A)まわりに回転させて撮像すると変化する特徴量である。特徴量は、例えば、先端部材3の外周、面積、所定の箇所(例えば、先端から所定距離位置)の幅などである。なお、比較情報は、第1姿勢及び第2姿勢とは異なる姿勢の状態にある先端部材3を交差方向から撮像して得られる少なくとも一つの他の画像情報をさらに含んでいてもよい。 The storage unit 14 stores comparison information. The comparison information includes first attitude information of the tip member 3 in the first attitude, and a second attitude rotated by a predetermined angle around the central axis A parallel to the insertion direction of the tip member 3 from the first attitude. It includes second attitude information of a certain tip member 3. It is preferable that the first posture information is first image information obtained by imaging the distal end member 3 in the first posture from a cross direction. It is preferable that the second posture information is second image information obtained by imaging the distal end member 3 in the second posture from a cross direction. Each piece of image information is a feature amount that changes when the tip member 3 is rotated around a predetermined axis (center axis A) and an image is captured. The feature amounts include, for example, the outer circumference, area, and width of a predetermined location (for example, a predetermined distance from the tip) of the tip member 3. Note that the comparison information may further include at least one other piece of image information obtained by imaging the tip member 3 in a posture different from the first posture and the second posture from a cross direction.

本実施形態では、記憶部14は、比較情報である各画像情報として、端子部3bの幅W1又は幅W1に対応する画素数と、圧着部3aの幅W2又は幅W2に対応する画素数と、記憶している。幅W1は、先端部材3の下端部から上方に第1距離L1の位置における端子部3bの幅である。幅W2は、先端部材3の下端部から上方に第1距離L1よりも大きな第2距離L2の位置における圧着部3aの幅である。各画像情報は、先端部材3又は先端部材3に含まれる形状の幅、面積または外周であってもよい。ここで、「形状の幅」は、先端部材3の挿入方向と平行な軸に対して直交する方向の長さを意味する。「先端部材3に含まれる形状」とは、例えば先端部材3中の孔や模様を意味し、例えば図5における先端部材3の孔3hの面積や孔3hの外周である。 In this embodiment, the storage unit 14 stores the width W1 of the terminal portion 3b or the number of pixels corresponding to the width W1, and the width W2 of the crimp portion 3a or the number of pixels corresponding to the width W2, as each image information that is comparison information. , remember. The width W1 is the width of the terminal portion 3b at a position a first distance L1 upward from the lower end of the tip member 3. The width W2 is the width of the crimp portion 3a at a position a second distance L2 upward from the lower end of the tip member 3, which is larger than the first distance L1. Each image information may be the width, area, or outer circumference of the tip member 3 or a shape included in the tip member 3. Here, the "width of the shape" means the length in the direction perpendicular to the axis parallel to the insertion direction of the tip member 3. The "shape included in the tip member 3" means, for example, a hole or a pattern in the tip member 3, and is, for example, the area of the hole 3h of the tip member 3 in FIG. 5 or the outer circumference of the hole 3h.

例えば、第1画像情報は、撮像部12に対して先端部材3が正対しており、挿入される挿入方向と平行な中心軸Aまわりの回転角が挿入部へ挿入可能な向きとなっている正対姿勢(図5に示される姿勢)における幅W1又は幅W1に対応する画素数、及び、幅W2又は幅W2に対応する画素数である。第2画像情報は、先端部材3が正対姿勢から中心軸Aまわりに90度回転した姿勢(図8に示される姿勢)における幅W1又は幅W1に対応する画素数、及び、幅W2又は幅W2に対応する画素数である。図6は、先端部材3が正対姿勢から中心軸Aまわりに30度回転した状態における撮像部12による撮像画像を概略的に示す図である。図7は、先端部材3が正対姿勢から中心軸Aまわりに60度回転した状態における撮像部による撮像画像を概略的に示す図である。 For example, the first image information shows that the tip member 3 is directly facing the imaging section 12, and the rotation angle around the central axis A parallel to the insertion direction is such that it can be inserted into the insertion section. These are the width W1 or the number of pixels corresponding to the width W1, and the width W2 or the number of pixels corresponding to the width W2 in the facing attitude (the attitude shown in FIG. 5). The second image information includes the width W1 or the number of pixels corresponding to the width W1 when the tip member 3 is rotated 90 degrees around the central axis A from the facing orientation (the orientation shown in FIG. 8), and the width W2 or the width. This is the number of pixels corresponding to W2. FIG. 6 is a diagram schematically showing an image captured by the imaging unit 12 in a state where the tip member 3 is rotated by 30 degrees around the central axis A from the facing position. FIG. 7 is a diagram schematically showing an image captured by the imaging unit in a state where the tip member 3 is rotated 60 degrees around the central axis A from the facing position.

算出部16は、前記比較情報に基づいて、撮像部12が取得した画像情報における先端部材3の所定の軸まわりの向きから所望の向きまでの回転角を算出する。本実施形態では、算出部16は、撮像部12により撮像された撮像画像における画像情報(幅W1又は幅W1に対応する画素数、及び、幅W2又は幅W2に対応する画素数)と比較情報とを比較することによって、先端部材3の現在の姿勢から前記正規の姿勢までの中心軸Aまわりの回転角を算出する。具体的には、画像情報は、中心軸Aまわりの回転角に応じて正弦関数的に変化するため、算出部16は、撮像画像における画像情報及び正弦関数に基づいて、先端部材3の現在の姿勢から前記正規の姿勢までの中心軸Aまわりの回転角を算出する。 The calculation unit 16 calculates the rotation angle from the orientation of the tip member 3 around a predetermined axis to a desired orientation in the image information acquired by the imaging unit 12, based on the comparison information. In the present embodiment, the calculation unit 16 includes image information (width W1 or the number of pixels corresponding to the width W1, and width W2 or the number of pixels corresponding to the width W2) in the captured image captured by the imaging unit 12 and comparison information. By comparing these, the rotation angle around the central axis A from the current posture of the tip member 3 to the normal posture is calculated. Specifically, since the image information changes sinusoidally according to the rotation angle around the central axis A, the calculation unit 16 calculates the current state of the tip member 3 based on the image information in the captured image and the sine function. The rotation angle around the central axis A from the posture to the normal posture is calculated.

あるいは、記憶部14は、正対姿勢からの先端部材3の中心軸Aまわりの角度とそのときの先端部材3の画像情報との関係を示すテーブルを記憶しており、算出部16は、そのマップと撮像部12により撮像された撮像画像における画像情報とに基づいて、先端部材3の現在の姿勢から前記正規の姿勢までの中心軸Aまわりの回転角を算出してもよい。 Alternatively, the storage unit 14 stores a table showing the relationship between the angle around the central axis A of the tip member 3 from the front facing posture and the image information of the tip member 3 at that time, and the calculation unit 16 The rotation angle around the central axis A from the current posture of the distal end member 3 to the normal posture may be calculated based on the map and the image information in the captured image captured by the imaging section 12.

また、算出部16は、前記回転角を算出する際、先端部材3の中心軸Aが鉛直方向に対して傾斜している場合、画像情報中において前記中心軸Aが鉛直方向と平行となるような処理を行った上で、前記回転角を算出してもよい。 Furthermore, when calculating the rotation angle, if the central axis A of the tip member 3 is inclined with respect to the vertical direction, the calculation unit 16 calculates the rotation angle such that the central axis A is parallel to the vertical direction in the image information. The rotation angle may be calculated after performing some processing.

算出部16は、先端部材3の位置情報(座標情報)も算出してもよい。本実施形態においては、算出部16はステレオカメラで撮像して取得した画像情報にもとづいて、三角測量の原理で先端部材3の三次元座標を取得することができる。算出部16は、先端部材3の位置情報として、例えば、先端部材3の最下点(挿入される先端部)の位置を算出する。 The calculation unit 16 may also calculate position information (coordinate information) of the tip member 3. In this embodiment, the calculation unit 16 can obtain the three-dimensional coordinates of the tip member 3 using the principle of triangulation based on image information obtained by imaging with a stereo camera. The calculation unit 16 calculates, for example, the position of the lowest point of the distal end member 3 (the distal end portion to be inserted) as the positional information of the distal end member 3.

送信部18は、回転信号と挿入信号とをロボット20の受信部24に送信する。回転信号は、算出部16により算出された回転角だけロボットハンド22によって先端部材3を中心軸Aまわりに回転させる信号、つまり、先端部材3の現在の姿勢から前記正規の姿勢までの中心軸Aまわりの回転角を示す信号である。挿入信号は、算出部が算出した先端部材の位置情報をもとに、ロボットハンド22によって先端部材3を挿入部100に挿入させることを示す信号である。 The transmitter 18 transmits the rotation signal and the insertion signal to the receiver 24 of the robot 20. The rotation signal is a signal that causes the robot hand 22 to rotate the tip member 3 around the center axis A by the rotation angle calculated by the calculation unit 16, that is, the center axis A from the current posture of the tip member 3 to the normal posture. This is a signal indicating the rotation angle of the surroundings. The insertion signal is a signal indicating that the robot hand 22 inserts the tip member 3 into the insertion section 100 based on the position information of the tip member calculated by the calculation section.

以上に説明した先端部材向き認識装置10は、画像情報取得工程と、算出工程と、を実施する。画像情報取得工程は、撮像部12により実行される工程である。算出工程は、算出部16により実行される工程である。 The tip member orientation recognition device 10 described above performs an image information acquisition process and a calculation process. The image information acquisition process is a process executed by the imaging unit 12. The calculation process is a process executed by the calculation unit 16.

ここで、受信部24について説明する。受信部24は、先端部材向き認識装置10の送信部18から回転信号及び挿入信号を受信する。受信部24は、回転信号に基づいてロボットハンド22に対して線状物2を回転させることを指示する。受信部24は、挿入信号に基づいてロボットハンド22に対して先端部材3の挿入部100内への挿入を指示する。 Here, the receiving section 24 will be explained. The receiving unit 24 receives the rotation signal and the insertion signal from the transmitting unit 18 of the tip member orientation recognition device 10. The receiving unit 24 instructs the robot hand 22 to rotate the linear object 2 based on the rotation signal. The receiving section 24 instructs the robot hand 22 to insert the tip member 3 into the insertion section 100 based on the insertion signal.

すなわち、ロボットハンド22は、回転信号に基づいて先端部材3が中心軸Aまわりに回転するように線状物2を操作すること(回転操作)と、挿入信号に基づいて先端部材3が挿入部100に挿入されるように線状物2を操作すること(挿入操作)と、を行う。換言すれば、ロボット20は、ロボットハンド22によって回転信号に基づいた回転角だけ線状物2を中心軸Aまわりに回転させるとともに、ロボットハンド22によって挿入信号に基づいて先端部材3を挿入部100に挿入させる。なお、回転操作及び挿入操作は、この順に行われてもよいし、同時に行われてもよい。 That is, the robot hand 22 operates the linear object 2 so that the tip member 3 rotates around the central axis A based on the rotation signal (rotation operation), and the tip member 3 moves into the insertion portion based on the insertion signal. 100 (insertion operation). In other words, the robot 20 uses the robot hand 22 to rotate the linear object 2 around the central axis A by the rotation angle based on the rotation signal, and also uses the robot hand 22 to rotate the tip member 3 to the insertion section 100 based on the insertion signal. have it inserted. Note that the rotation operation and the insertion operation may be performed in this order or may be performed simultaneously.

以上に説明したロボット20は、回転工程と、挿入工程と、を実施可能である。回転工程は、回転操作を行う工程である。挿入工程は、挿入操作を行う工程である。 The robot 20 described above can perform a rotation process and an insertion process. The rotation process is a process of performing a rotation operation. The insertion process is a process of performing an insertion operation.

以上に説明したように、本実施形態の先端部材向き合わせシステム1では、カメラが撮影して取得した先端部材3の画像情報を比較情報と比較することによって、先端部材3の現在の姿勢から正規の姿勢までの中心軸Aまわりの回転角が算出され、その回転角だけロボットハンド22が線状物2を回転させるため、先端部材3の姿勢が挿入部100への先端部材3の挿入が可能な姿勢(正規の姿勢)となる。画像情報とは、撮像部12が撮像して得られた画像そのものであってもよいし、画像を解析することで得られる画像中の幾何的特徴量(例えば、円形状のパラメータや特定の図形で近似した際の縦横比、外周長など)の情報であってもよい。 As explained above, in the tip member orientation system 1 of the present embodiment, by comparing the image information of the tip member 3 taken and acquired by the camera with comparison information, the current posture of the tip member 3 is normalized. The rotation angle around the central axis A up to the posture is calculated, and the robot hand 22 rotates the linear object 2 by that rotation angle, so that the posture of the tip member 3 is such that the tip member 3 can be inserted into the insertion section 100. posture (regular posture). The image information may be the image itself obtained by imaging by the imaging unit 12, or may be the geometric feature amount in the image obtained by analyzing the image (for example, a circular parameter or a specific figure). information such as the aspect ratio, outer circumference length, etc. when approximated by .

また、先端部材向き認識装置10は、回転信号及び挿入信号をロボット20に送信するため、ロボットハンド22によって線状物2を挿入部100に挿入することが可能となる。 Further, since the distal end member orientation recognition device 10 transmits a rotation signal and an insertion signal to the robot 20, it becomes possible to insert the linear object 2 into the insertion section 100 using the robot hand 22.

なお、今回開示された実施形態はすべての点で例示であって、制限的なものではないと考えられるべきである。本発明の範囲は、上記した実施形態の説明ではなく特許請求の範囲によって示され、さらに特許請求の範囲と均等の意味および範囲内でのすべての変更が含まれる。 Note that the embodiments disclosed herein are illustrative in all respects and should not be considered restrictive. The scope of the present invention is indicated by the claims rather than the description of the embodiments described above, and includes all changes within the meaning and range equivalent to the claims.

例えば、記憶部14、算出部16及び送信部18は、撮像部12から離間していてもよい。すなわち、先端部材向き認識装置10は、第1ケースと第1ケースとは別体の第2ケースとを有し、第1ケースに撮像部12が収容され、第2ケースに記憶部14、算出部16及び送信部18が収容されていてもよい。 For example, the storage section 14, the calculation section 16, and the transmission section 18 may be separated from the imaging section 12. That is, the tip member orientation recognition device 10 has a first case and a second case separate from the first case, in which the first case houses the imaging unit 12, and the second case houses the storage unit 14 and the calculation unit. The unit 16 and the transmitting unit 18 may be housed therein.

また、算出部16は、前記回転角(先端部材3の現在の姿勢から前記正規の姿勢までの中心軸Aまわりの回転角)の算出前に、画像情報において先端部材3の中心軸Aの鉛直方向に対する傾斜角を算出してもよい。この場合、送信部18は、先端部材3の中心軸Aが鉛直方向と平行となるように中心軸Aを傾けることを示す鉛直信号を受信部24に送信し、受信部24からの指示に基づいて、ロボットハンド22は、先端部材3の中心軸Aが鉛直方向と平行となるように線状物2を操作すること(鉛直操作)を行う。その状態、つまり、先端部材3の中心軸Aが鉛直方向と実質的に平行となった状態において、算出部16は、撮像部12による画像情報に基づいて前記回転角を算出する。このようにすれば、より正確に前記回転角が算出される。 Further, before calculating the rotation angle (rotation angle around the central axis A from the current posture of the tip member 3 to the normal posture), the calculation unit 16 calculates the vertical axis of the central axis A of the tip member 3 in the image information. An inclination angle with respect to the direction may also be calculated. In this case, the transmitting unit 18 transmits a vertical signal to the receiving unit 24 indicating that the central axis A of the tip member 3 is to be tilted so that the central axis A is parallel to the vertical direction, and based on the instruction from the receiving unit 24. The robot hand 22 operates the linear object 2 so that the central axis A of the tip member 3 is parallel to the vertical direction (vertical operation). In this state, that is, in a state in which the central axis A of the distal end member 3 is substantially parallel to the vertical direction, the calculating section 16 calculates the rotation angle based on the image information from the imaging section 12. In this way, the rotation angle can be calculated more accurately.

また、先端部材の種類によっては、先端部材の挿入部への挿入方向と平行な方向における先端部材の中心軸まわりに180度回転させたときの挿入方向に先端部材を投影させたときに形成される投影図が、回転前の投影図と同じ形状になるが、機能的には表裏の別がある場合がある。この場合、回転角の算出に加えて表裏判定が必要となる。先端部材の表裏について、先端部材の色や模様等による違いがあれば、取得した画像情報から色や模様を識別して、表裏判定をする工程を回転角の算出工程に加えることで、表裏の別を反映した向き認識及び向き合わせを実現できる。この時、色や模様等を判別するために、透過光画像に加えて反射光画像を取得することが好ましい。反射光画像を取得する際は、室内光や太陽光等の環境光が線状物に対して照射された状態で取得すればよい。より安定して反射光画像を取得するためには、線状物に対して光を照射する反射光照射部材を有するのが好ましい。反射光照射部材にはランプ、LED等の通常の照明を用いることができる。より好ましくは、撮像部と同じ側から先端部材に対して均一な光を照射可能な拡散光照明である。 Also, depending on the type of tip member, the shape may be formed when the tip member is projected in the insertion direction when rotated 180 degrees around the central axis of the tip member in a direction parallel to the direction in which the tip member is inserted into the insertion section. The projected image that is rotated has the same shape as the projected image before rotation, but there are cases where the front and back sides are functionally different. In this case, in addition to calculating the rotation angle, it is necessary to determine the front and back sides. If there is a difference between the front and back of the tip member due to the color or pattern of the tip member, it is possible to identify the color or pattern from the acquired image information and add the process of determining front and back to the rotation angle calculation process. It is possible to realize orientation recognition and alignment that reflect the different orientations. At this time, in order to distinguish colors, patterns, etc., it is preferable to acquire a reflected light image in addition to a transmitted light image. When acquiring a reflected light image, it is sufficient to acquire the reflected light image while the linear object is irradiated with environmental light such as indoor light or sunlight. In order to more stably acquire a reflected light image, it is preferable to have a reflected light irradiation member that irradiates light onto the linear object. Ordinary lighting such as a lamp or LED can be used as the reflected light irradiation member. More preferably, it is a diffused light illumination that can irradiate uniform light to the tip member from the same side as the imaging section.

1 先端部材向き合わせシステム、2 対象物(線状物)、3 先端部材、3a 圧着部、3b 接続部、3h 孔、10 先端部材向き認識装置、12 撮像部、14 記憶部、16 算出部、18 送信部、20 ロボット、22 ロボットハンド、24 受信部、32 吊り下げ部、34 スクリーン、36 照明、100 挿入部。 1 tip member orientation system, 2 object (linear object), 3 tip member, 3a crimp section, 3b connection section, 3h hole, 10 tip member orientation recognition device, 12 imaging section, 14 storage section, 16 calculation section, 18 transmitting section, 20 robot, 22 robot hand, 24 receiving section, 32 hanging section, 34 screen, 36 lighting, 100 insertion section.

Claims (8)

先端部材を有する対象物をロボットハンドで把持した状態において、先端部材向き認識装置が前記先端部材の所定の軸まわりの向きを認識する方法であって、
前記対象物を前記ロボットハンドで把持した状態において前記先端部材をカメラで撮影し、画像情報を取得する画像情報取得工程と、
第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から前記先端部材の前記所定の軸まわりに所定の角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報に基づいて、取得した前記画像情報における前記先端部材の前記所定の軸まわりの向きから所望の向きまでの回転角を算出する算出工程と、を備える、
先端部材の向き認識方法。
A method in which a tip member orientation recognition device recognizes the orientation of the tip member around a predetermined axis in a state in which an object having the tip member is gripped by a robot hand, the method comprising:
an image information acquisition step of photographing the tip member with a camera while the object is gripped by the robot hand, and acquiring image information;
First attitude information of the tip member in a first attitude, and information on the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member from the first attitude. a calculation step of calculating a rotation angle from the orientation of the tip member around the predetermined axis in the acquired image information to a desired orientation based on comparison information including two posture information;
How to recognize the orientation of the tip member.
前記先端部材は、正規の姿勢であるときに当該先端部材の挿入を許容する挿入部に挿入されることが可能であって、
前記所定の軸は、前記先端部材が前記挿入部に挿入される挿入方向と平行な方向の軸であり、
前記所望の向きは、前記先端部材が前記挿入部に挿入可能な向きである、
請求項1に記載の先端部材の向き認識方法。
The tip member can be inserted into an insertion portion that allows insertion of the tip member when in a normal posture,
The predetermined axis is an axis parallel to an insertion direction in which the tip member is inserted into the insertion section,
the desired orientation is an orientation in which the distal end member can be inserted into the insertion section;
The method for recognizing the orientation of a tip member according to claim 1.
前記第1姿勢情報、及び、前記第2姿勢情報は、前記先端部材を前記所定の軸まわりに回転させて撮像すると変化する特徴量である、
請求項1または2に記載の先端部材の向き認識方法。
The first posture information and the second posture information are feature quantities that change when the tip member is rotated around the predetermined axis and imaged.
The method for recognizing the orientation of a tip member according to claim 1 or 2.
前記特徴量は、前記先端部材または先端部材に含まれる形状の幅、面積もしくは外周である、
請求項3に記載の先端部材の向き認識方法。
The feature amount is the width, area, or outer circumference of the tip member or the shape included in the tip member,
The method for recognizing the orientation of a tip member according to claim 3.
先端部材を有する対象物をロボットハンドで把持した状態において、前記ロボットハンド及び先端部材向き認識装置を含む先端部材向き合わせシステムが前記先端部材の所定の軸まわりの向きを正規の向きにする先端部材向き合わせ方法であって、
前記対象物を前記ロボットハンドで把持した状態においてカメラで撮影し、画像情報を取得する画像情報取得工程と、
第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から前記先端部材の前記所定の軸まわりに所定角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報に基づいて、取得した前記画像情報における前記先端部材の前記所定の軸まわりの向きから前記正規の向きまでの回転角を算出する算出工程と、
前記ロボットハンドによって前記回転角だけ前記対象物を前記所定の軸まわりに回転させる回転工程と、を備える、先端部材向き合わせ方法。
A tip member in which a tip member orientation system including the robot hand and a tip member orientation recognition device adjusts the orientation of the tip member around a predetermined axis to a normal orientation when an object having the tip member is gripped by a robot hand. A method of facing
an image information acquisition step of photographing the object with a camera while it is being held by the robot hand, and acquiring image information;
First attitude information of the tip member in a first attitude, and second attitude information of the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member from the first attitude. a calculation step of calculating a rotation angle from the orientation of the tip member around the predetermined axis in the acquired image information to the normal orientation based on comparison information including posture information;
A method for aligning tip members, comprising: a rotation step of rotating the object around the predetermined axis by the rotation angle by the robot hand.
先端部材を有する対象物をロボットハンドで把持した状態において、前記ロボットハンド及び先端部材向き認識装置を含む先端部材向き合わせシステムが、前記先端部材を、前記先端部材の所定の軸まわりの向きが正規の向きであるときにのみ当該先端部材の挿入を許容する挿入部に挿入する先端部材挿入方法であって、
請求項5に記載の先端部材向き合わせ方法によって、前記先端部材の向きを合わせ、
前記回転工程後、あるいは、前記回転工程と同時に、前記ロボットハンドによって前記先端部材を前記挿入部に挿入する挿入工程をさらに備える、先端部材挿入方法。
When an object having a tip member is gripped by a robot hand, a tip member orientation system including the robot hand and a tip member orientation recognition device determines whether the tip member is correctly oriented around a predetermined axis of the tip member. A method of inserting a tip member into an insertion section that allows insertion of the tip member only when the tip member is oriented in the direction of
Aligning the orientation of the tip member by the tip member orientation method according to claim 5,
The tip member insertion method further comprises an insertion step of inserting the tip member into the insertion portion by the robot hand after the rotation step or simultaneously with the rotation step.
先端部材を有する対象物をロボットハンドで把持した状態において、前記先端部材の所定の軸まわりの向きを認識する先端部材の向き認識装置であって、
前記ロボットハンドによって前記対象物が把持された状態において前記先端部材を撮像する撮像部と、
第1姿勢の状態にある前記先端部材の第1姿勢情報、及び、前記第1姿勢から前記先端部材の前記所定の軸まわりに所定の角度回転した第2姿勢の状態にある前記先端部材の第2姿勢情報を含む比較情報を記憶する記憶部と、
前記比較情報に基づいて、取得した画像情報における前記先端部材の前記所定の軸まわりの向きから所望の向きまでの回転角を算出する算出部と、
を備える、先端部材向き認識装置。
A tip member orientation recognition device that recognizes the orientation of the tip member around a predetermined axis in a state where an object having the tip member is gripped by a robot hand, the device comprising:
an imaging unit that captures an image of the tip member in a state in which the object is gripped by the robot hand;
First attitude information of the tip member in a first attitude, and information on the tip member in a second attitude rotated by a predetermined angle around the predetermined axis of the tip member from the first attitude. a storage unit that stores comparison information including two-position information;
a calculation unit that calculates a rotation angle from the orientation of the tip member around the predetermined axis to a desired orientation in the acquired image information, based on the comparison information;
A tip member orientation recognition device comprising:
請求項7に記載の先端部材向き認識装置と、
前記ロボットハンドを有するロボットと、を備え、
前記ロボットは、前記ロボットハンドによって前記回転角だけ前記対象物を前記所定の軸まわりに回転させる、先端部材向き合わせシステム。
A tip member orientation recognition device according to claim 7;
a robot having the robot hand,
The robot is a tip member facing system that rotates the object around the predetermined axis by the rotation angle by the robot hand.
JP2019220588A 2019-12-05 2019-12-05 Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system Active JP7438734B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019220588A JP7438734B2 (en) 2019-12-05 2019-12-05 Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2019220588A JP7438734B2 (en) 2019-12-05 2019-12-05 Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system

Publications (2)

Publication Number Publication Date
JP2021089672A JP2021089672A (en) 2021-06-10
JP7438734B2 true JP7438734B2 (en) 2024-02-27

Family

ID=76220293

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2019220588A Active JP7438734B2 (en) 2019-12-05 2019-12-05 Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system

Country Status (1)

Country Link
JP (1) JP7438734B2 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005011580A (en) 2003-06-17 2005-01-13 Fanuc Ltd Connector holding device, and connector inspection system and connector connection system equipped therewith
JP2011129082A (en) 2009-11-20 2011-06-30 3D Media Co Ltd Three-dimensional object recognition device and three-dimensional object recognition method
JP2014161950A (en) 2013-02-25 2014-09-08 Dainippon Screen Mfg Co Ltd Robot system, robot control method, and robot calibration method
JP2017087317A (en) 2015-11-04 2017-05-25 トヨタ自動車株式会社 Operation object state estimation device
WO2018207552A1 (en) 2017-05-09 2018-11-15 ソニー株式会社 Robot device and electronic device manufacturing method
JP2021028107A (en) 2019-08-09 2021-02-25 倉敷紡績株式会社 Connector direction searching method, connector connecting method, robot hand, control device, imaging device and connector connecting system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005011580A (en) 2003-06-17 2005-01-13 Fanuc Ltd Connector holding device, and connector inspection system and connector connection system equipped therewith
JP2011129082A (en) 2009-11-20 2011-06-30 3D Media Co Ltd Three-dimensional object recognition device and three-dimensional object recognition method
JP2014161950A (en) 2013-02-25 2014-09-08 Dainippon Screen Mfg Co Ltd Robot system, robot control method, and robot calibration method
JP2017087317A (en) 2015-11-04 2017-05-25 トヨタ自動車株式会社 Operation object state estimation device
WO2018207552A1 (en) 2017-05-09 2018-11-15 ソニー株式会社 Robot device and electronic device manufacturing method
JP2021028107A (en) 2019-08-09 2021-02-25 倉敷紡績株式会社 Connector direction searching method, connector connecting method, robot hand, control device, imaging device and connector connecting system

Also Published As

Publication number Publication date
JP2021089672A (en) 2021-06-10

Similar Documents

Publication Publication Date Title
US5249035A (en) Method of measuring three dimensional shape
JP4757142B2 (en) Imaging environment calibration method and information processing apparatus
JP2004144557A (en) Three-dimensional visual sensor
CN105547153B (en) Plug-in element stitch vision positioning method and device based on binocular vision
JP2021193400A (en) Method for measuring artefact
JP5383853B2 (en) Tool shape measuring apparatus and tool shape measuring method
JP2006284531A (en) Tool shape measuring instrument, and tool shape measuring method
CN115629066A (en) Method and device for automatic wiring based on visual guidance
KR20040010091A (en) Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker
JP7438734B2 (en) Tip member orientation recognition method, tip member orientation method, tip member insertion method, tip member orientation recognition device, and tip member orientation system
JP2002099902A (en) Image processing device for measuring three-dimensional information of object through binocular stereoscopic vision, its method, and storage medium with measurement program stored therein
US20120304439A1 (en) Method and apparatus for fitting of a plug housing
CN111757796B (en) Method for moving tip of thread, control device, and three-dimensional camera
CN116105600B (en) Aiming target method based on binocular camera, processing device and laser tracker
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
KR100698535B1 (en) Position recognition device and method of mobile robot with tilt correction function
JP7119584B2 (en) Three-dimensional measuring device, position display method and program for three-dimensional measuring device
US11090812B2 (en) Inspection apparatus for optically inspecting an object, production facility equipped with the inspection apparatus, and method for optically inspecting the object using the inspection apparatus
JP2008294065A (en) Mounting method and mounting device for electronic component
US20180231474A1 (en) Apparatus and method for generating operation program of inspection system
JP2009186404A (en) Image information creating device and image information creating method
JPH0820207B2 (en) Optical 3D position measurement method
JP2021089670A (en) Method for recognizing bundled material, method for determining work position of bundled material, method for controlling robot, device for determining work position of bundled material, and system for performing bundled material
CN114248293B (en) 2D laser profiler and 2D camera-based perforated part grabbing method and system
JP7312663B2 (en) Linear object working position determining method, robot control method, linear object fixing method, linear object working position determining device, and linear object gripping system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20220929

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20231128

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20240123

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20240206

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20240214

R150 Certificate of patent or registration of utility model

Ref document number: 7438734

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150