JP2015042965A - Article position recognition device - Google Patents

Article position recognition device Download PDF

Info

Publication number
JP2015042965A
JP2015042965A JP2013174937A JP2013174937A JP2015042965A JP 2015042965 A JP2015042965 A JP 2015042965A JP 2013174937 A JP2013174937 A JP 2013174937A JP 2013174937 A JP2013174937 A JP 2013174937A JP 2015042965 A JP2015042965 A JP 2015042965A
Authority
JP
Japan
Prior art keywords
article
container
position recognition
captured image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2013174937A
Other languages
Japanese (ja)
Other versions
JP6167760B2 (en
Inventor
淳 倉山
Atsushi Kurayama
淳 倉山
淳 味生
Atsushi Mino
淳 味生
朋也 坪田
Tomoya Tsubota
朋也 坪田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Daifuku Co Ltd
Original Assignee
Daifuku Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daifuku Co Ltd filed Critical Daifuku Co Ltd
Priority to JP2013174937A priority Critical patent/JP6167760B2/en
Publication of JP2015042965A publication Critical patent/JP2015042965A/en
Application granted granted Critical
Publication of JP6167760B2 publication Critical patent/JP6167760B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide an article position recognition device capable of appropriately recognizing the position of an article in a vessel in a picked-up image without being affected by a reflected virtual image of an article reflected on the inside surface of the vessel.SOLUTION: The article position recognition device includes: an imaging unit for imaging, from the upside, a vessel C having a flat inside surface Cn and an opening top face; and an article position recognition unit for executing an article position recognition processing of recognizing the position of an article B accommodated in the vessel C on the basis of a picked-up image picked up by the imaging unit. The article position recognition unit extracts, from the picked-up image, an article existence region in which actual article B can exist, and executes the article position recognition processing on the extracted region.

Description

本発明は、内側面が平坦に形成されて上面が開放された容器を上方から撮像する撮像部と、前記撮像部が撮像した撮像画像に基づいて、前記容器に収容された物品の位置を認識する物品位置認識処理を実行する物品位置認識部と、を備えた物品位置認識装置に関する。   The present invention recognizes the position of an article accommodated in the container based on an imaging unit that images from above the container whose inner surface is formed flat and the upper surface is opened, and the captured image captured by the imaging unit. The present invention relates to an article position recognition device including an article position recognition unit that executes an article position recognition process.

かかる物品位置認識装置は、例えば、容器に収容された物品をピッキングロボットでピッキングするピッキング設備において、容器内におけるピッキング対象の物品の位置を認識する等の目的で用いられるものである(例えば、特許文献1参照)。   Such an article position recognition apparatus is used for the purpose of, for example, recognizing the position of an article to be picked in a container in a picking facility for picking an article contained in the container with a picking robot (for example, a patent) Reference 1).

特開2010−12567号公報JP 2010-12567 A

上記特許文献1には記載されていないが、例えば、容器が、内側面が平坦に形成されたプラスチックケース等である場合、撮像部が撮像した撮像画像において、物品が壁面に映り込んで反射虚像が生じることがある。従来の物品位置認識装置は、このような反射虚像が生じると、反射虚像を現実の物品の像(実像)と誤認してしまい、現実の物品の位置からずれた位置を物品の位置として認識してしまう事態が生じる虞があった。このような事態が生じると、例えば上記特許文献1のようなピッキング設備では、ピッキングロボットの物品支持部分が容器の内側面と干渉したりして、物品を適正にピッキングできなくなる虞がある。   Although not described in Patent Document 1, for example, when the container is a plastic case or the like having a flat inner surface, in the captured image captured by the imaging unit, the article is reflected on the wall surface and reflected virtual image May occur. When such a reflected virtual image is generated, the conventional article position recognition apparatus misidentifies the reflected virtual image as an actual article image (real image), and recognizes a position shifted from the actual article position as the article position. There was a risk that this would happen. When such a situation occurs, for example, in the picking facility as in Patent Document 1, there is a possibility that the article support portion of the picking robot interferes with the inner side surface of the container and the article cannot be picked properly.

そこで、撮像画像において容器の内側面に映り込んだ物品の反射虚像に影響されることなく、容器内での物品の位置を適切に認識することができる物品位置認識装置の実現が望まれる。   Therefore, it is desired to realize an article position recognition device that can appropriately recognize the position of the article in the container without being affected by the reflected virtual image of the article reflected on the inner surface of the container in the captured image.

上記課題を解決するための本発明にかかる物品位置認識装置は、内側面が平坦に形成されて上面が開放された容器を上方から撮像する撮像部と、前記撮像部が撮像した撮像画像に基づいて、前記容器に収容された物品の位置を認識する物品位置認識処理を実行する物品位置認識部と、を備えたものであって、
前記物品位置認識部は、前記撮像画像から現実の物品が存在し得る物品存在領域を抽出して、抽出した領域に対して前記物品位置認識処理を実行する点を特徴とする。
An article position recognition apparatus according to the present invention for solving the above-described problems is based on an imaging unit that images a container having an inner surface formed flat and an open upper surface from above, and a captured image captured by the imaging unit. An article position recognition unit that executes an article position recognition process for recognizing the position of the article contained in the container,
The article position recognition unit is characterized in that an article existence area where an actual article can exist is extracted from the captured image, and the article position recognition process is executed on the extracted area.

この構成によれば、撮像部が撮像した撮像画像から現実の物品が存在し得る物品存在領域を抽出して、その領域に対して物品位置認識処理を実行するものであるから、撮像画像において容器の内側面に反射虚像が生じていたとしても、撮像画像から反射虚像を排除した状態で物品位置認識処理を実行することができる。
このため、撮像画像において容器の内側面に映り込んだ物品の反射虚像に影響されることなく、容器内での物品の位置を適切に認識することができる。
According to this configuration, since an article presence area where an actual article can exist is extracted from the captured image captured by the imaging unit, and the article position recognition process is performed on the area, the container in the captured image is displayed. Even if a reflected virtual image is generated on the inner surface of the article, the article position recognition process can be executed in a state where the reflected virtual image is excluded from the captured image.
For this reason, the position of the article in the container can be appropriately recognized without being affected by the reflected virtual image of the article reflected on the inner side surface of the container in the captured image.

本発明に係る物品位置認識装置の実施形態においては、前記容器内の物品の上面のうち最も高い位置に位置する物品の上面の高さを検出する上面高さ検出部が設けられ、前記物品位置認識部は、前記上面高さ検出部が検出した物品の上面を含む平面と、前記容器の内側面との交線である仮想枠線を前記撮像画像に設定し、前記撮像画像における前記仮想枠線よりも内方の領域を前記物品存在領域として抽出することが好ましい。   In an embodiment of the article position recognition apparatus according to the present invention, an upper surface height detection unit that detects the height of the upper surface of the article located at the highest position among the upper surfaces of the articles in the container is provided, and the article position The recognition unit sets a virtual frame line that is an intersection line between a plane including the upper surface of the article detected by the upper surface height detection unit and the inner surface of the container in the captured image, and the virtual frame in the captured image It is preferable to extract a region inside the line as the article presence region.

例えば、上述したピッキング設備においては、ピッキングロボットの取出し作用部と物品との干渉を避けるために、容器内の物品のうち最も高い位置に位置する物品から順次ピッキングすることが好ましい。
本実施形態によれば、撮像画像において、容器内の物品の上面のうち最も高い位置に位置する物品の上面を含む平面と、容器の内側面との交線である仮想枠線よりも内方の領域を物品存在領域として抽出するため、物品存在領域には、容器内の物品のうち最も高い位置に位置する物品を含めて容器内に存在する全ての物品が包含されることになる。
このように、本実施形態によれば、容器内の物品のうち最も高い位置に位置する物品が存在する領域を、適切に認識対象とすることができる。
For example, in the above-described picking equipment, it is preferable that picking is performed sequentially from the article located at the highest position among the articles in the container in order to avoid interference between the picking-up action part of the picking robot and the article.
According to the present embodiment, in the captured image, the inner side of the virtual frame line that is the intersection of the plane including the upper surface of the article located at the highest position among the upper surfaces of the articles in the container and the inner side surface of the container. Therefore, the article existing area includes all the articles existing in the container including the article located at the highest position among the articles in the container.
As described above, according to the present embodiment, the region where the article located at the highest position among the articles in the container is present can be appropriately recognized.

物品位置認識装置が備えられたピッキング設備の全体斜視図Whole perspective view of picking equipment provided with article position recognition device ピッキング設備の制御ブロック図Control block diagram of picking equipment 物品が収容された容器の撮像画像を示す図The figure which shows the captured image of the container in which the articles | goods were accommodated 上面高さ検出部が高さを検出するために用いるヒストグラムを示す図The figure which shows the histogram used in order that an upper surface height detection part may detect height. 物品位置認識部が実行する制御のフローチャートFlow chart of control executed by article position recognition unit 物品を複数段収容する場合の各段における上面高さに対応する仮想枠線を示す図The figure which shows the virtual frame line corresponding to the upper surface height in each step | level in case articles | goods are accommodated in multiple steps | paragraphs 物品存在領域の抽出結果を示す図The figure which shows the extraction result of the article existence area 物品が収容されていない容器の撮像画像を示す図The figure which shows the captured image of the container in which the article | item is not accommodated. 容器の底部上面と内側面との境界を仮想枠線とした場合を比較例として示す図The figure which shows the case where the boundary of the bottom upper surface and inner surface of a container is made into a virtual frame line as a comparative example

本発明の物品位置認識装置をピッキング設備に適用した場合の実施形態を、図面に基づいて説明する。
図1は、ピッキングロボットPを備えるピッキング設備の斜視図である。ピッキングロボットPは、搬送コンベヤ22Cの搬送経路の側脇に設置されている。搬送コンベヤ22Cは、上位管理装置Hu(図2参照)からの指令にしたがって自動倉庫(図示省略)から出庫されたコンテナCを搬送する。コンテナCは、上面が開放されており、コンテナCの内部に収容された物品Bを上方から取り出し自在に構成されている。ピッキングロボットPは、搬送コンベヤ22Cの搬送経路における所定の停止位置(ピッキング用停止位置)に位置するコンテナCから、物品Bを、後述する吸着支持部40を用いて取り出すように構成されている。本実施形態では、コンテナCが容器に相当する。コンテナCの内部には、同一種別の物品が複数収容されるようになっている。
An embodiment when the article position recognition apparatus of the present invention is applied to a picking facility will be described with reference to the drawings.
FIG. 1 is a perspective view of a picking facility including a picking robot P. FIG. The picking robot P is installed on the side of the transport path of the transport conveyor 22C. The conveyor 22C conveys the container C delivered from an automatic warehouse (not shown) in accordance with a command from the upper management device Hu (see FIG. 2). The container C has an open top surface and is configured so that an article B accommodated in the container C can be taken out from above. The picking robot P is configured to take out the article B from the container C located at a predetermined stop position (picking stop position) in the transport path of the transport conveyor 22C by using a suction support unit 40 described later. In the present embodiment, the container C corresponds to a container. Inside the container C, a plurality of articles of the same type are accommodated.

ピッキングロボットPは、ピッキングエリアの床面に固設される台座部30と、上昇下降移動及び鉛直軸心回りでの回動が自在な支持シリンダ31と、支持シリンダ31の上端でかつ水平面に沿う水平軸36に軸支されて、水平軸36を揺動軸として揺動自在な第1アーム32と、第1アーム32の水平軸36とは逆側の端部に設けられる水平軸37を揺動軸として揺動自在な第2アーム33と、第2アーム33の水平軸37とは逆側の端部に設けられる水平軸38を揺動軸として揺動自在な回動軸部材34とを備えた移動機構と、物品Bの上面部に吸着作用する吸着部が複数配設された吸着支持部40と、を備えて構成されている。   The picking robot P includes a pedestal 30 fixed on the floor surface of the picking area, a support cylinder 31 that can move up and down and rotate around a vertical axis, and an upper end of the support cylinder 31 along a horizontal plane. A first arm 32 pivotally supported by the horizontal shaft 36 and swingable with the horizontal shaft 36 as a swing shaft, and a horizontal shaft 37 provided at the end of the first arm 32 opposite to the horizontal shaft 36 are swung. A second arm 33 that can swing as a moving shaft, and a rotary shaft member 34 that can swing using a horizontal shaft 38 provided at the end opposite to the horizontal shaft 37 of the second arm 33 as a swinging shaft. And a suction support portion 40 provided with a plurality of suction portions that perform a suction action on the upper surface portion of the article B.

搬送コンベヤ22Cにおけるピッキング用停止位置の上方には、撮像部50が備えられている。撮像部50は、搬送コンベヤ22Cの搬送面に載置されてピッキング用停止位置に位置している状態のコンテナCの内部を上方から撮像自在な位置に固定設置されている。また、撮像部50は2つの撮像レンズを備えるステレオカメラで構成され、水平方向に異なる位置からの2つの画像を同時期に撮像することができるように構成されている。   An imaging unit 50 is provided above the picking stop position on the conveyor 22C. The imaging unit 50 is fixedly installed at a position where the inside of the container C placed on the transport surface of the transport conveyor 22C and positioned at the picking stop position can be imaged from above. The imaging unit 50 includes a stereo camera including two imaging lenses, and is configured to be able to capture two images from different positions in the horizontal direction at the same time.

図2に示すように、制御部Hは、例えば演算装置と記憶装置とを備えたコンピュータにて構成されており、記憶装置に記憶したプログラムを実行することによって、各種の制御を実行するようになっている。
制御部Hには、撮像部50、ピッキングコントローラHp、及び、上位管理装置Huが制御部Hと相互に通信自在に接続されている。
撮像部50は、制御部Hからの指令によって画像を撮像するとともに、撮像した画像を制御部Hに送信可能に構成されている。
ピッキングコントローラHpは、ピッキングロボットPの移動機構及び吸着支持部40の作動を制御自在に構成されている。
As shown in FIG. 2, the control unit H is configured by, for example, a computer including an arithmetic device and a storage device, and executes various controls by executing programs stored in the storage device. It has become.
An imaging unit 50, a picking controller Hp, and a host management device Hu are connected to the control unit H so as to be able to communicate with the control unit H.
The imaging unit 50 is configured to capture an image according to a command from the control unit H and to transmit the captured image to the control unit H.
The picking controller Hp is configured to be able to control the movement mechanism of the picking robot P and the operation of the suction support unit 40.

上位管理装置Huは、ピッキング作業全体のスケジュールを管理している。そして、上位管理装置Huによってピッキングをすべき物品Bが指定されると、図示しない自動倉庫用のコントローラ及び搬送コンベヤ22C用のコントローラ等が、該当する物品Bを収容したコンテナCを自動倉庫から出庫して搬送コンベヤ22Cにおけるピッキング用停止位置まで搬送すべく自動倉庫の各種搬送装置及び搬送コンベヤ22Cの作動を制御する。   The host management device Hu manages the entire picking work schedule. Then, when an article B to be picked is designated by the host management device Hu, a controller for an automatic warehouse (not shown) and a controller for the transfer conveyor 22C, etc., unload the container C containing the corresponding article B from the automatic warehouse. Then, the various conveyors in the automatic warehouse and the operation of the conveyor 22C are controlled so as to convey to the picking stop position on the conveyor 22C.

制御部Hは、上位管理装置Huから物品の取り出し指令に基づいてコンテナCが搬送コンベヤ22Cのピッキング用停止位置に搬送されると、取出し作動指令をピッキングコントローラHpに指令する。そして、ピッキングコントローラHpが上記取出し作動指令に従ってピッキングロボットPを作動させることで、ピッキングロボットPによって物品BをコンテナCから取り出すことができる。制御部Hは、取出し作動指令を指令するに当たり、ピッキング対象の物品BがコンテナC内のどの位置に存在するかをピッキングコントローラHpに教示する。このため、制御部Hは、ピッキング対象の物品BがコンテナC内のどの位置に存在するかを認識する必要がある。
物品位置認識装置は、ピッキング対象の物品BがコンテナC内のどの位置に存在するかを認識するために用いられるものであり、本実施形態においては、制御部Hと撮像部50とから物品位置認識装置が構成されている。
When the container C is transported to the picking stop position of the transport conveyor 22C based on the article take-out command from the host management device Hu, the control unit H instructs the picking controller Hp to take out the pick-up operation command. The article B can be taken out from the container C by the picking robot P when the picking controller Hp operates the picking robot P in accordance with the take-out operation command. When instructing the take-out operation command, the control unit H tells the picking controller Hp where the picked article B is located in the container C. For this reason, the control unit H needs to recognize in which position in the container C the article B to be picked exists.
The article position recognition device is used for recognizing in which position in the container C the article B to be picked exists, and in this embodiment, the article position is detected from the control unit H and the imaging unit 50. A recognition device is configured.

図2に示すように、制御部Hには、プログラムモジュールとして、物品位置認識部H1と、上面高さ検出部H2とを備えている。
物品位置認識部H1は、撮像部50にて撮像した撮像画像(コンテナCを上方から撮像した画像。図3参照。)に基づいて、コンテナCに収容された物品Bの位置を認識する物品位置認識処理を実行する。物品位置認識処理は、上位管理装置Huから取り出し対象の物品Bの種別が指令されたときに、その物品Bに対応するテンプレート画像を記憶装置からロードし、撮像画像に対するテンプレート画像のパターンマッチングを実行することによって、その撮像画像における物品Bの位置(例えば、物品Bが上面視で矩形の物品である場合、コンテナCの特定の角部を原点とする座標系における物品Bの中心座標、物品Bの上面矩形の大きさ、及び物品Bの姿勢)を認識する処理である。
As shown in FIG. 2, the control unit H includes an article position recognition unit H1 and an upper surface height detection unit H2 as program modules.
The article position recognizing unit H1 recognizes the position of the article B accommodated in the container C based on the captured image (the image obtained by capturing the container C from above; see FIG. 3) captured by the imaging unit 50. Perform recognition processing. In the article position recognition process, when the type of the article B to be taken out is instructed from the upper management apparatus Hu, the template image corresponding to the article B is loaded from the storage device and the pattern matching of the template image with the captured image is executed. As a result, the position of the article B in the captured image (for example, when the article B is a rectangular article as viewed from above, the center coordinates of the article B in the coordinate system having a specific corner of the container C as the origin, the article B Is the process of recognizing the size of the upper surface rectangle and the posture of the article B).

上面高さ検出部H2は、撮像部50にて撮像した撮像画像に基づいて、その撮像画像における物品Bの上面のうち最も高い位置に位置する物品の上面の高さを検出する上面高さ検出処理を実行する。
具体的には、撮像部50が撮像した2枚の撮像画像に基づいて、画素(ピクセル)ごとの撮像部50からの距離を算出し、撮像部50が設置されている高さから算出された上記距離を減算することで、各ピクセル毎の高さを算出し、高さとピクセル数との相関をグラフ化する(図4参照)。このグラフにおいて、ノイズ除去のための演算(例えば、平滑化微分等)を行った後、ピクセル数のピークとなる部分(図4においては高さT1、高さT2、高さT3)を検出する。図4において、高さT1に位置するピクセル群は、撮像画像においてコンテナCの上縁部に対応するピクセルの集合であり、高さT3に位置するピクセル群は、撮像画像においてコンテナCに収容される物品Bのうち最も上面が高い物品Bよりも低い位置に位置する物品Bの上面に対応するピクセルの集合である。そして、高さT2に位置するピクセル群は、撮像画像においてコンテナCに収容される物品Bのうち最も上面が高い物品Bの上面に対応するピクセルの集合である。上面高さ検出部H2は、高い方から2つ目のピークの高さT2を、コンテナC内の物品Bの上面のうち最も高い位置に位置する物品Bの上面の高さとして検出する。
すなわち、上面高さ検出部H2は、コンテナC内の物品Bの上面のうち最も高い位置に位置する物品Bの上面の高さT2を検出する。
The upper surface height detection unit H2 detects the height of the upper surface of the article located at the highest position among the upper surfaces of the article B in the captured image based on the captured image captured by the imaging unit 50. Execute the process.
Specifically, the distance from the imaging unit 50 for each pixel (pixel) is calculated based on the two captured images captured by the imaging unit 50, and is calculated from the height at which the imaging unit 50 is installed. By subtracting the distance, the height for each pixel is calculated, and the correlation between the height and the number of pixels is graphed (see FIG. 4). In this graph, after calculation for noise removal (for example, smoothing differentiation, etc.), the peak portions (height T1, height T2, and height T3 in FIG. 4) are detected. . In FIG. 4, the pixel group positioned at the height T1 is a set of pixels corresponding to the upper edge of the container C in the captured image, and the pixel group positioned at the height T3 is accommodated in the container C in the captured image. This is a set of pixels corresponding to the upper surface of the article B located at a position lower than the article B having the highest upper surface among the articles B to be displayed. The pixel group located at the height T2 is a set of pixels corresponding to the upper surface of the article B having the highest upper surface among the articles B accommodated in the container C in the captured image. The upper surface height detection unit H2 detects the height T2 of the second peak from the highest as the height of the upper surface of the article B located at the highest position among the upper surfaces of the articles B in the container C.
That is, the upper surface height detection unit H2 detects the height T2 of the upper surface of the article B located at the highest position among the upper surfaces of the articles B in the container C.

本実施形態において、コンテナCはプラスチックで形成され、その内側面Cnは平坦に形成されている。一般に、プラスチック製のコンテナCの内側面は平滑度が高いため、図3に示すように、物品に反射した光等が映り込み、物品の反射虚像Bkを生じ易いものとなっている。
反射虚像Bkが生じた状態のまま、物品位置認識部H1にて物品の位置を認識すると、反射虚像Bkを現実の物品Bの像(実像)と認識してしまい、実像からずれた位置を物品Bの位置として認識してしまう虞がある。
In the present embodiment, the container C is made of plastic, and its inner side surface Cn is formed flat. In general, since the inner side surface of the plastic container C has high smoothness, as shown in FIG. 3, reflected light or the like is reflected on the article, and a reflected virtual image Bk of the article is easily generated.
When the article position recognition unit H1 recognizes the position of the article while the reflected virtual image Bk is generated, the reflected virtual image Bk is recognized as an actual image (real image) of the article B, and the position deviated from the real image is changed to the article. There is a risk of recognizing the position of B.

そこで、本実施形態では、物品位置認識部H1が、撮像画像から現実の物品Bが存在し得る物品存在領域を抽出して、抽出した領域に対して物品位置認識処理を実行するように構成している。
以下、物品位置認識装置の制御部Hが実行する処理を、図5のフローチャートに基づいて説明する。
Therefore, in the present embodiment, the article position recognition unit H1 is configured to extract an article existence area where the actual article B can exist from the captured image, and execute article position recognition processing on the extracted area. ing.
Hereinafter, the process performed by the control unit H of the article position recognition apparatus will be described based on the flowchart of FIG.

制御部Hは、ピッキング用停止箇所にコンテナCが到達したことを、搬送コンベヤ22Cに備えるリミットスイッチ等にて検出すると、撮像部50に対してコンテナCの画像の撮像を指令し、撮像画像を取得する(ステップ#1)。
続いて、制御部Hにおける上面高さ検出部H2が、コンテナC内の物品Bの上面のうち最も高い位置に位置する物品Bの上面の高さを検出し(ステップ#2)、検出した物品Bの上面の高さに対応する仮想枠線を撮像画像に設定する(ステップ#3)。
ステップ#3で設定される仮想枠線は、図6に示すように、上面高さ検出部H2が検出した物品Bの上面を含む平面と、コンテナCの内側面Cnとの交線(コンテナCの上面視の形状において、物品Bの上面の高さに対応する等高線)である。図6は、物品BがコンテナC内に複数段積層状態で収容される場合に、コンテナCの物品Bのうち最も高い位置に位置する物品Bが存在する段の夫々に対応する物品Bの上面の高さに対して設定される仮想枠線を、破線Wk1〜Wk5にて示している。
When the control unit H detects that the container C has arrived at the picking stop position by using a limit switch or the like provided in the transport conveyor 22C, the control unit H instructs the imaging unit 50 to capture the image of the container C, and displays the captured image. Obtain (step # 1).
Subsequently, the upper surface height detection unit H2 in the control unit H detects the height of the upper surface of the article B located at the highest position among the upper surfaces of the articles B in the container C (step # 2), and the detected article. A virtual frame line corresponding to the height of the upper surface of B is set in the captured image (step # 3).
As shown in FIG. 6, the virtual frame line set in step # 3 is an intersection line (container C) between the plane including the upper surface of the article B detected by the upper surface height detection unit H2 and the inner surface Cn of the container C. The contour line corresponding to the height of the upper surface of the article B in the shape of the top view of FIG. FIG. 6 shows the upper surface of the article B corresponding to each of the stages where the article B located at the highest position among the articles B of the container C exists when the article B is accommodated in the container C in a multi-tiered state. Virtual frame lines set for the height of are indicated by broken lines Wk1 to Wk5.

引き続き、制御部Hは、上記仮想枠線よりも内方の領域を、現実の物品が存在し得る物品存在領域として抽出する(ステップ#4)。この処理によって抽出された物品存在領域を図7に示す(破線Wkで囲まれた領域の内方が物品存在領域として抽出されている)。
制御部Hは、ステップ#4で抽出した物品存在領域に対して、物品位置認識部H1が物品位置認識処理を実行する(ステップ#5)。
これにより、図7に示すように、撮像画像において反射虚像Bkが生じている虞がある部分を適切に排除することができる。
Subsequently, the control unit H extracts an area inside the virtual frame line as an article existence area where an actual article can exist (step # 4). The article presence area extracted by this processing is shown in FIG. 7 (the inside of the area surrounded by the broken line Wk is extracted as the article presence area).
In the control unit H, the article position recognition unit H1 performs article position recognition processing on the article presence area extracted in step # 4 (step # 5).
As a result, as shown in FIG. 7, it is possible to appropriately exclude a portion where the reflected virtual image Bk may be generated in the captured image.

ところで、撮像画像から、最も確実に現実の物品Bが存在し得る物品存在領域のみを抽出することができるのは、図8に示すコンテナCの底部上面Ctと容器の内側面Cnとの境界Wtを仮想枠線として、その仮想枠線よりも内方を物品存在領域として抽出する場合である。しかしながら、そのように仮想枠線を設定した場合、図9に示すように、上面視でコンテナCの中央部寄りに位置する物品Biについては、撮像画像において仮想枠線よりも内方の物品存在領域に位置するものの、コンテナCの内側面Cnに隣接する位置にある物品Boについては、撮像画像において仮想枠線よりも外方となる部分が生じる虞がある。   By the way, it is the boundary Wt between the bottom upper surface Ct of the container C shown in FIG. 8 and the inner side surface Cn of the container C that can extract only the article presence region where the actual article B can exist most reliably from the captured image. Is the virtual frame line, and the inside of the virtual frame line is extracted as the article presence region. However, when the virtual frame line is set in such a manner, as shown in FIG. 9, the article Bi located near the center of the container C in the top view is present in the captured image inside the virtual frame line. Although the article Bo is located in the region but is located adjacent to the inner side surface Cn of the container C, there is a possibility that a portion outside the virtual frame line is generated in the captured image.

上記のように物品存在領域に含まれない部分のある物品Boについては、物品位置認識処理を実行した場合にパターンマッチングの結果として物品Bとして認識されず、結果として、コンテナCの内側面Cn近傍に位置する物品Boは取り出し対象とならずにコンテナC内に残存する虞がある。   As described above, an article Bo having a part not included in the article presence area is not recognized as the article B as a result of pattern matching when the article position recognition process is executed, and as a result, in the vicinity of the inner side surface Cn of the container C. There is a risk that the article Bo located in the container C will remain in the container C without being taken out.

本実施形態では、コンテナC内の物品Bの上面のうち最も高い位置に位置する物品の上面を含む平面と、コンテナCの内側面との交線である仮想枠線を撮像画像に設定し、撮像画像における仮想枠線よりも内方の領域を物品存在領域として抽出するから、撮像画像において反射虚像Bkが生じている虞がある部分を適切に排除することができながらも、内側面Cnに隣接する位置にある物品Bについても物品Bとして認識できない事態を回避し、コンテナC内においてピッキングされずに物品Bが残存するという不都合を適切に回避することができるものとなる。   In the present embodiment, a virtual frame line that is an intersection line between the plane including the upper surface of the article located at the highest position among the upper surfaces of the article B in the container C and the inner surface of the container C is set in the captured image, Since an area inside the virtual frame line in the captured image is extracted as the article presence area, a portion where the reflected virtual image Bk may be generated in the captured image can be appropriately excluded, but the inner side surface Cn can be appropriately excluded. The situation in which the article B at the adjacent position cannot be recognized as the article B can be avoided, and the inconvenience that the article B remains without being picked in the container C can be appropriately avoided.

〔別実施形態〕
(1)上記実施形態では、上面高さ検出部H2によって検出された、コンテナC内の物品Bのうち最も高い位置に上面が位置する物品Bの上面の高さを含む平面と、コンテナCの内側面Cnとの交線を仮想枠線とし、当該仮想枠線よりも内方の領域を物品存在領域として抽出する構成としたが、例えば、撮像画像の各ピクセル毎の輝度を検出し、隣接するピクセルの輝度が所定の値以上変化する点を連結して仮想枠線を設定する等、他の方法にて仮想枠線を設定してもよい。また、物品Bの上面の高さに関係なく、常にコンテナCの底部上面Ctと内側面Cnとの境界Wtを仮想枠線とする構成でもよい。
[Another embodiment]
(1) In the above embodiment, a plane including the height of the upper surface of the article B whose upper surface is located at the highest position among the articles B in the container C, detected by the upper surface height detection unit H2, and the container C The intersecting line with the inner side surface Cn is set as a virtual frame line, and an area inside the virtual frame line is extracted as an article presence area. For example, the brightness of each pixel of the captured image is detected and adjacent The virtual frame line may be set by other methods, such as setting a virtual frame line by connecting points where the luminance of the pixels to be changed changes by a predetermined value or more. Moreover, the structure which always uses the boundary Wt of the bottom part upper surface Ct and the inner surface Cn of the container C as a virtual frame line irrespective of the height of the upper surface of the article | item B may be sufficient.

(2)上記実施形態では、物品位置認識部H1が、仮想枠線を、上面高さ検出部H2が検出した物品Bの上面を含む平面と、コンテナCの内側面Cnとの交線として検出する構成を説明したが、このような構成に限定されるものではなく、例えば、上位管理装置Huが、過去のピッキング履歴に基づいて、コンテナC内の物品Bのうち最も上の段に位置する物品BがコンテナC内で下から何段目に存在するか(段数情報と称する)を管理し、物品位置認識部H1が、上記上位管理装置Huから得た段数情報に基づいて、コンテナC内の物品Bのうち最も上の段に位置する物品Bの上面の高さを算出し、搬送コンベヤ22Cの搬送面と平行でかつ上記物品Bの上面の高さを通る平面とコンテナCの内側面Cnとの交線を仮想枠線とするように構成してもよい。 (2) In the above embodiment, the article position recognition unit H1 detects the virtual frame line as an intersection line between the plane including the upper surface of the article B detected by the upper surface height detection unit H2 and the inner surface Cn of the container C. However, the present invention is not limited to such a configuration. For example, the upper management device Hu is positioned at the top of the articles B in the container C based on the past picking history. The number of steps from the bottom in the container C within the container C (referred to as step number information) is managed, and the item position recognition unit H1 stores the information in the container C based on the step number information obtained from the host management device Hu. The height of the upper surface of the article B located in the uppermost stage among the articles B of the same is calculated, and the plane parallel to the conveying surface of the conveying conveyor 22C and passing through the height of the upper surface of the article B and the inner surface of the container C are calculated. The intersection line with Cn is configured as a virtual frame line. It may be.

50 撮像部
B 物品
C 容器
Cn 内側面
H1 物品位置認識部
H2 上面高さ検出部
50 Imaging unit B Article C Container Cn Inner side surface H1 Article position recognition unit H2 Upper surface height detection unit

Claims (2)

内側面が平坦に形成されて上面が開放された容器を上方から撮像する撮像部と、
前記撮像部が撮像した撮像画像に基づいて、前記容器に収容された物品の位置を認識する物品位置認識処理を実行する物品位置認識部と、を備えた物品位置認識装置であって、
前記物品位置認識部は、前記撮像画像から現実の物品が存在し得る物品存在領域を抽出して、抽出した領域に対して前記物品位置認識処理を実行する物品位置認識装置。
An imaging unit for imaging from above the container whose inner side surface is formed flat and whose upper surface is opened;
An article position recognizing device comprising: an article position recognizing unit that executes an article position recognizing process for recognizing a position of an article accommodated in the container based on a captured image captured by the imaging unit;
The article position recognition unit is an article position recognition apparatus that extracts an article presence area where an actual article can exist from the captured image and executes the article position recognition process on the extracted area.
前記容器内の物品の上面のうち最も高い位置に位置する物品の上面の高さを検出する上面高さ検出部が設けられ、
前記物品位置認識部は、前記上面高さ検出部が検出した物品の上面を含む平面と、前記容器の内側面との交線である仮想枠線を前記撮像画像に設定し、前記撮像画像における前記仮想枠線よりも内方の領域を前記物品存在領域として抽出する請求項1記載の物品位置認識装置。
An upper surface height detector for detecting the height of the upper surface of the article located at the highest position among the upper surfaces of the articles in the container;
The article position recognition unit sets, in the captured image, a virtual frame line that is an intersection line between a plane including the upper surface of the article detected by the upper surface height detection unit and the inner surface of the container. The article position recognition apparatus according to claim 1, wherein an area inside the virtual frame line is extracted as the article existence area.
JP2013174937A 2013-08-26 2013-08-26 Article position recognition device Active JP6167760B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013174937A JP6167760B2 (en) 2013-08-26 2013-08-26 Article position recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013174937A JP6167760B2 (en) 2013-08-26 2013-08-26 Article position recognition device

Publications (2)

Publication Number Publication Date
JP2015042965A true JP2015042965A (en) 2015-03-05
JP6167760B2 JP6167760B2 (en) 2017-07-26

Family

ID=52696542

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013174937A Active JP6167760B2 (en) 2013-08-26 2013-08-26 Article position recognition device

Country Status (1)

Country Link
JP (1) JP6167760B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2016113836A1 (en) * 2015-01-13 2017-06-22 株式会社日立製作所 Manipulator control method, system, and manipulator
JP2020040789A (en) * 2018-09-11 2020-03-19 株式会社東芝 Unloading device and detecting method of cargo
WO2020105295A1 (en) * 2018-11-21 2020-05-28 Thk株式会社 Image information processing device, holding system, and image information processing method
JP2022009289A (en) * 2018-09-11 2022-01-14 株式会社東芝 Unloading device and detecting method of cargo

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019147644A (en) * 2018-02-26 2019-09-05 株式会社東芝 Control device, program and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02256485A (en) * 1989-03-29 1990-10-17 Toyoda Mach Works Ltd Robot with visual device
JP2010071743A (en) * 2008-09-17 2010-04-02 Yaskawa Electric Corp Method of detecting object, object detection device and robot system
JP2012002683A (en) * 2010-06-17 2012-01-05 Fuji Electric Co Ltd Stereo image processing method and stereo image processing device
JP2013154457A (en) * 2012-01-31 2013-08-15 Asahi Kosan Kk Workpiece transfer system, workpiece transfer method, and program
JP2013158873A (en) * 2012-02-03 2013-08-19 Fanuc Ltd Image processing device provided with function for automatically adjusting search window

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02256485A (en) * 1989-03-29 1990-10-17 Toyoda Mach Works Ltd Robot with visual device
JP2010071743A (en) * 2008-09-17 2010-04-02 Yaskawa Electric Corp Method of detecting object, object detection device and robot system
JP2012002683A (en) * 2010-06-17 2012-01-05 Fuji Electric Co Ltd Stereo image processing method and stereo image processing device
JP2013154457A (en) * 2012-01-31 2013-08-15 Asahi Kosan Kk Workpiece transfer system, workpiece transfer method, and program
JP2013158873A (en) * 2012-02-03 2013-08-19 Fanuc Ltd Image processing device provided with function for automatically adjusting search window

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2016113836A1 (en) * 2015-01-13 2017-06-22 株式会社日立製作所 Manipulator control method, system, and manipulator
JP2020040789A (en) * 2018-09-11 2020-03-19 株式会社東芝 Unloading device and detecting method of cargo
JP2022009289A (en) * 2018-09-11 2022-01-14 株式会社東芝 Unloading device and detecting method of cargo
JP7170818B2 (en) 2018-09-11 2022-11-14 株式会社東芝 Unloading device and cargo detection method
WO2020105295A1 (en) * 2018-11-21 2020-05-28 Thk株式会社 Image information processing device, holding system, and image information processing method
JP2020082253A (en) * 2018-11-21 2020-06-04 Thk株式会社 Image information processor, holding system, and image information processing method
CN113165187A (en) * 2018-11-21 2021-07-23 Thk株式会社 Image information processing device, gripping system, and image information processing method
US11607803B2 (en) 2018-11-21 2023-03-21 Thk Co., Ltd. Image information processing device, gripping system, and image information processing method
CN113165187B (en) * 2018-11-21 2023-07-21 Thk株式会社 Image information processing device, gripping system, and image information processing method

Also Published As

Publication number Publication date
JP6167760B2 (en) 2017-07-26

Similar Documents

Publication Publication Date Title
US11288810B2 (en) Robotic system with automated package registration mechanism and methods of operating the same
US11001441B2 (en) Cargo handling apparatus and method
US10124489B2 (en) Locating, separating, and picking boxes with a sensor-guided robot
JP6167760B2 (en) Article position recognition device
WO2021249568A1 (en) Warehouse robot control method and apparatus, device and readable storage medium
JP7206421B2 (en) Smart forklift and detection method of container position and orientation deviation
US20230260071A1 (en) Multicamera image processing
TW202221632A (en) Method and apparatus for storing material, robot, warehousing system and storage medium
JP2017100214A (en) Manipulator system, imaging system, object delivery method, and manipulator control program
JP7062406B2 (en) Information processing equipment and robot arm control system
WO2022021561A1 (en) Goods sorting system and sorting method
JP6643921B2 (en) Sorting device and article removal method
Pan et al. Manipulator package sorting and placing system based on computer vision
US11911919B2 (en) Method and computing system for performing grip region detection
WO2018225827A1 (en) Workpiece recognition device, and method for recognizing workpiece
JP7021620B2 (en) Manipulators and mobile robots
WO2022264726A1 (en) Transport system, method executed by computer for controlling transport of articles, and program for causing computer to execute this method
JP7398662B2 (en) Robot multi-sided gripper assembly and its operating method
US20230071488A1 (en) Robotic system with overlap processing mechanism and methods for operating the same
US20230364787A1 (en) Automated handling systems and methods
US20220375097A1 (en) Robotic system for object size detection
JP2021024679A (en) Cargo handling control device, cargo handling system, and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20151120

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20161026

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20161101

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161228

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170530

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170612

R150 Certificate of patent or registration of utility model

Ref document number: 6167760

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250