WO2023026452A1 - Three-dimensional data acquisition device - Google Patents

Three-dimensional data acquisition device Download PDF

Info

Publication number
WO2023026452A1
WO2023026452A1 PCT/JP2021/031437 JP2021031437W WO2023026452A1 WO 2023026452 A1 WO2023026452 A1 WO 2023026452A1 JP 2021031437 W JP2021031437 W JP 2021031437W WO 2023026452 A1 WO2023026452 A1 WO 2023026452A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
pixel
distance
image
workpiece
Prior art date
Application number
PCT/JP2021/031437
Other languages
French (fr)
Japanese (ja)
Inventor
澤源 孫
俊之 安藤
Original Assignee
ファナック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ファナック株式会社 filed Critical ファナック株式会社
Priority to PCT/JP2021/031437 priority Critical patent/WO2023026452A1/en
Priority to TW111130663A priority patent/TW202308820A/en
Publication of WO2023026452A1 publication Critical patent/WO2023026452A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Definitions

  • the present disclosure relates to a three-dimensional data acquisition device.
  • Patent Document 1 A method of performing three-dimensional measurement of a workpiece by combining a monocular camera and a horizontal movement device using a monocular stereo method is known (see Patent Document 1, for example).
  • One aspect of the present disclosure includes a single two-dimensional camera that photographs a workpiece, a movement mechanism that relatively moves the workpiece and the two-dimensional camera in one direction, and a control unit, wherein the control unit
  • the arrival of the workpiece within the field of view of the two-dimensional camera is detected by a change in the pixel value of pixels in the vicinity of the upstream end of the workpiece in the direction of relative movement in the two-dimensional image acquired by the two-dimensional camera, and the workpiece is A first image acquired by the two-dimensional camera at a first position that has moved a predetermined first distance from the arrival position of the work, and a second image that has been moved from the arrival position by a predetermined second distance to the second image.
  • the three-dimensional data acquisition device generates three-dimensional data of the workpiece based on a second image acquired by a dimensional camera and a difference distance between the second distance and the first distance.
  • FIG. 1 is an overall configuration diagram of a three-dimensional data acquisition device according to a first embodiment of the present disclosure
  • FIG. 2 is a block diagram showing the configuration of a control unit in FIG. 1
  • FIG. 2 is a schematic diagram showing a two-dimensional image acquired by the two-dimensional camera of FIG. 1
  • FIG. FIG. 2 is a schematic diagram showing a state in which a work has arrived in the field of view of the two-dimensional camera of FIG. 1
  • 5 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 4
  • FIG. FIG. 5 is a schematic diagram showing a state in which the workpiece has entered the field of view of the two-dimensional camera rather than the state in FIG. 4
  • FIG. 7 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 6;
  • FIG. 2 is a schematic diagram showing a state in which a workpiece is placed at a first position within the field of view of the two-dimensional camera of FIG. 1;
  • FIG. 9 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 8;
  • 2 is a schematic diagram showing a state in which a workpiece is placed at a second position within the field of view of the two-dimensional camera of FIG. 1;
  • FIG. 2 is a schematic diagram showing a two-dimensional image acquired by a two-dimensional camera of a modified example of the three-dimensional data acquisition device of FIG. 1;
  • FIG. 14 is a partial configuration diagram showing a state in which a workpiece is placed near the optical axis within the field of view of the two-dimensional camera of FIG. 13;
  • 1 is an overall configuration diagram of a three-dimensional data acquisition device according to a second embodiment of the present disclosure;
  • the three-dimensional data acquisition apparatus 1 includes a conveyor (moving mechanism) 2 that conveys a work W, and a single 2-axis unit installed above the conveyor 2 downward.
  • a dimensional camera 3 and a control unit 4 are provided.
  • the conveyor 2 is, for example, a belt conveyor, and includes a belt 21 for carrying the work W thereon and conveying it in one horizontal direction.
  • the belt 21 is driven at a constant speed by a motor (not shown).
  • the two-dimensional camera 3 is attached, for example, to a pedestal R arranged above the belt 21 of the conveyor 2 .
  • the two-dimensional camera 3 has a downward visual field V with its optical axis X arranged in the vertical direction. ing.
  • the control unit 4 includes at least one memory that stores pre-taught programs and at least one processor that executes the programs. 2, the control unit 4 includes a camera control unit 41 that controls the two-dimensional camera 3, an image processing unit 42 that processes a two-dimensional image acquired by the two-dimensional camera 3, and a computing unit. 43 , a timer 44 , a storage unit 45 and a three-dimensional data generation unit 46 .
  • the camera control unit 41 transmits a control command to the two-dimensional camera 3 and sets the two-dimensional camera 3 to acquire a two-dimensional image at a predetermined frame rate.
  • the two-dimensional camera 3 transmits the acquired two-dimensional image to the image processing unit 42 each time it acquires a two-dimensional image.
  • the image processing unit 42 processes each two-dimensional image received from the two-dimensional camera 3, for example, as shown in FIG. A pixel value of a pixel row (first pixel) A composed of a plurality of pixels arranged orthogonally to is extracted. Similarly, the image processing unit 42 extracts pixel values of a pixel row (second pixel) B consisting of a plurality of pixels arranged in parallel with the pixel row A at a distance of a predetermined number of pixels from the pixel row A in the transport direction. . The extracted pixel values are sent to the calculation unit 43 .
  • the calculation unit 43 determines whether or not the pixel value of any pixel in the pixel row A has changed compared to the two-dimensional image acquired one frame before. Then, when there is a change in the pixel value, it is determined that the work W has arrived within the visual field V, and the first time obtained by the timer 44 at that time is stored. Further, the calculation unit 43 also compares the pixel value of any pixel in the pixel row B with the two-dimensional image acquired one frame before and determines whether or not it has changed. Then, when there is a change in the pixel value, it is determined that the workpiece W has reached a predetermined position within the visual field V, and the second time obtained by the timer 44 at that time is stored.
  • the calculation unit 43 calculates the transport speed of the workpiece W based on the stored first time and second time and the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B.
  • the correspondence relationship between the distance between pixels in the two-dimensional image and the actual movement distance of the work W by the conveyor 2 is stored in advance in the calculation unit 43 by calibration.
  • the calculation unit 43 moves from the arrival position at which the work W arrives in the field of view V as shown in FIG. A first time required to travel a first distance D1 to is calculated. Similarly, the calculation unit 43 calculates a second distance D2 from the arrival position where the work W has arrived in the visual field V to a second position set in advance as shown in FIG. Calculate a second time required to move the . Then, the two-dimensional images obtained by the two-dimensional camera 3 after the first time and the second time have passed from the time when the work W arrived at the visual field V are stored as the first image P1 and the second image P2 in the storage unit 45. be memorized.
  • the first position is set to a position near the upstream end of the field of view V where the trailing edge of the work W fits within the field of view V
  • the second position is set near the downstream end of the field of view V by the leading edge of the work W. is set at a position that does not deviate from the visual field V.
  • the three-dimensional data generation unit 46 calculates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the first image P1 and the second image P2 (the second distance and the second distance).
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from the 1 distance D1.
  • step S1 a program pre-stored in the control unit 4 of the three-dimensional data acquisition device 1 according to the present embodiment is executed, the conveyor 2 is operated (step S1), and a plurality of workpieces W on the belt 21 are each is transported toward the field of view V from the upstream side of the field of view V at a speed of .
  • the timer 44 starts timing, and the two-dimensional camera 3 starts photographing within the field of view V at a constant frame rate in accordance with a control command from the camera control unit 41.
  • Two-dimensional images are acquired sequentially (step S2).
  • the acquired two-dimensional images are sequentially sent to the image processing unit 42 and processed.
  • the image processing unit 42 for example, binarization processing is performed.
  • the processed two-dimensional image information is sent to the calculation unit 43, and the calculation unit 43 determines whether or not the pixel value of any pixel in the pixel array A of each two-dimensional image has changed. (step S3). This determination is repeated until a change in the pixel value of any pixel in the pixel row A is detected.
  • the calculation unit 43 detects any pixel row A in the two-dimensional image. It is determined that the work W has arrived at the visual field V by detecting a change in the pixel value of the pixel. Then, the time when the workpiece W arrives at the visual field V is acquired by the timer 44 and stored in the calculation unit 43 as the first time (step S4).
  • the calculation unit 43 performs the same determination as that for the pixel row A regarding the change in the pixel value of any pixel in the pixel row B, which is separated from the pixel row A by a predetermined number of pixels in the transport direction (step S5).
  • the calculation unit 43 detects that the workpiece W is at a predetermined position within the field of view V. Determine that it has arrived. Then, the time when the workpiece W reaches a predetermined position within the field of view V is acquired by the timer 44 and stored in the calculation unit 43 as a second time (step S6).
  • the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B is divided by the difference time obtained by subtracting the first time from the second time, so that the workpiece W is conveyed.
  • a speed is calculated (step S7). Based on the calculated conveying speed of the work W, the calculation unit 43 moves the work W by the first distance D1 and the second distance D2 from the position where the work W arrives in the visual field V, as shown in FIGS. A first time period and a second time period required for this are calculated (step S8).
  • the calculation unit 43 determines whether or not the first time has passed since the first time when the workpiece W arrived at the visual field V (step S9). As a result, when it is detected that the first time has passed since the first time when the workpiece W arrived in the field of view V, the two-dimensional image acquired by the two-dimensional camera 3 at that time is stored as the first image P1. It is stored in the unit 45 (step S10).
  • the calculation unit 43 determines whether or not the second time has elapsed from the first time (step S11). Then, when it is detected that the second time has elapsed from the first time, the two-dimensional image acquired at that time is stored in the storage unit 45 as the second image P2 (step S12). That is, the storage unit 45 stores a first image P1 and a second image P2 obtained by photographing the workpiece W moving within the field of view V from two different angles.
  • the three-dimensional data generation unit 46 generates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the images P1 and P2 (the second distance D2 and the first distance
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1 (step S13). This terminates the program.
  • the single two-dimensional camera 3 that is originally required for acquiring three-dimensional data by the monocular stereo method is used to detect whether or not the work W has arrived in the visual field V. be able to. Therefore, since it is not necessary to provide a special component such as a photoelectric sensor to acquire the position information of the workpiece W being conveyed, the three-dimensional data acquisition apparatus 1 can be configured simply.
  • a pixel array A arranged along the upstream end of the two-dimensional image and a pixel array B separated from the pixel array A by several pixels on the downstream side are based on changes in the pixel values of the workpiece.
  • the transport speed of W was calculated.
  • a pixel row ( The transport speed may be calculated based on the change in the pixel values of the first pixel)C and the pixel row (second pixel)D.
  • the angle formed by the optical axis X and the straight line connecting the front edge of the workpiece W and the center of the two-dimensional camera 3, which coincides with the positions corresponding to the pixel columns C and D, is ⁇ can be made smaller. Therefore, even when the dimension of the work W in the direction of the optical axis X, ie, the thickness dimension, is large, the position of the leading edge of the work W can be accurately detected from the two-dimensional image. In particular, even when the distance between the work W and the two-dimensional camera 3 is short, there is an advantage that the detection accuracy of the leading edge position of the work W can be maintained high.
  • the positional relationship is adjusted so that the trailing edge of the workpiece W falls within the visual field V, and the The resulting two-dimensional image may be the first image P1.
  • FIG. 13 illustrates the case where the pixel columns C and D are arranged on the upstream side of the optical axis X, but instead of this, both pixel columns C and D are arranged on the downstream side of the optical axis X.
  • the pixel row C and the pixel row D may be arranged on the upstream side and the downstream side with the optical axis X interposed therebetween.
  • the thickness dimension of the work W has a greater effect on the detection accuracy of the position of the work W in the region on the downstream side of the optical axis X than in the region on the upstream side.
  • the times at which the first image P1 and the second image P2 are acquired are determined based on the first time and the second time required for the work W to move the first distance D1 and the second distance D2. Calculated.
  • the time to acquire the second image P2 is the third time required for the workpiece W to move the third distance D3 from the first position to the second position. may be calculated based on
  • the first time required for the work W to move the first distance D1 is calculated from the transport speed of the work W, and the work W arrives at the visual field V in the same manner as described above.
  • a two-dimensional image acquired when the first time has elapsed from the first time is assumed to be a first image P1.
  • the second image P2 first, the difference between the second distance D2 from the arrival position of the work W to the second position and the first distance D1 from the arrival position to the first position. A third distance D3 is calculated. Then, based on the transport speed of the work W, a third time required for the work W to move the third distance D3 is calculated. As a result, the two-dimensional image acquired when the third time has passed since the work W was placed at the first position can be used as the second image P2.
  • the acquisition time of the second image P2 is calculated based on the acquisition time of the first image P1, the acquisition time of the first image P1 and the acquisition time of the second image P2 are can be stabilized, and the three-dimensional data of the workpiece W can be generated with higher accuracy.
  • the first position is set in advance, but instead of this, the first position may be determined based on changes in pixel values in the pixel row A of the two-dimensional image.
  • the calculation unit 43 by detecting two changes in the pixel values in the pixel row A by the calculation unit 43, it is detected that the workpiece W has arrived in the visual field V and that the entire workpiece W is within the visual field V. can do.
  • the first image P1 can be acquired while the entire work W is reliably contained within the field of view V even if the dimension of the work W in the transport direction, that is, the length dimension is unknown.
  • the second position may also be determined based on changes in pixel values of pixel rows arranged along the downstream end of the two-dimensional image, without being set in advance.
  • a pixel row equivalent to the pixel row A is set a few pixels upstream of the downstream end of the two-dimensional image, and a change in the pixel value of that pixel row is detected. It is detected that the leading edge of W is positioned just before reaching the downstream end of the field of view V. FIG. Then, the two-dimensional image acquired at that time is acquired as the second image P2.
  • the second image P2 can be reliably acquired before the front edge of the work W moves out of the field of view V, so that the three-dimensional data of the work W can be generated more accurately, and the three-dimensional data can be acquired.
  • the processing steps of the device 1 can also be simplified.
  • a three-dimensional data acquisition device 1' according to a second embodiment of the present disclosure will be described below with reference to the drawings.
  • a three-dimensional data acquisition device 1' according to this embodiment has the same reference numerals as those of the three-dimensional data acquisition device 1 according to the first embodiment described above, and description thereof is omitted.
  • the three-dimensional data acquisition apparatus 1' according to the present embodiment is similar to the 3D data acquisition apparatus 1' according to the first embodiment in that the conveyor 2 includes an encoder 22 for detecting the amount of rotation of the belt 21. It is different from the dimensional data acquisition device 1 .
  • the two-dimensional camera A workpiece W is conveyed from the upstream side of the visual field V of 3 .
  • Two-dimensional images within the field of view V are sequentially acquired by the two-dimensional camera 3 , and each acquired two-dimensional image is processed by the image processing unit 42 and then sent to the calculation unit 43 .
  • the arrival of the workpiece W into the visual field V is detected based on the change in the pixel value of any pixel in the pixel array A within the two-dimensional image.
  • the calculation unit 43 receives information on the amount of rotation of the belt 21 output from the encoder 22 . Then, based on the received information about the amount of rotation of the belt 21, the calculation unit 43 detects that the work W has moved the first distance D1 from the position where it arrived in the visual field V and is placed at the first position. Also, the two-dimensional image acquired at that time is stored in the storage unit 45 as the first image P1.
  • the information from the encoder 22 is received by the computing unit 43, and based on the received information, it is detected that the workpiece W is placed at the second position, which is the second distance D2 from the arrival position, and The storage unit 45 stores the second image P2 acquired at the time point.
  • the three-dimensional data generation unit 46 the first image P1 and the second image P2 stored in the storage unit 45 and the amount of movement of the work W between the images P1 and P2 (the second distance D2 and the first distance
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1.
  • the encoder 22 outputs the first distance D1, which is the amount of movement of the work W from the position where the work W has arrived in the visual field V, to the first position, and the second distance D2, which is the amount of movement to the second position. can be obtained directly.
  • the first position and the second position of the workpiece W within the visual field V can be accurately detected. Therefore, the first image P1 and the second image P2 used to generate the three-dimensional data of the work W can be acquired more accurately, and the accuracy of the three-dimensional data of the work W generated by the three-dimensional data generation unit 46 can be improved. It has the advantage that it can be improved.

Abstract

Provided is a three-dimensional data acquisition device (1) comprising: a single two-dimensional camera (3) that images a workpiece (W); a movement mechanism (2) that moves the workpiece (W) and the two-dimensional camera (3) relatively in one direction; and a control unit (4). The control unit (4) detects the arrival of the workpiece (W) in a field of view (V) of the two-dimensional camera (3) by a change in a pixel value of a pixel near an upstream end in the relative movement direction of the workpiece (W) in a two-dimensional image acquired by the two-dimensional camera (3). The control unit also generates three-dimensional data of the workpiece (W) on the basis of a first image acquired by the two-dimensional camera (3) at a first position to which the workpiece (W) moved from the arrival position by only a prescribed first distance, a second image acquired by the two-dimensional camera (3) at a second position to which the workpiece (W) moved from the arrival position by only a prescribed second distance, and a differential distance between the second distance and the first distance.

Description

3次元データ取得装置3D data acquisition device
 本開示は、3次元データ取得装置に関するものである。 The present disclosure relates to a three-dimensional data acquisition device.
 単眼ステレオ法を用いて、単眼のカメラと水平移動装置とを組み合わせてワークの3次元計測を行う方法が知られている(例えば、特許文献1参照。)。 A method of performing three-dimensional measurement of a workpiece by combining a monocular camera and a horizontal movement device using a monocular stereo method is known (see Patent Document 1, for example).
特開2017-162133号公報JP 2017-162133 A
 特許文献1の方法では、カメラの視野内の極力離れた2地点においてワークを撮影するために、ワークまたはワークを保持する保持体に、ワークの搬送方向の上流側と下流側にマークを付与している。しかしながら、ワークがコンベヤ等によって搬送される場合には、各ワークにマークを付与したり、マーク付きの保持体に各ワークを搭載したりしなければならず、作業が煩雑となり、作業コストも高くなる。
 したがって、ワークにマークを付与することなく、単眼のカメラによってワークの3次元計測を精度よく行うことが望まれている。
In the method of Patent Document 1, in order to photograph the work at two points as far apart as possible within the field of view of the camera, the work or a holder that holds the work is provided with marks on the upstream side and the downstream side in the work conveying direction. ing. However, when the works are conveyed by a conveyor or the like, it is necessary to mark each work or to mount each work on a holder with a mark, which complicates the work and increases the work cost. Become.
Therefore, it is desired to accurately perform three-dimensional measurement of a work using a monocular camera without marking the work.
 本開示の一態様は、ワークを撮影する単一の2次元カメラと、前記ワークと前記2次元カメラとを一方向に相対移動させる移動機構と、制御部とを備え、該制御部が、前記2次元カメラにより取得された2次元画像における前記ワークの前記相対移動方向の上流端近傍の画素の画素値の変化によって前記ワークの前記2次元カメラの視野内への到来を検出するとともに、前記ワークの到来位置から所定の第1距離だけ移動した第1位置において前記2次元カメラによって取得された第1画像と、前記ワークが前記到来位置から所定の第2距離だけ移動した第2位置において前記2次元カメラによって取得された第2画像と、前記第2距離と前記第1距離との差分距離とに基づいて前記ワークの3次元データを生成する3次元データ取得装置である。 One aspect of the present disclosure includes a single two-dimensional camera that photographs a workpiece, a movement mechanism that relatively moves the workpiece and the two-dimensional camera in one direction, and a control unit, wherein the control unit The arrival of the workpiece within the field of view of the two-dimensional camera is detected by a change in the pixel value of pixels in the vicinity of the upstream end of the workpiece in the direction of relative movement in the two-dimensional image acquired by the two-dimensional camera, and the workpiece is A first image acquired by the two-dimensional camera at a first position that has moved a predetermined first distance from the arrival position of the work, and a second image that has been moved from the arrival position by a predetermined second distance to the second image. The three-dimensional data acquisition device generates three-dimensional data of the workpiece based on a second image acquired by a dimensional camera and a difference distance between the second distance and the first distance.
本開示の第1の実施形態に係る3次元データ取得装置の全体構成図である。1 is an overall configuration diagram of a three-dimensional data acquisition device according to a first embodiment of the present disclosure; FIG. 図1の制御部の構成を示すブロック図である。2 is a block diagram showing the configuration of a control unit in FIG. 1; FIG. 図1の2次元カメラによって取得される2次元画像を示す概略図である。2 is a schematic diagram showing a two-dimensional image acquired by the two-dimensional camera of FIG. 1; FIG. 図1の2次元カメラの視野にワークが到来した状態を示す概略図である。FIG. 2 is a schematic diagram showing a state in which a work has arrived in the field of view of the two-dimensional camera of FIG. 1; 図4の状態で取得される2次元画像を示す概略図である。5 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 4; FIG. 図4の状態よりもワークが2次元カメラの視野内に侵入した状態を示す概略図である。FIG. 5 is a schematic diagram showing a state in which the workpiece has entered the field of view of the two-dimensional camera rather than the state in FIG. 4; 図6の状態で取得される2次元画像を示す概略図である。FIG. 7 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 6; 図1の2次元カメラの視野内の第1位置にワークが配置された状態を示す概略図である。FIG. 2 is a schematic diagram showing a state in which a workpiece is placed at a first position within the field of view of the two-dimensional camera of FIG. 1; 図8の状態で取得される2次元画像を示す概略図である。FIG. 9 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 8; 図1の2次元カメラの視野内の第2位置にワークが配置された状態を示す概略図である。2 is a schematic diagram showing a state in which a workpiece is placed at a second position within the field of view of the two-dimensional camera of FIG. 1; FIG. 図10の状態で取得される2次元画像を示す概略図である。FIG. 11 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 10; 図1の3次元データ取得装置の動作を説明するフローチャートである。2 is a flowchart for explaining the operation of the three-dimensional data acquisition device of FIG. 1; 図1の3次元データ取得装置の変形例の2次元カメラによって取得される2次元画像を示す概略図である。FIG. 2 is a schematic diagram showing a two-dimensional image acquired by a two-dimensional camera of a modified example of the three-dimensional data acquisition device of FIG. 1; 図13の2次元カメラの視野内の光軸近傍にワークが配置された状態を示す部分構成図である。FIG. 14 is a partial configuration diagram showing a state in which a workpiece is placed near the optical axis within the field of view of the two-dimensional camera of FIG. 13; 本開示の第2の実施形態に係る3次元データ取得装置の全体構成図である。1 is an overall configuration diagram of a three-dimensional data acquisition device according to a second embodiment of the present disclosure; FIG.
 本開示の第1の実施形態に係る3次元データ取得装置1について、図面を参照して以下に説明する。
 本実施形態に係る3次元データ取得装置1は、例えば、図1に示されるように、ワークWを搬送するコンベヤ(移動機構)2と、コンベヤ2の上方に下向きに設置された単一の2次元カメラ3と、制御部4とを備えている。
A three-dimensional data acquisition device 1 according to a first embodiment of the present disclosure will be described below with reference to the drawings.
For example, as shown in FIG. 1, the three-dimensional data acquisition apparatus 1 according to the present embodiment includes a conveyor (moving mechanism) 2 that conveys a work W, and a single 2-axis unit installed above the conveyor 2 downward. A dimensional camera 3 and a control unit 4 are provided.
 コンベヤ2は、例えば、ベルトコンベヤであって、ワークWを搭載して水平方向の一方向に搬送するベルト21を備えている。ベルト21は図示しないモータによって一定の速度で駆動される。
 2次元カメラ3は、例えば、コンベヤ2のベルト21の上方に配置された架台Rに取り付けられている。また、2次元カメラ3は、その光軸Xを鉛直方向に配置した下向きの視野Vを有しており、視野V内の中央近傍をベルト21によって搬送されるワークWが通過する位置に配置されている。
The conveyor 2 is, for example, a belt conveyor, and includes a belt 21 for carrying the work W thereon and conveying it in one horizontal direction. The belt 21 is driven at a constant speed by a motor (not shown).
The two-dimensional camera 3 is attached, for example, to a pedestal R arranged above the belt 21 of the conveyor 2 . The two-dimensional camera 3 has a downward visual field V with its optical axis X arranged in the vertical direction. ing.
 制御部4は、予め教示されたプログラムを記憶する少なくとも1つのメモリおよびプログラムを実行する少なくとも1つのプロセッサを備えている。また、制御部4は、図2に示されるように、2次元カメラ3を制御するカメラ制御部41と、2次元カメラ3によって取得された2次元画像を処理する画像処理部42と、演算部43と、タイマ44と、記憶部45と、3次元データ生成部46とを備えている。 The control unit 4 includes at least one memory that stores pre-taught programs and at least one processor that executes the programs. 2, the control unit 4 includes a camera control unit 41 that controls the two-dimensional camera 3, an image processing unit 42 that processes a two-dimensional image acquired by the two-dimensional camera 3, and a computing unit. 43 , a timer 44 , a storage unit 45 and a three-dimensional data generation unit 46 .
 カメラ制御部41は、2次元カメラ3に対して制御指令を送信して、2次元カメラ3を所定のフレームレートで2次元画像を取得する状態に設定する。2次元カメラ3は、2次元画像を取得する毎に、取得した2次元画像を画像処理部42に送信する。 The camera control unit 41 transmits a control command to the two-dimensional camera 3 and sets the two-dimensional camera 3 to acquire a two-dimensional image at a predetermined frame rate. The two-dimensional camera 3 transmits the acquired two-dimensional image to the image processing unit 42 each time it acquires a two-dimensional image.
 画像処理部42は、2次元カメラ3から受信した各2次元画像を処理し、例えば、図3に示されるように、各2次元画像におけるワークWの搬送方向の上流端に沿いつつ、搬送方向に直交して配列された複数の画素からなる画素列(第1画素)Aの画素値を抽出する。同様に、画像処理部42は、画素列Aから搬送方向に所定画素だけ離れて、画素列Aと平行に配列された複数の画素からなる画素列(第2画素)Bの画素値を抽出する。抽出された画素値は演算部43に送られる。 The image processing unit 42 processes each two-dimensional image received from the two-dimensional camera 3, for example, as shown in FIG. A pixel value of a pixel row (first pixel) A composed of a plurality of pixels arranged orthogonally to is extracted. Similarly, the image processing unit 42 extracts pixel values of a pixel row (second pixel) B consisting of a plurality of pixels arranged in parallel with the pixel row A at a distance of a predetermined number of pixels from the pixel row A in the transport direction. . The extracted pixel values are sent to the calculation unit 43 .
 演算部43は、画素列A内のいずれかの画素の画素値が、1フレーム前に取得された2次元画像と比較して変化したか否かを判定する。そして、画素値に変化があった場合には、視野V内にワークWが到来したと判定し、その時点でタイマ44によって取得された第1時刻を記憶する。
 さらに、演算部43は、画素列B内のいずれかの画素の画素値についても、1フレーム前に取得された2次元画像と比較して変化したか否かを判定する。そして、画素値に変化があった場合には、視野V内の所定の位置にワークWが到達したと判定し、その時点でタイマ44によって取得された第2時刻を記憶する。
The calculation unit 43 determines whether or not the pixel value of any pixel in the pixel row A has changed compared to the two-dimensional image acquired one frame before. Then, when there is a change in the pixel value, it is determined that the work W has arrived within the visual field V, and the first time obtained by the timer 44 at that time is stored.
Further, the calculation unit 43 also compares the pixel value of any pixel in the pixel row B with the two-dimensional image acquired one frame before and determines whether or not it has changed. Then, when there is a change in the pixel value, it is determined that the workpiece W has reached a predetermined position within the visual field V, and the second time obtained by the timer 44 at that time is stored.
 演算部43は、記憶した第1時刻および第2時刻と、画素列Aと画素列Bとの間の画素数に相当する実際の距離とに基づいて、ワークWの搬送速度を算出する。ここで、2次元画像内における画素間の距離と、コンベヤ2によるワークWの実際の移動距離との対応関係は、キャリブレーションによって、予め演算部43に記憶されている。 The calculation unit 43 calculates the transport speed of the workpiece W based on the stored first time and second time and the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B. Here, the correspondence relationship between the distance between pixels in the two-dimensional image and the actual movement distance of the work W by the conveyor 2 is stored in advance in the calculation unit 43 by calibration.
 また、演算部43は、算出したワークWの搬送速度に基づいて、図4に示されるようにワークWが視野Vに到来した到来位置から、図8に示されるように予め設定した第1位置までの第1距離D1を移動するのに要する第1時間を算出する。同様に、演算部43は、算出したワークWの搬送速度に基づいて、ワークWが視野Vに到来した到来位置から、図10に示されるように予め設定した第2位置までの第2距離D2を移動するのに要する第2時間を算出する。そして、ワークWが視野Vに到来した時刻から第1時間経過後および第2時間経過後に、それぞれ2次元カメラ3によって取得された2次元画像を第1画像P1および第2画像P2として記憶部45に記憶させる。 Based on the calculated conveying speed of the work W, the calculation unit 43 moves from the arrival position at which the work W arrives in the field of view V as shown in FIG. A first time required to travel a first distance D1 to is calculated. Similarly, the calculation unit 43 calculates a second distance D2 from the arrival position where the work W has arrived in the visual field V to a second position set in advance as shown in FIG. Calculate a second time required to move the . Then, the two-dimensional images obtained by the two-dimensional camera 3 after the first time and the second time have passed from the time when the work W arrived at the visual field V are stored as the first image P1 and the second image P2 in the storage unit 45. be memorized.
 ここで、第1位置は、視野V内の上流端近傍においてワークWの後縁が視野V内に収まる位置に設定され、第2位置は、視野V内の下流端近傍においてワークWの前縁が視野Vから外れない位置に設定される。 Here, the first position is set to a position near the upstream end of the field of view V where the trailing edge of the work W fits within the field of view V, and the second position is set near the downstream end of the field of view V by the leading edge of the work W. is set at a position that does not deviate from the visual field V.
 3次元データ生成部46は、記憶部45に記憶された第1画像P1および第2画像P2と、第1画像P1と第2画像P2との間におけるワークWの移動量(第2距離と第1距離D1との差分距離)とを用いて、単眼ステレオ法によりワークWの3次元データを生成する。 The three-dimensional data generation unit 46 calculates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the first image P1 and the second image P2 (the second distance and the second distance). The three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from the 1 distance D1.
 このように構成された本実施形態に係る3次元データ取得装置1の作用について、図12に示すフローチャートに沿って説明する。
 本実施形態に係る3次元データ取得装置1の制御部4に予め記憶されたプログラムが実行されると、コンベヤ2が動作させられ(ステップS1)、ベルト21上の複数のワークWが、それぞれ一定の速度で視野Vよりも上流側から視野Vに向かって搬送される。
The operation of the three-dimensional data acquisition device 1 according to this embodiment configured in this way will be described along the flowchart shown in FIG.
When a program pre-stored in the control unit 4 of the three-dimensional data acquisition device 1 according to the present embodiment is executed, the conveyor 2 is operated (step S1), and a plurality of workpieces W on the belt 21 are each is transported toward the field of view V from the upstream side of the field of view V at a speed of .
 また、この状態において、タイマ44による計時が開始されるとともに、2次元カメラ3が、カメラ制御部41からの制御指令に従って一定のフレームレートでの視野V内の撮影を開始し、視野V内の2次元画像を順次取得する(ステップS2)。取得された2次元画像は、逐次、画像処理部42に送られて処理される。画像処理部42においては、例えば、2値化処理が行われる。 In this state, the timer 44 starts timing, and the two-dimensional camera 3 starts photographing within the field of view V at a constant frame rate in accordance with a control command from the camera control unit 41. Two-dimensional images are acquired sequentially (step S2). The acquired two-dimensional images are sequentially sent to the image processing unit 42 and processed. In the image processing unit 42, for example, binarization processing is performed.
 次いで、処理された2次元画像の情報は、演算部43に送られ、演算部43によって、各2次元画像の画素列Aにおけるいずれかの画素の画素値が変化したか否かの判定が行われる(ステップS3)。この判定は、画素列Aにおけるいずれかの画素の画素値の変化が検出されるまで繰り返される。
 一方、図4および図5に示されるように、ベルト21によって搬送されたワークWが2次元カメラ3の視野Vに到来すると、演算部43は、2次元画像内の画素列Aのいずれかの画素の画素値の変化を検出し、ワークWが視野Vに到来したと判定する。
 そして、ワークWが視野Vに到来した時刻がタイマ44によって取得され、第1時刻として演算部43に記憶される(ステップS4)。
Next, the processed two-dimensional image information is sent to the calculation unit 43, and the calculation unit 43 determines whether or not the pixel value of any pixel in the pixel array A of each two-dimensional image has changed. (step S3). This determination is repeated until a change in the pixel value of any pixel in the pixel row A is detected.
On the other hand, as shown in FIGS. 4 and 5, when the workpiece W conveyed by the belt 21 reaches the field of view V of the two-dimensional camera 3, the calculation unit 43 detects any pixel row A in the two-dimensional image. It is determined that the work W has arrived at the visual field V by detecting a change in the pixel value of the pixel.
Then, the time when the workpiece W arrives at the visual field V is acquired by the timer 44 and stored in the calculation unit 43 as the first time (step S4).
 その後、演算部43は、画素列Aから搬送方向に所定画素だけ離れた画素列Bにおけるいずれかの画素の画素値の変化についても、画素列Aと同様の判定を行う(ステップS5)。
 その結果、図6および図7に示されるように、画素列Bにおけるいずれかの画素の画素値の変化を検出した場合には、演算部43は、ワークWが視野V内の所定の位置に到達したと判定する。そして、ワークWが視野V内の所定の位置に到達した時刻がタイマ44によって取得され、第2時刻として演算部43に記憶される(ステップS6)。
After that, the calculation unit 43 performs the same determination as that for the pixel row A regarding the change in the pixel value of any pixel in the pixel row B, which is separated from the pixel row A by a predetermined number of pixels in the transport direction (step S5).
As a result, as shown in FIGS. 6 and 7, when a change in the pixel value of any pixel in the pixel row B is detected, the calculation unit 43 detects that the workpiece W is at a predetermined position within the field of view V. Determine that it has arrived. Then, the time when the workpiece W reaches a predetermined position within the field of view V is acquired by the timer 44 and stored in the calculation unit 43 as a second time (step S6).
 また、演算部43において、画素列Aと画素列Bとの間の画素数に相当する実際の距離を、第2時刻から第1時刻を減算した差分時間で除算することにより、ワークWの搬送速度が算出される(ステップS7)。そして、演算部43は、算出したワークWの搬送速度に基づいて、図8から図11に示されるように、ワークWが視野Vに到来した位置から第1距離D1および第2距離D2だけ移動するのに要する第1時間および第2時間を算出する(ステップS8)。 In addition, in the calculation unit 43, the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B is divided by the difference time obtained by subtracting the first time from the second time, so that the workpiece W is conveyed. A speed is calculated (step S7). Based on the calculated conveying speed of the work W, the calculation unit 43 moves the work W by the first distance D1 and the second distance D2 from the position where the work W arrives in the visual field V, as shown in FIGS. A first time period and a second time period required for this are calculated (step S8).
 次いで、演算部43によって、ワークWが視野Vに到来した第1時刻から第1時間が経過したか否かの判定が行われる(ステップS9)。その結果、ワークWが視野Vに到来した第1時刻から第1時間が経過したことを検出した場合には、その時点で2次元カメラ3により取得された2次元画像が第1画像P1として記憶部45に記憶される(ステップS10)。 Next, the calculation unit 43 determines whether or not the first time has passed since the first time when the workpiece W arrived at the visual field V (step S9). As a result, when it is detected that the first time has passed since the first time when the workpiece W arrived in the field of view V, the two-dimensional image acquired by the two-dimensional camera 3 at that time is stored as the first image P1. It is stored in the unit 45 (step S10).
 また、ステップS9およびステップS10と同様に、演算部43によって第1時刻から第2時間が経過した否かが判定される(ステップS11)。そして、第1時刻から第2時間が経過したことを検出すると、その時点で取得された2次元画像が第2画像P2として記憶部45に記憶される(ステップS12)。
 すなわち、記憶部45には、視野V内を移動するワークWを異なる2つのアングルから撮影して得られた第1画像P1および第2画像P2が記憶される。
Further, similarly to steps S9 and S10, the calculation unit 43 determines whether or not the second time has elapsed from the first time (step S11). Then, when it is detected that the second time has elapsed from the first time, the two-dimensional image acquired at that time is stored in the storage unit 45 as the second image P2 (step S12).
That is, the storage unit 45 stores a first image P1 and a second image P2 obtained by photographing the workpiece W moving within the field of view V from two different angles.
 そして、3次元データ生成部46により、記憶部45に記憶された第1画像P1および第2画像P2と、両画像P1,P2の間におけるワークWの移動量(第2距離D2と第1距離D1との差分距離)とを用いた単眼ステレオ法によって、ワークWの3次元データが生成される(ステップS13)。これにより、プログラムが終了する。 Then, the three-dimensional data generation unit 46 generates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the images P1 and P2 (the second distance D2 and the first distance The three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1 (step S13). This terminates the program.
 このように、本実施形態によれば、単眼ステレオ法による3次元データの取得において元々必要な単一の2次元カメラ3を利用して、ワークWが視野Vに到来したか否かを検出することができる。したがって、搬送されるワークWの位置情報を取得するために、光電センサなどの特別な部品を設ける必要がないので、3次元データ取得装置1を簡易な構成とすることができる。 As described above, according to the present embodiment, the single two-dimensional camera 3 that is originally required for acquiring three-dimensional data by the monocular stereo method is used to detect whether or not the work W has arrived in the visual field V. be able to. Therefore, since it is not necessary to provide a special component such as a photoelectric sensor to acquire the position information of the workpiece W being conveyed, the three-dimensional data acquisition apparatus 1 can be configured simply.
 また、2次元カメラ3により取得された2次元画像内の2箇所の画素列の画素値の変化に基づいて、ワークWの搬送速度を算出することにより、従来のようにワークWにマーク等を付与しなくても、視野V内におけるワークWの位置を検出することができる。これにより、ワークWの3次元データを取得する工程を簡素化することができ、作業コストの低減を図ることができる。 Further, by calculating the conveying speed of the work W based on the change in the pixel values of the two pixel rows in the two-dimensional image acquired by the two-dimensional camera 3, it is possible to mark the work W as in the conventional art. The position of the workpiece W within the field of view V can be detected even without applying it. Thereby, the process of acquiring the three-dimensional data of the work W can be simplified, and the work cost can be reduced.
 なお、本実施形態においては、2次元画像の上流端に沿って配置された画素列Aおよび画素列Aよりも下流側に数画素だけ離れた画素列Bの画素値の変化に基づいて、ワークWの搬送速度を算出した。これに代えて、図13に示されるように、2次元画像内の光軸Xの近傍であって光軸Xよりも上流側に、ワークWの搬送方向に直交して配置される画素列(第1画素)Cおよび画素列(第2画素)Dの画素値の変化に基づいて、搬送速度を算出してもよい。 In the present embodiment, a pixel array A arranged along the upstream end of the two-dimensional image and a pixel array B separated from the pixel array A by several pixels on the downstream side are based on changes in the pixel values of the workpiece. The transport speed of W was calculated. Alternatively, as shown in FIG. 13, a pixel row ( The transport speed may be calculated based on the change in the pixel values of the first pixel)C and the pixel row (second pixel)D.
 この場合、図14に示されるように、画素列C,Dに相当する位置に一致しているワークWの前縁と2次元カメラ3の中心とを結ぶ直線と、光軸Xとがなす角度αを小さくすることができる。そのため、ワークWの光軸X方向の寸法、すなわち厚さ寸法が大きい場合でも、ワークWの前縁の位置を2次元画像から精度よく検出することができる。
 特に、ワークWと2次元カメラ3との距離が近い場合であっても、ワークWの前縁位置の検出精度を高く維持することができるという利点がある。
In this case, as shown in FIG. 14, the angle formed by the optical axis X and the straight line connecting the front edge of the workpiece W and the center of the two-dimensional camera 3, which coincides with the positions corresponding to the pixel columns C and D, is α can be made smaller. Therefore, even when the dimension of the work W in the direction of the optical axis X, ie, the thickness dimension, is large, the position of the leading edge of the work W can be accurately detected from the two-dimensional image.
In particular, even when the distance between the work W and the two-dimensional camera 3 is short, there is an advantage that the detection accuracy of the leading edge position of the work W can be maintained high.
 また、例えば、ワークWの前縁が2次元画像内の画素列Cに対応する位置に到達した時点でワークWの後縁が視野V内に収まる位置関係に調整しておき、その時点で取得された2次元画像を第1画像P1としてもよい。
 これにより、第1画像P1を取得する時刻の算出を省略することができる。したがって、第2画像P2を取得する時刻だけを、ワークWの搬送速度に基づいて算出すればよいので、3次元データ取得装置1における処理工程をより簡略化することができるという利点がある。
Further, for example, when the leading edge of the workpiece W reaches the position corresponding to the pixel row C in the two-dimensional image, the positional relationship is adjusted so that the trailing edge of the workpiece W falls within the visual field V, and the The resulting two-dimensional image may be the first image P1.
This makes it possible to omit the calculation of the time to acquire the first image P1. Therefore, only the time to acquire the second image P2 needs to be calculated based on the transport speed of the work W, so there is an advantage that the processing steps in the three-dimensional data acquisition apparatus 1 can be further simplified.
 また、図13には画素列Cおよび画素列Dが光軸Xよりも上流側に配置される場合を例示したが、これに代えて、両画素列C,Dが光軸Xよりも下流側に配置されてもよいし、画素列Cおよび画素列Dがそれぞれ光軸Xを挟んで上流側と下流側とに配置されてもよい。 13 illustrates the case where the pixel columns C and D are arranged on the upstream side of the optical axis X, but instead of this, both pixel columns C and D are arranged on the downstream side of the optical axis X. Alternatively, the pixel row C and the pixel row D may be arranged on the upstream side and the downstream side with the optical axis X interposed therebetween.
 この場合、光軸Xよりも下流側の領域では、上流側の領域に比べて、ワークWの厚さ寸法が、ワークWの位置の検出精度に対して大きく影響するが、両画素列C,Dがいずれも光軸Xの近傍に配置されることにより、その影響を最小限に抑えることができる。
 すなわち、光軸Xよりも下流側の領域に配置される画素の画素値の変化に基づいて、ワークWの搬送速度を算出する場合であっても、その算出精度を高く維持することができるという利点がある。
In this case, the thickness dimension of the work W has a greater effect on the detection accuracy of the position of the work W in the region on the downstream side of the optical axis X than in the region on the upstream side. By arranging all of D in the vicinity of the optical axis X, the influence can be minimized.
That is, even when calculating the conveying speed of the work W based on the change in the pixel value of the pixels arranged in the region downstream of the optical axis X, the calculation accuracy can be maintained at a high level. There are advantages.
 また、本実施形態は、第1画像P1および第2画像P2を取得する時刻を、ワークWが第1距離D1および第2距離D2を移動するのに要する第1時間および第2時間に基づいて算出した。これに代えて、第2画像P2を取得する時刻を、図10に示されるように、ワークWが第1位置から第2位置までの間の第3距離D3を移動するのに要する第3時間に基づいて算出してもよい。 Further, in the present embodiment, the times at which the first image P1 and the second image P2 are acquired are determined based on the first time and the second time required for the work W to move the first distance D1 and the second distance D2. Calculated. Alternatively, as shown in FIG. 10, the time to acquire the second image P2 is the third time required for the workpiece W to move the third distance D3 from the first position to the second position. may be calculated based on
 この場合、第1画像P1においては、上述の方法と同様、ワークWの搬送速度からワークWが第1距離D1を移動するのに要する第1時間を算出し、ワークWが視野Vに到来した第1時刻から第1時間が経過した時点で取得された2次元画像を第1画像P1とする。 In this case, in the first image P1, the first time required for the work W to move the first distance D1 is calculated from the transport speed of the work W, and the work W arrives at the visual field V in the same manner as described above. A two-dimensional image acquired when the first time has elapsed from the first time is assumed to be a first image P1.
 一方、第2画像P2においては、まず、ワークWが視野Vに到来した到来位置から第2位置までの第2距離D2と、到来位置から第1位置までの第1距離D1との差分である第3距離D3を算出する。そして、ワークWの搬送速度に基づいて、ワークWが第3距離D3を移動するのに要する第3時間を算出する。
 これにより、ワークWが第1位置に配置された時刻から第3時間が経過した時点で取得された2次元画像を第2画像P2とすることができる。
On the other hand, in the second image P2, first, the difference between the second distance D2 from the arrival position of the work W to the second position and the first distance D1 from the arrival position to the first position. A third distance D3 is calculated. Then, based on the transport speed of the work W, a third time required for the work W to move the third distance D3 is calculated.
As a result, the two-dimensional image acquired when the third time has passed since the work W was placed at the first position can be used as the second image P2.
 このように、本実施形態によれば、第2画像P2の取得時刻を第1画像P1の取得時刻を基準にして算出するので、第1画像P1の取得時刻と第2画像P2の取得時刻との間隔を安定させることができ、ワークWの3次元データをより精度よく生成することができる。 As described above, according to the present embodiment, since the acquisition time of the second image P2 is calculated based on the acquisition time of the first image P1, the acquisition time of the first image P1 and the acquisition time of the second image P2 are can be stabilized, and the three-dimensional data of the workpiece W can be generated with higher accuracy.
 また、本実施形態においては、第1位置を予め設定したが、これに代えて、2次元画像の画素列Aにおける画素値の変化に基づいて、第1位置を決定してもよい。 Also, in the present embodiment, the first position is set in advance, but instead of this, the first position may be determined based on changes in pixel values in the pixel row A of the two-dimensional image.
 具体的には、例えば、図4および図5に示すように、ワークWの前縁が視野Vの上流端に到達すると、2次元画像の画素列Aの画素値が変化する。そして、ワークWが視野Vの上流端を通過するまでは、図6および図7に示すように、画素列Aの画素値は変化後の状態に維持される。その後、図8および図9に示すように、ワークWの後縁が視野Vの上流端を通過すると、再び画素列Aの画素値が変化する。 Specifically, for example, as shown in FIGS. 4 and 5, when the leading edge of the workpiece W reaches the upstream end of the visual field V, the pixel values of the pixel row A of the two-dimensional image change. Until the workpiece W passes the upstream end of the field of view V, the pixel values of the pixel array A are maintained in the changed state, as shown in FIGS. After that, as shown in FIGS. 8 and 9, when the trailing edge of the workpiece W passes the upstream end of the field of view V, the pixel values of the pixel array A change again.
 このように、画素列Aにおける2度の画素値の変化を演算部43によって検出することにより、ワークWが視野Vに到来したこと、および、ワークW全体が視野V内に収まったことを検出することができる。
 これにより、ワークWの搬送方向の寸法、すなわち長さ寸法が分からない場合であっても、ワークW全体を視野V内に確実に収めた状態で、第1画像P1を取得することができる。
In this way, by detecting two changes in the pixel values in the pixel row A by the calculation unit 43, it is detected that the workpiece W has arrived in the visual field V and that the entire workpiece W is within the visual field V. can do.
As a result, the first image P1 can be acquired while the entire work W is reliably contained within the field of view V even if the dimension of the work W in the transport direction, that is, the length dimension is unknown.
 また、本実施形態においては、第2位置についても、予め設定することなく、2次元画像の下流端に沿って配置される画素列の画素値の変化に基づいて決定してもよい。 In addition, in the present embodiment, the second position may also be determined based on changes in pixel values of pixel rows arranged along the downstream end of the two-dimensional image, without being set in advance.
 この場合には、例えば、画素列Aと同等の画素列を2次元画像の下流端よりも数画素だけ上流側に離して設定し、その画素列の画素値の変化を検出することにより、ワークWの前縁が視野Vの下流端に到達する直前の位置に配置されていることが検出される。そして、その時点で取得された2次元画像が第2画像P2として取得される。 In this case, for example, a pixel row equivalent to the pixel row A is set a few pixels upstream of the downstream end of the two-dimensional image, and a change in the pixel value of that pixel row is detected. It is detected that the leading edge of W is positioned just before reaching the downstream end of the field of view V. FIG. Then, the two-dimensional image acquired at that time is acquired as the second image P2.
 これにより、ワークWの前縁が視野Vから外れる前に第2画像P2を確実に取得することができるので、ワークWの3次元データをより正確に生成することができるとともに、3次元データ取得装置1の処理工程を簡略化することもできる。 As a result, the second image P2 can be reliably acquired before the front edge of the work W moves out of the field of view V, so that the three-dimensional data of the work W can be generated more accurately, and the three-dimensional data can be acquired. The processing steps of the device 1 can also be simplified.
 次に、本開示の第2の実施形態に係る3次元データ取得装置1´について、図面を参照して以下に説明する。
 本実施形態に係る3次元データ取得装置1´は、上述した第1の実施形態に係る3次元データ取得装置1と構成を共通にする箇所には同一符号を付して説明を省略する。
Next, a three-dimensional data acquisition device 1' according to a second embodiment of the present disclosure will be described below with reference to the drawings.
A three-dimensional data acquisition device 1' according to this embodiment has the same reference numerals as those of the three-dimensional data acquisition device 1 according to the first embodiment described above, and description thereof is omitted.
 本実施形態に係る3次元データ取得装置1´は、図15に示されるように、コンベヤ2が、ベルト21の回転量を検出するエンコーダ22を備えている点において第1の実施形態に係る3次元データ取得装置1と相違している。 As shown in FIG. 15, the three-dimensional data acquisition apparatus 1' according to the present embodiment is similar to the 3D data acquisition apparatus 1' according to the first embodiment in that the conveyor 2 includes an encoder 22 for detecting the amount of rotation of the belt 21. It is different from the dimensional data acquisition device 1 .
 本実施形態に係る3次元データ取得装置1´においても、第1の実施形態に係る3次元データ取得装置1と同様に、制御部4に予め記憶されたプログラムが実行されると、2次元カメラ3の視野Vの上流側からワークWが搬送される。
 また、2次元カメラ3によって視野V内の2次元画像が順次取得され、取得された各2次元画像は、画像処理部42によって処理された後に、演算部43へと送られる。
Also in the three-dimensional data acquisition device 1' according to the present embodiment, similarly to the three-dimensional data acquisition device 1 according to the first embodiment, when the program pre-stored in the control unit 4 is executed, the two-dimensional camera A workpiece W is conveyed from the upstream side of the visual field V of 3 .
Two-dimensional images within the field of view V are sequentially acquired by the two-dimensional camera 3 , and each acquired two-dimensional image is processed by the image processing unit 42 and then sent to the calculation unit 43 .
 次いで、演算部43においては、2次元画像内の画素列Aのいずれかの画素の画素値の変化に基づいて、ワークWの視野Vへの到来が検出される。
 ワークWの到来が検出されると、演算部43は、エンコーダ22から出力されたベルト21の回転量の情報を受信する。そして、演算部43は、受信したベルト21の回転量の情報に基づいて、ワークWが視野Vに到来した位置から第1距離D1だけ移動して第1位置に配置されたことを検出する。また、その時点で取得された2次元画像は、第1画像P1として記憶部45に記憶される。
Next, in the calculation unit 43, the arrival of the workpiece W into the visual field V is detected based on the change in the pixel value of any pixel in the pixel array A within the two-dimensional image.
When the arrival of the work W is detected, the calculation unit 43 receives information on the amount of rotation of the belt 21 output from the encoder 22 . Then, based on the received information about the amount of rotation of the belt 21, the calculation unit 43 detects that the work W has moved the first distance D1 from the position where it arrived in the visual field V and is placed at the first position. Also, the two-dimensional image acquired at that time is stored in the storage unit 45 as the first image P1.
 同様に、演算部43によって、エンコーダ22からの情報が受信され、受信した情報に基づいて、ワークWが到来位置から第2距離D2だけ移動した第2位置に配置されたことを検出し、その時点で取得された第2画像P2を記憶部45に記憶させる。 Similarly, the information from the encoder 22 is received by the computing unit 43, and based on the received information, it is detected that the workpiece W is placed at the second position, which is the second distance D2 from the arrival position, and The storage unit 45 stores the second image P2 acquired at the time point.
 そして、3次元データ生成部46において、記憶部45に記憶された第1画像P1および第2画像P2と、両画像P1,P2の間におけるワークWの移動量(第2距離D2と第1距離D1との差分距離)とを用いた単眼ステレオ法によって、ワークWの3次元データの生成が行われる。 Then, in the three-dimensional data generation unit 46, the first image P1 and the second image P2 stored in the storage unit 45 and the amount of movement of the work W between the images P1 and P2 (the second distance D2 and the first distance The three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1.
 本実施形態によれば、ワークWが視野Vに到来した位置から第1位置までの移動量である第1距離D1および、第2位置までの移動量である第2距離D2を、エンコーダ22から直接的に取得することができる。
 これにより、ワークWの視野V内における第1位置および第2位置を正確に検出することができる。したがって、ワークWの3次元データの生成に用いる第1画像P1および第2画像P2をより正確に取得することができ、3次元データ生成部46によって生成されるワークWの3次元データの精度を向上することができるという利点がある。
According to this embodiment, the encoder 22 outputs the first distance D1, which is the amount of movement of the work W from the position where the work W has arrived in the visual field V, to the first position, and the second distance D2, which is the amount of movement to the second position. can be obtained directly.
Thereby, the first position and the second position of the workpiece W within the visual field V can be accurately detected. Therefore, the first image P1 and the second image P2 used to generate the three-dimensional data of the work W can be acquired more accurately, and the accuracy of the three-dimensional data of the work W generated by the three-dimensional data generation unit 46 can be improved. It has the advantage that it can be improved.
1,1´ 3次元データ取得装置
2 コンベヤ(移動機構)
3 2次元カメラ
4 制御部
22 エンコーダ
42 画像処理部
A,C 画素列(第1画素)
B,D 画素列(第2画素)
D1 第1距離
D2 第2距離
D3 第3距離
P1 第1画像
P2 第2画像
V 視野
W ワーク
X 光軸
1, 1' three-dimensional data acquisition device 2 conveyor (moving mechanism)
3 two-dimensional camera 4 control unit 22 encoder 42 image processing units A and C pixel array (first pixel)
B, D pixel row (second pixel)
D1 First distance D2 Second distance D3 Third distance P1 First image P2 Second image V Field of view W Workpiece X Optical axis

Claims (6)

  1.  ワークを撮影する単一の2次元カメラと、
     前記ワークと前記2次元カメラとを一方向に相対移動させる移動機構と、
     制御部とを備え、
     該制御部が、前記2次元カメラにより取得された2次元画像における前記ワークの前記相対移動方向の上流端近傍の画素の画素値の変化によって前記ワークの前記2次元カメラの視野内への到来を検出するとともに、
     前記ワークの到来位置から所定の第1距離だけ移動した第1位置において前記2次元カメラによって取得された第1画像と、前記ワークが前記到来位置から所定の第2距離だけ移動した第2位置において前記2次元カメラによって取得された第2画像と、前記第2距離と前記第1距離との差分距離とに基づいて前記ワークの3次元データを生成する3次元データ取得装置。
    a single two-dimensional camera for photographing the workpiece;
    a movement mechanism for relatively moving the workpiece and the two-dimensional camera in one direction;
    and a control unit,
    The control unit detects arrival of the work within the field of view of the two-dimensional camera by a change in pixel value of a pixel in the vicinity of an upstream end of the work in the direction of relative movement in the two-dimensional image acquired by the two-dimensional camera. Along with detecting
    A first image acquired by the two-dimensional camera at a first position that has moved a predetermined first distance from the arrival position of the work, and a second image that the work has moved a predetermined second distance from the arrival position A three-dimensional data acquisition device for generating three-dimensional data of the workpiece based on a second image acquired by the two-dimensional camera and a difference distance between the second distance and the first distance.
  2.  前記移動機構が、前記ワークと前記2次元カメラとの相対移動距離を検出するエンコーダを備え、
     前記第1距離および前記第2距離が前記エンコーダにより取得される請求項1に記載の3次元データ取得装置。
    The movement mechanism includes an encoder that detects a relative movement distance between the workpiece and the two-dimensional camera,
    The three-dimensional data acquisition device according to claim 1, wherein the first distance and the second distance are acquired by the encoder.
  3.  前記制御部が、前記2次元画像内において前記相対移動方向に所定距離だけ離れた第1画素および第2画素の画素値がそれぞれ変化した時刻に基づいて、前記ワークが前記第1距離および前記第2距離を移動するのに要する第1時間および第2時間を算出するとともに、
     前記ワークの前記視野内への到来時刻から前記第1時間が経過した時点で前記2次元カメラによって前記第1画像を取得させ、前記ワークの前記到来時刻から前記第2時間が経過した時点で前記2次元カメラによって前記第2画像とを取得させる請求項1に記載の3次元データ取得装置。
    Based on the time when the pixel values of a first pixel and a second pixel spaced apart by a predetermined distance in the relative movement direction in the two-dimensional image change, respectively, the control unit determines whether the work moves by the first distance and the second pixel. calculating a first time and a second time required to travel two distances;
    The first image is acquired by the two-dimensional camera when the first time has passed since the arrival time of the work within the field of view, and the first image is acquired when the second time has passed since the arrival time of the work. 3. The three-dimensional data acquisition device according to claim 1, wherein the second image is acquired by a two-dimensional camera.
  4.  前記第1画素および前記第2画素が、前記2次元画像内における前記2次元カメラの光軸近傍に配置されている請求項3に記載の3次元データ取得装置。 The three-dimensional data acquisition device according to claim 3, wherein the first pixel and the second pixel are arranged near the optical axis of the two-dimensional camera in the two-dimensional image.
  5.  前記第1画素および前記第2画素が、両方とも前記光軸よりも上流側または下流側に設けられている請求項4に記載の3次元データ取得装置。 The three-dimensional data acquisition device according to claim 4, wherein both the first pixel and the second pixel are provided upstream or downstream of the optical axis.
  6.  前記制御部が、前記2次元画像内において前記相対移動方向に所定距離だけ離れた第1画素および第2画素の画素値がそれぞれ変化した時刻に基づいて、前記ワークが前記第2距離と前記第1距離との差分である第3距離を移動するのに要する第3時間を算出するとともに、前記第1画素の画素値が変化した時点で前記2次元カメラによって前記第1画像を取得させ、前記第1画像の取得から第3時間が経過した時点で前記2次元カメラによって前記第2画像を取得させる請求項1に記載の3次元データ取得装置。
     
     
    Based on the time when the pixel values of a first pixel and a second pixel separated by a predetermined distance in the relative movement direction in the two-dimensional image change respectively, the control unit determines that the workpiece is moved by the second distance and the second pixel. A third time required to move a third distance which is a difference from the first distance is calculated, and when the pixel value of the first pixel changes, the first image is acquired by the two-dimensional camera, and 2. The three-dimensional data acquisition apparatus according to claim 1, wherein the second image is acquired by the two-dimensional camera when a third time has passed since acquisition of the first image.

PCT/JP2021/031437 2021-08-27 2021-08-27 Three-dimensional data acquisition device WO2023026452A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/031437 WO2023026452A1 (en) 2021-08-27 2021-08-27 Three-dimensional data acquisition device
TW111130663A TW202308820A (en) 2021-08-27 2022-08-15 Three-dimensional data acquisition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/031437 WO2023026452A1 (en) 2021-08-27 2021-08-27 Three-dimensional data acquisition device

Publications (1)

Publication Number Publication Date
WO2023026452A1 true WO2023026452A1 (en) 2023-03-02

Family

ID=85322542

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/031437 WO2023026452A1 (en) 2021-08-27 2021-08-27 Three-dimensional data acquisition device

Country Status (2)

Country Link
TW (1) TW202308820A (en)
WO (1) WO2023026452A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016114433A (en) * 2014-12-12 2016-06-23 株式会社安川電機 Image processing device, image processing method, and component supply device
JP2017036104A (en) * 2015-08-07 2017-02-16 シンフォニアテクノロジー株式会社 Part feeder
JP2017094406A (en) * 2015-11-18 2017-06-01 オムロン株式会社 Simulation device, simulation method, and simulation program
JP2017162133A (en) * 2016-03-09 2017-09-14 キヤノン株式会社 Imaging system, measurement system, production system, imaging method, program, recording media, and measurement method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016114433A (en) * 2014-12-12 2016-06-23 株式会社安川電機 Image processing device, image processing method, and component supply device
JP2017036104A (en) * 2015-08-07 2017-02-16 シンフォニアテクノロジー株式会社 Part feeder
JP2017094406A (en) * 2015-11-18 2017-06-01 オムロン株式会社 Simulation device, simulation method, and simulation program
JP2017162133A (en) * 2016-03-09 2017-09-14 キヤノン株式会社 Imaging system, measurement system, production system, imaging method, program, recording media, and measurement method

Also Published As

Publication number Publication date
TW202308820A (en) 2023-03-01

Similar Documents

Publication Publication Date Title
RU2562413C2 (en) Method and system identifying and determining geometrical, spatial and position characteristics of products transported by continuously acting conveyor, in particular unmachined, coarse profiled, coarse machined or partially machined steel products
US20030164876A1 (en) Procedure and device for measuring positions of continuous sheets
US20210362518A1 (en) Lateral adjustment of print substrate based on a camera image
CN110081816B (en) Article carrying system
JP2008296330A (en) Robot simulation device
KR20130093041A (en) Cutting line making device for plate-shaped object, cutting line making method for plate-shaped object, producing device for glass plate and producing method for glass plate
CN107683401A (en) Shape measuring apparatus and process for measuring shape
WO2023026452A1 (en) Three-dimensional data acquisition device
WO2019239351A1 (en) Method and machine for the surface decoration of a base article of the ceramic processing industry
JP7336678B2 (en) picking device
JP2019018339A (en) Robot system
JP2009115715A (en) Apparatus for measuring length of tread rubber of tire
EP3217191B1 (en) Distance measuring apparatus and method for measuring a distance
JP2013184838A (en) Apparatus and method for processing cutting line of plate-like object, and apparatus and method for producing glass plate
JP2016138761A (en) Three-dimensional measurement method by optical cutting method and three-dimensional measuring instrument
JPH02194307A (en) Curvature shape measuring instrument for plate-like body
KR20210026034A (en) System for inspecting appearance of rolled plate and method of inspecting appearance of rolled plate using the same
KR20130126631A (en) Device for detecting conveyance amount of plate-shaped object, device for cutting plate-shaped object, method for detecting conveyance amount of plate-shaped object, device for forming cutting lines on plate-shaped object, and method for forming cutting lines on plate-shaped object
CA2962809C (en) System and method for color scanning a moving article
JP2006337270A (en) Measuring method for cross-sectional shape and device therefor
WO2018207086A1 (en) Method for aligning sheets of paper for the construction of paper and cardboard boxes
KR101830514B1 (en) Apparatus and method for shearing in rolling process
KR20030053075A (en) the weight measuring system using 3-D image process
KR102385189B1 (en) Method and system for printing with regular position of drug products
US20230175833A1 (en) Method and apparatus for measuring objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21955064

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE