WO2023026452A1 - Dispositif d'acquisition de données tridimensionnelles - Google Patents

Dispositif d'acquisition de données tridimensionnelles Download PDF

Info

Publication number
WO2023026452A1
WO2023026452A1 PCT/JP2021/031437 JP2021031437W WO2023026452A1 WO 2023026452 A1 WO2023026452 A1 WO 2023026452A1 JP 2021031437 W JP2021031437 W JP 2021031437W WO 2023026452 A1 WO2023026452 A1 WO 2023026452A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
pixel
distance
image
workpiece
Prior art date
Application number
PCT/JP2021/031437
Other languages
English (en)
Japanese (ja)
Inventor
澤源 孫
俊之 安藤
Original Assignee
ファナック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ファナック株式会社 filed Critical ファナック株式会社
Priority to PCT/JP2021/031437 priority Critical patent/WO2023026452A1/fr
Priority to TW111130663A priority patent/TW202308820A/zh
Publication of WO2023026452A1 publication Critical patent/WO2023026452A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Definitions

  • the present disclosure relates to a three-dimensional data acquisition device.
  • Patent Document 1 A method of performing three-dimensional measurement of a workpiece by combining a monocular camera and a horizontal movement device using a monocular stereo method is known (see Patent Document 1, for example).
  • One aspect of the present disclosure includes a single two-dimensional camera that photographs a workpiece, a movement mechanism that relatively moves the workpiece and the two-dimensional camera in one direction, and a control unit, wherein the control unit
  • the arrival of the workpiece within the field of view of the two-dimensional camera is detected by a change in the pixel value of pixels in the vicinity of the upstream end of the workpiece in the direction of relative movement in the two-dimensional image acquired by the two-dimensional camera, and the workpiece is A first image acquired by the two-dimensional camera at a first position that has moved a predetermined first distance from the arrival position of the work, and a second image that has been moved from the arrival position by a predetermined second distance to the second image.
  • the three-dimensional data acquisition device generates three-dimensional data of the workpiece based on a second image acquired by a dimensional camera and a difference distance between the second distance and the first distance.
  • FIG. 1 is an overall configuration diagram of a three-dimensional data acquisition device according to a first embodiment of the present disclosure
  • FIG. 2 is a block diagram showing the configuration of a control unit in FIG. 1
  • FIG. 2 is a schematic diagram showing a two-dimensional image acquired by the two-dimensional camera of FIG. 1
  • FIG. FIG. 2 is a schematic diagram showing a state in which a work has arrived in the field of view of the two-dimensional camera of FIG. 1
  • 5 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 4
  • FIG. FIG. 5 is a schematic diagram showing a state in which the workpiece has entered the field of view of the two-dimensional camera rather than the state in FIG. 4
  • FIG. 7 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 6;
  • FIG. 2 is a schematic diagram showing a state in which a workpiece is placed at a first position within the field of view of the two-dimensional camera of FIG. 1;
  • FIG. 9 is a schematic diagram showing a two-dimensional image acquired in the state of FIG. 8;
  • 2 is a schematic diagram showing a state in which a workpiece is placed at a second position within the field of view of the two-dimensional camera of FIG. 1;
  • FIG. 2 is a schematic diagram showing a two-dimensional image acquired by a two-dimensional camera of a modified example of the three-dimensional data acquisition device of FIG. 1;
  • FIG. 14 is a partial configuration diagram showing a state in which a workpiece is placed near the optical axis within the field of view of the two-dimensional camera of FIG. 13;
  • 1 is an overall configuration diagram of a three-dimensional data acquisition device according to a second embodiment of the present disclosure;
  • the three-dimensional data acquisition apparatus 1 includes a conveyor (moving mechanism) 2 that conveys a work W, and a single 2-axis unit installed above the conveyor 2 downward.
  • a dimensional camera 3 and a control unit 4 are provided.
  • the conveyor 2 is, for example, a belt conveyor, and includes a belt 21 for carrying the work W thereon and conveying it in one horizontal direction.
  • the belt 21 is driven at a constant speed by a motor (not shown).
  • the two-dimensional camera 3 is attached, for example, to a pedestal R arranged above the belt 21 of the conveyor 2 .
  • the two-dimensional camera 3 has a downward visual field V with its optical axis X arranged in the vertical direction. ing.
  • the control unit 4 includes at least one memory that stores pre-taught programs and at least one processor that executes the programs. 2, the control unit 4 includes a camera control unit 41 that controls the two-dimensional camera 3, an image processing unit 42 that processes a two-dimensional image acquired by the two-dimensional camera 3, and a computing unit. 43 , a timer 44 , a storage unit 45 and a three-dimensional data generation unit 46 .
  • the camera control unit 41 transmits a control command to the two-dimensional camera 3 and sets the two-dimensional camera 3 to acquire a two-dimensional image at a predetermined frame rate.
  • the two-dimensional camera 3 transmits the acquired two-dimensional image to the image processing unit 42 each time it acquires a two-dimensional image.
  • the image processing unit 42 processes each two-dimensional image received from the two-dimensional camera 3, for example, as shown in FIG. A pixel value of a pixel row (first pixel) A composed of a plurality of pixels arranged orthogonally to is extracted. Similarly, the image processing unit 42 extracts pixel values of a pixel row (second pixel) B consisting of a plurality of pixels arranged in parallel with the pixel row A at a distance of a predetermined number of pixels from the pixel row A in the transport direction. . The extracted pixel values are sent to the calculation unit 43 .
  • the calculation unit 43 determines whether or not the pixel value of any pixel in the pixel row A has changed compared to the two-dimensional image acquired one frame before. Then, when there is a change in the pixel value, it is determined that the work W has arrived within the visual field V, and the first time obtained by the timer 44 at that time is stored. Further, the calculation unit 43 also compares the pixel value of any pixel in the pixel row B with the two-dimensional image acquired one frame before and determines whether or not it has changed. Then, when there is a change in the pixel value, it is determined that the workpiece W has reached a predetermined position within the visual field V, and the second time obtained by the timer 44 at that time is stored.
  • the calculation unit 43 calculates the transport speed of the workpiece W based on the stored first time and second time and the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B.
  • the correspondence relationship between the distance between pixels in the two-dimensional image and the actual movement distance of the work W by the conveyor 2 is stored in advance in the calculation unit 43 by calibration.
  • the calculation unit 43 moves from the arrival position at which the work W arrives in the field of view V as shown in FIG. A first time required to travel a first distance D1 to is calculated. Similarly, the calculation unit 43 calculates a second distance D2 from the arrival position where the work W has arrived in the visual field V to a second position set in advance as shown in FIG. Calculate a second time required to move the . Then, the two-dimensional images obtained by the two-dimensional camera 3 after the first time and the second time have passed from the time when the work W arrived at the visual field V are stored as the first image P1 and the second image P2 in the storage unit 45. be memorized.
  • the first position is set to a position near the upstream end of the field of view V where the trailing edge of the work W fits within the field of view V
  • the second position is set near the downstream end of the field of view V by the leading edge of the work W. is set at a position that does not deviate from the visual field V.
  • the three-dimensional data generation unit 46 calculates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the first image P1 and the second image P2 (the second distance and the second distance).
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from the 1 distance D1.
  • step S1 a program pre-stored in the control unit 4 of the three-dimensional data acquisition device 1 according to the present embodiment is executed, the conveyor 2 is operated (step S1), and a plurality of workpieces W on the belt 21 are each is transported toward the field of view V from the upstream side of the field of view V at a speed of .
  • the timer 44 starts timing, and the two-dimensional camera 3 starts photographing within the field of view V at a constant frame rate in accordance with a control command from the camera control unit 41.
  • Two-dimensional images are acquired sequentially (step S2).
  • the acquired two-dimensional images are sequentially sent to the image processing unit 42 and processed.
  • the image processing unit 42 for example, binarization processing is performed.
  • the processed two-dimensional image information is sent to the calculation unit 43, and the calculation unit 43 determines whether or not the pixel value of any pixel in the pixel array A of each two-dimensional image has changed. (step S3). This determination is repeated until a change in the pixel value of any pixel in the pixel row A is detected.
  • the calculation unit 43 detects any pixel row A in the two-dimensional image. It is determined that the work W has arrived at the visual field V by detecting a change in the pixel value of the pixel. Then, the time when the workpiece W arrives at the visual field V is acquired by the timer 44 and stored in the calculation unit 43 as the first time (step S4).
  • the calculation unit 43 performs the same determination as that for the pixel row A regarding the change in the pixel value of any pixel in the pixel row B, which is separated from the pixel row A by a predetermined number of pixels in the transport direction (step S5).
  • the calculation unit 43 detects that the workpiece W is at a predetermined position within the field of view V. Determine that it has arrived. Then, the time when the workpiece W reaches a predetermined position within the field of view V is acquired by the timer 44 and stored in the calculation unit 43 as a second time (step S6).
  • the actual distance corresponding to the number of pixels between the pixel array A and the pixel array B is divided by the difference time obtained by subtracting the first time from the second time, so that the workpiece W is conveyed.
  • a speed is calculated (step S7). Based on the calculated conveying speed of the work W, the calculation unit 43 moves the work W by the first distance D1 and the second distance D2 from the position where the work W arrives in the visual field V, as shown in FIGS. A first time period and a second time period required for this are calculated (step S8).
  • the calculation unit 43 determines whether or not the first time has passed since the first time when the workpiece W arrived at the visual field V (step S9). As a result, when it is detected that the first time has passed since the first time when the workpiece W arrived in the field of view V, the two-dimensional image acquired by the two-dimensional camera 3 at that time is stored as the first image P1. It is stored in the unit 45 (step S10).
  • the calculation unit 43 determines whether or not the second time has elapsed from the first time (step S11). Then, when it is detected that the second time has elapsed from the first time, the two-dimensional image acquired at that time is stored in the storage unit 45 as the second image P2 (step S12). That is, the storage unit 45 stores a first image P1 and a second image P2 obtained by photographing the workpiece W moving within the field of view V from two different angles.
  • the three-dimensional data generation unit 46 generates the first image P1 and the second image P2 stored in the storage unit 45, and the amount of movement of the workpiece W between the images P1 and P2 (the second distance D2 and the first distance
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1 (step S13). This terminates the program.
  • the single two-dimensional camera 3 that is originally required for acquiring three-dimensional data by the monocular stereo method is used to detect whether or not the work W has arrived in the visual field V. be able to. Therefore, since it is not necessary to provide a special component such as a photoelectric sensor to acquire the position information of the workpiece W being conveyed, the three-dimensional data acquisition apparatus 1 can be configured simply.
  • a pixel array A arranged along the upstream end of the two-dimensional image and a pixel array B separated from the pixel array A by several pixels on the downstream side are based on changes in the pixel values of the workpiece.
  • the transport speed of W was calculated.
  • a pixel row ( The transport speed may be calculated based on the change in the pixel values of the first pixel)C and the pixel row (second pixel)D.
  • the angle formed by the optical axis X and the straight line connecting the front edge of the workpiece W and the center of the two-dimensional camera 3, which coincides with the positions corresponding to the pixel columns C and D, is ⁇ can be made smaller. Therefore, even when the dimension of the work W in the direction of the optical axis X, ie, the thickness dimension, is large, the position of the leading edge of the work W can be accurately detected from the two-dimensional image. In particular, even when the distance between the work W and the two-dimensional camera 3 is short, there is an advantage that the detection accuracy of the leading edge position of the work W can be maintained high.
  • the positional relationship is adjusted so that the trailing edge of the workpiece W falls within the visual field V, and the The resulting two-dimensional image may be the first image P1.
  • FIG. 13 illustrates the case where the pixel columns C and D are arranged on the upstream side of the optical axis X, but instead of this, both pixel columns C and D are arranged on the downstream side of the optical axis X.
  • the pixel row C and the pixel row D may be arranged on the upstream side and the downstream side with the optical axis X interposed therebetween.
  • the thickness dimension of the work W has a greater effect on the detection accuracy of the position of the work W in the region on the downstream side of the optical axis X than in the region on the upstream side.
  • the times at which the first image P1 and the second image P2 are acquired are determined based on the first time and the second time required for the work W to move the first distance D1 and the second distance D2. Calculated.
  • the time to acquire the second image P2 is the third time required for the workpiece W to move the third distance D3 from the first position to the second position. may be calculated based on
  • the first time required for the work W to move the first distance D1 is calculated from the transport speed of the work W, and the work W arrives at the visual field V in the same manner as described above.
  • a two-dimensional image acquired when the first time has elapsed from the first time is assumed to be a first image P1.
  • the second image P2 first, the difference between the second distance D2 from the arrival position of the work W to the second position and the first distance D1 from the arrival position to the first position. A third distance D3 is calculated. Then, based on the transport speed of the work W, a third time required for the work W to move the third distance D3 is calculated. As a result, the two-dimensional image acquired when the third time has passed since the work W was placed at the first position can be used as the second image P2.
  • the acquisition time of the second image P2 is calculated based on the acquisition time of the first image P1, the acquisition time of the first image P1 and the acquisition time of the second image P2 are can be stabilized, and the three-dimensional data of the workpiece W can be generated with higher accuracy.
  • the first position is set in advance, but instead of this, the first position may be determined based on changes in pixel values in the pixel row A of the two-dimensional image.
  • the calculation unit 43 by detecting two changes in the pixel values in the pixel row A by the calculation unit 43, it is detected that the workpiece W has arrived in the visual field V and that the entire workpiece W is within the visual field V. can do.
  • the first image P1 can be acquired while the entire work W is reliably contained within the field of view V even if the dimension of the work W in the transport direction, that is, the length dimension is unknown.
  • the second position may also be determined based on changes in pixel values of pixel rows arranged along the downstream end of the two-dimensional image, without being set in advance.
  • a pixel row equivalent to the pixel row A is set a few pixels upstream of the downstream end of the two-dimensional image, and a change in the pixel value of that pixel row is detected. It is detected that the leading edge of W is positioned just before reaching the downstream end of the field of view V. FIG. Then, the two-dimensional image acquired at that time is acquired as the second image P2.
  • the second image P2 can be reliably acquired before the front edge of the work W moves out of the field of view V, so that the three-dimensional data of the work W can be generated more accurately, and the three-dimensional data can be acquired.
  • the processing steps of the device 1 can also be simplified.
  • a three-dimensional data acquisition device 1' according to a second embodiment of the present disclosure will be described below with reference to the drawings.
  • a three-dimensional data acquisition device 1' according to this embodiment has the same reference numerals as those of the three-dimensional data acquisition device 1 according to the first embodiment described above, and description thereof is omitted.
  • the three-dimensional data acquisition apparatus 1' according to the present embodiment is similar to the 3D data acquisition apparatus 1' according to the first embodiment in that the conveyor 2 includes an encoder 22 for detecting the amount of rotation of the belt 21. It is different from the dimensional data acquisition device 1 .
  • the two-dimensional camera A workpiece W is conveyed from the upstream side of the visual field V of 3 .
  • Two-dimensional images within the field of view V are sequentially acquired by the two-dimensional camera 3 , and each acquired two-dimensional image is processed by the image processing unit 42 and then sent to the calculation unit 43 .
  • the arrival of the workpiece W into the visual field V is detected based on the change in the pixel value of any pixel in the pixel array A within the two-dimensional image.
  • the calculation unit 43 receives information on the amount of rotation of the belt 21 output from the encoder 22 . Then, based on the received information about the amount of rotation of the belt 21, the calculation unit 43 detects that the work W has moved the first distance D1 from the position where it arrived in the visual field V and is placed at the first position. Also, the two-dimensional image acquired at that time is stored in the storage unit 45 as the first image P1.
  • the information from the encoder 22 is received by the computing unit 43, and based on the received information, it is detected that the workpiece W is placed at the second position, which is the second distance D2 from the arrival position, and The storage unit 45 stores the second image P2 acquired at the time point.
  • the three-dimensional data generation unit 46 the first image P1 and the second image P2 stored in the storage unit 45 and the amount of movement of the work W between the images P1 and P2 (the second distance D2 and the first distance
  • the three-dimensional data of the workpiece W is generated by the monocular stereo method using the difference distance from D1.
  • the encoder 22 outputs the first distance D1, which is the amount of movement of the work W from the position where the work W has arrived in the visual field V, to the first position, and the second distance D2, which is the amount of movement to the second position. can be obtained directly.
  • the first position and the second position of the workpiece W within the visual field V can be accurately detected. Therefore, the first image P1 and the second image P2 used to generate the three-dimensional data of the work W can be acquired more accurately, and the accuracy of the three-dimensional data of the work W generated by the three-dimensional data generation unit 46 can be improved. It has the advantage that it can be improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

L'invention concerne un dispositif d'acquisition de données tridimensionnelles (1) comprenant : une seule caméra bidimensionnelle (3) qui obtient une image d'une pièce (W) ; un mécanisme de déplacement (2) qui déplace la pièce (W) et la caméra bidimensionnelle (3), l'une par rapport à l'autre, dans une direction ; et une unité de commande (4). L'unité de commande (4) détecte l'arrivée de la pièce (W) dans un champ de vue (V) de la caméra bidimensionnelle (3) par un changement d'une valeur de pixel d'un pixel à proximité d'une extrémité amont dans la direction de déplacement relatif de la pièce (W) dans une image bidimensionnelle acquise par la caméra bidimensionnelle (3). L'unité de commande génère également des données tridimensionnelles de la pièce (W) sur la base d'une première image acquise par la caméra bidimensionnelle (3) à une première position vers laquelle la pièce (W) a été déplacée de la position d'arrivée d'uniquement une première distance prédéfinie, d'une seconde image acquise par la caméra bidimensionnelle (3) à une seconde position vers laquelle la pièce (W) a été déplacée de la position d'arrivée d'uniquement une seconde distance prédéfinie, et une distance différentielle entre la seconde distance et la première distance.
PCT/JP2021/031437 2021-08-27 2021-08-27 Dispositif d'acquisition de données tridimensionnelles WO2023026452A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/031437 WO2023026452A1 (fr) 2021-08-27 2021-08-27 Dispositif d'acquisition de données tridimensionnelles
TW111130663A TW202308820A (zh) 2021-08-27 2022-08-15 3d檔案取得裝置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/031437 WO2023026452A1 (fr) 2021-08-27 2021-08-27 Dispositif d'acquisition de données tridimensionnelles

Publications (1)

Publication Number Publication Date
WO2023026452A1 true WO2023026452A1 (fr) 2023-03-02

Family

ID=85322542

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/031437 WO2023026452A1 (fr) 2021-08-27 2021-08-27 Dispositif d'acquisition de données tridimensionnelles

Country Status (2)

Country Link
TW (1) TW202308820A (fr)
WO (1) WO2023026452A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016114433A (ja) * 2014-12-12 2016-06-23 株式会社安川電機 画像処理装置、画像処理方法、及び部品供給装置
JP2017036104A (ja) * 2015-08-07 2017-02-16 シンフォニアテクノロジー株式会社 パーツフィーダ
JP2017094406A (ja) * 2015-11-18 2017-06-01 オムロン株式会社 シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
JP2017162133A (ja) * 2016-03-09 2017-09-14 キヤノン株式会社 撮像システム、計測システム、生産システム、撮像方法、プログラム、記録媒体および計測方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016114433A (ja) * 2014-12-12 2016-06-23 株式会社安川電機 画像処理装置、画像処理方法、及び部品供給装置
JP2017036104A (ja) * 2015-08-07 2017-02-16 シンフォニアテクノロジー株式会社 パーツフィーダ
JP2017094406A (ja) * 2015-11-18 2017-06-01 オムロン株式会社 シミュレーション装置、シミュレーション方法、およびシミュレーションプログラム
JP2017162133A (ja) * 2016-03-09 2017-09-14 キヤノン株式会社 撮像システム、計測システム、生産システム、撮像方法、プログラム、記録媒体および計測方法

Also Published As

Publication number Publication date
TW202308820A (zh) 2023-03-01

Similar Documents

Publication Publication Date Title
RU2562413C2 (ru) Способ и система для обнаружения и определения геометрических, пространственных и позиционных характеристик изделий, транспортируемых конвейером непрерывного действия, в частности необработанных, грубопрофилированных, грубообработанных или частично обработанных стальных изделий
US20030164876A1 (en) Procedure and device for measuring positions of continuous sheets
US20210362518A1 (en) Lateral adjustment of print substrate based on a camera image
CN110081816B (zh) 物品搬运系统
JP2008296330A (ja) ロボットシミュレーション装置
KR20130093041A (ko) 판상물의 절선 가공 장치 및 판상물의 절선 가공 방법, 및 유리판의 제조 장치 및 유리판의 제조 방법
CN107683401A (zh) 形状测定装置和形状测定方法
EP4015097A1 (fr) Dispositif de prélèvement
WO2023026452A1 (fr) Dispositif d'acquisition de données tridimensionnelles
WO2019239351A1 (fr) Procédé et machine pour la décoration de surface d'un article de base de l'industrie de traitement de la céramique
CN106694387A (zh) 精准柱件检测装置
JP2019018339A (ja) ロボットシステム
JP2009115715A (ja) タイヤトレッドゴムの長さの測定装置
EP3217191B1 (fr) Appareil et procédé pour mesurer une distance
JP2013184838A (ja) 板状物の切線加工装置及び板状物の切線加工方法、ならびにガラス板の製造装置及びガラス板の製造方法
JP2016138761A (ja) 光切断法による三次元測定方法および三次元測定器
KR20210026034A (ko) 후판 압연판의 외관 검사 시스템과, 이를 이용한 후판 압연판의 외관 검사방법
KR20130126631A (ko) 판상물의 반송량 검출 장치 및 판상물의 절단 장치 및 판상물의 반송량 검출 방법 및 판상물의 절선 가공 장치 및 판상물의 절선 가공 방법
CA2962809C (fr) Systeme et methode de numerisation couleur d'un article en mouvement
JP2006337270A (ja) 断面形状の測定方法及びその装置
JP2020148738A (ja) 板材の位置検出方法、板材のデータ補正方法、及び板材の位置検出装置
WO2018207086A1 (fr) Procédé d'alignement de feuilles de papier pour la construction de boîtes en papier et en carton
KR101830514B1 (ko) 압연 공정용 전단 장치 및 방법
KR102385189B1 (ko) 약품의 정위치 인쇄시스템 및 방법
US20230175833A1 (en) Method and apparatus for measuring objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21955064

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE