JP5669195B2 - Surface shape measuring device and surface shape measuring method - Google Patents

Surface shape measuring device and surface shape measuring method Download PDF

Info

Publication number
JP5669195B2
JP5669195B2 JP2011021620A JP2011021620A JP5669195B2 JP 5669195 B2 JP5669195 B2 JP 5669195B2 JP 2011021620 A JP2011021620 A JP 2011021620A JP 2011021620 A JP2011021620 A JP 2011021620A JP 5669195 B2 JP5669195 B2 JP 5669195B2
Authority
JP
Japan
Prior art keywords
coordinates
distance measurement
measurement data
image
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011021620A
Other languages
Japanese (ja)
Other versions
JP2012163346A (en
Inventor
紀功仁 川末
紀功仁 川末
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Miyazaki
Original Assignee
University of Miyazaki
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Miyazaki filed Critical University of Miyazaki
Priority to JP2011021620A priority Critical patent/JP5669195B2/en
Publication of JP2012163346A publication Critical patent/JP2012163346A/en
Application granted granted Critical
Publication of JP5669195B2 publication Critical patent/JP5669195B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

本発明は、表面形状計測装置、及び表面形状計測方法に関し、特に、地形や建造物内部の形状を三次元的に計測する表面形状計測装置、及び表面形状計測方法に関するものである。   The present invention relates to a surface shape measuring device and a surface shape measuring method, and more particularly to a surface shape measuring device and a surface shape measuring method for three-dimensionally measuring topography and the shape inside a building.

地形や建物内部の形状を三次元的に計測するために、レーザスキャナを利用した計測方法が実用化され始めている。これらのシステムでは、三脚などで固定された三次元スキャナからレーザを周囲に投光することで、三次元スキャナの位置から目視できる箇所の三次元形状が計測される。しかしながら、さらに広範囲の領域を計測するには、計測器を移動させる必要がある。しかし、計測器を移動させた場合、移動前と移動後の三次元データを連結する作業が煩雑であるといった問題がある。
これを解決するために、高精度なGPSを利用して位置を検出する方法も考えられるが、ビルの谷間や建物の内部では人工衛星の電波が検出できず精度が大幅に低下するといった問題がある。
また、従来技術として同一出願人による特許文献1には、小径管内の形状を計測するために光により全方位に照射し、撮像手段を移動させたときに撮像された光リングの径の値、及び文様の移動量に基づいて撮像手段の移動距離を演算することにより、小径管内面の三次元形状を演算して、小径管内面の状態を正確に且つ迅速に計測することが可能な管内面形状計測装置について開示されている。
In order to measure the topography and the internal shape of a building three-dimensionally, a measurement method using a laser scanner has been put into practical use. In these systems, a three-dimensional shape of a portion visible from the position of the three-dimensional scanner is measured by projecting a laser around the three-dimensional scanner fixed by a tripod or the like. However, in order to measure a wider area, it is necessary to move the measuring instrument. However, when the measuring instrument is moved, there is a problem that the operation of connecting the three-dimensional data before and after the movement is complicated.
In order to solve this, a method of detecting the position using high-accuracy GPS is conceivable. However, there is a problem that the accuracy of the satellite is greatly reduced because the radio wave of the artificial satellite cannot be detected in the valley of the building or inside the building. is there.
Further, in Patent Document 1 by the same applicant as the prior art, the value of the diameter of the optical ring imaged when the imaging means is moved by irradiating light in all directions to measure the shape in the small-diameter pipe, By calculating the moving distance of the image pickup means based on the movement amount of the pattern and the pattern, it is possible to calculate the three-dimensional shape of the inner surface of the small-diameter tube and accurately and quickly measure the state of the inner surface of the small-diameter tube. A shape measuring apparatus is disclosed.

特開2010−223710公報JP 2010-223710 A

しかし、特許文献1に開示されている従来技術は、小径管内面に撮像された文様のうち、180°の位置にある複数の組み合わせの文様の移動量、及び光リング径の値からカメラの移動量を検知して、複数の組み合わせの平均値でカメラの移動量を計算するため、移動量に多少の誤差が含まれるといった問題がある。
本発明は、かかる課題に鑑みてなされたものであり、レーザスキャナとカメラを走行台車に取り付け、走行台車を移動しながらレーザスキャナにより計測対象物の断面データを取得し、同時にカメラで撮像された画像データに基づいて走行台車の移動量と移動方向を演算により求め、両者のデータを重ねることで、計測対象物の三次元形状を容易に、且つリアルタイムに得ることができる表面形状計測装置、及び表面形状計測方法を提供することを目的とする。
However, the prior art disclosed in Patent Document 1 is based on the movement of the camera based on the amount of movement of a plurality of combinations at 180 ° and the value of the optical ring diameter among the patterns imaged on the inner surface of the small-diameter tube. Since the amount of movement is detected and the amount of movement of the camera is calculated using the average value of a plurality of combinations, there is a problem that some amount of error is included in the amount of movement.
The present invention has been made in view of such a problem. A laser scanner and a camera are attached to a traveling carriage, and the cross-sectional data of the measurement object is acquired by the laser scanner while moving the traveling carriage, and is simultaneously captured by the camera. A surface shape measuring device that can easily and in real time obtain the three-dimensional shape of the measurement object by calculating the moving amount and moving direction of the traveling carriage based on the image data, and superimposing both data, and An object is to provide a surface shape measuring method.

本発明はかかる課題を解決するために、請求項1は、計測対象物の表面に沿って移動しながら該表面の形状を三次元的に計測する表面形状計測装置であって、前記計測対象物の表面にレーザビームを走査させて、該計測対象物の表面に照射した前記レーザビームの方位毎の測距データを出力する測距データ検出手段と、前記レーザビームの投光面の一部を含んだ前記計測対象物の表面を撮像する撮像手段と、前記測距データ検出手段、及び前記撮像手段を前記計測対象物表面に沿って移動させる移動手段と、該移動手段により前記測距データ検出手段、及び前記撮像手段を移動させた時に該測距データ検出手段から得られた測距データを演算処理してz軸方向の位置座標を求め、前記撮像手段から得られた画像データを演算処理してx軸方向、並びにy軸方向の位置座標を求める位置座標検出手段と、該位置座標検出手段により求められたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて前記計測対象物の三次元形状を演算する制御手段と、を備え、前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として鉛直方向にz軸をとり、前記z軸に対して水平方向にx軸及びy軸をとる座標系を設置することを特徴とする。
三次元形状を得るには、x、y、及びz軸方向の座標が必要である。本発明では、z軸方向は、レーザビームを計測対象物の表面に走査して、レーザ光が計測対象物の表面を反射して往復する時間から距離を計測し、計測した各方位ごとの距離データを演算してz軸座標を求める。また、x軸、y軸の座標は、カメラ等の撮像手段により計測対象物の表面を移動しながら撮像して、その画像データとレーザビームによる距離データzを演算処理してx、y座標を求める。従って、測距データ検出手段、及び撮像手段を計測対象物表面に沿って移動させる移動手段が必要となる。本発明では、x、y軸方向の座標の変化を演算して移動手段の移動量と移動方向を求める点が特徴である。これにより、簡単な構成で、且つ低コストで計測対象物の三次元形状を計測することができる。
In order to solve this problem, the present invention provides a surface shape measuring apparatus that three-dimensionally measures the shape of the surface while moving along the surface of the object to be measured. A distance measurement data detecting means for scanning the surface of the laser beam and outputting distance measurement data for each direction of the laser beam irradiated on the surface of the object to be measured, and a part of the projection surface of the laser beam. Imaging means for imaging the surface of the measurement object included, distance measurement data detection means, movement means for moving the imaging means along the surface of the measurement object, and distance measurement data detection by the movement means And the distance measurement data obtained from the distance measurement data detection means when the image pickup means is moved to calculate the position coordinates in the z-axis direction, and the image data obtained from the image pickup means is calculated. X-axis direction And a position coordinate detection means for obtaining a position coordinate in the y-axis direction, and a three-dimensional shape of the measurement object based on the position coordinates in the x-axis direction, the y-axis direction, and the z-axis direction obtained by the position coordinate detection means And a control means for computing the z-axis in the vertical direction with the laser projection portion of the distance measurement data detection means as the origin, and the x-axis and y in the horizontal direction with respect to the z-axis. It is characterized by installing a coordinate system that takes an axis .
To obtain a three-dimensional shape, coordinates in the x, y, and z axis directions are required. In the present invention, the z-axis direction is determined by measuring the distance from the time when the laser beam scans the surface of the measurement object and the laser beam reflects and reciprocates on the surface of the measurement object. Calculate the z-axis coordinates by calculating the data. Also, the x-axis and y-axis coordinates are obtained by moving the surface of the measurement object with an imaging means such as a camera, and calculating the image data and the distance data z by the laser beam to obtain the x and y coordinates. Ask. Accordingly, a moving means for moving the distance measurement data detecting means and the imaging means along the surface of the measurement object is required. The present invention is characterized in that a movement amount and a moving direction of the moving means are obtained by calculating a change in coordinates in the x and y axis directions. Thereby, the three-dimensional shape of the measurement object can be measured with a simple configuration and at a low cost.

請求項2は、前記制御手段は、前記撮像手段により撮像された画像上での前記計測対象物の表面模様の移動量及び移動方向に基づいて前記移動手段の移動量及び移動方向を演算することを特徴とする。
例えば、計測対象物が道路の場合、路面のテクスチャ(表面模様)は全て異なる。そこで本発明では、ある領域のテクスチャを読み取って、画像フレーム単位に前のフレームに撮像されたテクスチャと次のフレームの同じテクスチャがどこに移動したかを判定して移動手段の移動量及び移動方向を演算する。これにより、移動手段の移動量と移動方向を検知する装置を不要として、計測装置のコストを低減することができる。
According to a second aspect of the present invention, the control means calculates the movement amount and movement direction of the movement means based on the movement amount and movement direction of the surface pattern of the measurement object on the image captured by the imaging means. It is characterized by.
For example, when the measurement object is a road, the textures (surface patterns) of the road surface are all different. Therefore, in the present invention, the texture of a certain region is read to determine where the texture captured in the previous frame and the same texture in the next frame have moved in units of image frames, and the moving amount and moving direction of the moving means are determined. Calculate. Thereby, the apparatus which detects the moving amount and moving direction of a moving means becomes unnecessary, and the cost of a measuring device can be reduced.

請求項3は、前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として下向きにz軸をとる座標系を設置し、このときの前記撮像手段により撮像される画像上の座標(u,v)と実座標(x,y,z)の関係から係数を算出し、該係数、前記画像上の座標(u,v)、及び前記測距データ検出手段からのz座標を用いて前記実座標(x,y,z)を求め
前記撮像手段により撮像された画像上の座標(u,v)と実座標(x,y,z)との関係は、

Figure 0005669195

となり、該行列式からh11〜h33の係数を算出し、実座標(x,y)の算出は、前記画像上の前記座標(u,v)と前記測距データ検出手段からのz座標を用いて、

Figure 0005669195

として求めることにより、撮像された前記画像上の前記座標(u,v)と前記測距データ検出手段による前記測距データのz座標から実座標(x,y)を求めることを特徴とする。 According to a third aspect of the present invention, the control means installs a coordinate system taking the z-axis downward with the laser projection portion of the distance measurement data detection means as the origin, and the coordinates on the image captured by the imaging means at this time A coefficient is calculated from the relationship between (u, v) and real coordinates (x, y, z), and the coefficient, the coordinates (u, v) on the image, and the z coordinate from the distance measurement data detecting means are used. To obtain the real coordinates (x, y, z) ,
The relationship between the coordinates (u, v) on the captured image and the actual coordinate (x, y, z) that by the imaging means,

Figure 0005669195

DOO Ri greens, calculates the coefficient of h11~h33 from the determinant, the calculation of the actual coordinate (x, y), the coordinates (u, v) on the image and z coordinates from the distance measuring data detection means Using,

Figure 0005669195

The determined Mel it as a feature to seek the on captured the image coordinates (u, v) between the actual coordinates from the z coordinate of the distance measuring data by the distance data detecting means (x, y) To do.

請求項4は、前記制御手段は、前記撮像手段により撮像された画像上での前記測距データ検出手段により検出した領域の複数個所の近傍領域を検出し、該近傍領域の移動前の座標と移動後の座標に基づいて前記移動手段の移動量及び移動方向を決定することを特徴とする。
測距データ検出手段は扇状の領域を検出するため、下向きをz軸とし、y軸を通るものとする(y=0)。測距データ検出手段から検出されるx、zのデータおよびy=0から、カメラ画像上での測距データ検出手段で検出している領域の画像上での数カ所の近傍領域を検出する。
この近傍領域の中心の画像上での座標(u,v)はx、zのデータおよびy=0を以下の式に代入して求まる。

Figure 0005669195

撮像手段はビデオカメラを用いて連続した画像を撮影し、連続した画像間で近傍領域の移動量(Δu、Δv)を検出する。移動後の座標(u+Δu,v+Δv)およびz(測距データ検出手段からの座標z。移動手段は上下方向には移動しないものとし、移動前後一定でzである。)を式に代入することで移動後の(x,y)が決定される。移動前の座標(x,y)と比較することで、移動手段の移動量および移動方向が算出される。 According to a fourth aspect of the present invention, the control unit detects a plurality of neighborhood regions of the region detected by the distance measurement data detection unit on the image captured by the imaging unit, and the coordinates of the neighborhood region before the movement are detected. The moving amount and moving direction of the moving means are determined based on the coordinates after moving.
In order to detect the fan-shaped area, the distance measurement data detection means sets the downward direction as the z axis and passes through the y axis (y 0 = 0). From the x 0 and z 0 data and y 0 = 0 detected by the ranging data detecting means, several neighboring areas on the image of the area detected by the ranging data detecting means on the camera image are detected. To do.
The coordinates (u, v) on the image of the center of this neighboring area can be obtained by substituting the data of x 0 and z 0 and y 0 = 0 into the following expression.

Figure 0005669195

The imaging means captures continuous images using a video camera, and detects the amount of movement (Δu, Δv) of the neighboring region between the consecutive images. The coordinates after movement (u + Δu, v + Δv) and z (coordinate z 0 from the distance measurement data detection means. The movement means is assumed not to move in the vertical direction and is constant z 0 before and after the movement) are substituted into the equation. Thus, (x, y) after movement is determined. By comparing with the coordinates (x 0 , y 0 ) before moving, the moving amount and moving direction of the moving means are calculated.

請求項5は、測距データ検出手段、撮像手段、移動手段、位置座標検出手段、及び制御手段を備え、計測対象物の表面形状を三次元的に計測する表面形状計測装置の表面形状計測方法であって、前記測距データ検出手段が、前記計測対象物の表面にレーザビームを走査させて、該計測対象物の表面に照射した前記レーザビームの方位毎の測距データを出力するステップと、前記撮像手段が、前記レーザビームの投光面の一部を含んだ前記計測対象物の表面を撮像するステップと、前記移動手段が、前記測距データ検出手段、及び前記撮像手段を前記計測対象物表面に沿って移動させるステップと、前記位置座標検出手段が、前記移動手段により前記測距データ検出手段、及び前記撮像手段を移動させた時に該測距データ検出手段から得られた測距データを演算処理してz軸方向の位置座標を求め、前記撮像手段から得られた画像データを演算処理してx軸方向、並びにy軸方向の位置座標を求めるステップと、前記制御手段が、前記位置座標検出手段により求めたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて前記計測対象物の三次元形状を演算するステップと、を含み、
前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として鉛直方向にz軸をとり、前記z軸に対して水平方向にx軸及びy軸をとる座標系を設置することを特徴とする。
請求項1と同様の作用効果を奏する。
請求項6は、前記制御手段が、前記撮像手段により撮像された画像上での前記計測対象物の表面模様の移動量及び移動方向に基づいて前記移動手段の移動量及び移動方向を演算するステップを含むことを特徴とする。
請求項2と同様の作用効果を奏する。
A surface shape measuring method for a surface shape measuring apparatus that includes a distance measurement data detecting means, an imaging means, a moving means, a position coordinate detecting means, and a control means, and three-dimensionally measures the surface shape of a measurement object. The distance measurement data detecting means scans the surface of the measurement object with a laser beam and outputs distance measurement data for each direction of the laser beam irradiated on the surface of the measurement object; The imaging unit imaging the surface of the measurement object including a part of the laser beam projection surface; and the moving unit is configured to measure the distance measurement data detection unit and the imaging unit. A step of moving along the surface of the object, and the position coordinate detection means obtained from the distance measurement data detection means when the distance measurement data detection means and the imaging means are moved by the movement means; Computing the distance data to obtain the position coordinates in the z-axis direction, computing the image data obtained from the imaging means to obtain the position coordinates in the x-axis direction and the y-axis direction, and the control means , x-axis direction was determined by the position coordinate detection means, it viewed including the steps of calculating the three-dimensional shape of the measurement object based on the position coordinates of the y-axis direction, and the z-axis direction, and
The control means is provided with a coordinate system that takes the z-axis in the vertical direction with the laser projection part of the distance measurement data detection means as the origin, and takes the x-axis and the y-axis in the horizontal direction with respect to the z-axis. Features.
There exists an effect similar to Claim 1.
In a sixth aspect of the present invention, the control means calculates the movement amount and movement direction of the movement means based on the movement amount and movement direction of the surface pattern of the measurement object on the image captured by the imaging means. It is characterized by including.
There exists an effect similar to Claim 2.

請求項7は、前記制御手段が、前記測距データ検出手段のレーザ投光部を原点として下向きにz軸をとる座標系を設置し、このときの前記撮像手段により撮像される画像上の座標(u,v)と実座標(x,y,z)の関係から係数を算出し、該係数、前記画像上の座標(u,v)、及び前記測距データ検出手段からのz座標を用いて前記実座標(x,y,z)を求めるステップを含み、
前記撮像手段により撮像された画像上の座標(u,v)と実座標(x,y,z)との関係は、

Figure 0005669195
となり、該行列式からh11〜h33の係数を算出し、実座標(x,y)の算出は、前記画像上の前記座標(u,v)と前記測距データ検出手段からのz座標を用いて、

Figure 0005669195

として求めることにより、撮像された前記画像上の前記座標(u,v)と前記測距データ検出手段による前記測距データのz座標から実座標(x,y)を求めること特徴とする
請求項8は、前記制御手段が、前記撮像手段により撮像された画像上での前記測距データ検出手段により検出した領域の複数個所の近傍領域を検出し、該近傍領域の移動前の座標と移動後の座標に基づいて前記移動手段の移動量及び移動方向を決定するステップを含むことを特徴とする。
請求項4と同様の作用効果を奏する。
According to a seventh aspect of the present invention, the control unit installs a coordinate system that takes the z-axis downward with the laser projection unit of the distance measurement data detection unit as the origin, and the coordinates on the image captured by the imaging unit at this time A coefficient is calculated from the relationship between (u, v) and real coordinates (x, y, z), and the coefficient, the coordinates (u, v) on the image, and the z coordinate from the distance measurement data detecting means are used. look including the step of determining an actual coordinate (x, y, z) Te,
The relationship between the coordinates (u, v) on the image captured by the imaging means and the actual coordinates (x, y, z) is as follows:
Figure 0005669195
The coefficients of h11 to h33 are calculated from the determinant, and the actual coordinates (x, y) are calculated using the coordinates (u, v) on the image and the z coordinates from the distance measurement data detecting means. And

Figure 0005669195

Thus, the actual coordinates (x, y) are obtained from the coordinates (u, v) on the captured image and the z-coordinate of the distance measurement data by the distance measurement data detecting means .
In the eighth aspect of the present invention, the control unit detects a plurality of neighboring regions of the region detected by the distance measurement data detection unit on the image captured by the imaging unit, and the coordinates of the vicinity region before the movement are detected. The method includes a step of determining a moving amount and a moving direction of the moving means based on the coordinates after moving.
There exists an effect similar to Claim 4.

本発明によれば、z軸方向は、レーザビームを計測対象物の表面を走査して、レーザ光が計測対象物の表面を反射して往復する時間から距離を計測し、計測した各方位ごとの距離データを演算してz軸座標を求め、x軸、y軸の座標は、カメラ等の撮像手段により計測対象物の表面を移動しながら撮像して、その画像データを演算処理してx、y座標を求めるので、簡単な構成で、且つ低コストで計測対象物の三次元形状を計測することができる。   According to the present invention, the z-axis direction is measured by measuring the distance from the time when the laser beam scans the surface of the measurement object and the laser beam reflects and reciprocates on the surface of the measurement object. The z-axis coordinates are obtained by calculating the distance data of x, and the x-axis and y-axis coordinates are imaged while moving the surface of the measurement object by an imaging means such as a camera, and the image data is calculated and processed. Since the y coordinate is obtained, the three-dimensional shape of the measurement object can be measured with a simple configuration and at a low cost.

本発明の実施形態に係る表面形状計測装置の外観側面図である。It is an external appearance side view of the surface shape measuring apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る表面形状計測装置の外観を示す斜視図である。It is a perspective view which shows the external appearance of the surface shape measuring apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る表面形状計測装置の機能を説明するブロック図である。It is a block diagram explaining the function of the surface shape measuring apparatus which concerns on embodiment of this invention. カメラ座標から実座標に変換する様子を説明する図である。It is a figure explaining a mode that it changes from a camera coordinate to a real coordinate. レーザスキャナの検出線と近傍領域の関係を示す図である。It is a figure which shows the relationship between the detection line of a laser scanner, and a near region. 実際に道路上で本発明の表面形状計測装置を走行して得られた画像を処理した結果を示す図である。It is a figure which shows the result of having processed the image obtained by actually driving | running | working the surface shape measuring apparatus of this invention on the road.

以下、本発明を図に示した実施形態を用いて詳細に説明する。但し、この実施形態に記載される構成要素、種類、組み合わせ、形状、その相対配置などは特定的な記載がない限り、この発明の範囲をそれのみに限定する主旨ではなく単なる説明例に過ぎない。   Hereinafter, the present invention will be described in detail with reference to embodiments shown in the drawings. However, the components, types, combinations, shapes, relative arrangements, and the like described in this embodiment are merely illustrative examples and not intended to limit the scope of the present invention only unless otherwise specified. .

図1は本発明の実施形態に係る表面形状計測装置の外観側面図である。本発明の表面形状計測装置50は、道路(計測対象物)8の表面に沿って移動しながら表面の形状を三次元的に計測する表面形状計測装置50であって、道路8の表面にレーザビーム9を走査させて、道路8の表面に照射したレーザビーム9の方位毎の測距データを出力する測域センサ(測距データ検出手段)1と、レーザビーム9の投光面の一部を含んだ道路8の表面を撮像するカメラ(撮像手段)3と、測域センサ1、及びカメラ3を道路8の表面に沿って車輪7を回転させて移動させる走行台車(移動手段)5と、走行台車5により測域センサ1、及びカメラ3を移動させた時に測域センサ1から得られた測距データを演算処理してz軸方向の位置座標を求め、カメラ3から得られた画像データを演算処理してx軸方向、並びにy軸方向の位置座標を求める位置座標検出手段10(図3参照)と、位置座標検出手段10により求められたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて道路8の三次元形状を演算するPC(制御手段)4と、を備えて構成されている。尚、本実施形態では、位置座標検出手段10は、PC4内に備え、ソフトウェアにより処理される。また、走行台車5に測域センサ1、カメラ3、及びPC4を駆動するバッテリ(電源)6を搭載している。また、台車5は、人力により走行してもよいし、モータ等の駆動源により走行してもよい。また、測域センサ1とカメラ3は、走行台車5の本体から伸びたアーム2に取り付けられている。   FIG. 1 is an external side view of a surface shape measuring apparatus according to an embodiment of the present invention. A surface shape measuring apparatus 50 according to the present invention is a surface shape measuring apparatus 50 that measures the shape of a surface three-dimensionally while moving along the surface of a road (measurement object) 8. A range sensor (ranging data detection means) 1 that scans the beam 9 and outputs ranging data for each direction of the laser beam 9 irradiated on the surface of the road 8, and a part of the projection surface of the laser beam 9 A camera (imaging means) 3 that images the surface of the road 8 including the vehicle, a range sensor 1, and a traveling carriage (moving means) 5 that moves the camera 3 along the surface of the road 8 by rotating the wheels 7. The distance measurement data obtained from the range sensor 1 when the range sensor 1 and the camera 3 are moved by the traveling carriage 5 is processed to obtain the position coordinates in the z-axis direction, and the image obtained from the camera 3 Data is processed and processed along the x-axis and y-axis The position coordinate detection means 10 for obtaining the position coordinates (see FIG. 3), and the three-dimensional shape of the road 8 based on the position coordinates in the x-axis direction, the y-axis direction, and the z-axis direction obtained by the position coordinate detection means 10 And a PC (control means) 4 for calculating In the present embodiment, the position coordinate detection means 10 is provided in the PC 4 and processed by software. Further, a battery (power source) 6 for driving the range sensor 1, the camera 3, and the PC 4 is mounted on the traveling carriage 5. Moreover, the cart 5 may run by human power or may run by a drive source such as a motor. The range sensor 1 and the camera 3 are attached to an arm 2 extending from the main body of the traveling carriage 5.

三次元形状を得るには、x、y、及びz軸方向の座標が必要である。本実施形態では、z軸方向は、レーザビーム9を道路8の表面に走査して、レーザ光が道路8の表面を反射して往復する時間から距離を計測し、計測した各方位ごとの距離データを演算してz軸座標を求める。また、x軸、y軸の座標は、カメラ3により道路8の表面を移動しながら撮像して、その画像データを演算処理してx、y座標を求める。従って、測域センサ1、及びカメラ3を道路8の表面に沿って移動させる走行台車5が必要となる。本実施形態では、x軸、y軸方向の座標の変化を演算して走行台車5の移動量と移動方向を求める点が特徴である。これにより、簡単な構成で、且つ低コストで道路8の三次元形状を計測することができる。   To obtain a three-dimensional shape, coordinates in the x, y, and z axis directions are required. In this embodiment, in the z-axis direction, the laser beam 9 is scanned on the surface of the road 8, the distance is measured from the time when the laser beam reflects and reciprocates on the surface of the road 8, and the measured distance for each direction. Calculate the z-axis coordinates by calculating the data. The x-axis and y-axis coordinates are obtained by moving the surface of the road 8 with the camera 3 and calculating the image data to obtain the x and y coordinates. Therefore, the traveling cart 5 that moves the range sensor 1 and the camera 3 along the surface of the road 8 is required. The present embodiment is characterized in that a movement amount and a movement direction of the traveling carriage 5 are obtained by calculating a change in coordinates in the x-axis and y-axis directions. As a result, the three-dimensional shape of the road 8 can be measured with a simple configuration and at a low cost.

図2は本発明の実施形態に係る表面形状計測装置の外観を示す斜視図である。同じ構成要素には図1と同じ参照番号を付して説明する。測域センサ1からは、レーザビーム9が道路8の表面を走査して、扇状に走査する。本実施形態では、市販の測域センサを使用しているため、領域Bにはレーザビーム9は届かない。一例として、北陽電機社製の仕様として、型番(URG−04LX)、測拒範囲(60〜4095mm、240°)、測拒精度(60〜1000mm:±10mm、1000〜4095mm:距離の±1%)、角度分解能(ステップ角:約0.36°)、走査時間(100ms/scan)、質量(約160g)が使用できる。   FIG. 2 is a perspective view showing an appearance of the surface shape measuring apparatus according to the embodiment of the present invention. The same components will be described with the same reference numerals as in FIG. From the range sensor 1, the laser beam 9 scans the surface of the road 8 and scans in a fan shape. In the present embodiment, since a commercially available range sensor is used, the laser beam 9 does not reach the region B. As an example, the specifications made by Hokuyo Electric Co., Ltd. are model number (URG-04LX), measurement rejection range (60-4095 mm, 240 °), measurement rejection accuracy (60-1000 mm: ± 10 mm, 1000-4095 mm: ± 1% of distance) ), Angular resolution (step angle: about 0.36 °), scanning time (100 ms / scan), and mass (about 160 g).

図3は本発明の実施形態に係る表面形状計測装置の機能を説明するブロック図である。同じ構成要素には図1と同じ参照番号を付して説明する。PC4は位置座標検出手段10と、位置座標検出手段10により求められたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて道路8の三次元形状を演算するCPU11により構成され、その結果はモニタ12に可視化される。
ここでPC4は、カメラ3により撮像された画像上での道路8の表面模様の移動量及び移動方向に基づいて走行台車5の移動量及び移動方向を演算する。例えば、計測対象物が道路の場合、路面のテクスチャ(表面模様)は全て異なる。そこで本実施形態では、ある領域のテクスチャを読み取って、画像フレーム単位に前のフレームに撮像されたテクスチャと次のフレームの同じテクスチャがどこに移動したかを計測して走行台車5の移動量及び移動方向を演算する。これにより、走行台車5の移動量と移動方向を検知する装置を不要として、装置のコストを低減することができる。
またPC4は、測域センサ1のレーザ投光部を原点として下向きにz軸をとる座標系を設置し、このときのカメラ3により撮像される画像上の座標(u,v)と実座標(x,y,z)の関係から係数を算出し、該係数、画像上の座標(u,v)、及び測距データ検出手段からのz座標を用いて実座標(x,y,z)を求める(詳細は後述する)。
FIG. 3 is a block diagram illustrating functions of the surface shape measuring apparatus according to the embodiment of the present invention. The same components will be described with the same reference numerals as in FIG. The PC 4 includes a position coordinate detection unit 10 and a CPU 11 that calculates the three-dimensional shape of the road 8 based on the position coordinates in the x-axis direction, the y-axis direction, and the z-axis direction obtained by the position coordinate detection unit 10. The result is visualized on the monitor 12.
Here, the PC 4 calculates the movement amount and movement direction of the traveling carriage 5 based on the movement amount and movement direction of the surface pattern of the road 8 on the image captured by the camera 3. For example, when the measurement object is a road, the textures (surface patterns) of the road surface are all different. Therefore, in the present embodiment, the texture of a certain region is read, and the amount of movement and movement of the traveling carriage 5 is measured by measuring where the texture captured in the previous frame and the same texture in the next frame move in image frames. Calculate the direction. This eliminates the need for a device that detects the amount and direction of movement of the traveling carriage 5, and reduces the cost of the device.
In addition, the PC 4 is provided with a coordinate system that takes the z-axis downward with the laser projection unit of the range sensor 1 as the origin, and the coordinates (u, v) on the image captured by the camera 3 at this time and the actual coordinates ( The coefficient is calculated from the relationship of x, y, z), and the actual coordinate (x, y, z) is calculated using the coefficient, the coordinate (u, v) on the image, and the z coordinate from the distance measurement data detecting means. (Details will be described later).

次に、本発明のPC4が各座標から走行台車5の移動量及び移動方向を求めるプロセスについて詳細に説明する。
測域センサ(以下、説明の都合上レーザスキャナと呼ぶ)1は、鉛直(路面が水平の場合)下向きに扇状にレーザを投光するものとし、レーザが投光された平面上での形状(断面形状)を計測する。カメラ3はレーザスキャナ1のレーザ投光面の一部を含み、道路(歩道)表面8を撮影する。走行台車5の移動量および移動方向は、カメラ3に撮影される画像上での道路表面のテクスチャ(表面模様)の移動量および移動方向から検出される。
Next, the process in which the PC 4 according to the present invention obtains the movement amount and movement direction of the traveling carriage 5 from each coordinate will be described in detail.
A range sensor (hereinafter referred to as a laser scanner for convenience of explanation) 1 projects a laser in a fan shape vertically (when the road surface is horizontal) and has a shape on a plane on which the laser is projected ( Cross-sectional shape) is measured. The camera 3 includes a part of the laser projection surface of the laser scanner 1 and images the road (sidewalk) surface 8. The moving amount and moving direction of the traveling carriage 5 are detected from the moving amount and moving direction of the texture (surface pattern) on the road surface on the image photographed by the camera 3.

レーザスキャナ1のレーザ投光部を原点として鉛直下向きにz軸をとる座標系(実座標)を設置する。この時カメラ3に撮影される画像上の座標(u,v)と実座標(x,y,z)の関係は以下のようになる。

Figure 0005669195

ここで、h11−h33は係数である。
これを展開し、

Figure 0005669195

sを消去して変形し、

Figure 0005669195
・・・・(1)
となる。 A coordinate system (actual coordinates) that takes the z-axis vertically downward with the laser projection portion of the laser scanner 1 as the origin is installed. At this time, the relationship between the coordinates (u, v) on the image photographed by the camera 3 and the actual coordinates (x, y, z) is as follows.

Figure 0005669195

Here, h 11 -h 33 is a coefficient.
Expand this,

Figure 0005669195

delete s and transform,

Figure 0005669195
(1)
It becomes.

図4のようにx、yスケールが記入された目盛板13を水平に設置し、目盛板13をカメラ3で撮影する。撮影された画像内で、目盛板13の一箇所をマウスなどを利用して指示し、その位置(画像上での座標u,v)をコンピュータで読み込む。また、その位置の実座標(x,y)を目盛り板から読み取る。z座標はレーザスキャナ1のデータから読み取る。平板の位置(z位置)を変化させ、カメラ3で撮影することを繰り返し、(u,v)と(x,y,z)の複数(6点以上)の組み合わせを検出する。これらの組み合わせを(1)式に代入することで、h11−h33の係数を算出できる。
11−h33が決まると実座標(x,y)の算出はカメラ座標(u,v)とレーザスキャナ1からのz座標を用いて以下の式で可能になる。
As shown in FIG. 4, a scale plate 13 in which x and y scales are written is installed horizontally, and the scale plate 13 is photographed by the camera 3. In the photographed image, one part of the scale plate 13 is designated using a mouse or the like, and the position (coordinates u, v on the image) is read by a computer. Further, the actual coordinates (x, y) of the position are read from the scale plate. The z coordinate is read from the data of the laser scanner 1. The position (z position) of the flat plate is changed, and the photographing with the camera 3 is repeated to detect a plurality (6 or more) combinations of (u, v) and (x, y, z). By substituting these combinations into equation (1), the coefficient of h 11 -h 33 can be calculated.
When h 11 -h 33 is determined, real coordinates (x, y) can be calculated by the following equation using camera coordinates (u, v) and z coordinates from the laser scanner 1.

(1)を変形し、

Figure 0005669195

Figure 0005669195
から、

Figure 0005669195
・・・(2)
となる。 (1) is transformed,

Figure 0005669195

Figure 0005669195
From

Figure 0005669195
... (2)
It becomes.

また、逆にu、vの算出式は

Figure 0005669195
・・・(3)
となり、x、y、zからu、vへの変換が可能になる。 Conversely, the formula for calculating u and v is

Figure 0005669195
... (3)
Thus, conversion from x, y, z to u, v becomes possible.

図5に示すように、レーザスキャナは扇状の領域14を検出するため、鉛直下向きをz軸とし、y軸を通るものとする(y=0)。レーザスキャナから検出されるx、zのデータから、(3)式を用いることで、カメラ画像上でのレーザスキャナ1で検出している領域15の画像上での数カ所の近傍領域を検出する。カメラ3はビデオカメラを用い、連続した画像を撮影し、連続した画像間で近傍領域の移動量(Δu,Δv)を検出する。例えば、近傍領域を(u,v)を中心としサイズをm×nとする。この時の相関関数は、

Figure 0005669195

となり、相関関数φfgがもっとも小さくなるΔuとΔvを求めることを行う。 As shown in FIG. 5, the laser scanner detects the fan-shaped region 14, and the vertical downward direction is taken as the z axis and passes through the y axis (y = 0). Using the equation (3), several neighboring areas on the image of the area 15 detected by the laser scanner 1 on the camera image are detected from the x 0 and z 0 data detected from the laser scanner. To do. The camera 3 uses a video camera to take continuous images and detects the movement amount (Δu, Δv) of the neighboring region between the continuous images. For example, the neighborhood region is centered on (u, v) and the size is m × n. The correlation function at this time is

Figure 0005669195

Thus, Δu and Δv that minimize the correlation function φfg are obtained.

移動後の座標(u+Δu,v+Δv)およびz(レーザスキャナ1からの座標z、走行台車5は上下方向には移動しないものとし、移動前後一定でzである。)を(2)式に代入することで、移動後の(x,y)が決定される。移動前の座標(x,y)と比較することで、台車の移動量および移動方向が算出される。即ち、上述の数カ所の近傍領域15の移動前の座標と移動後の座標から、走行台車5の移動量および移動方向を決定する。以下に具体的に説明する。
(x,y)から(x,y)への平行移動および回転(方向)を示す式は以下のようになる。

Figure 0005669195

・・・(4)
展開すると、

Figure 0005669195

・・・(5)
になる。 The coordinates after movement (u + Δu, v + Δv) and z (coordinate z 0 from the laser scanner 1, the traveling carriage 5 is assumed not to move in the vertical direction, and is constant z 0 before and after the movement) are expressed in Equation (2). By substituting, (x, y) after movement is determined. By comparing with the coordinates (x 0 , y 0 ) before movement, the movement amount and movement direction of the carriage are calculated. That is, the movement amount and the movement direction of the traveling carriage 5 are determined from the coordinates before and after the movement of the above-mentioned several neighboring areas 15. This will be specifically described below.
The expression indicating the translation and rotation (direction) from (x 0 , y 0 ) to (x, y) is as follows.

Figure 0005669195

... (4)
When expanded

Figure 0005669195

... (5)
become.

数カ所(n個)の近傍領域の移動前の座標(x0i,y0i)と移動後の座標(x,y)を(5)に代入し、以下の連立方程式を得る。

Figure 0005669195
・・・(6)
Aの逆行列(または擬似逆行列)を両辺にかけることで、係数h11からh23が決定される。係数h11からh23が決定されると移動と回転を示す(4)式が決定されることになる。すなわち走行台車5の移動量および移動方向(回転)が決定されたことになる。 Substitution coordinates (x 0i , y 0i ) and coordinates after movement (x i , y i ) of several (n) neighboring regions are substituted into (5) to obtain the following simultaneous equations.

Figure 0005669195
... (6)
By multiplying the inverse matrix of A (or pseudo-inverse) to both sides, h 23 is determined from the coefficient h 11. When the coefficients h 11 to h 23 are determined, the equation (4) indicating the movement and rotation is determined. That is, the moving amount and moving direction (rotation) of the traveling carriage 5 are determined.

走行台車5の移動量および移動方向を考慮し、レーザスキャナ1からの断面データを重ねることで道路8の三次元形状が算出される。具体的には、レーザスキャナ1からの断面データを(4)式を用いて移動および回転させながらメモリー上に蓄積することで三次元形状を算出する。
尚、本実施形態では道路が水平である場合について言及したが、走行台車に傾きセンサを取り付け、傾きセンサの信号により、測域センサ1から得られる測拒データを補正することにより、坂道での計測も可能とすることができる。
The three-dimensional shape of the road 8 is calculated by superimposing the cross-sectional data from the laser scanner 1 in consideration of the moving amount and moving direction of the traveling carriage 5. More specifically, the three-dimensional shape is calculated by accumulating the cross-sectional data from the laser scanner 1 on the memory while moving and rotating using the equation (4).
In the present embodiment, the case where the road is horizontal is mentioned. However, by attaching a tilt sensor to the traveling carriage and correcting the measurement rejection data obtained from the range sensor 1 by the signal of the tilt sensor, Measurement can also be possible.

図6は実際に道路上で本発明の表面形状計測装置を走行して得られた画像を処理した結果を示す図である。本実施例においては、写真に示す道路上を走行させて得た画像である。写真(A)は道路の中心から直線的に眺めた図であり、写真(B)は道路の端部から倉庫側を斜めに眺めた図である。この写真では、道路を16、側路側を17、倉庫を18とする。図6(a)は、写真(A)の処理画像であり、図6(b)は写真(B)の処理画像であり、図6(c)は画像展開して真上からみた図である。この図からわかるとおり、道路16の凹凸や倉庫18の位置が鮮明に描画されている。また、側路側は草があるために、レーザが反射されにくいため、画像が若干不鮮明である。   FIG. 6 is a diagram showing a result of processing an image actually obtained by running the surface shape measuring device of the present invention on a road. In the present embodiment, it is an image obtained by running on the road shown in the photograph. Photo (A) is a view seen straight from the center of the road, and Photo (B) is a view seen obliquely from the end of the road to the warehouse side. In this photo, 16 roads, 17 side roads, and 18 warehouses. 6A is a processed image of the photograph (A), FIG. 6B is a processed image of the photograph (B), and FIG. 6C is a view of the image developed from the top. . As can be seen from this figure, the unevenness of the road 16 and the position of the warehouse 18 are clearly drawn. In addition, since there is grass on the side road side, the image is slightly blurred because the laser is not easily reflected.

1 測域センサ、2 アーム、3 カメラ、4 PC、5 走行台車、6 バッテリ、7 車輪、8 道路(路面)、9 レーザビーム、10 位置座標検出手段、11 CPU、12 モニタ、13 目盛り板、14 扇状の領域、15 近傍領域、16 道路、17 側路側、18 倉庫、50 表面形状計測装置 1 range sensor, 2 arms, 3 cameras, 4 PCs, 5 traveling carts, 6 batteries, 7 wheels, 8 roads (road surface), 9 laser beams, 10 position coordinate detection means, 11 CPU, 12 monitor, 13 scale plate, 14 fan-shaped area, 15 neighboring area, 16 road, 17 side road side, 18 warehouse, 50 surface shape measuring device

Claims (8)

計測対象物の表面に沿って移動しながら該表面の形状を三次元的に計測する表面形状計測装置であって、
前記計測対象物の表面にレーザビームを走査させて、該計測対象物の表面に照射した前記レーザビームの方位毎の測距データを出力する測距データ検出手段と、
前記レーザビームの投光面の一部を含んだ前記計測対象物の表面を撮像する撮像手段と、
前記測距データ検出手段、及び前記撮像手段を前記計測対象物表面に沿って移動させる移動手段と、
該移動手段により前記測距データ検出手段、及び前記撮像手段を移動させた時に該測距データ検出手段から得られた測距データを演算処理してz軸方向の位置座標を求め、前記撮像手段から得られた画像データを演算処理してx軸方向、並びにy軸方向の位置座標を求める位置座標検出手段と、
該位置座標検出手段により求められたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて前記計測対象物の三次元形状を演算する制御手段と、を備え、
前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として鉛直方向にz軸をとり、前記z軸に対して水平方向にx軸及びy軸をとる座標系を設置することを特徴とする表面形状計測装置。
A surface shape measuring device that three-dimensionally measures the shape of the surface while moving along the surface of the measurement object,
Ranging data detecting means for scanning the surface of the measuring object with a laser beam and outputting ranging data for each direction of the laser beam irradiated on the surface of the measuring object;
An imaging means for imaging a surface of the measurement object including a part of a projection surface of the laser beam;
Moving means for moving the distance measurement data detection means and the imaging means along the surface of the measurement object;
When the distance measurement data detection means and the image pickup means are moved by the movement means, the distance measurement data obtained from the distance measurement data detection means is processed to obtain position coordinates in the z-axis direction, and the image pickup means Position coordinate detection means for calculating the position coordinates in the x-axis direction and the y-axis direction by calculating the image data obtained from
Control means for calculating the three-dimensional shape of the measurement object based on the position coordinates in the x-axis direction, the y-axis direction, and the z-axis direction obtained by the position coordinate detection means,
The control means is provided with a coordinate system that takes the z-axis in the vertical direction with the laser projection part of the distance measurement data detection means as the origin, and takes the x-axis and the y-axis in the horizontal direction with respect to the z-axis. Characteristic surface shape measuring device.
前記制御手段は、前記撮像手段により撮像された画像上での前記計測対象物の表面模様の移動量及び移動方向に基づいて前記移動手段の移動量及び移動方向を演算することを特徴とする請求項1に記載の表面形状計測装置。   The said control means calculates the movement amount and movement direction of the said movement means based on the movement amount and movement direction of the surface pattern of the said measurement target object on the image imaged by the said imaging means. Item 2. The surface shape measuring device according to Item 1. 前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として下向きにz軸をとる座標系を設置し、このときの前記撮像手段により撮像される画像上の座標(u,v)と実座標(x,y,z)の関係から係数を算出し、該係数、前記画像上の座標(u,v)、及び前記測距データ検出手段からのz座標を用いて前記実座標(x,y,z)を求め
前記撮像手段により撮像された画像上の座標(u,v)と実座標(x,y,z)との関係は、
Figure 0005669195
となり、該行列式からh11〜h33の係数を算出し、実座標(x,y)の算出は、前記画像上の前記座標(u,v)と前記測距データ検出手段からのz座標を用いて、

Figure 0005669195

として求めることにより、撮像された前記画像上の前記座標(u,v)と前記測距データ検出手段による前記測距データのz座標から実座標(x,y)を求めることを特徴とする請求項1又は2に記載の表面形状計測装置。
The control means installs a coordinate system taking the z-axis downward with the laser projection part of the distance measurement data detection means as the origin, and the coordinates (u, v) on the image captured by the imaging means at this time And the real coordinates (x, y, z) are calculated, and the real coordinates (x, y, z) are calculated using the coefficients, the coordinates (u, v) on the image, and the z-coordinate from the distance measurement data detecting means. x, y, and z) asked,
The relationship between the coordinates (u, v) on the image captured by the imaging means and the actual coordinates (x, y, z) is as follows:
Figure 0005669195
The coefficients of h11 to h33 are calculated from the determinant, and the actual coordinates (x, y) are calculated using the coordinates (u, v) on the image and the z coordinates from the distance measurement data detecting means. And

Figure 0005669195

By determining the features a Rukoto seek the coordinates (u, v) on the captured the image and the actual coordinates from the z coordinate of the distance measuring data by the distance data detecting means (x, y) The surface shape measuring device according to claim 1 or 2.
前記制御手段は、前記撮像手段により撮像された画像上での前記測距データ検出手段により検出した領域の複数個所の近傍領域を検出し、該近傍領域の移動前の座標と移動後の座標に基づいて前記移動手段の移動量及び移動方向を決定することを特徴とする請求項1又は2に記載の表面形状計測装置。   The control means detects a plurality of neighboring areas of the area detected by the distance measurement data detecting means on the image taken by the imaging means, and sets the coordinates of the neighboring area before and after the movement. The surface shape measuring apparatus according to claim 1, wherein a movement amount and a movement direction of the moving means are determined based on the movement means. 測距データ検出手段、撮像手段、移動手段、位置座標検出手段、及び制御手段を備え、計測対象物の表面に沿って移動しながら該表面の形状を三次元的に計測する表面形状計測装置の表面形状計測方法であって、
前記測距データ検出手段が、前記計測対象物の表面にレーザビームを走査させて、該計測対象物の表面に照射した前記レーザビームの方位毎の測距データを出力するステップと、
前記撮像手段が、前記レーザビームの投光面の一部を含んだ前記計測対象物の表面を撮像するステップと、
前記移動手段が、前記測距データ検出手段、及び前記撮像手段を前記計測対象物表面に沿って移動させるステップと、
前記位置座標検出手段が、前記移動手段により前記測距データ検出手段、及び前記撮像手段を移動させた時に該測距データ検出手段から得られた測距データを演算処理してz軸方向の位置座標を求め、前記撮像手段から得られた画像データを演算処理してx軸方向、並びにy軸方向の位置座標を求めるステップと、
前記制御手段が、前記位置座標検出手段により求めたx軸方向、y軸方向、及びz軸方向の位置座標に基づいて前記計測対象物の三次元形状を演算するステップと、を含み、
前記制御手段は、前記測距データ検出手段のレーザ投光部を原点として鉛直方向にz軸をとり、前記z軸に対して水平方向にx軸及びy軸をとる座標系を設置することを特徴とする表面形状計測方法。
A surface shape measuring device that includes a distance measurement data detecting means, an imaging means, a moving means, a position coordinate detecting means, and a control means, and measures the shape of the surface three-dimensionally while moving along the surface of the measurement object. A surface shape measuring method,
The distance measurement data detecting means scanning the surface of the measurement object with a laser beam and outputting distance measurement data for each direction of the laser beam irradiated on the surface of the measurement object;
The imaging means imaging the surface of the measurement object including a part of the projection surface of the laser beam;
The moving means moving the distance measurement data detecting means and the imaging means along the surface of the measurement object;
The position coordinate detection means performs arithmetic processing on the distance measurement data obtained from the distance measurement data detection means when the distance measurement data detection means and the imaging means are moved by the movement means, and the position in the z-axis direction. Obtaining coordinates, calculating the image data obtained from the imaging means to obtain the position coordinates in the x-axis direction and the y-axis direction;
Said control means, said saw including a step of computing the three-dimensional shape of the measurement object, a based on the x-axis direction, the position coordinates of the y-axis direction, and the z-axis direction was determined by the position coordinate detection means,
The control means is provided with a coordinate system that takes the z-axis in the vertical direction with the laser projection part of the distance measurement data detection means as the origin, and takes the x-axis and the y-axis in the horizontal direction with respect to the z-axis. Characteristic surface shape measurement method.
前記制御手段が、前記撮像手段により撮像された画像上での前記計測対象物の表面模様の移動量及び移動方向に基づいて前記移動手段の移動量及び移動方向を演算するステップを含むことを特徴とする請求項5に記載の表面形状計測方法。   The control means includes a step of calculating a movement amount and a movement direction of the movement means based on a movement amount and a movement direction of the surface pattern of the measurement object on the image captured by the imaging means. The surface shape measuring method according to claim 5. 前記制御手段が、前記測距データ検出手段のレーザ投光部を原点として下向きにz軸をとる座標系を設置し、このときの前記撮像手段により撮像される画像上の座標(u,v)と実座標(x,y,z)の関係から係数を算出し、該係数、前記画像上の座標(u,v)、及び前記測距データ検出手段からのz座標を用いて前記実座標(x,y,z)を求めるステップを含み、
前記撮像手段により撮像された画像上の座標(u,v)と実座標(x,y,z)との関係は、
Figure 0005669195
となり、該行列式からh11〜h33の係数を算出し、実座標(x,y)の算出は、前記画像上の前記座標(u,v)と前記測距データ検出手段からのz座標を用いて、

Figure 0005669195

として求めることにより、撮像された前記画像上の前記座標(u,v)と前記測距データ検出手段による前記測距データのz座標から実座標(x,y)を求めること特徴とする請求項5又は6に記載の表面形状計測方法。
The control means installs a coordinate system taking the z-axis downward with the laser projection part of the distance measurement data detection means as the origin, and the coordinates (u, v) on the image captured by the imaging means at this time And the real coordinates (x, y, z) are calculated, and the real coordinates (x, y, z) are calculated using the coefficients, the coordinates (u, v) on the image, and the z-coordinate from the distance measurement data detecting means. x, y, the step of determining a z) only contains,
The relationship between the coordinates (u, v) on the image captured by the imaging means and the actual coordinates (x, y, z) is as follows:
Figure 0005669195
The coefficients of h11 to h33 are calculated from the determinant, and the actual coordinates (x, y) are calculated using the coordinates (u, v) on the image and the z coordinates from the distance measurement data detecting means. And

Figure 0005669195

The actual coordinates (x, y) are obtained from the coordinates (u, v) on the captured image and the z-coordinate of the distance measurement data by the distance measurement data detection means. 5. The surface shape measuring method according to 5 or 6.
前記制御手段が、前記撮像手段により撮像された画像上での前記測距データ検出手段により検出した領域の複数個所の近傍領域を検出し、該近傍領域の移動前の座標と移動後の座標に基づいて前記移動手段の移動量及び移動方向を決定するステップを含むことを特徴とする請求項5又は6に記載の表面形状計測方法。   The control means detects a plurality of neighboring areas of the area detected by the distance measurement data detecting means on the image taken by the imaging means, and sets the coordinates of the neighboring area before and after the movement. 7. The surface shape measuring method according to claim 5, further comprising a step of determining a moving amount and a moving direction of the moving means based on the moving means.
JP2011021620A 2011-02-03 2011-02-03 Surface shape measuring device and surface shape measuring method Active JP5669195B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011021620A JP5669195B2 (en) 2011-02-03 2011-02-03 Surface shape measuring device and surface shape measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011021620A JP5669195B2 (en) 2011-02-03 2011-02-03 Surface shape measuring device and surface shape measuring method

Publications (2)

Publication Number Publication Date
JP2012163346A JP2012163346A (en) 2012-08-30
JP5669195B2 true JP5669195B2 (en) 2015-02-12

Family

ID=46842882

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011021620A Active JP5669195B2 (en) 2011-02-03 2011-02-03 Surface shape measuring device and surface shape measuring method

Country Status (1)

Country Link
JP (1) JP5669195B2 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101456565B1 (en) 2014-05-28 2014-10-31 하상우 Investigation and analysis system for pavement of a road, and method thereof
JP2016217801A (en) * 2015-05-18 2016-12-22 エヌ・ティ・ティ・アドバンステクノロジ株式会社 Omnidirectional mobile robot and deformation detection system by three-dimensional camera system
JP6602625B2 (en) * 2015-09-28 2019-11-06 倉敷紡績株式会社 Structure inspection system
CN108362223B (en) * 2017-11-24 2020-10-27 广东康云多维视觉智能科技有限公司 Portable 3D scanner, scanning system and scanning method
CN209224071U (en) * 2018-11-19 2019-08-09 炬星科技(深圳)有限公司 The sensor placement system of robot
KR102052203B1 (en) * 2019-06-20 2019-12-04 (주)케이에스알큰사람 Safety diagnosis system of road pavement using line camera

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6395307A (en) * 1986-10-10 1988-04-26 Tokyo Keiki Co Ltd Apparatus for measuring unevenness of road surface
JP4000417B2 (en) * 2002-07-22 2007-10-31 四国化工機株式会社 Quality evaluation method of weld bead and shape measurement method therefor
JP2005070840A (en) * 2003-08-25 2005-03-17 East Japan Railway Co Three dimensional model preparing device, three dimensional model preparing method and three dimensional model preparing program
JP3837431B2 (en) * 2004-07-26 2006-10-25 国立大学法人 宮崎大学 Pipe inner surface shape measuring device
JP5343219B2 (en) * 2008-03-27 2013-11-13 福岡県 Strain measurement method, strain measurement system
JP5278878B2 (en) * 2009-03-23 2013-09-04 国立大学法人 宮崎大学 Pipe inner surface shape measuring device

Also Published As

Publication number Publication date
JP2012163346A (en) 2012-08-30

Similar Documents

Publication Publication Date Title
JP5669195B2 (en) Surface shape measuring device and surface shape measuring method
CN103491897B (en) Motion blur compensation
JP4871352B2 (en) Automatic reference system and apparatus for 3D scanning
EP2438397B1 (en) Method and device for three-dimensional surface detection with a dynamic reference frame
JP4619962B2 (en) Road marking measurement system, white line model measurement system, and white line model measurement device
JP2016516196A (en) Structured optical scanner correction tracked in 6 degrees of freedom
JP5418176B2 (en) Pantograph height measuring device and calibration method thereof
Li et al. Large depth-of-view portable three-dimensional laser scanner and its segmental calibration for robot vision
US20200124406A1 (en) Method for referencing a plurality of sensors and associated measuring device
JP2509357B2 (en) Work position detector
JP5648831B2 (en) Inner surface shape measuring apparatus and inner surface shape measuring method
KR101090082B1 (en) System and method for automatic measuring of the stair dimensions using a single camera and a laser
TWI521471B (en) 3 - dimensional distance measuring device and method thereof
JP5481862B2 (en) Pantograph height measuring device and calibration method thereof
JP2014145735A (en) Shape measurement device, structure production system, evaluation device, shape measurement method, structure production method, and shape measurement program
JP4077755B2 (en) POSITION DETECTION METHOD, DEVICE THEREOF, PROGRAM THEREOF, AND CALIBRATION INFORMATION GENERATION METHOD
JP2008009916A (en) Measuring device and measuring method
CN114663486A (en) Building height measurement method and system based on binocular vision
JP2012013592A (en) Calibration method for three-dimensional shape measuring machine, and three-dimensional shape measuring machine
JP2009042147A (en) Apparatus and method for recognizing object
Isa et al. The Effect of Motion Blur on Photogrammetric Measurements of a Robotic Moving Target
Senjalia et al. Measurement of wheel alignment using Camera Calibration and Laser Triangulation
JP4876676B2 (en) POSITION MEASURING DEVICE, METHOD, AND PROGRAM, AND MOVEMENT DETECTION DETECTING DEVICE, METHOD, AND PROGRAM
KR101436097B1 (en) Non-Contacting Method for Measuring 6-DOF Motion Based on Laser Sensor
JPH0843044A (en) Measuring apparatus for three dimensional coordinate

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140131

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140206

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140313

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140903

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140916

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20141110

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20141202

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20141211

R150 Certificate of patent or registration of utility model

Ref document number: 5669195

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250