JP2010014699A - Shape measuring apparatus and shape measuring method - Google Patents

Shape measuring apparatus and shape measuring method Download PDF

Info

Publication number
JP2010014699A
JP2010014699A JP2009029605A JP2009029605A JP2010014699A JP 2010014699 A JP2010014699 A JP 2010014699A JP 2009029605 A JP2009029605 A JP 2009029605A JP 2009029605 A JP2009029605 A JP 2009029605A JP 2010014699 A JP2010014699 A JP 2010014699A
Authority
JP
Japan
Prior art keywords
imaging
imaging means
image
measured
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009029605A
Other languages
Japanese (ja)
Inventor
Toru Mihashi
徹 三橋
Hiroki Unten
弘樹 運天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toppan Inc
Original Assignee
Toppan Printing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toppan Printing Co Ltd filed Critical Toppan Printing Co Ltd
Priority to JP2009029605A priority Critical patent/JP2010014699A/en
Publication of JP2010014699A publication Critical patent/JP2010014699A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a shape measuring apparatus and shape measuring method for measuring a shape even when a part or the whole of an object to be measured comes out of an imaging region of a plurality of imaging means when the object to be measured is imaged using the plurality of imaging means such as cameras. <P>SOLUTION: This shape measuring apparatus picks up an image while moving the plurality of imaging means that have a predetermined positional relation and have different view points, and measures the distance to the object to be measured using the image. This apparatus includes an auxiliary imaging means that images the imaging region of the plurality of imaging means far from the point where the view points of the plurality of imaging means start to overlap and a region including the periphery of the imaging region, and moves integrally with the plurality of imaging means. When the image is picked up while the plurality of imaging means and the auxiliary imaging means are moved and the whole shape of the object to be measured is measured, the position of the object to be measured captured by the plurality of imaging means is tracked using the image obtained from the auxiliary imaging means. <P>COPYRIGHT: (C)2010,JPO&INPIT

Description

本発明は写真測量の原理に基づく形状計測装置に関するもので、複数の撮像手段を移動し、該撮像手段の各位置において被計測物体までの距離を計測し、該計測結果を統合することにより被計測物体の形状を計測する形状計測装置に関する。   The present invention relates to a shape measuring apparatus based on the principle of photogrammetry, and moves a plurality of image pickup means, measures the distance to a measured object at each position of the image pickup means, and integrates the measurement results to obtain the object. The present invention relates to a shape measuring apparatus that measures the shape of a measurement object.

物体をCG(コンピュータグラフィクス)などで再現する際、その形状を三次元データとして作成する必要がある。   When reproducing an object with computer graphics (CG), it is necessary to create its shape as three-dimensional data.

対象が架空物体であればデザイナー、あるいはオペレーターが種々のCG作成ツールを用いてその形状データを作成している。   If the target is an imaginary object, the designer or operator creates the shape data using various CG creation tools.

また、形状の見本や試作品、あるいは実在する物体をデータ化しようとする場合には、それ自体、あるいは写真や映像を見て全て手作業で形状データを作成する場合と、何らかの形状計測装置による計測データを用いて作成する方法がある。   In addition, when trying to convert a sample of a shape or prototype or an actual object into data, it is possible to create shape data by itself by looking at a picture or video, or by using some shape measuring device. There is a method of creating using measurement data.

しかし、前者の全て手作業で形状データを作成するには非常に負荷がかかる。一方、後者の形状計測装置による計測データを作成する場合は、対象や求められる精度に応じていくつかの装置が市販されている。例えば、最も手軽な装置の一つは、写真測量の原理により視点が異なる複数の画像を用いて、画像内の共通する各点までの距離を求めるというもので、ステレオカメラやマルチカメラ、あるいは視点の異なる画像を読み込んで処理するソフトウェアなどが挙げられる。   However, it takes a very heavy load to create the shape data by all manual operations. On the other hand, when creating measurement data by the latter shape measuring device, several devices are commercially available depending on the object and the required accuracy. For example, one of the simplest devices is to use a plurality of images with different viewpoints based on the principle of photogrammetry to determine the distance to each common point in the image. Software that reads and processes different images.

ステレオカメラを用いる場合は、二つのカメラ間の距離や各々のカメラの焦点距離や光軸の向き、といったカメラパラメータを予め求めておき、各々のカメラからの画像によって、カメラからそれら画像上の被計測物体の各点までの距離を三角測量の原理により求めるものである。このようなステレオカメラを用いて形状計測を行う場合、一度に撮像できる範囲については、被計測物体の各点までの距離が求められ、その形状は計測できるが、被計測物体全体の形状を知るためには被計測物体をあらゆる方向から撮像し、それぞれのデータを同じ3次元の座標系に統合する必要がある。   When a stereo camera is used, camera parameters such as the distance between the two cameras, the focal length of each camera, and the direction of the optical axis are obtained in advance, and the image from each camera is subject to an object on the image. The distance to each point of the measurement object is obtained by the principle of triangulation. When performing shape measurement using such a stereo camera, the distance to each point of the object to be measured is obtained for the range that can be imaged at once, and the shape can be measured, but the shape of the entire object to be measured is known. In order to achieve this, it is necessary to image the object to be measured from all directions and to integrate the respective data into the same three-dimensional coordinate system.

これを自動化したシステムとして非特許文献1のステレオカメラ移動計測システムが実現されている。これは2眼のステレオカメラを移動させながら計測対象を撮像し、各フレーム(ステレオカメラを移動させた毎に得られる撮像画像)で得られた形状情報から求める点の3次元座標を算出するものである。   As a system that automates this, the stereo camera movement measurement system of Non-Patent Document 1 is realized. This is to capture the measurement object while moving the two-lens stereo camera, and calculate the 3D coordinates of the point to be obtained from the shape information obtained in each frame (the captured image obtained each time the stereo camera is moved) It is.

上記のように、手軽に被計測物体全体の形状データを得ることができるが、撮像の際、被計測物体の一部、あるいは全体が撮像領域の外に出てしまうと、その前後のフレームから得られる被計測物体の形状情報や座標に誤りが発生するいう問題が生じる。   As described above, the shape data of the entire object to be measured can be easily obtained, but if a part or the whole of the object to be measured goes out of the imaging area during imaging, the frame before and after that There arises a problem that errors occur in the shape information and coordinates of the measured object to be obtained.

これを防ぐためには、後で述べる被計測物体の特徴点の周囲情報を増やす、すなわち、より広い範囲の情報を用いてあらゆる方向のエッジ検出や多重解像度解析を行い特徴点を抽出する手法が考えられる。しかしながら情報を増やせば計算負荷が高くなり、リアルタイムでの処理は行えない。カメラを移動させながら全体形状を得るためには15fps(フレーム/秒)程度のリアルタイム処理が必要である。例えば、動画像解析で広く用いられているKLT法(非特許文献2)を用いた場合、15fps程度でのリアルタイム処理を行うことは容易であるが、より特徴量の次元の高いSIFT法(非特許文献3)では15fpsでの処理は困難である。   In order to prevent this, there is a method to increase the surrounding information of feature points of the measured object, which will be described later, that is, to extract feature points by performing edge detection and multi-resolution analysis using a wider range of information. It is done. However, if the information is increased, the calculation load increases and real-time processing cannot be performed. Real-time processing of about 15 fps (frame / second) is required to obtain the entire shape while moving the camera. For example, when the KLT method (Non-Patent Document 2) widely used in moving image analysis is used, real-time processing at about 15 fps is easy, but the SIFT method (non- In Patent Document 3), processing at 15 fps is difficult.

このように撮像の際、被計測物体の一部、あるいは全体が撮像領域の外に出た場合でも、計算負荷を高くせず、その前後のフレームから得られる被計測物体の形状情報や座標に誤りが発生することなく形状計測することは困難であった。 As described above, even when a part or the whole of the measured object goes out of the imaging area during imaging, the calculation load is not increased, and the shape information and coordinates of the measured object obtained from the frames before and after that are used. It was difficult to measure the shape without causing an error.

上記問題を解決するために、ステレオカメラに用いるレンズをより広角のものとして、対象物が撮影領域から外れにくくすることも考えられるが、同時に形状計測の精度も低下してしまう。   In order to solve the above-mentioned problem, it is conceivable that the lens used for the stereo camera has a wider angle and the object is less likely to be removed from the imaging region, but at the same time, the accuracy of the shape measurement also decreases.

運天弘樹, 増田智仁, 三橋徹, 安藤真: "ステレオカメラ移動撮影によるVRモデル自動生成手法の検討",日本バーチャルリアリティ学会論文誌, 12, 1, pp. 127-135 (2007)Hiroki Unten, Tomohito Masuda, Toru Mitsuhashi, Makoto Ando: "Examination of VR model automatic generation method by moving stereo camera", Transactions of the Virtual Reality Society of Japan, 12, 1, pp. 127-135 (2007) Bruce D. Lucas and Takeo Kanade. An Iterative Image Registration Technique with an Application to Stereo Vision. International Joint Conference on Artificial Intelligence, pages 674-679, 1981Bruce D. Lucas and Takeo Kanade.An Iterative Image Registration Technique with an Application to Stereo Vision.International Joint Conference on Artificial Intelligence, pages 674-679, 1981 David G. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, 60, 2, pp. 91-110,2004David G. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, 60, 2, pp. 91-110,2004

本発明は複数のカメラ等の撮像手段を用いて被計測物体を撮像する際、被計測物体の一部あるいは全体が上記複数の撮像手段の撮像領域の外に出た場合でも、形状計測を可能とする形状計測装置及び形状計測方法を提供することである。   In the present invention, when an object to be measured is imaged using an imaging means such as a plurality of cameras, the shape can be measured even when a part or the whole of the object to be measured is outside the imaging area of the plurality of imaging means. It is to provide a shape measuring apparatus and a shape measuring method.

本発明の請求項1に係る発明は、所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する装置において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を備え、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し被計測物体の全体形状を計測する際、前記補助撮像手段から得られる画像を用いて前記複数の撮像手段が捉えた被計測物体の位置を追跡するようにしたことを特徴とする形状計測装置である。   The invention according to claim 1 of the present invention captures an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, and measures the shape by measuring the distance to the object to be measured using the image. In the apparatus, the imaging region of the plurality of imaging units far beyond the point where the fields of view of the plurality of imaging units start to overlap and the region including the periphery of the imaging region are captured and moved together with the plurality of imaging units. An auxiliary imaging unit that captures images while moving the plurality of imaging units and the auxiliary imaging unit and measures the entire shape of the object to be measured, using the images obtained from the auxiliary imaging unit. The shape measuring apparatus is characterized in that the position of an object to be measured captured by an imaging means is tracked.

本発明の請求項2に係る発明は、所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する装置において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を備え、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し被計測物体の全体形状を計測する際、前記補助撮像手段から得られる画像を用いて前記被計測物体の位置に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定することを特徴とする請求項1記載の形状計測装置である。   The invention according to claim 2 of the present invention captures an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, and measures the shape by measuring the distance to the object to be measured using the image. In the apparatus, the imaging region of the plurality of imaging units far beyond the point where the fields of view of the plurality of imaging units start to overlap and the region including the periphery of the imaging region are captured and moved together with the plurality of imaging units. An auxiliary imaging unit that captures an image while moving the plurality of imaging units and the auxiliary imaging unit, and measures the entire shape of the object to be measured, using the image obtained from the auxiliary imaging unit. The shape measuring apparatus according to claim 1, wherein a change in a relative position and direction of the plurality of imaging units with respect to an object position is determined.

本発明の請求項3に係る発明は、所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測す
る方法において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を用いて、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し、前記補助撮像手段から得られる画像を用いて前記複数の撮像手段が捉えた被計測物体の位置を追跡することを特徴とする形状計測方法である。
The invention according to claim 3 of the present invention captures an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, and measures the shape by measuring the distance to the object to be measured using the image. In the method, the imaging region of the plurality of imaging units far beyond the point where the fields of view of the plurality of imaging units start to overlap and the region including the periphery of the imaging region are captured and moved together with the plurality of imaging units. The auxiliary imaging means is used to take an image while moving the plurality of imaging means and the auxiliary imaging means, and the measured object captured by the plurality of imaging means using the image obtained from the auxiliary imaging means This is a shape measuring method characterized by tracking a position.

本発明の請求項4に係る発明は、所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する方法において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を用いて、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し、前記補助撮像手段から得られる画像を用いて前記被計測物体の位置に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定することを特徴とする形状計測方法である。   The invention according to claim 4 of the present invention captures an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, and measures the shape by measuring the distance to the object to be measured using the image. In the method, the imaging region of the plurality of imaging units far beyond the point where the fields of view of the plurality of imaging units start to overlap and the region including the periphery of the imaging region are captured and moved together with the plurality of imaging units. An image is picked up while moving the plurality of image pickup means and the auxiliary image pickup means using the auxiliary image pickup means, and the plurality of image pickup means for the position of the object to be measured using an image obtained from the auxiliary image pickup means. It is a shape measurement method characterized by determining a change in relative position and direction.

本発明の形状計測装置と形状計測方法によれば、形状計測用の撮像手段の撮像領域から被計測物体が外れてしまった場合でも特徴点を追跡することができ、速やかに対象物全体の形状を計測することが可能になる。   According to the shape measuring apparatus and the shape measuring method of the present invention, it is possible to track a feature point even when an object to be measured has deviated from the imaging area of the imaging means for shape measurement, and quickly shape the entire object. Can be measured.

本発明の形状計測装置の一実施形態を示す模式図The schematic diagram which shows one Embodiment of the shape measuring apparatus of this invention 各撮像手段の撮像領域の関係を示す模式図Schematic diagram showing the relationship of the imaging area of each imaging means 各撮像領域で撮像された被計測物体と被計測物体上で抽出される特徴点の関係を示す図The figure which shows the relationship between the to-be-measured object imaged in each imaging area, and the feature point extracted on a to-be-measured object 各撮像領域で撮像された被計測物体と被計測物体上で抽出される特徴点の関係を示す模式図Schematic diagram showing the relationship between the measured object imaged in each imaging area and the feature points extracted on the measured object 形状計測のフローチャート図Flow chart of shape measurement 他の形状計測方法のフローチャート図Flowchart diagram of another shape measurement method

本発明の形状計測装置及び計測方法の最良の形態について、図面を参照しながら説明する。   The best mode of the shape measuring apparatus and measuring method of the present invention will be described with reference to the drawings.

図1は本発明の形状計測装置の一実施形態を示す模式図、図2は本発明に係わる形状計測装置における各撮像手段の撮像領域の関係を示す模式図である。   FIG. 1 is a schematic diagram showing an embodiment of a shape measuring apparatus according to the present invention, and FIG. 2 is a schematic diagram showing a relationship between imaging regions of each imaging means in the shape measuring apparatus according to the present invention.

図1(a)は形状計測のための一組の撮像手段11及び12に加えて本発明に係わるもう一つの撮像手段である補助撮像手段13を組み入れた例を示し、上段は側面を示す図、下段は平面を示す図で、補助撮像手段13は一組の撮像手段11及び12の間に設置される。同図(b)は1組の撮像手段21及び22を含む既存のステレオカメラの枠24に本発明に係わるもう一つの撮像手段である補助撮像手段23を組み入れた例を示し、上段は側面を示す図、下段は平面を示す図で、補助撮像手段23は一組の撮像手段21及び22が組み込まれているステレオカメラの枠24の外側に設置されている。   FIG. 1A shows an example in which auxiliary imaging means 13 which is another imaging means according to the present invention is incorporated in addition to a set of imaging means 11 and 12 for shape measurement, and the upper part shows a side view. The lower part is a diagram showing a plane, and the auxiliary imaging unit 13 is installed between the pair of imaging units 11 and 12. FIG. 2B shows an example in which auxiliary imaging means 23, which is another imaging means according to the present invention, is incorporated in a frame 24 of an existing stereo camera including a pair of imaging means 21 and 22, and the upper side is a side view. In the figure, the lower part is a diagram showing a plane, and the auxiliary imaging means 23 is installed outside the frame 24 of the stereo camera in which a pair of imaging means 21 and 22 are incorporated.

図2は撮像手段(11または21)の視野A1、及び撮像手段(12または22)の視野B1、更に補助撮像手段(13または23)の視野C1のそれぞれの撮像領域A,撮像領域B,撮像領域Cを示す。撮像領域Aと撮像領域Bとが重なる撮像領域Dが被計測物体までの距離を計測できる領域となる。ここで、補助撮像手段は撮像手段(11または21)の視野A1と撮像手段(12または22)の視野B1が重なり始める点Lより被計測物体から見て遠い位置で、その視野がA1,B1の外側の線と交差するように設置される。   FIG. 2 shows a field A1 of the imaging means (11 or 21), a field B1 of the imaging means (12 or 22), and a field C1 of the field C1 of the auxiliary imaging means (13 or 23). Region C is shown. The imaging area D where the imaging area A and the imaging area B overlap is an area where the distance to the object to be measured can be measured. Here, the auxiliary imaging means is at a position far from the object to be measured from the point L where the visual field A1 of the imaging means (11 or 21) and the visual field B1 of the imaging means (12 or 22) begin to overlap, and the visual field is A1, B1. It is installed so that it crosses the outside line.

図3はそれぞれの撮像領域で撮像された被計測物体と被計測物体上で抽出される特徴点との関係を示す。それぞれの撮像手段の撮像領域A、撮像領域Bの撮像領域の一部が重なっており、撮像領域Cは前記撮像領域Aと前記撮像領域Bと該撮像領域の周辺をも含む領域となっている。   FIG. 3 shows the relationship between the measured object imaged in each imaging area and the feature points extracted on the measured object. A part of the imaging area A and the imaging area B of each imaging unit overlaps, and the imaging area C is an area including the imaging area A, the imaging area B, and the periphery of the imaging area. .

前記図1(a)または(b)で示した撮像手段を移動しF-1、F-2、F-3、F-4で示される画像を撮像する。前記F-1、F-2、F-3、F-4は撮像したフレーム順(適当に数フレーム飛ばして示してある)で、撮像手段(11、または21)、(12または22)、(13または23)を移動して撮像しているため、撮像対象の被計測物体の位置がそれぞれ変わっている。なお、撮像対象は説明のため、代表として葉のようなものがついた丸い被計測物体を一つのみ示す。   The image pickup means shown in FIG. 1 (a) or (b) is moved to pick up images shown by F-1, F-2, F-3, and F-4. F-1, F-2, F-3, and F-4 are in the order of captured frames (shown by skipping several frames as appropriate), and the imaging means (11 or 21), (12 or 22), ( 13 or 23) is moved and imaged, so that the position of the measured object to be imaged changes. For the sake of explanation, only one round object to be measured with a leaf-like object is shown as a representative for the purpose of explanation.

形状情報を取得するために得られた被計測物体上の特徴点をそれぞれa、b、c、d、eとする。これらの特徴点はエッジ検出やコーナー検出、更にその周囲の画素との明度や色の差など周囲画素との関係から求められる。F−1では5点全てが全ての撮像手段の撮像領域の中にあるが、F−2では点aと点d、点eが撮像領域Bの外に、F−3では5点全てが撮像領域Bの外に出てしまい、点a、点d、点eは撮像領域Aからも外れてしまっている。その後F−4では点aを除く4点は全ての撮像系の撮像領域に入り、点aも撮像領域A、撮像領域Cに入っていることを示す。   The feature points on the measured object obtained for acquiring the shape information are a, b, c, d, and e, respectively. These feature points are obtained from edge detection, corner detection, and the relationship with surrounding pixels such as brightness and color difference from the surrounding pixels. In F-1, all five points are in the imaging area of all imaging means, but in F-2, points a and d and point e are outside the imaging area B, and in F-3, all five points are imaged. The point a, the point d, and the point e are also out of the imaging region A. Thereafter, in F-4, four points excluding the point a enter the imaging region of all the imaging systems, and the point a also enters the imaging region A and the imaging region C.

このように撮像した場合、例えばd―eで示す特徴点と似た周囲状況の特徴点d’ ―e’がその近くにあると、特徴点抽出の際用いる情報が十分でなければd、eが撮像領域から外れた際にそれらをd’、e’と取り違えて認識してしまうことがある。例えばF−2において、撮像手段(11、または21)の撮像領域Aがd―eとして認識しているものを、撮像手段(12、または22)の撮像領域Bではそれに対応するものとしてd’ ―e’を誤って認識してしまうといった具合である。   When imaging is performed in this manner, for example, if a feature point d′-e ′ in the surrounding situation similar to the feature point indicated by d−e is in the vicinity thereof, d, e if the information used for feature point extraction is not sufficient May deviate from d ′ and e ′ when they deviate from the imaging area. For example, in F-2, what the imaging area A of the imaging means (11 or 21) recognizes as de is regarded as corresponding to that in the imaging area B of the imaging means (12 or 22). ―E 'is mistakenly recognized.

しかしながら本発明では、撮像手段(11、または21)、及び撮像手段(12または22)の撮像領域のすべての領域とその周辺をも含む領域を撮像する補助撮像手段(13または23)を付加する。更に、上記すべての撮像手段間の距離、焦点距離、光軸の向き、F値などのカメラパラメータを予め求めておくことによって、各特徴点の撮像領域Aの画像上での座標と撮像領域Bの画像上での座標、撮像領域Aの画像上での座標と撮像領域Cの画像上での座標、撮像領域Bの画像上での座標と撮像領域Cの画像上での座標との関係を得ることが出来る。   However, in the present invention, the image pickup means (11 or 21) and the auxiliary image pickup means (13 or 23) for picking up an image including all areas of the image pickup area of the image pickup means (12 or 22) and the surrounding area are added. . Further, the camera parameters such as the distance between all the imaging means, the focal length, the direction of the optical axis, and the F value are obtained in advance, so that the coordinates on the image of the imaging area A and the imaging area B of each feature point are obtained. The relationship between the coordinates on the image, the coordinates on the image of the imaging area A and the coordinates on the image of the imaging area C, and the coordinates on the image of the imaging area B and the coordinates on the image of the imaging area C Can be obtained.

これを模式的に示したのが図4である。F−1に示すように、撮像手段からある距離にある平面において、撮像領域A、あるいは撮像領域Bの画像は撮像領域Cへ投影した画像として考えられる。つまり、撮像領域A、あるいは撮像領域Bの画像は撮像領域Cの画像の一部とみなせ、F−2に示すように、それぞれの撮像領域から特徴点が出てしまった場合についても撮像領域Cの画像上にあれば追跡を続けることができ、これらの特徴点が撮像領域Aや撮像領域Bに再度戻ってきた際、特徴点の取り違えをすることなく正しい情報が瞬時に得られる。   This is schematically shown in FIG. As shown in F-1, the image of the imaging area A or the imaging area B on the plane at a certain distance from the imaging means is considered as an image projected onto the imaging area C. That is, the image of the imaging area A or the imaging area B can be regarded as a part of the image of the imaging area C, and the imaging area C even in the case where a feature point appears from each imaging area as shown in F-2. Tracking can be continued as long as it is on the image, and when these feature points return to the imaging region A and the imaging region B again, correct information can be obtained instantaneously without mistaking the feature points.

特徴点の抽出は、撮像領域A、撮像領域B、撮像領域Cのそれぞれの画像でエッジ検出やコーナー検出などの手法により行われ、それぞれの画像間でブロックマッチングなどの手法を用いて対応する点を求める。その後はフレーム間での対応点マッチングを併用することにより、各画像内、画像間での特徴点追跡が行える。   The feature points are extracted by a method such as edge detection or corner detection in each of the images in the imaging region A, the imaging region B, and the imaging region C, and points corresponding to each image using a method such as block matching. Ask for. After that, by using corresponding point matching between frames together, it is possible to track feature points within each image and between images.

図5は本発明に係わる形状計測のフローチャート図を示す。   FIG. 5 is a flowchart of shape measurement according to the present invention.

図1(b)の1組の撮像手段21及び22を含む既存のステレオカメラ24に本発明に係わるもう一つの撮像手段である補助撮像手段23を組み入れた例で形状計測のフローを図3及び図5を用いて以下に示す。   FIG. 3 and FIG. 3 show the flow of shape measurement in an example in which the auxiliary imaging means 23 which is another imaging means according to the present invention is incorporated in the existing stereo camera 24 including the pair of imaging means 21 and 22 in FIG. It shows below using FIG.

START(S1)後、各撮像手段のセットアップすなわち撮像領域(画角)、設置位置を決定する(S2)。複数の撮像手段であるステレオカメラの二つの撮像手段と補助撮像手段間の距離や各々のカメラの焦点距離や光軸の向き、といったカメラパラメータを求める(キャリブレーション)(S3)。以上の準備を終えた後、撮像する(S4)。次に各撮像手段が撮像した被計測物体の画像から特徴点を抽出し(S5)(ここで、図3の点a,b,c,d,eの特徴点が各撮像手段から得られる)、次に各撮像手段から得られた撮像領域A、撮像領域B、撮像領域C間の画像の対応付け(各撮像領域における各特徴点を一致させる)が行われる(S6)。次に各撮像手段と被計測物体の特徴点間との距離を算出し形状データを取得する(S7,S8)。   After START (S1), the setup of each imaging means, that is, the imaging area (view angle) and the installation position are determined (S2). Camera parameters such as the distance between two imaging means and auxiliary imaging means of a stereo camera as a plurality of imaging means, the focal length of each camera, and the direction of the optical axis are obtained (calibration) (S3). After completing the above preparation, an image is taken (S4). Next, feature points are extracted from the image of the measured object imaged by each imaging means (S5) (here, the feature points of points a, b, c, d, e in FIG. 3 are obtained from each imaging means). Next, image association between the imaging area A, the imaging area B, and the imaging area C obtained from each imaging means (matching each feature point in each imaging area) is performed (S6). Next, the distance between each imaging means and the feature points of the object to be measured is calculated to obtain shape data (S7, S8).

更に全形状(S9)が終了していない場合は、視点を変えるために各撮像手段を一体で移動し(S10)、撮像する(S11)。各撮像手段によって撮像された画像から前記と同じように特徴点を抽出(S12)し、撮像手段と特徴点間の距離を算出し形状データを取得する(S13、S14)。次に今回撮像した特徴点と前回撮像した特徴点で、同じものを追跡し(S15)、フレーム間で特徴点を前記と同じように対応付けし(S16)、各フレームで得た形状データを統合する(一つの座標系に繋ぎ合わせる(フレーム間位置合わせ))(S17)。   Further, if the entire shape (S9) has not been completed, the respective imaging means are moved together to change the viewpoint (S10) and imaged (S11). In the same manner as described above, feature points are extracted from the images picked up by the respective image pickup means (S12), the distance between the image pickup means and the feature points is calculated, and shape data is acquired (S13, S14). Next, the same feature point captured this time and the feature point captured last time are tracked (S15), the feature points are associated between the frames in the same manner as described above (S16), and the shape data obtained in each frame is obtained. Integrate (connect to one coordinate system (inter-frame alignment)) (S17).

前記特徴点追跡(S15)において、上記点a,b,c,d,eのうち一部または全体が図3のF−2、F−3のように、撮像領域Aまたは撮像領域Bの外に出てしまった場合でも、図4で示したように撮像領域A、あるいは撮像領域Bの画像は撮像領域Cの画像の一部とみなせ、撮像領域Cの中にあれば追跡を続けることができる。   In the feature point tracking (S15), some or all of the points a, b, c, d, and e are outside the imaging area A or the imaging area B as shown in F-2 and F-3 in FIG. 4, the image of the imaging area A or B can be regarded as a part of the image of the imaging area C as shown in FIG. it can.

このように複数の撮像手段と補助撮像手段を一体となって移動させながら撮像、特徴点抽出、特徴点追跡、特徴点対応付け、距離算出、形状データ取得を全形状(S9)について行い、形状計測は終了する(S18)。   In this way, imaging, feature point extraction, feature point tracking, feature point association, distance calculation, and shape data acquisition are performed for all shapes (S9) while moving a plurality of imaging means and auxiliary imaging means together. The measurement ends (S18).

以上の形状計測では、補助撮像手段から得られる画像を用いて、前記複数の撮像手段が捉えた被計測物体の位置を追跡して形状計測する方法について述べたが、これとは別の方法で、補助撮像手段から得られる画像を用いて、被計測物体の位置に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定して被計測物体の形状を計測する方法を次に述べる。   In the above shape measurement, the method of measuring the shape by tracking the positions of the measured objects captured by the plurality of imaging means using the image obtained from the auxiliary imaging means has been described. Next, a method for measuring the shape of the object to be measured by determining changes in the relative positions and directions of the plurality of image capturing means with respect to the position of the object to be measured using an image obtained from the auxiliary imaging means will be described.

START(K1)後、各撮像手段のセットアップすなわち撮像領域(画角)、設置位置を決定する(K2)。複数の撮像手段であるステレオカメラの二つの撮像手段と補助撮像手段間の距離や各々のカメラの焦点距離や光軸の向き、といったカメラパラメータを求める(キャリブレーション)(K3)。以上の準備を終えた後、撮像する(K4)。次に各撮像手段が撮像した被計測物体の画像から特徴点を抽出し(K5)(ここで、図3の点a,b,c,d,eの特徴点が各撮像手段から得られる)、次に各撮像手段から得られた撮像領域A、撮像領域B、撮像領域C間の画像の対応付け(各撮像領域における各特徴点を一致させる)が行われる(K6)。次に被計測物体の特徴点に対する前記複数の撮像手段の相対的な位置及び方向を決定し、形状データを取得する(K7,K8)。   After START (K1), the setup of each imaging means, that is, the imaging area (view angle) and the installation position are determined (K2). Camera parameters such as the distance between two imaging means and auxiliary imaging means of a stereo camera as a plurality of imaging means, the focal length of each camera, and the direction of the optical axis are obtained (calibration) (K3). After completing the above preparation, an image is taken (K4). Next, feature points are extracted from the image of the measured object imaged by each imaging means (K5) (here, the feature points at points a, b, c, d, e in FIG. 3 are obtained from each imaging means). Next, the image association between the imaging area A, the imaging area B, and the imaging area C obtained from each imaging means is performed (matching the feature points in each imaging area) (K6). Next, the relative positions and directions of the plurality of imaging means with respect to the feature points of the measurement object are determined, and shape data is acquired (K7, K8).

更に全形状(K9)が終了していない場合は、視点を変えるために各撮像手段を一体で
移動し(K10)、撮像する(K11)。各撮像手段によって撮像された画像から前記と同じように特徴点を抽出(K12)し、被計測物体に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定し、該変化より形状データを取得する(K13、K14)。次に今回撮像した特徴点と前回撮像した特徴点で、同じものを追跡し(K15)、フレーム間で特徴点を前記と同じように対応付けし(K16)、各フレームで得た形状データを統合する(一つの座標系に繋ぎ合わせる(フレーム間位置合わせ))(K17)。全形状について終了した場合(K9のY)は終了する(END)(K18)。
Further, when the entire shape (K9) has not been completed, the respective imaging means are moved together (K10) to take an image (K11) in order to change the viewpoint. A feature point is extracted from the image captured by each imaging unit in the same manner as described above (K12), changes in the relative positions and directions of the plurality of imaging units with respect to the object to be measured are determined, and shape data is determined based on the changes. Is acquired (K13, K14). Next, the feature point captured this time and the feature point captured last time are tracked (K15), the feature points are correlated between the frames in the same manner as described above (K16), and the shape data obtained in each frame is obtained. Integrate (connect to one coordinate system (inter-frame alignment)) (K17). When the process is completed for all shapes (Y of K9), the process is ended (END) (K18).

以上のように本発明による形状計測装置によれば、複数の形状計測用の撮像手段とは別に、それらの複数の撮像手段全ての撮像領域を含む領域を撮像する補助撮像手段によって、上記複数の撮像手段の撮像領域から被計測物体が外れてしまった場合でも特徴点を追跡し、形状計測することが可能になる。また、本発明に係わる補助撮像手段は特徴点追跡に使われ、距離情報は上記複数の撮像手段によって行われるので、形状計測精度を落とすことなく形状計測が可能である。   As described above, according to the shape measuring apparatus according to the present invention, the plurality of the plurality of shape measuring imaging means separately from the plurality of shape measuring imaging means by the auxiliary imaging means that captures an area including the imaging regions of the plurality of shape measuring means. Even when the object to be measured has deviated from the imaging area of the imaging means, it is possible to track the feature point and measure the shape. Further, since the auxiliary imaging means according to the present invention is used for feature point tracking and distance information is obtained by the plurality of imaging means, shape measurement can be performed without reducing shape measurement accuracy.

11、12・・・1組の形状計測用撮像手段
13・・・補助撮像手段
21、22・・・1組の形状計測用撮像手段
23・・・補助撮像手段
24・・・ステレオカメラ
A1・・・11または21の撮像手段の視野
B1・・・12または22の撮像手段の視野
C1・・・13または23の補助撮像手段の視野
A・・・11または21の撮像手段の撮像領域
B・・・12または22の撮像手段の撮像領域
C・・・13または23の撮像手段の撮像領域
D・・・撮像領域Aと撮像領域Bが重なる撮像領域
L・・・視野A1及び視野B1が重なり始める点
11, 12... A set of shape measurement imaging means 13... Auxiliary imaging means 21, 22... A set of shape measurement imaging means 23. Field of view B1 of the imaging means 11 or 21 Field of view C1 of the imaging means of 12 or 22 Field of view A of the auxiliary imaging means of 13 or 23 Image area B of the imaging means of 11 or 21 .. Imaging area C of imaging means 12 or 22 Imaging area D of imaging means 13 or 23 Imaging area L where imaging area A and imaging area B overlap Field of view A1 and field of view B1 overlap Starting point

Claims (4)

所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する装置において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を備え、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し被計測物体の全体形状を計測する際、前記補助撮像手段から得られる画像を用いて前記複数の撮像手段が捉えた被計測物体の位置を追跡するようにしたことを特徴とする形状計測装置。   In an apparatus for capturing an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, measuring the distance to the object to be measured using the image, and measuring the shape, the field of view of the plurality of imaging means is The plurality of imaging units includes an auxiliary imaging unit that captures an imaging region of the plurality of imaging units beyond the point where overlap starts and an area including the periphery of the imaging region and moves integrally with the plurality of imaging units. When measuring the overall shape of the object to be measured while moving the image capturing means and the auxiliary imaging means, the positions of the objects to be measured captured by the plurality of imaging means are measured using the images obtained from the auxiliary imaging means. A shape measuring device characterized by being tracked. 所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する装置において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を備え、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し被計測物体の全体形状を計測する際、前記補助撮像手段から得られる画像を用いて前記被計測物体の位置に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定することを特徴とする形状計測装置。   In an apparatus for capturing an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, measuring the distance to the object to be measured using the image, and measuring the shape, the field of view of the plurality of imaging means is The plurality of imaging units includes an auxiliary imaging unit that captures an imaging region of the plurality of imaging units beyond the point where overlap starts and an area including the periphery of the imaging region and moves integrally with the plurality of imaging units. When measuring the overall shape of the object to be measured by moving an image and moving the auxiliary imaging means, the plurality of imaging means relative to the position of the object to be measured using the image obtained from the auxiliary imaging means A shape measuring apparatus characterized by determining a change in a general position and direction. 所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する方法において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を用いて、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し、前記補助撮像手段から得られる画像を用いて前記複数の撮像手段が捉えた被計測物体の位置を追跡することを特徴とする形状計測方法。   In the method of capturing an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, measuring the distance to the object to be measured using the image, and measuring the shape, the field of view of the plurality of imaging means is Using the auxiliary imaging means that images the imaging area of the plurality of imaging means beyond the point where overlap starts and the area including the periphery of the imaging area, and moves together with the plurality of imaging means, the plurality of imaging means A shape measurement characterized in that an image is captured while moving the imaging means and the auxiliary imaging means, and the positions of the measured objects captured by the plurality of imaging means are tracked using images obtained from the auxiliary imaging means. Method. 所定の位置関係にある視点が異なる複数の撮像手段を移動させながら画像を撮像し、該画像を用いて被計測物体までの距離を測定し形状計測する方法において、前記複数の撮像手段の視野が重なり始まる点以遠の前記複数の撮像手段の撮像領域と該撮像領域の周辺をも含む領域とを撮像し、前記複数の撮像手段と一体となって移動する補助撮像手段を用いて、前記複数の撮像手段と前記補助撮像手段を移動させながら画像を撮像し、前記補助撮像手段から得られる画像を用いて前記被計測物体の位置に対する前記複数の撮像手段の相対的な位置及び方向の変化を決定することを特徴とする形状計測方法。   In the method of capturing an image while moving a plurality of imaging means having different viewpoints in a predetermined positional relationship, measuring the distance to the object to be measured using the image, and measuring the shape, the field of view of the plurality of imaging means is Using the auxiliary imaging means that images the imaging area of the plurality of imaging means beyond the point where overlap starts and the area including the periphery of the imaging area, and moves together with the plurality of imaging means, the plurality of imaging means An image is picked up while moving the image pickup means and the auxiliary image pickup means, and a change in relative position and direction of the plurality of image pickup means with respect to the position of the object to be measured is determined using the image obtained from the auxiliary image pickup means. A shape measuring method characterized by:
JP2009029605A 2008-06-05 2009-02-12 Shape measuring apparatus and shape measuring method Pending JP2010014699A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009029605A JP2010014699A (en) 2008-06-05 2009-02-12 Shape measuring apparatus and shape measuring method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008147889 2008-06-05
JP2009029605A JP2010014699A (en) 2008-06-05 2009-02-12 Shape measuring apparatus and shape measuring method

Publications (1)

Publication Number Publication Date
JP2010014699A true JP2010014699A (en) 2010-01-21

Family

ID=41700924

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009029605A Pending JP2010014699A (en) 2008-06-05 2009-02-12 Shape measuring apparatus and shape measuring method

Country Status (1)

Country Link
JP (1) JP2010014699A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012013580A (en) * 2010-07-01 2012-01-19 Central Res Inst Of Electric Power Ind System and program for simultaneously measuring shape, diameter and temperature of particle and droplet
CN111076674A (en) * 2019-12-12 2020-04-28 天目爱视(北京)科技有限公司 Closely target object 3D collection equipment
CN111445570A (en) * 2020-03-09 2020-07-24 天目爱视(北京)科技有限公司 Customized garment design production equipment and method
CN112254677A (en) * 2020-10-15 2021-01-22 天目爱视(北京)科技有限公司 Multi-position combined 3D acquisition system and method based on handheld device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0979820A (en) * 1995-09-18 1997-03-28 Toshiba Corp Restoring device of object shape and camera visual point movement and restoring method of object shape and camera visual point movement
JP2002218449A (en) * 2001-01-17 2002-08-02 Atr Media Integration & Communications Res Lab Device for tracking moving object
JP2005338977A (en) * 2004-05-25 2005-12-08 Aichi Gakuin Three-dimensional image processing system
JP2006214735A (en) * 2005-02-01 2006-08-17 Viewplus Inc Compound stereo vision device
JP2007257331A (en) * 2006-03-23 2007-10-04 Mitsubishi Electric Corp Video synthesizing device
JP2007319938A (en) * 2006-05-30 2007-12-13 Toyota Motor Corp Robot device and method of obtaining three-dimensional shape of object

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0979820A (en) * 1995-09-18 1997-03-28 Toshiba Corp Restoring device of object shape and camera visual point movement and restoring method of object shape and camera visual point movement
JP2002218449A (en) * 2001-01-17 2002-08-02 Atr Media Integration & Communications Res Lab Device for tracking moving object
JP2005338977A (en) * 2004-05-25 2005-12-08 Aichi Gakuin Three-dimensional image processing system
JP2006214735A (en) * 2005-02-01 2006-08-17 Viewplus Inc Compound stereo vision device
JP2007257331A (en) * 2006-03-23 2007-10-04 Mitsubishi Electric Corp Video synthesizing device
JP2007319938A (en) * 2006-05-30 2007-12-13 Toyota Motor Corp Robot device and method of obtaining three-dimensional shape of object

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012013580A (en) * 2010-07-01 2012-01-19 Central Res Inst Of Electric Power Ind System and program for simultaneously measuring shape, diameter and temperature of particle and droplet
CN111076674A (en) * 2019-12-12 2020-04-28 天目爱视(北京)科技有限公司 Closely target object 3D collection equipment
CN111445570A (en) * 2020-03-09 2020-07-24 天目爱视(北京)科技有限公司 Customized garment design production equipment and method
CN111445570B (en) * 2020-03-09 2021-04-27 天目爱视(北京)科技有限公司 Customized garment design production equipment and method
CN112254677A (en) * 2020-10-15 2021-01-22 天目爱视(北京)科技有限公司 Multi-position combined 3D acquisition system and method based on handheld device

Similar Documents

Publication Publication Date Title
US9965870B2 (en) Camera calibration method using a calibration target
US20190102911A1 (en) Extrinsic calibration of camera systems
JP2019194616A (en) Position detection method, device and equipment based upon image, and storage medium
CN107084680B (en) A kind of target depth measurement method based on machine monocular vision
JP2008224626A5 (en)
WO2018235163A1 (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
JP5156601B2 (en) Shape measuring apparatus and program
CN110926330B (en) Image processing apparatus, image processing method, and program
JP2010133751A (en) Shape measuring device and program
CN107895344B (en) Video splicing device and method
KR101469099B1 (en) Auto-Camera Calibration Method Based on Human Object Tracking
JP2010014699A (en) Shape measuring apparatus and shape measuring method
CN105865350A (en) 3D object point cloud imaging method
JP2010256296A (en) Omnidirectional three-dimensional space recognition input apparatus
JP4209637B2 (en) Distance correction apparatus and distance correction method for monitoring system
KR102065337B1 (en) Apparatus and method for measuring movement information of an object using a cross-ratio
JP7300895B2 (en) Image processing device, image processing method, program, and storage medium
JP2011095131A (en) Image processing method
CN109410272B (en) Transformer nut recognition and positioning device and method
JP2004364212A (en) Object photographing apparatus, object photographing method and object photographing program
JP2006317418A (en) Image measuring device, image measurement method, measurement processing program, and recording medium
JP4550081B2 (en) Image measurement method
JP2008224323A (en) Stereoscopic photograph measuring instrument, stereoscopic photograph measuring method, and stereoscopic photograph measuring program
JP2005031044A (en) Three-dimensional error measuring device
US20130076868A1 (en) Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120123

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130219

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130220

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20131210