JP2000076417A - Camera calibration method - Google Patents

Camera calibration method

Info

Publication number
JP2000076417A
JP2000076417A JP24507498A JP24507498A JP2000076417A JP 2000076417 A JP2000076417 A JP 2000076417A JP 24507498 A JP24507498 A JP 24507498A JP 24507498 A JP24507498 A JP 24507498A JP 2000076417 A JP2000076417 A JP 2000076417A
Authority
JP
Japan
Prior art keywords
camera
photograph
photo
points
photographs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP24507498A
Other languages
Japanese (ja)
Inventor
Yoichiro Matsumura
陽一郎 松村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meidensha Corp
Meidensha Electric Manufacturing Co Ltd
Original Assignee
Meidensha Corp
Meidensha Electric Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meidensha Corp, Meidensha Electric Manufacturing Co Ltd filed Critical Meidensha Corp
Priority to JP24507498A priority Critical patent/JP2000076417A/en
Publication of JP2000076417A publication Critical patent/JP2000076417A/en
Pending legal-status Critical Current

Links

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Processing (AREA)

Abstract

PROBLEM TO BE SOLVED: To make improvable the camera calibration accuracy by combining the corresponding points of photographs where the feature points are common to each other at three or more points into a one via the rotation and translation of the said corresponding points and deciding the direction and positional vector of a camera. SOLUTION: The three-dimensional coordinates of a set A are calculated, and the camera direction and position are decided for a set B of a photograph including six or more feature points and deleted from the set A. The camera direction and position are decided for a pair of two sheets of photographs including eight or more corresponding points to define a set C of paired two sheets of photographs. When one of both photographs is an element of the set B, the other photograph is also added to the set B to decide the relative camera direction and position. Then the three-dimensional coordinates of a new set B are calculated, and the camera direction and position are decided based on the remainder of the set A for a photograph including six or more calculation points, added to the set B and deleted from the set A. The three-dimensional coordinates of a set D whose camera direction and position are known from the remainder of the set A are calculated and the set B is added to the set D. The corresponding points of photographs where three or more feature points are common to each other in a new set D are combined into one via the rotation and translation of the said corresponding points, and the camera direction and position are found.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、複数枚の写真から
ステレオ視の原理を用いて3次元モデルを作成するにお
いて、各写真を撮影したカメラの位置や向きの情報を得
るためのカメラキャリブレーション方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a camera calibration for obtaining information on the position and orientation of a camera that has taken each picture in creating a three-dimensional model from a plurality of pictures using the principle of stereo vision. About the method.

【0002】[0002]

【従来の技術】コンピュータのグラフィクス処理性能の
向上とともに、3次元コンピュータグラフィクス(3D
CG)を用いた各種3次元シミュレーションシステムが
産業界に応用されはじめている。特に産業用で対象とな
るのは、現実の機器をモデル化し(3次元モデルを作成
し)、各種シミュレーション機能や状態表示機能等を3
次元アプリケーションで実現する。
2. Description of the Related Art With the improvement of computer graphics processing performance, three-dimensional computer graphics (3D
Various three-dimensional simulation systems using CG) have begun to be applied to industry. In particular, the target for industrial use is to model real equipment (create a three-dimensional model), and to provide various simulation functions and status display functions.
Implemented in a two-dimensional application.

【0003】3次元モデルを作成する場合間題となるの
は、作成工数の多さである。例えば、機器モデルを作成
する場合、モデル化の対象となる機器を直方体・円柱等
のプリミティブ形状に分割することにより、自由曲面等
でモデル化するよりも簡単にモデルが作成可能ではある
が、それらプリミティブの空間情報(頂点座標)の取得
には手間がかかる。
[0003] An issue in creating a three-dimensional model is the large number of man-hours to create. For example, when creating a device model, by dividing the device to be modeled into primitive shapes such as a rectangular parallelepiped and a cylinder, it is possible to create a model more easily than by modeling with a free-form surface, etc. Acquiring the spatial information (vertex coordinates) of a primitive takes time.

【0004】このような、プリミティブの空間情報を簡
易に抽出する方式として、写真を利用したモデル作成手
法が提案されている。この手法は、対象物を複数方向か
ら撮影した写真上でプリミティブを当てはめていくこと
により3次元モデルを作成する方法である。モデルに関
する数値入力が不要であり、また写真を利用するためモ
デル作成者のイメージがわきやすく、モデル作成が簡単
化される。写真をテクスチャーとして使用することによ
り現実感のある3次元モデルを作成できるという長所も
ある。
As a method for easily extracting the spatial information of a primitive, a model creation method using a photograph has been proposed. This method is a method of creating a three-dimensional model by applying primitives to photographs taken of an object from a plurality of directions. It is not necessary to input numerical values for the model, and since the photograph is used, the image of the model creator is easy to understand, and the model creation is simplified. There is also an advantage that a realistic three-dimensional model can be created by using a photograph as a texture.

【0005】空間情報は、複数の写真からステレオ視の
原理を用いて取得する。この際、各々の写真を撮影した
カメラの向き、位置の情報が必要であり、これらの情報
を求めることをカメラキャリブレーションと呼ぶ。複数
写真上の対応点の情報を利用したカメラキャリブレーシ
ョン方法については本願出願人は既に提案している。
[0005] Spatial information is acquired from a plurality of photographs using the principle of stereo vision. At this time, information on the direction and position of the camera that has taken each photo is required, and obtaining such information is called camera calibration. The present applicant has already proposed a camera calibration method using information of corresponding points on a plurality of photographs.

【0006】このカメラキャリブレーション方法は、3
枚以上になるf枚の写真の8点以上の対応点がある2枚
組の全ての組み合わせについてカメラの向きと位置ベク
トルを求め、写真の2枚組の各対応点についてカメラの
向きと位置ベクトルから視点と像を結ぶ直線を求め、こ
の2直線間の距離を求め、各対応点毎の各距離の平均値
をカメラキャリブレーション誤差として求め、この誤差
の小さい順にソートして各カメラの向きR1,R2,…R
fを求め、写真上の対象物の特徴点やカメラの位置ベク
トル及び向き等を変数としてもつ評価関数を最小にする
ものを各カメラの位置ベクトルT1,T2,…Tfとして
求める。
This camera calibration method has three
The orientation and position vector of the camera are obtained for all combinations of the two-piece set having eight or more corresponding points of f or more photos, and the camera orientation and the position vector for each corresponding point of the two-piece set of photos , A straight line connecting the viewpoint and the image is obtained, a distance between the two straight lines is obtained, an average value of each distance for each corresponding point is obtained as a camera calibration error, and the direction of each camera is sorted by ascending order of the error. 1, R 2, ... R
seeking f, the position of each camera which minimizes the evaluation function with the position vector of the feature points and camera and orientation of an object or the like on the photograph as variable vector T 1, T 2, obtained as ... T f.

【0007】[0007]

【発明が解決しようとする課題】対象物の写真以外に図
面等の付加情報が存在し、対象物のいくつかの特徴点に
ついては、付加情報から3次元座標を抽出できる場合が
ある。このような場合、複数枚の写真上での対応点の情
報に加えて、特徴点の3次元座標を利用することにより
カメラキャリブレーションの精度を上げることが期待さ
れる。
In some cases, additional information such as a drawing exists in addition to a photograph of an object, and three-dimensional coordinates of some feature points of the object can be extracted from the additional information. In such a case, it is expected that the accuracy of camera calibration will be improved by using the three-dimensional coordinates of the feature points in addition to the information on the corresponding points on a plurality of photographs.

【0008】しかし、従来のカメラキャリブレーション
方法では、複数枚の写真上での対応点の情報と、特徴点
の3次元座標を併用することができず、図面等の付加情
報が無駄になる。
However, according to the conventional camera calibration method, information of corresponding points on a plurality of photographs cannot be used together with three-dimensional coordinates of feature points, and additional information such as drawings is wasted.

【0009】本発明の目的は、図面等の付加情報を利用
してカメラキャリブレーション精度を高めるカメラキャ
リブレーション方法を提供することにある。
An object of the present invention is to provide a camera calibration method for improving camera calibration accuracy using additional information such as drawings.

【0010】[0010]

【課題を解決するための手段】(第1の発明)本発明
は、カメラの位置と向きを変えて対象物を撮影した複数
枚の写真からステレオ視の原理を用いて対象物の3次元
モデルを作成するため、各写真上の対象物の特徴点から
対象物を撮影したカメラの位置と向きの情報を得るカメ
ラキャリブレーション方法において、2枚以上になるf
枚の写真になる集合Aについて、図面等の付加情報から
対象物の3次元座標を算出できるものは前記特徴点の3
次元座標を算出しておき、前記f枚の写真のうち前記付
加情報から算出した特徴点を6点以上含む写真になる集
合Bについて、該写真上に投影される3次元上の点から
カメラの向きと位置ベクトルを求めると共に集合Bを集
合Aから削除し、前記f枚の写真のうち、8点以上の対
応点がある2枚組の写真のすべての組について、同じ3
次元上の点になる対応点のエピポーラ条件を利用して相
対的なカメラの向きと位置ベクトルを求めると共に当該
2枚組みの写真になる集合Cを求め、前記集合Cの2枚
組みの写真について、どちらか一方の写真が前記集合B
の要素であるときは他方の写真も集合Bに加え、この集
合Bに加えられた写真について相対的なカメラの向きと
位置ベクトルを求め、前記新たな集合Bの各写真につい
て、ステレオ視の原理を用いて各写真に写っている特徴
点の3次元座標を求め、前記集合Aに残った写真のう
ち、前記3次元座標を算出した点が6点以上写っている
写真について該写真上に投影される3次元上の点からカ
メラの向きと位置ベクトルを求めると共に該写真を前記
集合Bに追加して集合Aから削除し、前記集合Aに残っ
た写真のうち、相対的なカメラの向きと位置ベクトルが
分かる写真の集合Dを求め、前記集合Dの写真について
ステレオ視の原理を用いて特徴点の相対的な3次元座標
を算出し、前記集合Bを集合Dに加えた新たな集合Dに
おける特徴点が3点以上共通している写真の対応点を回
転と平行移動で合体させて最終的に1つにし、カメラの
向きと位置ベクトルを求めることを特徴とする。
(First Invention) The present invention provides a three-dimensional model of an object using the principle of stereo vision from a plurality of photographs of the object taken by changing the position and orientation of a camera. In the camera calibration method for obtaining information on the position and orientation of the camera that captured the target from the feature points of the target on each photograph, two or more f
For the set A, which is a single photograph, the three-dimensional coordinates of the object can be calculated from additional information such as drawings.
The dimensional coordinates are calculated in advance, and among the f photos, a set B, which is a photo including six or more feature points calculated from the additional information, is set based on a three-dimensional point projected on the photo. The direction and the position vector are obtained, and the set B is deleted from the set A, and the same 3 is set for all the sets of the two sets of photos having eight or more corresponding points among the f photos.
A relative camera direction and a position vector are obtained by using the epipolar condition of the corresponding point which becomes a point on the dimension, and a set C which becomes the two-piece photograph is obtained. , One of the pictures is the set B
, The other photograph is also added to the set B, and the relative camera direction and position vector of the photograph added to the set B are obtained. Is used to determine the three-dimensional coordinates of the feature points appearing in each photograph, and among the photographs remaining in the set A, those photographs in which six or more points for which the three-dimensional coordinates have been calculated appear on the photographs are projected. The camera direction and the position vector are obtained from the three-dimensional points to be obtained, and the photograph is added to the set B and deleted from the set A. Of the photos remaining in the set A, the relative camera direction and A set D of pictures whose position vectors are known is obtained, relative three-dimensional coordinates of feature points are calculated for the pictures of the set D using the principle of stereo vision, and a new set D is obtained by adding the set B to the set D. Feature points in Corresponding points pictures that are on common coalescing in rotation and translation finally 1 Tsunishi, and obtaining the orientation and the position vector of the camera.

【0011】(第2の発明)前記各カメラの向きR1
2,…Rfと位置ベクトルT1,T2,…Tfを初期値と
し、次式の評価関数H
(Second invention) The directions R 1 ,
R 2, ... R f and the position vector T 1, T 2, a ... T f as the initial value, the following expression evaluation function H

【0012】[0012]

【数3】 (Equation 3)

【0013】但し、Piは、前記図面等の付加情報から
算出した特徴点にはその3次元座標を表すベクトル、そ
れ以外の特徴点は次式のものとする。qfiはf枚目の写
真上の特徴点Piの3次元ベクトル。
Here, P i is a vector representing the three-dimensional coordinates of the feature point calculated from the additional information such as the drawing, and the other feature points are represented by the following equations. q fi is a three-dimensional vector of feature points P i on the f-th photograph.

【0014】[0014]

【数4】 (Equation 4)

【0015】の極小化により各カメラの位置と向きの値
を修正することを特徴とする。
It is characterized in that the values of the position and orientation of each camera are corrected by minimization.

【0016】[0016]

【発明の実施の形態】本実施形態では、複数枚の写真上
での対応点の情報と、図面等の付加情報から抽出した特
徴点の3次元座標を併用してカメラキャリブレーション
を実現する。以下、基本事項、記号等についての説明を
した後、実施形態を説明する。
DESCRIPTION OF THE PREFERRED EMBODIMENTS In this embodiment, camera calibration is realized by using information of corresponding points on a plurality of photographs and three-dimensional coordinates of feature points extracted from additional information such as drawings. Hereinafter, after describing basic matters, symbols, and the like, the embodiment will be described.

【0017】(基本事項等の説明)3次元上の点(XY
Z)が写真上の点(u,v)に投影されているときの変
換は次式で定式化される。
(Description of Basic Matters, etc.) A three-dimensional point (XY
The transformation when Z) is projected onto the point (u, v) on the photograph is formulated by the following equation.

【0018】[0018]

【数5】 (Equation 5)

【0019】ここで、3次元回転行列Rがカメラの向き
を表わし、3次元ベクトルTが位置を表わす。
Here, the three-dimensional rotation matrix R represents the direction of the camera, and the three-dimensional vector T represents the position.

【0020】また、図2に示すように、写真1上の点
(u1,v1)と写真2上の点(u2,v2)が同じ3次元
上の点の像であるとき、これらの点を対応点と呼び、ま
た次のエピポーラ条件が成立する。
As shown in FIG. 2, when the point (u 1 , v 1 ) on the photograph 1 and the point (u 2 , v 2 ) on the photograph 2 are the same three-dimensional point images, These points are called corresponding points, and the following epipolar condition is satisfied.

【0021】[0021]

【数6】 (Equation 6)

【0022】ここで、R1 T2、R1 T(T2−T1)は、
それぞれ写真1を撮影したカメラ(カメラ1)の座標系
でみたときの写真2を撮影したカメラ(カメラ2)の向
き及び位置である。すなわち、カメラ2のカメラ1に対
する相対的なカメラの向き及び位置である。
Here, R 1 T R 2 and R 1 T (T 2 −T 1 ) are
The direction and the position of the camera (camera 2) that took the picture 2 when viewed in the coordinate system of the camera (camera 1) that took the picture 1 respectively. That is, the direction and position of the camera 2 relative to the camera 1.

【0023】写真の枚数をf枚(f≧2)とし、写真
1、写真2、…写真fであらわす。対象物の特徴点の個
数をNとしi番目の特徴点をPiで表わす。Piが写って
いる写真のインデックスの集合をIiとする。特徴点Pi
がf枚目の写真に写っているとき、すわなちf∈Iであ
るとき、f枚目の写真上の特徴点Piの像の位置を(u
fi,vfi)とし、3次元ベクトル(ufi,vfi,1)を
fiで表す。また、f枚目の写真を撮影したカメラの向
きをRf、位置をTfで表す。
The number of photographs is f (f ≧ 2), and the photographs are represented as photograph 1, photograph 2,... Photograph f. Feature points and the number of feature points is N the i-th object represented by P i. A set of indices of photographs in which P i appears is defined as I i . Feature point P i
When There that is reflected in the f th photographs, Nachi Suwa when a F∈I, the position of the image of the feature point P i on the f-th photo (u
fi , v fi ), and the three-dimensional vector (u fi , v fi , 1) is represented by q fi . Also, the direction of the camera that took the f-th photograph is represented by R f , and the position is represented by T f .

【0024】(第1の実施形態)図1は、本発明の実施
形態を示すカメラキャリブレーション方法の処理手順で
あり、図面を併用して写真のカメラキャリブレーション
を行うアルゴリズムは同図の処理S1〜S11になり、
以下に詳細に説明する。
(First Embodiment) FIG. 1 shows a processing procedure of a camera calibration method according to an embodiment of the present invention. ~ S11,
This will be described in detail below.

【0025】(S1)写真1、写真2、…写真fを要素
とする集合を考え、これをAで表す。
(S1) Consider a set having Photo 1, Photo 2,... Photo f as elements.

【0026】A={写真1、写真2、…、写真f} (S2)特徴点P1,P2,…PNの内、図面等の付加情
報から3次元座標を算出できるものについては、3次元
座標を算出する。
[0026] A = {photo 1, photo 2, ..., photo f} (S2) feature point P 1, P 2, ... of the P N, for it can calculate the three-dimensional coordinates from additional information such as drawings, Calculate three-dimensional coordinates.

【0027】(S3)写真1、写真2、…写真fの内、
処理S2で3次元座標を算出した点が6点以上(同一平
面上にないことが必要)写っている写真について、前記
の(1)式を利用してカメラキャリブレーションを行う
(カメラ向き、位置を求める)。また、カメラキャリブ
レーションを行った写真の集合をBとし、集合Bの要素
であるものは、集合Aから削除する。例えば、 B={写真2、写真5} A={写真1、写真3、写真4、写真6、…、写真f} (S4)8点以上対応点がある写真2枚組すべてについ
て、前記の(2)式のエピポーラ条件を利用して相対的
なカメラキャリブレーション(どちらか一方の写真のカ
メラ座標系を基準としたときの、もう一方の写真を撮影
したカメラの向き、位置を算出)を行う。また、相対的
なカメラキャリブレーションを行った写真2枚組の集合
をCとする。例えば、 C={(写真1、写真2)、(写真1、写真4)、(写真
3、写真6)…} (S5)集合Cの要素(写真2枚組)で、どちらか一方
の写真が集合Eの要素であれば、もう一方の写真も集合
Eに加える。また、新たに集合Bに加えた写真のカメラ
向きく位置を求める(もともと集合Bの要素であった写
真のカメラ向き、位置及び新たに集合Eに加えた写真と
の相対的なカメラ向き、位置から算出)。
(S3) Photo 1, Photo 2,... Photo f
For a photograph in which six or more points for which the three-dimensional coordinates have been calculated in the processing S2 appear (need not to be on the same plane), camera calibration is performed using the above equation (1) (camera direction, position Seek). In addition, a set of photographs on which camera calibration has been performed is set to B, and elements that are elements of the set B are deleted from the set A. For example, B = {Photo 2, Photo 5} A = {Photo 1, Photo 3, Photo 4, Photo 6,..., Photo f} Relative camera calibration (calculating the orientation and position of the camera that took the other photo with reference to the camera coordinate system of one of the photos) using the epipolar condition of equation (2) Do. Also, a set of two sets of photographs on which the relative camera calibration has been performed is denoted by C. For example, C = {(Photo 1, Photo 2), (Photo 1, Photo 4), (Photo 3, Photo 6)... (S5) One of the photos in the element of the set C (two photos) Is a member of the set E, the other photograph is also added to the set E. Further, the position of the photograph of the photo newly added to the set B is determined (the camera direction and position of the photo which were originally elements of the set B, and the relative camera direction and position of the photo newly added to the set E). Calculated from).

【0028】集合Cの要素で、集合Bに含まれている写
真2枚組から成るものは削除する。集合Bの要素である
ものは、集合Aから削除する。
The elements of the set C, which consist of a set of two photographs included in the set B, are deleted. Elements that are elements of the set B are deleted from the set A.

【0029】以上を新たに集合Eに加えるものがなくな
るまで繰り返す。例えば、 A={写真1、写真3、写真4、写真6、…、写真f} B={写真2、写真5} C={(写真1、写真2)、(写真1、写真4)、(写真
3、写真6)} とすると、集合A,B,Cは、 A={写真3、写真6、…、写真f} B={写真1、写真2、写真4、写真5} C={(写真6、写真7)} (S6)ステレオ視の原理を用いて、集合Eの各写真に
写っている特徴点の3次元座標を計算する(前記の処理
S2で3次元座標を算出したものを除く)。
The above operation is repeated until there is no more data to be added to the set E. For example, A = {Photo 1, Photo 3, Photo 4, Photo 6,..., Photo f} B = {Photo 2, Photo 5} C = (Photo 1, Photo 2), (Photo 1, Photo 4), (Photo 3, Photo 6) Then, sets A, B, and C are: A = {Photo 3, Photo 6,..., Photo f} B = {Photo 1, Photo 2, Photo 4, Photo 5} C = {(Photo 6, Photo 7)} (S6) Calculate the three-dimensional coordinates of the feature points appearing in each photo of the set E using the principle of stereo vision (the three-dimensional coordinates were calculated in the above process S2). Except the ones).

【0030】(S7)集合Aに残った写真の内、処理S
2及びS6で3次元座標を算出した点が6点以上(同一
平面上にないことが必要)写っている写真について、前
記の(1)式を利用してカメラキャリブレーションを行
う(カメラ向き、位置を求める)。カメラキャリブレー
ションを行った写真を集合Bに追加し、集合Aから削除
する。新たに集合Bに追加した写真があった場合は処理
S5に戻る。
(S7) Of the photographs remaining in the set A, the processing S
For a photograph in which six or more points whose three-dimensional coordinates have been calculated in steps 2 and 6 are present (need not to be on the same plane), camera calibration is performed using the above equation (1). Find the position). The camera-calibrated photograph is added to the set B and deleted from the set A. If there is a photo newly added to the set B, the process returns to step S5.

【0031】(S8)集合Aに残った写真の要素で、互
いに相対的なカメラの向き、位置が分かる写真の集合を
Dとする(前記の処理S5〜S7と同様な方法で求め
る)。例えば、 A={写真3、写真6、写真7、写真8、写真f} C={(写真6、写真7)、(写真3、写真7)、(写真
8、写真f} とすると、集合Dは、 D={{写真3、写真6、写真7}、{写真8、写真
f}} (S9)集合Dのそれぞれの要素においてステレオ視の
原理を用いて、特徴点の相対的な3次元座標(それぞれ
の要素の座標系での3次元座標)を算出する。
(S8) A set of photographs in which the orientation and position of the camera relative to each other are known among the elements of the photographs remaining in the set A is defined as D (determined by the same method as in the above-described processes S5 to S7). For example, A = {Photo 3, Photo 6, Photo 7, Photo 8, Photo f} C = {(Photo 6, Photo 7), (Photo 3, Photo 7), (Photo 8, Photo f) D = {Photo 3, Photo 6, Photo 7}, {Photo 8, Photo f} (S9) For each element of the set D, the relative 3 Calculate dimensional coordinates (three-dimensional coordinates in the coordinate system of each element).

【0032】(S10)集合Bを集合Dに加える。例え
ば、 B={写真1、写真2、写真4、写真5} D={{写真3、写真6、写真7}、{写真8、写真
f}} とすると、集合Dは、 D={{写真1、写真2、写真4、写真5}、{写真
3、写真6、写真7}、{写真8、写真f}} (S11)集合Dの要素において、(同一直線上にな
い)3次元点が3点以上共通している二つの要素は、各
々の座標系間での変換(回転、平行移動)が判明する。
これらを合体していき、最終的に1つにし、カメラ位置
1,T2,…Tf、向きR1,R2,…,Rfを求める。
(S10) The set B is added to the set D. For example, if B = {Photo 1, Photo 2, Photo 4, Photo 5} D = {Photo 3, Photo 6, Photo 7}, {Photo 8, Photo f}, the set D becomes D = { Photo 1, Photo 2, Photo 4, Photo 5}, {Photo 3, Photo 6, Photo 7}, {Photo 8, Photo f} (S11) Three-dimensional (not on the same straight line) For two elements having three or more points in common, conversion (rotation, translation) between the respective coordinate systems is found.
These will coalesce and eventually 1 Tsunishi, camera position T 1, T 2, ... T f, the orientation R 1, R 2, ..., determining the R f.

【0033】(第2の実施形態)本実施形態は、前記の
第1の実施形態と同じ手順で複数枚の写真と図面等の付
加情報からカメラの向きと位置を求める。この後、これ
らのカメラ位置T1,T2,…Tf、向きR1,R2,…Rf
を初期値とし、次式の評価関数Hの極小化によりカメラ
位置、向きの値を修正して精度を向上させる。
(Second Embodiment) In this embodiment, the direction and position of the camera are obtained from a plurality of photographs and additional information such as drawings according to the same procedure as in the first embodiment. Thereafter, these camera positions T 1 , T 2 ,..., T f , directions R 1 , R 2 ,.
Is used as the initial value, and the camera position and orientation values are corrected by minimizing the evaluation function H in the following equation to improve the accuracy.

【0034】[0034]

【数7】 (Equation 7)

【0035】ここでPiは前記の処理S2で3次元座標
を算出した特徴点については、その3次元座標をあらわ
すベクトルであり、それ以外の特徴点については、次式
で表わされるものである。極小化手法は準ニュートン
法、共役勾配法等を用いる。
Here, P i is a vector representing the three-dimensional coordinates of the feature point for which the three-dimensional coordinates have been calculated in the above-described processing S2, and the other feature points are represented by the following equations. . As the minimization method, a quasi-Newton method, a conjugate gradient method, or the like is used.

【0036】[0036]

【数8】 (Equation 8)

【0037】[0037]

【発明の効果】以上のとおり、本発明によれば、以下の
効果がある。
As described above, according to the present invention, the following effects can be obtained.

【0038】(1)各写真を撮影した位置、カメラの向
きの概算値を人が指定する必要がない。カメラ位置、向
きの指定は煩雑な作業であり、この手間が省けるという
利点がある。
(1) There is no need for a person to specify the position where each photograph was taken and the approximate value of the direction of the camera. Specifying the camera position and orientation is a complicated operation, and has the advantage that this work can be omitted.

【0039】(2)複数写真上での対応点の情報以外
に、図面等の付加情報から算出した3次元座標を利用す
ることにより、写真のカメラキャリブレーション精度を
向上させることが可能となる。
(2) By using three-dimensional coordinates calculated from additional information such as drawings, in addition to information on corresponding points on a plurality of photographs, it is possible to improve the camera calibration accuracy of photographs.

【0040】(3)求めたカメラの位置及び向きを初期
値として評価関数の極小化により各カメラの向き、位置
を修正しているため、精度よくカメラキャリブレーショ
ンが行われる。
(3) Since the position and orientation of each camera are corrected by minimizing the evaluation function with the obtained position and orientation of the camera as initial values, camera calibration is performed with high accuracy.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の実施形態を示すカメラキャリブレーシ
ョンの処理手順。
FIG. 1 is a processing procedure of camera calibration according to an embodiment of the present invention.

【図2】写真と対応点の関係図。FIG. 2 is a diagram showing a relationship between a photograph and corresponding points.

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 カメラの位置と向きを変えて対象物を撮
影した複数枚の写真からステレオ視の原理を用いて対象
物の3次元モデルを作成するため、各写真上の対象物の
特徴点から対象物を撮影したカメラの位置と向きの情報
を得るカメラキャリブレーション方法において、 2枚以上になるf枚の写真になる集合Aについて、図面
等の付加情報から対象物の3次元座標を算出できるもの
は前記特徴点の3次元座標を算出しておき、 前記f枚の写真のうち前記付加情報から算出した特徴点
を6点以上含む写真になる集合Bについて、該写真上に
投影される3次元上の点からカメラの向きと位置ベクト
ルを求めると共に集合Bを集合Aから削除し、 前記f枚の写真のうち、8点以上の対応点がある2枚組
の写真のすべての組について、同じ3次元上の点になる
対応点のエピポーラ条件を利用して相対的なカメラの向
きと位置ベクトルを求めると共に当該2枚組みの写真に
なる集合Cを求め、 前記集合Cの2枚組みの写真について、どちらか一方の
写真が前記集合Bの要素であるときは他方の写真も集合
Bに加え、この集合Bに加えられた写真について相対的
なカメラの向きと位置ベクトルを求め、 前記新たな集合Bの各写真について、ステレオ視の原理
を用いて各写真に写っている特徴点の3次元座標を求
め、 前記集合Aに残った写真のうち、前記3次元座標を算出
した点が6点以上写っている写真について該写真上に投
影される3次元上の点からカメラの向きと位置ベクトル
を求めると共に該写真を前記集合Bに追加して集合Aか
ら削除し、 前記集合Aに残った写真のうち、相対的なカメラの向き
と位置ベクトルが分かる写真の集合Dを求め、 前記集合Dの写真についてステレオ視の原理を用いて特
徴点の相対的な3次元座標を算出し、 前記集合Bを集合Dに加えた新たな集合Dにおける特徴
点が3点以上共通している写真の対応点を回転と平行移
動で合体させて最終的に1つにし、カメラの向きと位置
ベクトルを求めることを特徴とするカメラキャリブレー
ション方法。
1. A feature point of an object on each photograph to create a three-dimensional model of the object from a plurality of photographs of the object by changing the position and orientation of a camera by using the principle of stereo vision. In a camera calibration method for obtaining information on the position and orientation of a camera that has captured an object from a set, three-dimensional coordinates of the object are calculated from additional information such as drawings with respect to a set A including two or more f-photos As far as possible, the three-dimensional coordinates of the feature points are calculated in advance, and a set B that is a photograph including six or more feature points calculated from the additional information out of the f photos is projected on the photograph. The camera direction and the position vector are obtained from the three-dimensional points, and the set B is deleted from the set A. Of the f photos, all the sets of two photos having eight or more corresponding points are obtained. On the same 3D The relative camera direction and the position vector are obtained by using the epipolar condition of the corresponding point, and the set C that becomes the two-piece photograph is obtained. When one of the photos is an element of the set B, the other photo is also added to the set B, and a relative camera direction and a position vector are determined for the photos added to the set B. For each photograph, the three-dimensional coordinates of the feature points appearing in each photograph are obtained by using the principle of stereo vision. Of the photographs remaining in the set A, six or more points at which the three-dimensional coordinates have been calculated appear. The camera orientation and the position vector are obtained from the three-dimensional point projected on the photograph for the photograph that is present, and the photograph is added to the set B and deleted from the set A. ,Relative A set D of photographs in which the direction and the position vector of the camera are known is obtained. Relative three-dimensional coordinates of feature points are calculated for the photographs of the set D using the principle of stereo vision, and the set B is added to the set D. Camera calibration characterized in that corresponding points of a photograph having three or more characteristic points in the new set D are merged by rotation and translation to finally obtain one and obtain a camera direction and a position vector. Option.
【請求項2】 前記各カメラの向きR1,R2,…Rf
位置ベクトルT1,T2,…Tfを初期値とし、次式の評
価関数H 【数1】 但し、Piは、前記図面等の付加情報から算出した特徴
点にはその3次元座標を表すベクトル、それ以外の特徴
点は次式のものとする。qfiはf枚目の写真上の特徴点
iの3次元ベクトル。 【数2】 の極小化により各カメラの位置と向きの値を修正するこ
とを特徴とする請求項1に記載のカメラキャリブレーシ
ョン方法。
Wherein said orientation R 1, R 2 of each camera, ... R f and the position vector T 1, T 2, a ... T f as the initial value, the evaluation function H [Number 1] of the formula Here, Pi is a vector representing the three-dimensional coordinates of the feature point calculated from the additional information such as the drawing, and the other feature points are represented by the following equation. q fi is a three-dimensional vector of feature points P i on the f-th photograph. (Equation 2) 2. The camera calibration method according to claim 1, wherein the position and orientation values of each camera are corrected by minimizing.
JP24507498A 1998-08-31 1998-08-31 Camera calibration method Pending JP2000076417A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP24507498A JP2000076417A (en) 1998-08-31 1998-08-31 Camera calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP24507498A JP2000076417A (en) 1998-08-31 1998-08-31 Camera calibration method

Publications (1)

Publication Number Publication Date
JP2000076417A true JP2000076417A (en) 2000-03-14

Family

ID=17128217

Family Applications (1)

Application Number Title Priority Date Filing Date
JP24507498A Pending JP2000076417A (en) 1998-08-31 1998-08-31 Camera calibration method

Country Status (1)

Country Link
JP (1) JP2000076417A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100453966C (en) * 2005-01-10 2009-01-21 北京航空航天大学 Spatial three-dimensional position attitude measurement method for video camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100453966C (en) * 2005-01-10 2009-01-21 北京航空航天大学 Spatial three-dimensional position attitude measurement method for video camera

Similar Documents

Publication Publication Date Title
JP3954211B2 (en) Method and apparatus for restoring shape and pattern in 3D scene
JP4392507B2 (en) 3D surface generation method
CN107392947B (en) 2D-3D image registration method based on contour coplanar four-point set
KR100755450B1 (en) 3d reconstruction apparatus and method using the planar homography
US20030091227A1 (en) 3-D reconstruction engine
WO2012096747A1 (en) Forming range maps using periodic illumination patterns
EP2294555A1 (en) Three dimensional mesh modeling
CN110567441B (en) Particle filter-based positioning method, positioning device, mapping and positioning method
CN110310331A (en) A kind of position and orientation estimation method based on linear feature in conjunction with point cloud feature
CN113313794A (en) Animation migration method and device, equipment and storage medium
JP2000268179A (en) Three-dimensional shape information obtaining method and device, two-dimensional picture obtaining method and device and record medium
JPH11504452A (en) Apparatus and method for reproducing and handling a three-dimensional object based on a two-dimensional projection
JP7178803B2 (en) Information processing device, information processing device control method and program
CN111460937B (en) Facial feature point positioning method and device, terminal equipment and storage medium
JP2832463B2 (en) 3D model reconstruction method and display method
Wu et al. Photogrammetric reconstruction of free-form objects with curvilinear structures
JP2002520969A (en) Automated 3D scene scanning from motion images
JP2000339465A (en) Corresponding method for three-dimensional form
CN114742954A (en) Method for constructing large-scale diversified human face image and model data pairs
JP2000076417A (en) Camera calibration method
Kang et al. 3D urban reconstruction from wide area aerial surveillance video
EP3779878A1 (en) Method and device for combining a texture with an artificial object
KR20210085953A (en) Apparatus and Method for Cailbrating Carmeras Loaction of Muti View Using Spherical Object
JP3137776B2 (en) Object data creation device from perspective view
Sengupta et al. Novel scene generation, merging and stitching views using the 2D affine space