WO2015045834A1 - マーカ画像処理システム - Google Patents
マーカ画像処理システム Download PDFInfo
- Publication number
- WO2015045834A1 WO2015045834A1 PCT/JP2014/073682 JP2014073682W WO2015045834A1 WO 2015045834 A1 WO2015045834 A1 WO 2015045834A1 JP 2014073682 W JP2014073682 W JP 2014073682W WO 2015045834 A1 WO2015045834 A1 WO 2015045834A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- marker
- camera
- index
- estimated
- dimensional
- Prior art date
Links
- 239000003550 marker Substances 0.000 title claims abstract description 127
- 238000012545 processing Methods 0.000 title claims description 21
- 239000013598 vector Substances 0.000 claims abstract description 23
- 238000006243 chemical reaction Methods 0.000 claims abstract description 7
- 239000007787 solid Substances 0.000 claims description 11
- 230000036544 posture Effects 0.000 description 32
- 238000010586 diagram Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 6
- 239000013256 coordination polymer Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 238000000034 method Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D5/00—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
- G01D5/26—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to a marker image processing system for extracting various information by image processing of a marker.
- markers are used by cameras to recognize the position and orientation of objects in AR (Augmented Reality), local part measurement fields, and robotics fields.
- AR Augmented Reality
- robotics fields In general, a flat plate marker having a flat pattern that can be attached to an object is widely used.
- markers are an indispensable element in constructing a system that supports reliable autonomous work by service robots.
- a typical marker includes a square black frame and a two-dimensional pattern and code printed therein. Then, by recognizing the marker with the camera, it is possible to recognize the relative position and posture of the camera with respect to the marker, and to recognize the position and posture of the object with the marker. Further, information recorded on the marker can be read by the camera.
- FIG. 1 shows examples of commonly used ARToolKit markers, ARTag markers, CyberCode markers, and ARToolKitPlus markers.
- markers that can record more information such as QRCode, can also be used.
- CG Computer Graphics
- the robot recognizes the position and orientation of the object attached with the marker, and operates the object.
- FIG. 2 is a diagram showing an example of a robot task using markers in the robotics field as an example.
- the camera mounted on the robot hand at the tip of the robot arm recognize the handle of the refrigerator with the marker, and let the robot hand It opens and closes automatically.
- the robot autonomously generates a predetermined robot arm trajectory based on the position of the marker and the posture relative to the marker, thereby opening the refrigerator door. Can be opened.
- Measurement of the marker by the camera is performed as follows, for example. First, when the camera is, for example, a flat marker having a square outer frame, the outer frame is extracted by performing image processing on the read image data, and four corners are detected. Then, the marker is calculated by geometric calculation using the positional relationship between the positions of the four corners (vertex of each square) and the four corners in the image read by the camera. By analyzing the position and orientation of the camera with respect to the object, the position and orientation of the object with the marker can be recognized.
- the measurement error of the marker increases near the marker and the camera facing each other. Also, even if the dispersion of marker measurement values can be reduced by using filters (average filter, Kalman filter, particle filter, etc.) and measurement history with the camera, is the marker measurement value true? It was difficult to reliably determine.
- Patent Document 1 discloses that the shading pattern is changed on the pattern constituting the marker according to the observation direction. It is described that the position and orientation of the camera with respect to the marker are accurately measured by providing the lens to be changed and changing the shading pattern, and the position and orientation of the object with the marker are accurately recognized.
- a measurement error occurs depending on the posture of the camera with respect to the marker.
- the marker varies depending on the size of the marker, the distance between the camera and the marker, and the angle of view (focal length) of the camera.
- recognition of the position and orientation of an object with a mark becomes indefinite.
- FIG. 3 shows the case where the camera and the marker are close to each other, or the camera has a large angle of view (wide angle lens).
- the center of FIG. 3 shows the case where the camera and the marker are separated from each other, or the angle of view of the camera.
- the case of a small (telephoto lens) is shown.
- the outer frame of the marker is projected on the imaging surface with the center axis of the lens as the normal line, and the right side of FIG.
- FIG. 4 shows an actually captured image when the angle formed by the visual line passing through the lens center and the center point of the marker and the normal line at the marker center point is 45 °, and (Camera1) has an angle of view of 7.1.
- An image taken with a lens of °, (Camera2) is an image taken with a lens having an angle of view of 40.0 °. Compared with the image of (Camera2), it can be confirmed that the image of (Camera1) does not have a large difference from the case of facing directly.
- FIG. 5 continuously analyzes the posture of the flat marker held in the hand by image processing, and the normal (Z-axis) vector passing through the center point of the obtained marker is indicated by an arrow.
- the direction of the normal vector is reversed, and at this stage, it can be seen that the camera posture estimation with respect to the AR marker fluctuates.
- a flat marker is formed with six markers including three-dimensional markers having different heights, each marker is identified in addition to the outer frame, and each solid marker for the flat marker is identified. It is conceivable to analyze the X axis, the Y axis (two-dimensional coordinates on the marker plane), the Z axis, and the rotation angle around each axis based on the position and height data. However, in such a method, in order to determine the position and orientation of the camera with respect to the marker, it is necessary to accurately specify the six indices by image processing.
- the present invention provides an image processing system for recognizing the position and orientation of a camera with respect to a marker using a marker having a plane index, and the marker has a predetermined height.
- the three-dimensional index is arranged at a predetermined position with respect to the plane index, and the arrangement of the plane index extracted from the image of the camera is compared with the predetermined arrangement of the plane index.
- Second estimation means for estimating the center positions of the upper surface and the lower surface of the stereoscopic index on the image of the camera, and based on the image of the camera, Detecting means for detecting the central position of the three-dimensional index estimated by the second estimating means, and the central positions of the top and bottom surfaces of the three-dimensional index detected by the detecting means.
- determining means for comparing based on a three-dimensional vector viewed from the camera and determining that the camera posture with respect to the marker estimated by the first estimating means is incorrect when the error exceeds a predetermined value. And when the determination means determines that there is an error, based on the posture estimated by the first estimation means so that the posture of the camera with respect to the marker estimated by the first estimation means is correct. After reversing the signs of the viewing angle around the X-axis and the viewing angle around the Y-axis of the camera, the camera orientation with respect to the marker is restored by performing rotation conversion. Was to get ready arithmetic means calculated to.
- the X axis and the Y axis indicate coordinate axes that are orthogonal to each other on the marker plane in the three-dimensional coordinates having one point on the marker as the origin.
- the present invention it is possible to determine whether or not the estimated posture is appropriate just by placing one solid index having a predetermined height at a predetermined position of a marker having a plane index. If it is erroneously estimated, the camera attitude data with respect to the already calculated marker can be corrected simply by converting it according to a certain rule. Therefore, accurate and reliable posture estimation is possible at low cost and without being obstructed by other three-dimensional indices.
- FIG. 1 is a diagram illustrating an example of a general marker.
- FIG. 2 is a diagram illustrating an example of a robot task using a marker.
- Fig. 3 shows the case where the camera and the marker are close to each other, or when the angle of view of the camera is large (left side), when the camera and the marker are separated, or when the angle of view of the camera is small (center).
- FIG. 6 is a diagram showing a case (right side) in which erroneous estimation occurs in contrast.
- FIG. 4 is a diagram comparing an image when the angle of view is small (Camera1) and an image when the angle of view is large (Camera2).
- FIG. 1 is a diagram illustrating an example of a general marker.
- FIG. 2 is a diagram illustrating an example of a robot task using a marker.
- Fig. 3 shows the case where the camera and the marker are close to each other, or when the angle of view of the camera is large (left side), when the camera and the marker
- FIG. 5 is a diagram showing the normal (Z-axis) direction passing through the center point of the marker obtained by continuously analyzing the posture of the marker held in the hand by image processing.
- FIG. 6 is a diagram illustrating an example of a marker used in the embodiment.
- FIG. 7 is a diagram illustrating the relationship between the estimated position and the detected position of the white marker formed at the center of the bottom of the three-dimensional index and the center of the top surface.
- FIG. 8 is a diagram comparing a case where the camera is fixed and a case where the camera is fixed and the case where it is erroneously estimated when the angle of the marker held in the hand is changed.
- FIG. 9 is a diagram showing three-dimensional coordinates with the center point c of the marker as the origin.
- FIG. 10 is a diagram illustrating a change in the line-of-sight angle ⁇ VXC calculated based on the estimated posture when the angle of the marker is continuously and slowly changed.
- the camera is fixed and the angle of the marker held on the hand is changed in two ways, and the line-of-sight angle ⁇ VXC around the X-axis and the line-of-sight angle ⁇ VYC around the Y-axis are based on the estimated postures.
- FIG. 12 is a diagram for explaining the correction of the estimated posture value to which the rotation conversion is applied.
- FIG. 13 shows the experimental results showing the effects of this example.
- the flat marker has a square with an outer frame of 20 mm, a white circle 1 with a diameter of 12 mm at the center, a white circle 2 with a diameter of 3 mm at the same position diagonally from each vertex of the outer frame,
- the part is composed of black squares.
- the solid index 3 was pasted from a cylindrical body having an outer diameter of 4.9 mm and a height of 3.2 mm at a point 5 mm away from the center of the bottom of the outer frame of the flat marker.
- a marker having a white circle portion with a diameter of 2.2 mm at the center and a black outer periphery is formed on the upper surface of the solid index 3.
- the white circle 2 provided at the four corners of the marker is extracted as a plane index from the image data, and this is analyzed to obtain a known white circle 2 in the marker. Contrast with the position (in this example, each vertex of a 13.5 mm square formed on the center of the white circle at the four corners). Then, the first estimation means in the image processing system estimates the position (three degrees of freedom) and posture (three degrees of freedom) of the camera based on geometric analysis. Without providing the white circles 2 at the four corners of the marker, image processing is performed on the outer frame of the square to obtain four vertices as a plane index, and the position of the camera with respect to the marker is determined by comparing with a square with a side of 20.0 mm. It may be estimated.
- the detection means in the image processing system identifies the position of the center of the white marker at the center of the top surface of the three-dimensional index 3 from the image data, and detects the position Pt, detected.
- the posture is accurately estimated, the error between Pt, estimated and Pt, detected is small, but when the posture is incorrectly estimated, the error between Pt, estimated and Pt, detected becomes a large value.
- the vector Vd is a vector having Pb, estimated as the starting point and the end point of Pt, detected
- the vector Ve is the vector having Pb, estimated as the starting point, and the end point of Pt, estimated
- ⁇ is the angle formed by the vector Vd and the vector Ve
- the inner product Vd ⁇ Ve of both vectors is By dividing by
- FIG. 8 shows the values of d obtained by fixing the camera and changing the angle of the marker held in the hand.
- d ⁇ 0.9992, that is, ⁇ is approximately 180 °, that is, the vector Vd and the vector Ve are almost opposite to each other.
- an angle formed between the plane S VY defined by the Visual line and the Y axis and the YZ plane is defined as a viewing angle ⁇ VYC around the Y axis.
- the three-dimensional vector of the Visual line is specified by the intersection line of the plane S VX and the plane S VY , and this is a parameter indicating the camera posture with respect to the marker. Note that the origin of the three-dimensional coordinates is not limited to the center point c of the marker, and may be determined at any one point on the marker plane.
- FIG. 10 shows changes in the line-of-sight angle ⁇ VXC (solid line) calculated based on the estimated posture and the d ( ⁇ 10) at that time when the angle of the marker is continuously and slowly changed.
- the change of the value (circle mark) is shown.
- the point of reversing after abrupt inversion indicates that a misjudgment has occurred.
- most values of d are below -0.6. Therefore, when the value of d ( ⁇ 10) is smaller than ⁇ 6, that is, when d ⁇ 0.6, it can be determined that the posture is erroneously estimated by the first estimating means.
- FIG. 11 shows a fixed camera, the angle of the marker held on the hand is changed in two ways , and the line-of-sight angle ⁇ VXC around the X axis and the line-of-sight angle around the Y axis based on the posture estimated in each state. The result of obtaining ⁇ VYC is shown. At each angle, a misjudgment of posture occurs in response to a minute hand shake, and ⁇ VXC and ⁇ VYC change rapidly.
- theta VXC or theta VYC set the correct perspective P m in the inverted position, viewpoint erroneous computed from erroneous estimate P i it is possible to correct the posture estimation value by applying the rotational transformation back to P m.
- the viewpoint for the erroneously determined marker is P i and the corrected viewpoint is P m .
- the vector connecting the marker center points C and P i is CP i
- the vector connecting the marker center points C and P m is CP m
- the angle between both vectors is ⁇
- FIG. 13 shows the measurement results around the X axis ( ⁇ VXC ), and the right side around the Y axis ( ⁇ VYC ).
- VMP lens that changes the gray pattern
- the misjudgment is most reduced, but the misjudgment is most effectively reduced without using a lens that changes the shading pattern. I understand.
- a marker to which a three-dimensional index used in this embodiment is added will be described.
- a square marker having an existing outer frame of 20 mm is used, and a solid index is bonded from a cylindrical body having an outer diameter of 4.9 mm and a height of 3.2 mm to a point 5 mm away from the center of the outer frame bottom.
- various forms such as, for example, integrally forming a three-dimensional index on a mount of a flat marker can be adopted.
- these parameters are stored in advance in a data file in the image processing apparatus.
- QRCode or the like is incorporated in the flat marker itself, and each data related to the three-dimensional index is read on the image processing apparatus side. Note that specifications such as the arrangement and height of the solid index to be attached to each flat marker may be determined in advance and recorded as initial values on the image processing side.
- the present invention determines whether the estimated posture is appropriate only by placing one solid index having a predetermined height at a predetermined position of a marker having a plane index. If it is erroneously estimated, the camera attitude data with respect to the already calculated marker can be corrected simply by converting it according to a certain rule. Therefore, it can be expected to be widely adopted as an image processing system that can perform accurate and reliable posture estimation at a low cost and without inhibiting other three-dimensional indices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
マーカのうち代表的なものは、正方形の黒い枠とその内部に印刷された2次元パターンとコードを含む。そして、マーカをカメラで認識することにより、マーカに対するカメラの相対的な位置と姿勢を認識し、マーカの付けられた物体の位置や姿勢を認識することができる。また、カメラにより、マーカに記録された情報を読み取ることもできる。
まず、カメラが、例えば、正方形の外枠を備えた平板型マーカの場合、読み取った画像データを画像処理することにより外枠を抽出し、4つのコーナーが検出される。そして、予め既知となっている、このマーカにおける4つのコーナーの位置(正方形の各頂点)と、カメラに読み取られた画像内における4つのコーナーとの位置関係を用いた幾何学的計算により、マーカに対するカメラの位置及び姿勢を解析し、マーカが付けられた物体の位置や姿勢を認識することができる。
カメラとマーカが離隔しているとき、あるいはカメラの画角が小さい(望遠レンズ)ときは、マーカの外枠がレンズの中心軸を法線とする撮像面に投影されると、図3右側に示すように、レンズ中心点とマーカの中心点を通る直線(Visual line)とマーカの平面がなす角度が逆向きに傾斜している場合でも、撮像面におけるマーカ外枠の偏差が小さくなり、両者の判別が不定となり、例えば、ロボティクスの分野では、マーカに対するカメラの姿勢を誤推定し、ロボットハンドが誤った動作を行う原因となる。
手の動きに応じて02、06、08では、法線ベクトルが向きが逆転し、この段階で、ARマーカに対するカメラの姿勢推定に揺らぎが発生していることが分かる。
しかし、こうした手法では、マーカに対するカメラの位置や姿勢を確定するために、画像処理により6点の指標を正確に特定する必要がある。しかし、立体指標の陰となる指標については特定できないため、カメラが、6点すべてを正確に撮像できる、限定的な範囲を外れると、正確な位置と姿勢の特定が不可能になってしまう。しかも、各立体指標の位置や、高さをきわめて高精度に選定する必要があり、マーカのコストが大幅に上昇するという問題がある。
ただし、X軸とY軸は、前記マーカ上の1点を原点とする3次元座標において、該マーカ平面上で互いに直交する座標軸をそれぞれ示す。
図6に示すように、平板型マーカは、外枠が20mmの正方形に、中央に直径12mmの白色円1、外枠の各頂点から対角線上同一の位置に直径3mmの白色円2、その他の部分が黒色の正方形で構成されている。本実施例では、この平板型マーカの外枠底辺の中心から5mm離れた地点に、外径4.9mm、高さ3.2mmの円筒体から立体指標3を貼付した。なお、この立体指標3の上面には、中央部に直径2.2mmの白い円部分を残し、外周が黒塗りされたマーカが形成されている。
一方、画像処理システムにおける検出手段により、画像データから立体指標3上面中央部の白いマーカ中央部の位置を特定し、その位置Pt,detectedを検出する。
姿勢が正確に推定された場合は、Pt,estimatedとPt,detected間の誤差は少ないが、姿勢を誤推定した場合は、Pt,estimatedとPt,detected間の誤差が大きな値となってしまう。
|Vd||Vd|で除算することにより、cosθの値であるdを求めることができる。
図8は、カメラを固定し、手に持ったマーカの角度を変えて、それぞれdの値を求めたものである。左側では正しく判定され、d=0.9999、すなわち、θがほぼゼロとなり、ベクトルVdとベクトルVeがほぼ同じ向きとなっていることを示している。一方、中央ではd=-0.9992、すなわち、θがほぼ180°、すなわち、ベクトルVdとベクトルVeがほぼ逆向きであることを示している。
同様に、Visual lineとY軸により規定される平面SVYとY-Z平面のなす角を、Y軸回りの視線角度θVYCと定義する。
Visual lineの3次元ベクトルは、平面SVXと平面SVYの交線により特定され、これが、マーカに対するカメラの姿勢を示すパラメータとなる。なお、3次元座標の原点は、マーカの中心点cに限られることはなく、マーカ平面上のいずれか1点に定めればよい。
各角度とも、手の微小な揺れに応じて姿勢の誤判定が発生し、θVXC、θVYCが急激に変化している。このとき、基本的には、妥当な姿勢推定値と誤った姿勢推定値との間では、X軸回りの視線角度θVXC、Y軸回りの視線角度θVYCの符号が逆転するという関係にあることが分かる。
マーカの中心点CとPiを結ぶベクトルをCPi、マーカの中心点CとPmを結ぶベクトルをCPm、両ベクトルのなす角をρとしたとき、ベクトルCPiとベクトルCPmの外積により、両ベクトルに垂直方向の回転軸ベクトルa(ax ay az)Tを求めることができる。
すなわち、
この結果、濃淡パターンを変化させるレンズを用いたVMPの場合、誤判定が最も低減されているが、濃淡パターンを変化させるレンズを用いなくても、誤判定が最も効果的に低減されていることが分かる。
ただし、こうした立体指標の平面指標に対する配置や高さは、上記の変換を行う際、画像処理装置が既知のデータとして所得する必要があるため、これらのパラメータは画像処理装置内のデータファイルに予め記憶しておく必要がある。もちろん、平板型マーカ自体にQRCode等を組み込み、画像処理装置側で、立体指標に関わる各データを読み込むようにする。
なお、各平板型マーカに貼付する立体指標の配置、高さなどの仕様を予め定めておいて、画像処理側に初期値として記録しておいてもよい。
したがって、低コストで、しかも、他の立体指標を阻害することなく、正確かつ確実な姿勢推定が可能な画像処理システムとして広く採用されることが期待できる。
2 外枠の各頂点から対角線上同一の位置に配置した白色円
3 立体指標
Claims (3)
- 平面指標を有するマーカを用いて、該マーカに対するカメラの位置及び姿勢を認識する画像処理システムにおいて、
前記マーカに、予め定められた高さを有するひとつの立体指標を、前記平面指標に対して予め定められた位置に配置し、
前記カメラの画像により抽出された前記平面指標の配置と、予め定められた前記平面指標の配置を比較することにより、前記マーカに対するカメラの位置と姿勢を推定する第1の推定手段と、
前記立体指標の配置及び高さ、並びに、前記第1の推定手段により推定された前記マーカに対するカメラの位置に基づいて、前記カメラの画像上における前記立体指標の上面及び下面の中央位置を推定する第2の推定手段と、
前記カメラの画像に基づいて、前記立体指標の上面の中央位置を検出する検出手段と、
前記第2の推定手段により推定された前記立体指標の上面及び下面の中央位置と、前記検出手段が検出した前記立体指標の上面及び下面の中央位置とを、前記カメラからみた3次元ベクトルに基づいて比較し、その誤差が所定値以上になったとき、前記第1の推定手段により推定された前記マーカに対するカメラの姿勢が誤りであると判定する判定手段とを備え、
前記判定手段が誤りと判定したときは、前記第1の推定手段が推定した前記マーカに対するカメラの姿勢が正しいものとなるよう、前記第1の推定手段により推定された姿勢に基づく前記カメラのX軸回りの視線角度及びY軸回りの視線角度の符号をそれぞれ反転させた上で、回転変換を行うことにより前記マーカに対するカメラの姿勢を再演算する演算手段を備えたことを特徴とする画像処理システム。
ただし、X軸とY軸は、前記マーカ平面上の1点を原点とする3次元座標において、該マーカ平面上で互いに直交する座標軸をそれぞれ示す。 - 平面指標を有するマーカの予め定められた位置に、予め定められた高さを有する立体指標が配置されてなる請求項1に記載の画像処理システムに使用されるマーカ。
- 前記マーカにおける前記立体指標の配置及び高さが、前記マーカにデータとして記録されていることを特徴とする請求項2に記載のマーカ。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015539078A JP6261016B2 (ja) | 2013-09-30 | 2014-09-08 | マーカ画像処理システム |
US15/025,102 US10262429B2 (en) | 2013-09-30 | 2014-09-08 | Marker image processing system |
CN201480054187.1A CN105612401B (zh) | 2013-09-30 | 2014-09-08 | 标记图像处理系统 |
EP14850001.0A EP3056854B1 (en) | 2013-09-30 | 2014-09-08 | Marker image processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013204947 | 2013-09-30 | ||
JP2013-204947 | 2013-09-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015045834A1 true WO2015045834A1 (ja) | 2015-04-02 |
Family
ID=52742981
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/073682 WO2015045834A1 (ja) | 2013-09-30 | 2014-09-08 | マーカ画像処理システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10262429B2 (ja) |
EP (1) | EP3056854B1 (ja) |
JP (1) | JP6261016B2 (ja) |
CN (1) | CN105612401B (ja) |
WO (1) | WO2015045834A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017007099A (ja) * | 2015-06-16 | 2017-01-12 | ファナック株式会社 | 射出成形システム |
WO2017160248A1 (en) * | 2016-03-18 | 2017-09-21 | Anadolu Universitesi | A fiducial marker, method for forming the fiducial marker, and system for sensing thereof |
WO2018135063A1 (ja) | 2017-01-17 | 2018-07-26 | 国立研究開発法人産業技術総合研究所 | マーカとマーカを用いた姿勢推定方法及び位置姿勢推定方法 |
WO2018147093A1 (ja) * | 2017-02-10 | 2018-08-16 | 株式会社エンプラス | マーカ |
WO2020036150A1 (ja) | 2018-08-15 | 2020-02-20 | 国立研究開発法人産業技術総合研究所 | マーカ |
KR102288194B1 (ko) * | 2021-03-05 | 2021-08-10 | 주식회사 맥스트 | 카메라 포즈 필터링 방법 및 이를 수행하기 위한 컴퓨팅 장치 |
JP2022536617A (ja) * | 2019-06-05 | 2022-08-18 | 北京外号信息技術有限公司 | 相対位置決めを実現するための装置及び対応する相対位置決め方法 |
US11521333B2 (en) | 2019-04-08 | 2022-12-06 | Nec Corporation | Camera calibration apparatus, camera calibration method, and non-transitory computer readable medium storing program |
JP2022186401A (ja) * | 2021-06-04 | 2022-12-15 | プライムプラネットエナジー&ソリューションズ株式会社 | 対象物の変位量測定方法および変位量測定装置 |
US11551419B2 (en) | 2018-10-10 | 2023-01-10 | Preferred Networks, Inc. | Method of generating three-dimensional model, training data, machine learning model, and system |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102016105496A1 (de) * | 2015-03-26 | 2016-09-29 | Faro Technologies Inc. | System zur Prüfung von Objekten mittels erweiterter Realität |
US10311596B2 (en) * | 2015-10-16 | 2019-06-04 | Seiko Epson Corporation | Image processing device, robot, robot system, and marker |
US10650591B1 (en) | 2016-05-24 | 2020-05-12 | Out of Sight Vision Systems LLC | Collision avoidance system for head mounted display utilized in room scale virtual reality system |
US10981060B1 (en) | 2016-05-24 | 2021-04-20 | Out of Sight Vision Systems LLC | Collision avoidance system for room scale virtual reality system |
CN109716061B (zh) * | 2016-09-13 | 2020-12-01 | 国立研究开发法人产业技术综合研究所 | 标记器和使用了标记器的姿势推定方法 |
JP6457469B2 (ja) * | 2016-12-08 | 2019-01-23 | ファナック株式会社 | 移動ロボットの干渉領域設定装置 |
WO2018143153A1 (ja) * | 2017-02-03 | 2018-08-09 | 三井住友建設株式会社 | 位置測定装置及び位置測定方法 |
CN107702714B (zh) * | 2017-07-31 | 2020-01-07 | 广州维绅科技有限公司 | 定位方法、装置及系统 |
US11413755B2 (en) * | 2017-12-31 | 2022-08-16 | Sarcos Corp. | Covert identification tags viewable by robots and robotic devices |
JP2019148865A (ja) * | 2018-02-26 | 2019-09-05 | パナソニックIpマネジメント株式会社 | 識別装置、識別方法、識別プログラムおよび識別プログラムを記録した一時的でない有形の記録媒体 |
CN108876900A (zh) * | 2018-05-11 | 2018-11-23 | 重庆爱奇艺智能科技有限公司 | 一种与现实场景融合的虚拟目标投射方法和系统 |
CN112051546B (zh) * | 2019-06-05 | 2024-03-08 | 北京外号信息技术有限公司 | 一种用于实现相对定位的装置以及相应的相对定位方法 |
JP7404011B2 (ja) * | 2019-09-24 | 2023-12-25 | 東芝テック株式会社 | 情報処理装置 |
DE102022200461A1 (de) | 2022-01-17 | 2023-07-20 | Volkswagen Aktiengesellschaft | Verfahren und Robotersystem zum Bearbeiten eines Werkstücks sowie Koordinatensystemmarker für ein Robotersystem |
CN114633263B (zh) * | 2022-01-19 | 2024-06-25 | 华东政法大学 | 一种电子数据取证监督机器人 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003281504A (ja) * | 2002-03-22 | 2003-10-03 | Canon Inc | 撮像部位置姿勢推定装置及びその制御方法並びに複合現実感提示システム |
JP2009288152A (ja) * | 2008-05-30 | 2009-12-10 | Nippon Soken Inc | 車載カメラのキャリブレーション方法 |
JP2012145559A (ja) | 2010-12-24 | 2012-08-02 | National Institute Of Advanced Industrial & Technology | マーカ |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2749329B2 (ja) * | 1988-08-04 | 1998-05-13 | 株式会社アマダ | 被補正物の位置補正装置 |
JPH04288541A (ja) * | 1991-03-14 | 1992-10-13 | Dainippon Printing Co Ltd | 画像歪補正装置 |
JP3729947B2 (ja) * | 1996-09-30 | 2005-12-21 | 住友電気工業株式会社 | 車両の位置算出装置 |
DE10131610C1 (de) * | 2001-06-29 | 2003-02-20 | Siemens Dematic Ag | Verfahren zur Kalibrierung des optischen Systems einer Lasermaschine zur Bearbeitung von elektrischen Schaltungssubstraten |
JP4136859B2 (ja) * | 2003-01-10 | 2008-08-20 | キヤノン株式会社 | 位置姿勢計測方法 |
US7845560B2 (en) * | 2004-12-14 | 2010-12-07 | Sky-Trax Incorporated | Method and apparatus for determining position and rotational orientation of an object |
JP6083747B2 (ja) * | 2012-10-24 | 2017-02-22 | 国立研究開発法人産業技術総合研究所 | 位置姿勢検出システム |
US9489738B2 (en) | 2013-04-26 | 2016-11-08 | Navigate Surgical Technologies, Inc. | System and method for tracking non-visible structure of a body with multi-element fiducial |
-
2014
- 2014-09-08 WO PCT/JP2014/073682 patent/WO2015045834A1/ja active Application Filing
- 2014-09-08 CN CN201480054187.1A patent/CN105612401B/zh active Active
- 2014-09-08 US US15/025,102 patent/US10262429B2/en active Active
- 2014-09-08 JP JP2015539078A patent/JP6261016B2/ja active Active
- 2014-09-08 EP EP14850001.0A patent/EP3056854B1/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003281504A (ja) * | 2002-03-22 | 2003-10-03 | Canon Inc | 撮像部位置姿勢推定装置及びその制御方法並びに複合現実感提示システム |
JP2009288152A (ja) * | 2008-05-30 | 2009-12-10 | Nippon Soken Inc | 車載カメラのキャリブレーション方法 |
JP2012145559A (ja) | 2010-12-24 | 2012-08-02 | National Institute Of Advanced Industrial & Technology | マーカ |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10035288B2 (en) | 2015-06-16 | 2018-07-31 | Fanuc Corporation | Injection molding system |
JP2017007099A (ja) * | 2015-06-16 | 2017-01-12 | ファナック株式会社 | 射出成形システム |
US10796183B2 (en) | 2016-03-18 | 2020-10-06 | Anadolu Universitesi | Fiducial marker, method for forming the fiducial marker, and system for sensing thereof |
WO2017160248A1 (en) * | 2016-03-18 | 2017-09-21 | Anadolu Universitesi | A fiducial marker, method for forming the fiducial marker, and system for sensing thereof |
WO2018135063A1 (ja) | 2017-01-17 | 2018-07-26 | 国立研究開発法人産業技術総合研究所 | マーカとマーカを用いた姿勢推定方法及び位置姿勢推定方法 |
US10928191B2 (en) | 2017-01-17 | 2021-02-23 | National Institute Of Advanced Industrial Science And Technology | Marker, and posture estimation method and position and posture estimation method using marker |
WO2018147093A1 (ja) * | 2017-02-10 | 2018-08-16 | 株式会社エンプラス | マーカ |
JPWO2018147093A1 (ja) * | 2017-02-10 | 2019-11-14 | 株式会社エンプラス | マーカ |
WO2020036150A1 (ja) | 2018-08-15 | 2020-02-20 | 国立研究開発法人産業技術総合研究所 | マーカ |
US11551419B2 (en) | 2018-10-10 | 2023-01-10 | Preferred Networks, Inc. | Method of generating three-dimensional model, training data, machine learning model, and system |
US12020376B2 (en) | 2018-10-10 | 2024-06-25 | Preferred Networks, Inc. | Method of generating three-dimensional model, training data, machine learning model, and system |
US11521333B2 (en) | 2019-04-08 | 2022-12-06 | Nec Corporation | Camera calibration apparatus, camera calibration method, and non-transitory computer readable medium storing program |
US11830223B2 (en) | 2019-04-08 | 2023-11-28 | Nec Corporation | Camera calibration apparatus, camera calibration method, and nontransitory computer readable medium storing program |
JP2022536617A (ja) * | 2019-06-05 | 2022-08-18 | 北京外号信息技術有限公司 | 相対位置決めを実現するための装置及び対応する相対位置決め方法 |
KR102288194B1 (ko) * | 2021-03-05 | 2021-08-10 | 주식회사 맥스트 | 카메라 포즈 필터링 방법 및 이를 수행하기 위한 컴퓨팅 장치 |
JP2022186401A (ja) * | 2021-06-04 | 2022-12-15 | プライムプラネットエナジー&ソリューションズ株式会社 | 対象物の変位量測定方法および変位量測定装置 |
JP7414772B2 (ja) | 2021-06-04 | 2024-01-16 | プライムプラネットエナジー&ソリューションズ株式会社 | 対象物の変位量測定方法および変位量測定装置 |
Also Published As
Publication number | Publication date |
---|---|
CN105612401A (zh) | 2016-05-25 |
CN105612401B (zh) | 2018-02-23 |
JP6261016B2 (ja) | 2018-01-17 |
US10262429B2 (en) | 2019-04-16 |
US20160239952A1 (en) | 2016-08-18 |
EP3056854A4 (en) | 2017-08-02 |
JPWO2015045834A1 (ja) | 2017-03-09 |
EP3056854B1 (en) | 2020-10-14 |
EP3056854A1 (en) | 2016-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6261016B2 (ja) | マーカ画像処理システム | |
JP4593968B2 (ja) | 位置姿勢計測方法および装置 | |
CN105091744B (zh) | 一种基于视觉传感器和激光测距仪的位姿检测装置与方法 | |
CN110570477B (zh) | 一种标定相机和旋转轴相对姿态的方法、装置和存储介质 | |
US9325969B2 (en) | Image capture environment calibration method and information processing apparatus | |
US20160117824A1 (en) | Posture estimation method and robot | |
JP5746477B2 (ja) | モデル生成装置、3次元計測装置、それらの制御方法及びプログラム | |
JP6324025B2 (ja) | 情報処理装置、情報処理方法 | |
JP2012002761A (ja) | 位置姿勢計測装置、その処理方法及びプログラム | |
JP6479296B2 (ja) | 位置姿勢推定装置および位置姿勢推定方法 | |
CN111279354A (zh) | 图像处理方法、设备及计算机可读存储介质 | |
Ding et al. | A robust detection method of control points for calibration and measurement with defocused images | |
US20050069172A1 (en) | Index identifying method and system | |
CN113643380A (zh) | 一种基于单目相机视觉标靶定位的机械臂引导方法 | |
JP2008309595A (ja) | オブジェクト認識装置及びそれに用いられるプログラム | |
JP5083715B2 (ja) | 三次元位置姿勢計測方法および装置 | |
Puig et al. | Self-orientation of a hand-held catadioptric system in man-made environments | |
Ogata et al. | A robust position and posture measurement system using visual markers and an inertia measurement unit | |
WO2019093299A1 (ja) | 位置情報取得装置およびそれを備えたロボット制御装置 | |
Merckel et al. | Evaluation of a method to solve the perspective-two-point problem using a three-axis orientation sensor | |
JPH07160881A (ja) | 環境認識装置 | |
Mark | A Monocular Vision-based System Using Markers for a Real-Time 6D Pose Estimation of a Trailer | |
Caglioti et al. | Uncalibrated visual odometry for ground plane motion without auto-calibration | |
Palonen | Augmented Reality Based Human Machine Interface for Semiautonomous Work Machines | |
Bian et al. | Real-time tracking error estimation for augmented reality for registration with linecode markers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14850001 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015539078 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15025102 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014850001 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014850001 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |