JP2007219704A - Image position measuring method, image position measuring device, and image position measuring program - Google Patents

Image position measuring method, image position measuring device, and image position measuring program Download PDF

Info

Publication number
JP2007219704A
JP2007219704A JP2006037856A JP2006037856A JP2007219704A JP 2007219704 A JP2007219704 A JP 2007219704A JP 2006037856 A JP2006037856 A JP 2006037856A JP 2006037856 A JP2006037856 A JP 2006037856A JP 2007219704 A JP2007219704 A JP 2007219704A
Authority
JP
Japan
Prior art keywords
image
similarity
estimation
peak
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006037856A
Other languages
Japanese (ja)
Other versions
JP4887820B2 (en
Inventor
Hiroaki Okamoto
浩明 岡本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to JP2006037856A priority Critical patent/JP4887820B2/en
Publication of JP2007219704A publication Critical patent/JP2007219704A/en
Application granted granted Critical
Publication of JP4887820B2 publication Critical patent/JP4887820B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a positioning method and device of sub-pixel precision capable of performing a high speed processing in a real time without complicating calculation even when similarity distribution is anisotropic and realizing a highly precise position measurement. <P>SOLUTION: The image similarity of input image data as the object of retrieval and the template data of a preliminarily input reference image is calculated so that a similarity map can be prepared. Then, one-dimensional peak position estimation is operated to x and y axes with the data point of the maximum similarity detected from the similarity map as a center, and the peak position of sub-pixel precision is temporarily estimated, and the similar primary peak position estimation is operated to adjacent x and y axes with the estimated peak position of the sub-pixels interposed, and the correction of the peak estimation position is operated by linear interpolation by using the estimation results so that an image position can be determined. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

本発明はFA分野などにおいて、画像を用いて部品やマークの高精度な位置計測を行う方法および装置に関する。   The present invention relates to a method and apparatus for performing highly accurate position measurement of parts and marks using images in the FA field and the like.

画像を用いたサブピクセル精度の位置計測方法の一つとして、従来よりテンプレートマッチングによってピクセル単位で画像類似度を求め、その値を補間するなどしてピーク座標を推定することにより、対象物やマークの位置をサブピクセル精度で計測する方法が用いられている。その具体例として、
(1)x,y座標の1次元類似度データに独立に関数当てはめ等を行ってピークを推定する方法
(2)x,y軸の2次元類似度データに対して、等方的な曲面を当てはめてピークを推定する方法
(3)x,y軸の2次元データ上で、異方的な2次元モデルを仮定してピークを推定する方法
などがある。
(1)の方法は簡単で計算量が少ないため広く用いられているが、類似度の分布形状が等方的でない場合、即ち推定したピーク位置から同心円状に均等に分布していない場合には誤差が生じる。実画像では完全に等方的に分布することはない。
(2)の方法は、2次元データを処理するために計算量が増える。また類似度分布が異方的な場合はやはり誤差となる。
(3)の方法は、類似度分布が異方的な場合も精度の高い計測が可能であるが、モデルが複雑になるため計算量が多い。
As one of the sub-pixel-accurate position measurement methods using images, an object or mark can be obtained by estimating the image coordinates using pixel matching and estimating the peak coordinates by interpolating the values. A method of measuring the position of the sub-pixel with sub-pixel accuracy is used. As a specific example,
(1) Method for estimating peak by independently performing function fitting etc. on one-dimensional similarity data of x and y coordinates (2) Isotropic curved surface for two-dimensional similarity data of x and y axes Method of estimating peak by fitting (3) There is a method of estimating a peak on the assumption of an anisotropic two-dimensional model on two-dimensional data of x and y axes.
The method of (1) is widely used because it is simple and requires a small amount of calculation. However, when the distribution shape of the similarity is not isotropic, that is, when the distribution is not uniformly distributed concentrically from the estimated peak position. An error occurs. In a real image, it is not completely isotropically distributed.
In the method (2), the amount of calculation increases because two-dimensional data is processed. If the similarity distribution is anisotropic, an error will occur.
The method (3) can measure with high accuracy even when the similarity distribution is anisotropic, but the amount of calculation is large because the model becomes complicated.

特許文献1には、まず、最大のグリッドポイント値を有するグリッドポイント位置を決定した後、x軸及びy軸に沿って次の最大グリッドポイント値を有する隣接グリッドポイントの位置を決定して、サブピクセル最大値を含む四角形が決定され、更に4辺の各辺に沿って一次元部分最大値を位置決定し、四角形の向かい合う辺上に位置する一次元部分最大値を結ぶ二つの直線の交点がサブピクセル最大値の位置と推定される、ことが記載されている。しかし、これらはサブピクセル最大値の位置の推定方法や、ピーク推定位置の補正を行う方法が異なり、本発明のように最大ポイントを中心とする三点から推定した一次元推定値一個をそのまま用いるものではなく、簡単な計算で且つ、高速な実時間で高い計測精度を得られるものはなかった。
特開2002−517045号公報
In Patent Document 1, first, the grid point position having the maximum grid point value is determined, and then the position of the adjacent grid point having the next maximum grid point value is determined along the x-axis and the y-axis. A quadrangle containing the pixel maximum value is determined, and the one-dimensional partial maximum value is positioned along each of the four sides, and the intersection of two straight lines connecting the one-dimensional partial maximum values located on opposite sides of the quadrangle is determined. It is described that the position of the subpixel maximum value is estimated. However, these differ in the method for estimating the position of the subpixel maximum value and the method for correcting the peak estimated position, and as in the present invention, one one-dimensional estimated value estimated from three points centered on the maximum point is used as it is. There was no one that could obtain high measurement accuracy with simple calculation and high-speed real time.
JP 2002-517045 A

FAなどの分野では実時間での高速処理が必要なため、計算を複雑化することなく、類似度分布が異方的な場合でも高精度な位置計測を実現する方法及び装置を提供することを目的とする。   In a field such as FA, high-speed processing in real time is required, and therefore a method and apparatus for realizing high-accuracy position measurement even when the similarity distribution is anisotropic without complicating calculation are provided. Objective.

本発明では、関数当てはめ等によるピーク推定はx, y座標独立に1次元類似度データに対して行い、その推定結果を2次元的に補正することで、類似度のピーク位置を簡易かつ高精度に推定する。   In the present invention, peak estimation by function fitting or the like is performed on the one-dimensional similarity data independently of the x and y coordinates, and the estimation result is corrected two-dimensionally so that the peak position of the similarity can be easily and highly accurately determined. To estimate.

本発明は、各座標軸ごとに2箇所でピーク位置を1次元処理によって推定し、それらを用いて相互に補正を施すことで、ピーク位置の検出精度を高めることができるものである。関数フィッティング等によるピーク推定は1次元処理であるため、計算量の増加はわずかであり、2次元曲面を当てはめる方法に比べると、少ない計算量でほぼ同等の計測精度を実現可能である。また、類似度が異方的に分布した場合でもピーク位置を正しく推定することが可能となり、後述するように類似度最大の近傍8点のうち最も距離の離れた類似度の信頼性が低い1点を使用しないため、類似度の誤差の影響を低減する効果を有することができる。   According to the present invention, the peak position is estimated by two-dimensional processing for each coordinate axis by one-dimensional processing, and correction is performed using these, thereby improving the peak position detection accuracy. Since peak estimation by function fitting or the like is a one-dimensional process, the amount of calculation increases little, and almost the same measurement accuracy can be realized with a small amount of calculation compared to a method of fitting a two-dimensional curved surface. Further, even when the similarity is anisotropically distributed, the peak position can be correctly estimated. As described later, the reliability of the similarity having the longest distance among the eight neighboring points having the maximum similarity is low. Since points are not used, it is possible to reduce the influence of similarity errors.

参照画像テンプレートデータと探索対象の画像データとの画像類似度を計算し、作成された類似度マップから、検出された最大類似度のデータ点を中心にx,y軸について1次元的なピーク位置推定を行ってサブピクセル精度の位置を一旦推定し、前記推定したサブピクセルのピーク位置を挟んで、隣接したx,y軸で各々同様の1次元的ピーク位置から補正を行うように構成する。   The image similarity between the reference image template data and the search target image data is calculated, and the one-dimensional peak position about the x and y axes centered on the detected maximum similarity data point from the created similarity map An estimation is performed to once estimate the position of subpixel accuracy, and correction is performed from the same one-dimensional peak position on the adjacent x and y axes with the estimated subpixel peak position interposed therebetween.

図1は本発明の一実施例を示す処理フローであり、図2は本発明の一実施例を示すブロック図である。図1の処理フローにおいてステップS1で、まずテンプレートデータを入力し、またステップS2で画像入力を行う。ついでステップS3で、入力された画像とテンプレートデータとの、画像類似度計算を行い、ステップS4で、類似度マップを生成する。その上で、ステップS5で、最大類似度検出を行い、ステップS6で、サブピクセルの位置を推定する。その後、ステップS7で推定位置の補正を行い、ステップS8で、計測結果の出力を行う。   FIG. 1 is a processing flow showing an embodiment of the present invention, and FIG. 2 is a block diagram showing an embodiment of the present invention. In the process flow of FIG. 1, template data is first input in step S1, and image input is performed in step S2. In step S3, image similarity is calculated between the input image and template data. In step S4, a similarity map is generated. Then, the maximum similarity is detected in step S5, and the position of the subpixel is estimated in step S6. Thereafter, the estimated position is corrected in step S7, and the measurement result is output in step S8.

図2のブロック図では、入力画像を入力する画像入力部1、テンプレートデータの入力を受け、画像入力部1との画像類似度の計算を行う画像類似度計算部2、その上で前記、画像類似度計算部2から、サブピクセルの位置を推定するサブピクセル位置推定部3、及び、求められたサブピクセルの推定位置を補正する推定位置補正部4とから構成され、最終的に計測結果が出力される。   In the block diagram of FIG. 2, an image input unit 1 that inputs an input image, an image similarity calculation unit 2 that receives input of template data and calculates an image similarity with the image input unit 1, and the image The similarity calculation unit 2 includes a sub-pixel position estimation unit 3 that estimates the position of the sub-pixel and an estimated position correction unit 4 that corrects the obtained estimated position of the sub-pixel. Is output.

図3及び図4は、相関値の曲線フィッティングによるピーク位置を推定し、更にサブピクセルの位置を推定していく方法を説明するものであり、図1及び図2と併せて具体的にサブピクセルの位置を推定する方法を説明する。図3は本発明の類似度最大のピーク位置推定方法を説明する図であり、テンプレートデータを入力されたテンプレートにより、y=242軸上で、探索画像データとの画像類似度計算の結果、x=319,320,321の3点のうちx=320の点が類似度最大と推定されることを示している。   FIGS. 3 and 4 illustrate a method of estimating a peak position by curve fitting of correlation values, and further estimating a position of a subpixel. Specifically, the subpixel is combined with FIGS. 1 and 2. A method of estimating the position of will be described. FIG. 3 is a diagram for explaining the maximum similarity peak position estimating method according to the present invention. As a result of calculating the image similarity with the search image data on the y = 242 axis using the template data input template, x Among the three points = 319, 320, and 321, x = 320 points are estimated to be the maximum similarity.

図4は相関値の曲線フィッティングによるピーク位置推定を説明する図であり、y=242軸上の3点に対する相関値を縦軸にとってプロットした3点(x=319,320,321上のそれぞれに対応するP、Q、R)を放物線の二次曲線でフィッティングしたもので、放物線の頂点の位置するx座標がy軸242上でのピーク推定位置(図3でのxa点)となることを示す。   FIG. 4 is a diagram for explaining peak position estimation by curve fitting of correlation values. Three points (x = 319, 320, and 321) plotted with the correlation values for the three points on the y = 242 axis plotted on the vertical axis are shown. The corresponding P, Q, R) is fitted with a parabolic quadratic curve, and the x coordinate at which the apex of the parabola is located is the peak estimated position on the y-axis 242 (point xa in FIG. 3). Show.

このことを、それぞれの座標軸上で1次元的なピーク推定位置を求める様子として、図5と図6により説明する。図5はサブピクセル位置推定におけるx座標の補正方法の1実施例を示す図であり、図6はサブピクセル位置推定のy座標の補正方法の1実施例を示す図である。先に図3で類似度最大とされた点は図5、図6におけるx=320、y=242の点であり、これをMとし、y=242軸上のx=319,321上の点をそれぞれA,B点とし、y=241軸上のx=321,319,320上の点をそれぞれC,D,E点とする。また、図6においてy=243軸上のx=320,319の点をそれぞれF,G点とする。   This will be described with reference to FIGS. 5 and 6 as a way of obtaining a one-dimensional peak estimation position on each coordinate axis. FIG. 5 is a diagram showing one embodiment of the x-coordinate correction method in subpixel position estimation, and FIG. 6 is a diagram showing one embodiment of the y-coordinate correction method in subpixel position estimation. The point where the degree of similarity is maximized in FIG. 3 is the point of x = 320 and y = 242 in FIGS. 5 and 6, and this is M, and the point on x = 319,321 on the y = 242 axis Are points A and B, respectively, and points on x = 321, 319, and 320 on the y = 241 axis are C, D, and E points, respectively. In FIG. 6, the points x = 320 and 319 on the y = 243 axis are F and G points, respectively.

図5ではy=242軸上のM,A,B点により、図3での説明と同じくx座標上のxa点が得られ、同様にして、y=241軸上での画像類似度計算の結果、x=319,320,321のD,E,C点の3点のうち、x=320のE点が類似度最大と推定され、上記と同様に、D,E,Cの3点に対して二次曲線でフィッティングを行ってピーク位置を推定した結果、x座標上のxb点が決定される。   In FIG. 5, the point xa on the x coordinate is obtained from the points M, A, and B on the y = 242 axis similarly to the description in FIG. 3. Similarly, the image similarity calculation on the y = 241 axis is performed. As a result, among the three points x, 319, 320, and 321 of points D, E, and C, the point E with x = 320 is estimated to be the maximum similarity, and similarly to the above, three points of D, E, and C are obtained. On the other hand, as a result of performing the fitting with a quadratic curve and estimating the peak position, the xb point on the x coordinate is determined.

次に、図6によりy座標についても、1次元ピーク推定位置を基にサブピクセルでの位置の補正を行う方法を説明する。図6において、x=320とx=319の軸上のそれぞれy=241,242,243の3点ずつ、即ち、E,M,F;D,A,Gの計6点に関して、図4に相当するものは図示しないが、x軸と同様の処理を行い、x=320上のy=241,242のE,M間のx軸上の、サブピクセル内の類似度最大に近いピクセル推定位置のyaが決定される。また、同様の処理によりx=319上のy=241,242の間のx軸上のD,A間のサブピクセル内の類似度最大に近いピクセル推定位置のyb点が決定され、かくして、x座標xa,xb点とy座標ya点が決定されることになる。また、図5、図6において、x=321、y=243の交点のH点は使用されない。即ち、類似度最大点Mの近傍8点のA〜Hのうち最も距離の離れた、類似度の信頼性が低いH点を使用しないため、類似度の誤差の影響を低減する効果を有することができる。   Next, with reference to FIG. 6, a method for correcting the position of the sub-pixel based on the estimated one-dimensional peak position for the y coordinate will be described. In FIG. 6, three points of y = 241, 242, and 243 on the axes of x = 320 and x = 319, that is, E, M, F; D, A, and G in total, are shown in FIG. The equivalent is not shown, but the same processing as the x-axis is performed, and the estimated pixel position close to the maximum similarity in the sub-pixel on the x-axis between E and M of y = 241 and 242 on x = 320 Is determined. Further, the yb point at the pixel estimated position close to the maximum similarity in the subpixel between D and A on the x axis between y = 241 and 242 between x = 319 and x = 319 is determined by the same processing, and thus x The coordinates xa and xb points and the y coordinate ya point are determined. In FIGS. 5 and 6, the point H at the intersection of x = 321 and y = 243 is not used. That is, among the eight points A to H in the vicinity of the maximum similarity point M, the H point having the lowest distance and the low reliability of the similarity degree is not used, so that the effect of the similarity error is reduced. Can do.

次にサブピクセル内での位置の補正を行う方法について述べる。図5で示すように、類似度最大点を通るy=242とその隣のy=241の2箇所で推定した2個のx座標上のxa点とxb点を結ぶ線上で、上記で求めたピークのy=320上のya点の座標で内分しxc点を求めることにより、ピークのx座標の精度を更に高める方法を示している。同様に図6では、類似度最大点を通るx=320とその隣のy=319の2箇所で推定した2個のy座標上のya点とyb点を結ぶ線上で、上記で求めたピークのy=242上のxa点の座標で内分しyc点を求めることにより、ピークのy座標の精度を更に高める方法を示している。更には、この両方を実行しても構わない。交互に繰り返すことにより精度を更に高めることができる。   Next, a method for correcting the position in the sub-pixel will be described. As shown in FIG. 5, on the line connecting the xa point and the xb point on the two x coordinates estimated at two points of y = 242 passing through the maximum similarity point and y = 241 adjacent thereto, the above is obtained. This shows a method of further improving the accuracy of the x coordinate of the peak by internally dividing with the coordinate of the ya point on y = 320 of the peak to obtain the xc point. Similarly, in FIG. 6, the peak obtained above is shown on a line connecting two points ya and yb on two y coordinates estimated at two points of x = 320 passing through the maximum similarity point and y = 319 adjacent thereto. This shows a method of further increasing the accuracy of the y-coordinate of the peak by internally dividing with the coordinates of the xa point on y = 242 in order to obtain the yc point. Furthermore, both of these may be executed. By repeating alternately, the accuracy can be further improved.

実施例2は、実施例1で詳細を説明したと同様に、xa点、xb点及びya点、yb点を求める方法は同じであるが、サブピクセル内での位置の補正を2次元座標で一括で補正するものである。図7に示すように、y=242上のxa点とy=241上のxb点を結ぶ直線と、x=320上のya点とx=319上のyb点を結ぶ直線との交点であるxc,yc座標の位置が補正後のサブピクセル内での位置となる。   In the second embodiment, as described in detail in the first embodiment, the method for obtaining the xa point, the xb point, the ya point, and the yb point is the same, but the correction of the position in the sub-pixel is performed in two-dimensional coordinates. The correction is performed in a lump. As shown in FIG. 7, it is an intersection of a straight line connecting the xa point on y = 242 and the xb point on y = 241 and a straight line connecting the ya point on x = 320 and the yb point on x = 319. The positions of the xc and yc coordinates are positions within the corrected subpixel.

本発明の一実施例を示す処理フローProcessing flow showing one embodiment of the present invention 本発明の一実施例を示すブロック図The block diagram which shows one Example of this invention 本発明の類似度最大のピーク位置推定方法を説明する図The figure explaining the peak position estimation method of the maximum similarity of this invention 相関値の曲線フィッティングによるピーク位置推定を説明する図The figure explaining peak position estimation by curve fitting of correlation value サブピクセル位置推定におけるx座標の補正方法の1実施例An embodiment of the x-coordinate correction method in subpixel position estimation サブピクセル位置推定におけるy座標の補正方法の1実施例Example of correcting y-coordinate in subpixel position estimation サブピクセル位置推定における2次元座標の一括補正の実施例Example of batch correction of two-dimensional coordinates in subpixel position estimation

符号の説明Explanation of symbols

1 画像入力部
2 画像類似度計算部
3 サブピクセル位置推定部
4 推定位置補正部
DESCRIPTION OF SYMBOLS 1 Image input part 2 Image similarity calculation part 3 Sub pixel position estimation part 4 Estimated position correction part

Claims (5)

画像のマッチング度に基づいて対象物の位置を計測する画像位置計測方法であって、
予め用意された参照画像のテンプレートデータを入力するステップと、
探索対象の画像データを入力するステップと、
前記入力された探索対象の画像データと、前記参照画像のテンプレートデータとの画像類似度を計算し、類似度マップを作成するステップと、
前記類似度マップ上から検出された最大類似度データを中心とした3点から1次元的なピーク位置推定をx,y軸について行い、サブピクセル精度の位置を一旦推定するステップと、
前記推定したサブピクセルのピーク位置を挟んで、隣接したx,y軸で各々同様の1次元的ピーク位置推定を行い、それらの推定結果を用いてピーク推定位置の補正を行うステップと、
により、画像位置を決定する画像位置計測方法。
An image position measurement method for measuring the position of an object based on the degree of matching of an image,
Inputting template data of a reference image prepared in advance;
Inputting image data to be searched;
Calculating an image similarity between the input image data to be searched and template data of the reference image, and creating a similarity map;
Performing one-dimensional peak position estimation on the x and y axes from three points centered on the maximum similarity data detected from the similarity map, and once estimating the position of subpixel accuracy;
Performing the same one-dimensional peak position estimation on adjacent x and y axes across the estimated subpixel peak position, and correcting the peak estimation position using those estimation results;
An image position measuring method for determining an image position.
請求項1において、推定位置の補正をx,y軸でそれぞれ独立に行い、そのうち一方あるいは両方のピーク座標を補正する画像位置計測方法。   2. The image position measuring method according to claim 1, wherein the estimated position is corrected independently on the x and y axes, and one or both of the peak coordinates are corrected. 請求項1において、推定位置の補正をx,y座標の両方を利用して一括して、線形の内挿補間で行う画像位置計測方法。   2. The image position measuring method according to claim 1, wherein the correction of the estimated position is performed collectively by using both x and y coordinates and linear interpolation. 画像のマッチング度に基づいて対象物の位置を計測する画像位置計測装置であって、
予め用意された参照画像のテンプレートデータを入力するテンプレートデータ入力手段と、
探索対象の画像データを入力する画像データ入力手段と、
前記入力された探索対象の画像データと、前記参照画像のテンプレートデータとの画像類似度を計算し、類似度マップを作成する類似度マップ作成手段と、
前記類似度マップ上から検出された最大類似度データを中心とした3点から1次元的なピーク位置推定をx,y軸について行い、サブピクセル精度の位置を一旦推定し、前記推定したサブピクセルのピーク位置を挟んで、隣接したx,y軸で各々同様の1次元的ピーク位置推定を行うピーク位置推定手段と、
前記推定結果を用いてピーク推定位置の補正を行う補正手段と、
により、画像位置を決定することを特徴とする画像位置計測装置。
An image position measuring device that measures the position of an object based on the degree of matching of images,
Template data input means for inputting template data of a reference image prepared in advance;
Image data input means for inputting image data to be searched;
A similarity map creating means for calculating an image similarity between the input image data to be searched and the template data of the reference image, and creating a similarity map;
One-dimensional peak position estimation is performed with respect to the x and y axes from three points centered on the maximum similarity data detected on the similarity map, the position of subpixel accuracy is once estimated, and the estimated subpixels are estimated. Peak position estimating means for performing the same one-dimensional peak position estimation on adjacent x and y axes with respect to each other,
Correction means for correcting the peak estimation position using the estimation result;
An image position measuring apparatus characterized by determining an image position.
コンピュータに、
予め用意された参照画像のテンプレートデータを入力するステップと、
探索対象の画像データを入力するステップと、
前記入力された探索対象の画像データと、前記参照画像のテンプレートデータとの画像類似度を計算し、類似度マップを作成するステップと、
前記類似度マップ上から検出された最大類似度データを中心とした3点から1次元的なピーク位置推定をx,y軸について行い、サブピクセル精度の位置を一旦推定するステップと、
前記推定したサブピクセルのピーク位置を挟んで、隣接したx,y軸で各々同様の1次元的ピーク位置推定を行い、それらの推定結果を用いてピーク推定位置の補正を行い、画像位置を決定するステップと、
を実行させるためのプログラム。
On the computer,
Inputting template data of a reference image prepared in advance;
Inputting image data to be searched;
Calculating an image similarity between the input image data to be searched and template data of the reference image, and creating a similarity map;
Performing one-dimensional peak position estimation on the x and y axes from three points centered on the maximum similarity data detected from the similarity map, and once estimating the position of subpixel accuracy;
The same one-dimensional peak position estimation is performed on adjacent x and y axes with the estimated subpixel peak position in between, and the peak estimation position is corrected using the estimation results to determine the image position. And steps to
A program for running
JP2006037856A 2006-02-15 2006-02-15 Image position measuring method, image position measuring apparatus, and image position measuring program Expired - Fee Related JP4887820B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006037856A JP4887820B2 (en) 2006-02-15 2006-02-15 Image position measuring method, image position measuring apparatus, and image position measuring program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006037856A JP4887820B2 (en) 2006-02-15 2006-02-15 Image position measuring method, image position measuring apparatus, and image position measuring program

Publications (2)

Publication Number Publication Date
JP2007219704A true JP2007219704A (en) 2007-08-30
JP4887820B2 JP4887820B2 (en) 2012-02-29

Family

ID=38496958

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006037856A Expired - Fee Related JP4887820B2 (en) 2006-02-15 2006-02-15 Image position measuring method, image position measuring apparatus, and image position measuring program

Country Status (1)

Country Link
JP (1) JP4887820B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010165183A (en) * 2009-01-15 2010-07-29 Panasonic Electric Works Co Ltd Human body detection device
KR101060897B1 (en) 2008-08-25 2011-08-30 미쓰비시덴키 가부시키가이샤 Content playback device and method
CN117474993A (en) * 2023-10-27 2024-01-30 哈尔滨工程大学 Underwater image feature point sub-pixel position estimation method and device
CN117495970A (en) * 2024-01-03 2024-02-02 中国科学技术大学 Template multistage matching-based chemical instrument pose estimation method, equipment and medium
CN117495970B (en) * 2024-01-03 2024-05-14 中国科学技术大学 Template multistage matching-based chemical instrument pose estimation method, equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0458376A (en) * 1990-06-28 1992-02-25 Matsushita Electric Ind Co Ltd Method for high-accuracy position recognition
JPH0721383A (en) * 1993-07-05 1995-01-24 Asia Electron Inc Picture processor
JP2000105828A (en) * 1998-09-29 2000-04-11 Dainippon Printing Co Ltd Position shift correcting device
JP2001195597A (en) * 2000-12-11 2001-07-19 Mitsubishi Electric Corp Image processor
JP2002517045A (en) * 1998-05-28 2002-06-11 アキュイティー イメージング エルエルシー How to determine the exact position of the template match point

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0458376A (en) * 1990-06-28 1992-02-25 Matsushita Electric Ind Co Ltd Method for high-accuracy position recognition
JPH0721383A (en) * 1993-07-05 1995-01-24 Asia Electron Inc Picture processor
JP2002517045A (en) * 1998-05-28 2002-06-11 アキュイティー イメージング エルエルシー How to determine the exact position of the template match point
JP2000105828A (en) * 1998-09-29 2000-04-11 Dainippon Printing Co Ltd Position shift correcting device
JP2001195597A (en) * 2000-12-11 2001-07-19 Mitsubishi Electric Corp Image processor

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101060897B1 (en) 2008-08-25 2011-08-30 미쓰비시덴키 가부시키가이샤 Content playback device and method
JP2010165183A (en) * 2009-01-15 2010-07-29 Panasonic Electric Works Co Ltd Human body detection device
CN117474993A (en) * 2023-10-27 2024-01-30 哈尔滨工程大学 Underwater image feature point sub-pixel position estimation method and device
CN117495970A (en) * 2024-01-03 2024-02-02 中国科学技术大学 Template multistage matching-based chemical instrument pose estimation method, equipment and medium
CN117495970B (en) * 2024-01-03 2024-05-14 中国科学技术大学 Template multistage matching-based chemical instrument pose estimation method, equipment and medium

Also Published As

Publication number Publication date
JP4887820B2 (en) 2012-02-29

Similar Documents

Publication Publication Date Title
US6208769B1 (en) Method of accurately locating the fractional position of a template match point
US8126260B2 (en) System and method for locating a three-dimensional object using machine vision
JP5029618B2 (en) Three-dimensional shape measuring apparatus, method and program by pattern projection method
EP1918877B1 (en) Correlation peak finding method for image correlation displacement sensing
US10102631B2 (en) Edge detection bias correction value calculation method, edge detection bias correction method, and edge detection bias correcting program
US9569850B2 (en) System and method for automatically determining pose of a shape
US9214024B2 (en) Three-dimensional distance measurement apparatus and method therefor
KR101453143B1 (en) Stereo matching process system, stereo matching process method, and recording medium
US8941732B2 (en) Three-dimensional measuring method
JP4887820B2 (en) Image position measuring method, image position measuring apparatus, and image position measuring program
CN102725774B (en) Similarity degree calculation device, similarity degree calculation method, and program
JP2009146150A (en) Method and device for detecting feature position
JP6951469B2 (en) How to calibrate an optical measuring device
JP4970118B2 (en) Camera calibration method, program thereof, recording medium, and apparatus
CN112991372B (en) 2D-3D camera external parameter calibration method based on polygon matching
JP6521988B2 (en) Wafer notch detection
JP6682167B2 (en) Template matching device, method, and program
JP6855271B2 (en) Long dimension measurement method
JP4153322B2 (en) Method and apparatus for associating measurement points in photogrammetry
JP2001229388A (en) Matching method for image data
CN103198452B (en) Based on the localization method on quick response matrix code the 4th summit on positioning pattern summit
JP4159373B2 (en) Method and apparatus for associating measurement points in photogrammetry
KR100784734B1 (en) Error compensation method for the elliptical trajectory of industrial robot
JP2018072937A (en) Template matching device, method and program
CN117252922A (en) Target pose positioning method and system for abnormal quantity estimation

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081117

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20101203

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20101207

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110204

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110712

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110824

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111115

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111128

R150 Certificate of patent or registration of utility model

Ref document number: 4887820

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141222

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees