WO2011083665A1 - 類似度算出装置、類似度算出方法、及びプログラム - Google Patents
類似度算出装置、類似度算出方法、及びプログラム Download PDFInfo
- Publication number
- WO2011083665A1 WO2011083665A1 PCT/JP2010/072576 JP2010072576W WO2011083665A1 WO 2011083665 A1 WO2011083665 A1 WO 2011083665A1 JP 2010072576 W JP2010072576 W JP 2010072576W WO 2011083665 A1 WO2011083665 A1 WO 2011083665A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- displacement vector
- geometric transformation
- displacement
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/753—Transform-based matching, e.g. Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
Definitions
- the present invention relates to a similarity calculation device, a similarity calculation method, and a program.
- an image template matching technique is known as a technique for matching a pattern such as an image.
- a template image is divided into block regions, template matching is performed for each block, and a displacement amount at which a maximum matching rate is obtained in the search target image for each block is obtained.
- the template matching is performed by voting the matching rate or the like on the coordinates of the voting space corresponding to the displacement amount and detecting the peak in the voting space.
- the amount of displacement that provides the maximum matching rate varies with the geometric transformation.
- the amount of displacement due to geometric transformation varies depending on the position, so that the voting position in the voting space fluctuates, and the peak in the voting space becomes less steep. For this reason, the accuracy of matching occurs.
- Patent Document 1 discloses a process of cutting out a plurality of characteristic local regions from an input image and a model image, a process of projecting image information of a selected local region onto a point set on a feature space, and a local of one image For each region, a process of searching for the local region of the other image projected at the nearest position in the feature space and associating the local regions with each other, and the local region of the associated one image and the other image
- a geometric transformation parameter estimation process for estimating a geometric transformation parameter between the input image and the model image based on the arrangement relationship of the local region of the image, and using the estimated geometric transformation parameter, the image information of the local region of one image or the entire image
- an image collation processing method including a process of performing geometric transformation on any of image information, and evaluating and collating the consistency between both associated local regions.
- an object of the present invention is to provide a similarity calculation device that can suppress deterioration in collation accuracy and reduce the amount of calculation even when geometric transformation such as rotation occurs. .
- the similarity calculation apparatus is provided between a first local region set in a first image and a second local region most similar to the first local region in a second image.
- a displacement vector estimator that estimates a displacement vector of the image
- a geometric transformation parameter estimator that estimates a geometric transformation parameter for geometrically transforming the first image into the second image based on a plurality of the displacement vectors
- a displacement vector correction unit that subtracts the displacement due to the geometric transformation from the displacement vector and corrects the displacement vector, and the displacement vector corrected by the displacement vector correction unit
- a voting unit for voting on a two-dimensional space defined by an element, a peak detection unit for detecting a peak on the voted two-dimensional space, and a size of the peak
- a similarity calculation section for calculating a similarity of the first image and the second image.
- voting is performed after correcting the displacement vector using the estimated geometric transformation parameters, and the similarity between the images is calculated. As a result, deterioration of collation accuracy can be suppressed, and the amount of calculation can be reduced.
- FIGS. 3A to 3C show the structure of the pattern matching apparatus by embodiment of this invention. It is a flowchart which shows operation
- 3A to 3C are diagrams schematically showing examples of displacement vectors estimated at the respective reference points of the image.
- 4A to 4C are diagrams schematically showing voting values obtained when voting processing is performed on the respective displacement vectors shown in FIGS. 3A to 3C in the voting space. It is a figure explaining the fluctuation
- FIG. 1 is a block diagram showing a configuration of a pattern matching device (similarity calculation device) 100 according to an embodiment of the present invention.
- the pattern matching apparatus 100 includes a displacement vector estimation unit 101, a geometric transformation parameter estimation unit 102, a displacement vector correction unit 103, a voting unit 104, a peak detection unit 105, and a similarity calculation unit 106.
- a displacement vector estimation unit 101, a geometric transformation parameter estimation unit 102, a displacement vector correction unit 103, a voting unit 104, a peak detection unit 105, and a similarity calculation unit 106 represent modules of operations performed by a computer processor according to a program, These integrally form the function of the processor of the pattern matching apparatus 100.
- the displacement vector estimation unit 101 (displacement between the first local region set in the first image and the second local region most similar to the first local region in the second image) Estimate displacement vector (which is a quantity).
- the geometric transformation parameter estimation unit 102 estimates a geometric transformation parameter for geometric transformation of the first image to the second image.
- the displacement vector correction unit 103 corrects the displacement vector based on the geometric transformation parameter.
- the voting unit 104 votes the displacement vector corrected by the displacement vector correcting unit 103 in a two-dimensional space, and creates a voting image.
- the peak detection unit 105 detects a peak on the voting image.
- the similarity calculation unit 106 calculates the similarity between the first image and the second image according to the size of the peak.
- FIG. 2 is a flowchart showing an example of the operation of the pattern matching apparatus 100 according to the embodiment of the present invention.
- a description will be given by taking face image collation as an example.
- two images to be subjected to image matching are input to the pattern matching device 100 (step S201).
- the image can be input via a scanner or the like.
- one image (first image) of two images to be subjected to image matching is represented as f (x, y), and the other image (second image) is represented as g (x, y).
- Each image size is set to 256 ⁇ 256 pixels.
- the first image f (x, y) and the second image g (x, y) are roughly normalized in size and orientation by associating both images by detecting the eye position in advance.
- Conventional techniques can be used for such normalization processing.
- the detected eye position or the like usually has a detection error, and there may be a geometric variation such as rotation between the two images.
- the displacement vector estimation unit 101 performs block matching or the like on a plurality of local regions including a plurality of reference points (s, t) set in the input first image f (x, y).
- the local region of the second image g (x, y) that is most similar to each other is obtained, and a displacement vector v (s, t) that is a displacement amount between the similar local regions of both images is calculated (step S202).
- Correlation value R (s, t, p, q) is calculated by equation (1).
- the reference points (s, t) are set, for example, every 8 ⁇ 8 pixels, and 32 ⁇ 32 reference points are set in the first image.
- the displacement amount (p max , q max ) that maximizes the correlation value R is obtained by the equation (2) as the displacement vector v (s, t) with respect to the reference point (s, t).
- max ⁇ 1 R represents p and q where R is the maximum value.
- the maximum correlation value at the reference point (s, t) is R max (s, t).
- the geometric transformation parameter estimation unit 102 uses the displacement vector v (s, t) obtained in step S202 to obtain a geometric transformation parameter between the image f (x, y) and the image g (x, y). Is estimated (step S203). If there is a geometric transformation under the constraint of affine transformation between the image f (x, y) and the image g (x, y), the reference point (s, t) of the image f (x, y) and the image g ( The relationship between the reference points (s ′, t ′) of x, y) is expressed by equation (3).
- a and b are affine transformation parameters.
- a and b are estimated by the least square method, for example. Specifically, A and b that minimize J represented by Expression (4) are obtained.
- the affine transformation having 6 degrees of freedom is used for estimating the geometric transformation parameters A and b.
- three parameters of translation and rotation and four parameters of translation and rotation / enlargement / reduction are provided. You may make it estimate.
- the displacement vector correction unit 103 corrects the displacement vector v (s, t) by using the geometric transformation parameters A and b obtained in step S203 by the equation (5), and the corrected displacement vector v ′ (S, t) is calculated (step S204).
- the voting unit 104 uses the corrected displacement vector v ′ (s, t) obtained in step S204 to vote for the two-dimensional voting space H corresponding to v ′ (s, t) ( Step S205).
- the voting space H is represented by an array ([p ′], [q ′]).
- [x] is a value obtained by discretizing x.
- the voting unit 104 adds 1 or the maximum correlation value R max (s, t) obtained when calculating the displacement vector to the voting space H as shown in equations (6) and (7).
- the peak detector 105 detects a peak indicating the maximum value in the voting space H (step S206).
- smoothing processing or the like may be performed on the voting space H obtained by the voting unit 104. By performing smoothing in this way, robustness against noise can be ensured.
- a Gaussian convolution operation may be performed on the voting space H which is a two-dimensional array.
- the similarity calculation unit 106 obtains the maximum peak value in the voting space H, and calculates the similarity between the first image and the second image according to the size of the peak (step S207).
- weighting processing such as a Gaussian window may be performed on the vicinity of the peak to obtain the sum of the values of the voting space H near the peak. .
- the similarity between the first image and the second image is calculated.
- FIG. 3 is a diagram schematically illustrating an example of a displacement vector between two images estimated at each reference point.
- FIG. 3A shows an example of a displacement vector estimated when there is no positional deviation or rotation between two images and ideal matching is performed.
- FIG. 3B is an example of a displacement vector estimated when there is a translational misalignment between two images.
- FIG. 3C is an example of a displacement vector estimated when a rotational fluctuation occurs.
- FIGS. 4A to 4C schematically show voting values obtained when the respective displacement vectors shown in FIGS. 3A to 3C are voted on the voting space.
- FIG. 4A in FIG. 4A, since there is no displacement or rotation between the two images, the displacement vector is distributed in the vicinity of the coordinates (0, 0) of the voting space, and is steep in the voting space. A peak is detected.
- FIG. 4B there is a translational displacement between the two images, but since the displacement vector is distributed in the vicinity of the coordinate value shifted in accordance with the translation, a steep peak is detected.
- the amount of variation accompanying the geometric transformation is corrected, and the vicinity of the origin coordinate value (0, 0).
- a displacement vector v ′ (s, t) having a distribution is obtained.
- the voting process can be directly performed using the displacement vector after the geometric transformation. For this reason, it is not necessary to perform geometric transformation processing of the image as compared with the case where geometric transformation is performed on the image information itself, and it is not necessary to associate local regions again with the image after geometric transformation. The calculation amount can be greatly reduced.
- face image matching has been described as an example, but it may be applied to matching of other biological patterns such as fingerprint images and vein images. Furthermore, the present invention can be applied to image matching in various fields other than biometric pattern matching.
- the present invention is suitable for reducing the accuracy of collation and reducing the amount of calculation even when geometric transformation such as rotation occurs.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
特許文献1には、入力画像およびモデル画像からそれぞれ複数の特徴的な局所領域を切り出す処理と、選択した局所領域の画像情報を特徴空間上の点集合に投影する処理と、一方の画像の局所領域ごとに、特徴空間内において最も近傍の位置に投影されている他方の画像の局所領域を探索して局所領域同士を対応づける処理と、対応付けられた一方の画像の局所領域と他方の画像の局所領域の配置関係に基づいて入力画像とモデル画像間の幾何変換パラメタを推定する幾何変換パラメタ推定処理と、推定した幾何変換パラメタを用いて一方の画像の局所領域の画像情報か画像全体の画像情報のいずれかに対して幾何変換を施し、対応付けられた局所領域両者の整合性を評価して照合する処理を備えた画像照合処理方法が開示されている。
図1は、本発明の実施形態によるパターン照合装置(類似度算出装置)100の構成を示すブロック図である。
図に示すように、パターン照合装置100は、変位ベクトル推定部101、幾何変換パラメータ推定部102、変位ベクトル補正部103、投票部104、ピーク検出部105、類似度算出部106を備えている。
幾何変換パラメータ推定部102は、第1の画像を第2の画像に幾何変換する幾何変換パラメータを推定する。
変位ベクトル補正部103は、幾何変換パラメータに基づいて、変位ベクトルを補正する。
投票部104は、変位ベクトル補正部103において補正した変位ベクトルを2次元空間に投票し、投票画像を作成する。
ピーク検出部105は、投票画像上のピークを検出する。
類似度算出部106は、ピークの大きさに応じて第1の画像と第2の画像の類似度を算出する。
図2は、本発明の実施形態によるパターン照合装置100の動作の例を示すフローチャートである。ここでは顔画像の照合を例にとり説明する。
ここで、max-1Rは、Rが最大値となるp、qを表す。
以上の処理により、第1の画像と第2の画像の類似度が算出される。
図に示すように、図4(A)は、2つの画像間に位置ずれや回転等がないので、変位ベクトルが投票空間の座標(0,0)の近辺に分布し、投票空間では急峻なピークが検出される。図4(B)では、2つの画像間に並進の変位があるが、変位ベクトルは並進に対応してずれた座標値の近辺に分布するので、急峻なピークが検出される。
Claims (5)
- 第1の画像内に設定された複数の第1の局所領域と、第2の画像内で各々の前記第1の局所領域と最も類似する第2の局所領域との間の変位ベクトルを推定する変位ベクトル推定部と、
複数の前記変位ベクトルに基づいて、前記第1の画像を前記第2の画像に幾何変換する幾何変換パラメータを推定する幾何変換パラメータ推定部と、
前記幾何変換パラメータに基づいて、前記変位ベクトルから前記幾何変換による変位を減算し、前記変位ベクトルを補正する変位ベクトル補正部と、
前記変位ベクトル補正部において補正した前記変位ベクトルを、各々の変位ベクトルの要素により規定される2次元空間に投票する投票部と、
前記投票された2次元空間上のピークを検出するピーク検出部と、
前記ピークの大きさに応じて前記第1の画像と前記第2の画像の類似度を算出する類似度算出部と、を備えた類似度算出装置。 - 前記幾何変換パラメータは、アフィン変換のパラメータであることを特徴とする請求項1に記載の類似度算出装置。
- 前記投票部は、前記投票を行う2次元空間に対して平滑化処理を行うことを特徴とする請求項1または2に記載の類似度算出装置。
- 第1の画像内に設定された複数の第1の局所領域と、第2の画像内で各々の前記第1の局所領域と最も類似する第2の局所領域との間の変位ベクトルを推定する工程と、
複数の前記変位ベクトルに基づいて、前記第1の画像を前記第2の画像に幾何変換する幾何変換パラメータを推定する工程と、
前記幾何変換パラメータに基づいて、前記変位ベクトルから前記幾何変換による変位を減算し、前記変位ベクトルを補正する工程と、
補正した前記変位ベクトルを、各々の変位ベクトルの要素により規定される2次元空間に投票する工程と、
前記投票された2次元空間上のピークを検出する工程と、
前記ピークの大きさに応じて前記第1の画像と前記第2の画像の類似度を算出する工程と、を備えた類似度算出方法。 - コンピュータを、
第1の画像内に設定された複数の第1の局所領域と、第2の画像内で各々の前記第1の局所領域と最も類似する第2の局所領域との間の変位ベクトルを推定する変位ベクトル推定部と、
複数の前記変位ベクトルに基づいて、前記第1の画像を前記第2の画像に幾何変換する幾何変換パラメータを推定する幾何変換パラメータ推定部と、
前記幾何変換パラメータに基づいて、前記変位ベクトルから前記幾何変換による変位を減算し、前記変位ベクトルを補正する変位ベクトル補正部と、
前記変位ベクトル補正部において補正した前記変位ベクトルを、各々の変位ベクトルの要素により規定される2次元空間に投票する投票部と、
前記投票された2次元空間上のピークを検出するピーク検出部と、
前記ピークの大きさに応じて前記第1の画像と前記第2の画像の類似度を算出する類似度算出部、として機能させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/519,307 US8855373B2 (en) | 2010-01-08 | 2010-12-15 | Similarity calculation device, similarity calculation method, and program |
CN201080060873.1A CN102725774B (zh) | 2010-01-08 | 2010-12-15 | 相似度计算设备、相似度计算方法及程序 |
JP2011548937A JP5717055B2 (ja) | 2010-01-08 | 2010-12-15 | 類似度算出装置、類似度算出方法、及びプログラム |
EP10842178A EP2523161A1 (en) | 2010-01-08 | 2010-12-15 | Similarity degree calculation device, similarity degree calculation method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-002569 | 2010-01-08 | ||
JP2010002569 | 2010-01-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011083665A1 true WO2011083665A1 (ja) | 2011-07-14 |
Family
ID=44305402
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/072576 WO2011083665A1 (ja) | 2010-01-08 | 2010-12-15 | 類似度算出装置、類似度算出方法、及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US8855373B2 (ja) |
EP (1) | EP2523161A1 (ja) |
JP (1) | JP5717055B2 (ja) |
KR (1) | KR20120094102A (ja) |
CN (1) | CN102725774B (ja) |
WO (1) | WO2011083665A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015179453A (ja) * | 2014-03-19 | 2015-10-08 | 富士通株式会社 | 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム |
US20160071285A1 (en) * | 2014-09-09 | 2016-03-10 | Nec Corporation | Information processing device, information processing apparatus, information processing method, and program |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8988536B2 (en) | 2010-12-23 | 2015-03-24 | Samsung Electronics Co., Ltd. | Image processing circuit, method of operation thereof, and digital camera including same |
CN104917931B (zh) * | 2015-05-28 | 2018-03-02 | 京东方科技集团股份有限公司 | 运动图像补偿方法及装置、显示装置 |
CN108256566A (zh) * | 2018-01-10 | 2018-07-06 | 广东工业大学 | 一种基于余弦相似度的自适应模版匹配方法及装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000132691A (ja) * | 1998-10-23 | 2000-05-12 | Toshiba Corp | 映像領域追跡方法および装置 |
JP2001092963A (ja) * | 1999-09-27 | 2001-04-06 | Fujitsu Ltd | 画像照合方法および装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6621929B1 (en) * | 1999-06-22 | 2003-09-16 | Siemens Corporate Research, Inc. | Method for matching images using spatially-varying illumination change models |
JP3576987B2 (ja) * | 2001-03-06 | 2004-10-13 | 株式会社東芝 | 画像のテンプレートマッチング方法及び画像処理装置 |
JP2002298141A (ja) * | 2001-03-29 | 2002-10-11 | Nec Corp | パターン照合装置とそのパターン照合方法、及びパターン照合プログラム |
EP1359536A3 (en) * | 2002-04-27 | 2005-03-23 | Samsung Electronics Co., Ltd. | Face recognition method and apparatus using component-based face descriptor |
US7646891B2 (en) | 2002-12-26 | 2010-01-12 | Mitshubishi Denki Kabushiki Kaisha | Image processor |
JP4178480B2 (ja) * | 2006-06-14 | 2008-11-12 | ソニー株式会社 | 画像処理装置、画像処理方法、撮像装置および撮像方法 |
JP5487970B2 (ja) * | 2007-11-08 | 2014-05-14 | 日本電気株式会社 | 特徴点配置照合装置及び画像照合装置、その方法及びプログラム |
US8340370B2 (en) * | 2008-02-19 | 2012-12-25 | Nec Corporation | Pattern verification apparatus, pattern verification method, and program |
US9305240B2 (en) * | 2011-12-07 | 2016-04-05 | Google Technology Holdings LLC | Motion aligned distance calculations for image comparisons |
-
2010
- 2010-12-15 KR KR1020127017705A patent/KR20120094102A/ko active IP Right Grant
- 2010-12-15 US US13/519,307 patent/US8855373B2/en active Active
- 2010-12-15 EP EP10842178A patent/EP2523161A1/en not_active Withdrawn
- 2010-12-15 CN CN201080060873.1A patent/CN102725774B/zh active Active
- 2010-12-15 WO PCT/JP2010/072576 patent/WO2011083665A1/ja active Application Filing
- 2010-12-15 JP JP2011548937A patent/JP5717055B2/ja active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000132691A (ja) * | 1998-10-23 | 2000-05-12 | Toshiba Corp | 映像領域追跡方法および装置 |
JP2001092963A (ja) * | 1999-09-27 | 2001-04-06 | Fujitsu Ltd | 画像照合方法および装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015179453A (ja) * | 2014-03-19 | 2015-10-08 | 富士通株式会社 | 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム |
US20160071285A1 (en) * | 2014-09-09 | 2016-03-10 | Nec Corporation | Information processing device, information processing apparatus, information processing method, and program |
JP2016057793A (ja) * | 2014-09-09 | 2016-04-21 | 日本電気株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
US9547913B2 (en) | 2014-09-09 | 2017-01-17 | Nec Corporation | Information processing device, information processing apparatus, information processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP5717055B2 (ja) | 2015-05-13 |
CN102725774A (zh) | 2012-10-10 |
CN102725774B (zh) | 2015-06-17 |
JPWO2011083665A1 (ja) | 2013-05-13 |
US20120294492A1 (en) | 2012-11-22 |
EP2523161A1 (en) | 2012-11-14 |
KR20120094102A (ko) | 2012-08-23 |
US8855373B2 (en) | 2014-10-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8649593B2 (en) | Image processing apparatus, image processing method, and program | |
US11227144B2 (en) | Image processing device and method for detecting image of object to be detected from input data | |
US7778467B2 (en) | Image matching system and image matching method and program | |
WO2015029982A1 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP4577532B2 (ja) | 認識システム、認識方法および認識プログラム | |
JP5717055B2 (ja) | 類似度算出装置、類似度算出方法、及びプログラム | |
EP2757527A1 (en) | System and method for distorted camera image correction | |
JP2002298141A (ja) | パターン照合装置とそのパターン照合方法、及びパターン照合プログラム | |
US20020146155A1 (en) | Method of correlation of images in biometric applications | |
US8675047B2 (en) | Detection device of planar area and stereo camera system | |
CN111582186A (zh) | 基于视觉和触觉的物体边缘识别方法、装置、系统及介质 | |
JP6497007B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
JP6163868B2 (ja) | 画像処理方法、画像処理装置および画像処理プログラム | |
JP4887820B2 (ja) | 画像位置計測方法、画像位置計測装置および画像位置計測プログラム | |
JP6604008B2 (ja) | 画像処理装置、画像処理プログラム、及び画像処理方法 | |
JP3938971B2 (ja) | パターンマッチング用データ処理回路 | |
JP5945823B2 (ja) | 画像処理方法および画像処理装置 | |
JP2008032478A (ja) | 自己位置推定装置 | |
JP2009048292A (ja) | 画像計測アルゴリズムにおける特徴抽出方法 | |
US11080819B1 (en) | Generating developable depth images using rank minimization | |
CN116342912B (zh) | 基于相关峰分析的异源遥感图像匹配方法及系统 | |
Bagchi et al. | 3D Face Recognitionacross Pose Extremities | |
WO2022190533A1 (ja) | テンプレート生成装置、照合システム、照合装置、テンプレート生成方法、照合方法およびプログラム | |
JP2004310281A (ja) | 物体追跡装置 | |
US10095921B2 (en) | Processing device and processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080060873.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10842178 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13519307 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011548937 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010842178 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20127017705 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |