CN103456000B - Feature point matching method and device - Google Patents

Feature point matching method and device Download PDF

Info

Publication number
CN103456000B
CN103456000B CN201210243804.5A CN201210243804A CN103456000B CN 103456000 B CN103456000 B CN 103456000B CN 201210243804 A CN201210243804 A CN 201210243804A CN 103456000 B CN103456000 B CN 103456000B
Authority
CN
China
Prior art keywords
image
unique point
visual angle
feature points
taking device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210243804.5A
Other languages
Chinese (zh)
Other versions
CN103456000A (en
Inventor
黄国唐
吕尚杰
谢伯璜
江博通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Publication of CN103456000A publication Critical patent/CN103456000A/en
Application granted granted Critical
Publication of CN103456000B publication Critical patent/CN103456000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Input (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

一种特征点匹配方法,其包含有:于至少一物件的图像取得一特征点;取得一特征点的位置Pn;取得对应Pn的特征点的位置Pn’;判断是否阈值大于偏移量;如果是,则取得特征点。

A feature point matching method, which includes: obtaining a feature point from an image of at least one object; obtaining the position Pn of a feature point; obtaining the position Pn' of the feature point corresponding to Pn; determining whether the threshold is greater than the offset; if If yes, the feature points are obtained.

Description

特征点匹配方法及其装置Feature point matching method and device thereof

技术领域technical field

本发明涉及一种特征点匹配方法及其装置,尤其是一种应用于图像检测的方法及其装置。The invention relates to a feature point matching method and its device, in particular to a method and its device applied to image detection.

背景技术Background technique

随着自动化的发展,近年工业已利用机械手臂,达到工厂自动化的目的,但大部分的应用仍局限于重复处理固定的工作,而这些工作须事先规划,再由一位具有经验的操作员操作一教导器,以教导机械手臂沿着固定轨道往复移动。With the development of automation, in recent years, the industry has used robotic arms to achieve the purpose of factory automation, but most of the applications are still limited to repetitive processing of fixed tasks, and these tasks must be planned in advance and then operated by an experienced operator A teaching device to teach the robot arm to reciprocate along the fixed track.

然而于小型化、多样少量生产类型的电子产业中,用于该产业的机械手臂将需要较多的时间于教导上,并需要很多的夹治具针对多种不同的物件,无形中导致成本提升。However, in the electronic industry with miniaturization and various types of low-volume production, the robotic arms used in this industry will require more time for teaching, and require a lot of fixtures for various objects, which will virtually increase the cost. .

故近年用于自动化生产的机械手机会结合摄像装置,举例而言,如固定式的单CCD(Charge-coupledDevice)、Eye-in-Hand的CCD以及立体视觉模块,通过摄像装置提升机械手臂对工件姿态辨识能力与抓取能力,以缩短换线时间与减少治具的制作成本。Therefore, in recent years, robotic handsets used in automated production will be combined with camera devices, such as fixed single CCD (Charge-coupled Device), Eye-in-Hand CCD and stereo vision module. Posture recognition ability and grasping ability to shorten the line change time and reduce the production cost of fixtures.

然于视觉引导的机械手臂(VisionGuidedRobot,VGR)系统中,所有导引任务皆要以物件的图像的特征点为基础,以进行判断与位置估计,故通过摄像装置取得特征点的技术就有可讨论的空间。However, in the vision-guided robotic arm (VisionGuided Robot, VGR) system, all guiding tasks must be based on the feature points of the image of the object for judgment and position estimation, so the technology of obtaining feature points through the camera device is possible. room for discussion.

发明内容Contents of the invention

在一实施例中,本公开提供一种特征点匹配方法,其包含有:In one embodiment, the present disclosure provides a feature point matching method, which includes:

在至少一物件的图像取得一特征点;obtaining a feature point in an image of at least one object;

取得一特征点的位置Pn;Obtain the position Pn of a feature point;

取得对应Pn的特征点的位置Pn’;Obtain the position Pn' of the feature point corresponding to Pn;

判断是否阈值大于偏移量;如果是,则取得特征点。Determine whether the threshold is greater than the offset; if yes, obtain the feature points.

在一实施例中,本公开提供一种特征点匹配装置,其包含有:In one embodiment, the present disclosure provides a feature point matching device, which includes:

一承接单元,其具有至少一光源;a receiving unit having at least one light source;

一图像撷取装置,其系设于该承接单元的上方;以及an image capture device, which is arranged above the receiving unit; and

一中央处理单元,其分别电性连接该承接单元与该图像撷取装置。A central processing unit is electrically connected to the receiving unit and the image capture device respectively.

附图说明Description of drawings

图1为本公开的一种特征点匹配装置的一实施例的示意图。FIG. 1 is a schematic diagram of an embodiment of a feature point matching device of the present disclosure.

图2为本公开的特征点匹配装置的另一实施例的示意图。FIG. 2 is a schematic diagram of another embodiment of the feature point matching device of the present disclosure.

图3为本公开的一种特征点匹配方法的流程示意图。FIG. 3 is a schematic flowchart of a feature point matching method of the present disclosure.

【主要元件符号说明】[Description of main component symbols]

10承接单元10 undertaking unit

100光源100 light sources

101承盘101 bearing plate

20图像撷取装置20 image capture device

200图像撷取装置200 image capture device

201第一移动单元201 First Mobile Unit

202第二取像装置202 Second imaging device

203第二移动单元203 Second Mobile Unit

30中央处理单元30 central processing unit

300输出端口300 output ports

201A第一移动单元201A First Mobile Unit

203A第二移动单元203A Second Mobile Unit

S1~S9步骤S1~S9 steps

具体实施方式detailed description

以下通过特定的具体实施例说明本公开的实施方式,本领域技术人员可由本说明书所揭示的内容,轻易地了解本公开。The implementation manners of the present disclosure are described below through specific specific examples, and those skilled in the art can easily understand the present disclosure from the content disclosed in this specification.

请配合参考图1所示,本公开的一种特征点匹配装置的一实施例,其具有一承接单元10、一图像撷取装置20与一中央处理单元30。Please refer to FIG. 1 , an embodiment of a feature point matching device of the present disclosure, which has a receiving unit 10 , an image capturing device 20 and a central processing unit 30 .

承接单元10具有至少一光源100与一承盘101,光源100设于承盘101的外缘,承盘101能够相对于光源100呈一轴向移动,举例而言,该轴向移动为纵轴移动,光源100能够为发光二极管。The receiving unit 10 has at least one light source 100 and a support plate 101, the light source 100 is arranged on the outer edge of the support plate 101, the support plate 101 can move in an axial direction relative to the light source 100, for example, the axial movement is the longitudinal axis Alternatively, the light source 100 can be a light emitting diode.

图像撷取装置20具有至少一图像撷取装置200、至少一第一移动单元201、至少一第二取像装置202与至少一第二移动单元203。The image capturing device 20 has at least one image capturing device 200 , at least one first moving unit 201 , at least one second image capturing device 202 and at least one second moving unit 203 .

第一移动单元201设于承接单元10的上方。The first mobile unit 201 is disposed above the receiving unit 10 .

第一取像装置200设于第一移动单元201,第一取像装置200相对于承接单元10具有一第一视角,如果第一取像装置200为可移动,则第一取像装置200在不同的位置于一物件40撷取多个特征点位置Pn(如后所述),假设第一取像装置200为不可移动,承盘101呈一轴向移动,第一取像装置200在不同的位置于物件40撷取多个特征点位置PnThe first imaging device 200 is arranged on the first mobile unit 201, and the first imaging device 200 has a first viewing angle relative to the receiving unit 10. If the first imaging device 200 is movable, the first imaging device 200 is A plurality of feature point positions P n are captured at different positions on an object 40 (as described later), assuming that the first imaging device 200 is immovable, and the support plate 101 moves in an axial direction, the first imaging device 200 is A plurality of feature point positions P n are extracted from different positions on the object 40 .

第二移动单元203设于承接单元10的上方,举例而言,第二移动单元203与第一移动单元201能够为一线性滑轨与一滑块的组合,或者一滚珠螺杆与一支撑座的组合。The second moving unit 203 is arranged above the receiving unit 10. For example, the second moving unit 203 and the first moving unit 201 can be a combination of a linear slide rail and a slider, or a combination of a ball screw and a support base. combination.

第二取像装置202设于第二移动单元203,第二取像装置202相对于承接单元10具有一第一视角,举例而言,第二视角与第一视角能够为0至180度,如15、25、30、35、40、45、50、60、70、80、90、95、100、110、120、130、140、150、160、170或175度,第二取像装置202与第一取像装置201能够为一CCD(Charge-coupledDevice)或一CMOS(ComplementaryMetalOxideSemiconductor)。The second imaging device 202 is arranged on the second mobile unit 203, and the second imaging device 202 has a first viewing angle relative to the receiving unit 10. For example, the second viewing angle and the first viewing angle can be 0 to 180 degrees, such as 15, 25, 30, 35, 40, 45, 50, 60, 70, 80, 90, 95, 100, 110, 120, 130, 140, 150, 160, 170 or 175 degrees, the second imaging device 202 and The first imaging device 201 can be a CCD (Charge-coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).

如果第二取像装置202为可移动,则第二取像装置202在不同的位置于一物件40撷取多个特征点位置Pn(如后所述),假设第二取像装置202为不可移动,承盘101呈一轴向移动,第二取像装置202在不同的位置于物件40撷取多个特征点位置Pn’。If the second imaging device 202 is movable, the second imaging device 202 captures a plurality of feature point positions Pn (as described later) on an object 40 at different positions, assuming that the second imaging device 202 is It cannot be moved, the support plate 101 moves in an axial direction, and the second imaging device 202 captures a plurality of feature point positions P n ′ on the object 40 at different positions.

中央处理单元30分别电性连接图像撷取装置20与承接单元10,中央处理单元30具有至少一输出端口300,举例而言,输出端口300能够为USB(UniversalSerialBus)、IEEE1394a或IEEE1394b。The central processing unit 30 is electrically connected to the image capture device 20 and the receiving unit 10 respectively. The central processing unit 30 has at least one output port 300. For example, the output port 300 can be USB (Universal Serial Bus), IEEE1394a or IEEE1394b.

另外,中央处理单元30控制第一移动单元201、第二移动单元203与承盘101的操作,并且接收第二取像装置202与第一取像装置201所撷取的物件40的图像。In addition, the central processing unit 30 controls the operations of the first moving unit 201 , the second moving unit 203 and the tray 101 , and receives images of the object 40 captured by the second imaging device 202 and the first imaging device 201 .

请配合参考图2所示,本公开的特征点匹配装置的另一实施例,在图2的承接单元10、光源100、一承盘101、第一取像装置200、第二取像装置202、中央处理单元30与输出端口300沿用图1,故元件符号沿用。Please refer to FIG. 2, another embodiment of the feature point matching device of the present disclosure, in FIG. 1, the central processing unit 30 and the output port 300 are used in FIG. 1 , so the component symbols are used in the same way.

在本实施例中,图像撷取装置20A的第一移动单元201A与第二移动单元203A分别变更为一机械手臂,第一取像装置200与第二取像装置202设于机械手臂的自由端。In this embodiment, the first moving unit 201A and the second moving unit 203A of the image capture device 20A are respectively changed into a robotic arm, and the first image capturing device 200 and the second image capturing device 202 are arranged at the free ends of the robotic arm .

第一移动单元201A与第二移动单元203A分别使第一取像装置200与第二取像装置202能够纵向或横向移动。The first moving unit 201A and the second moving unit 203A enable the first imaging device 200 and the second imaging device 202 to move longitudinally or laterally, respectively.

请配合参考图3所示,本公开是一种特征点匹配方法,其包含有:Please refer to Figure 3, the present disclosure is a feature point matching method, which includes:

S1:取得特征点。S1: Obtain feature points.

在至少一物件40的图像取得一特征点,举例而言,该物件40能够为薄形物件。A feature point is acquired in an image of at least one object 40, for example, the object 40 can be a thin object.

请再配合参考图1与图2所示,至少一物件40位于承盘101,光源100以一水平方向将一亮光投射至物件40表面,而使物件40的表面形成有多个连续的明亮点与多个连续的阴暗点,明亮点或阴暗点可视为特征点,承盘101被适时轴向移动,以突显特征点,举例而言,特征点的取得是利用一角检测(ConrnerDetection)或一人工方式指定,以取得物件40轮廓中作为图像点匹配的位置,该图像点为特征点,如果以角检测进行说明,物件40的表面具有明亮点与多个连续的阴暗点,这些明亮点或阴暗点可视为特征点,假如进一步说明,物件40的中心相较于明亮点为阴暗,故该中心可被视为一角点,明亮点可视为特征点。Please refer to FIG. 1 and FIG. 2 again, at least one object 40 is located on the support plate 101, and the light source 100 projects a bright light onto the surface of the object 40 in a horizontal direction, so that the surface of the object 40 forms a plurality of continuous bright spots. With a plurality of consecutive dark points, bright points or dark points can be regarded as feature points, and the bearing plate 101 is moved axially in time to highlight the feature points. For example, the feature points are obtained by using a corner detection (ConrnerDetection) or a Manually designate to obtain the matching position of an image point in the outline of the object 40. The image point is a feature point. If it is described by corner detection, the surface of the object 40 has bright points and multiple continuous dark points. These bright points or The dark point can be regarded as a feature point. If it is further explained, the center of the object 40 is darker than the bright point, so the center can be regarded as a corner point, and the bright point can be regarded as a feature point.

S2:设定一匹配尺寸。S2: Set a matching size.

该匹配尺寸为像数(pixle),其系应用后述的第一取像装置200与第二取像装置202所撷取的物件40图像。The matching size is the number of pixels (pixle), which is the image of the object 40 captured by the first image capturing device 200 and the second image capturing device 202 described later.

S3:取得一特征点的位置PnS3: Obtain the position P n of a feature point.

第一取像装置200于第一视角取得一特征点的位置Pn,Pn为物件40图像的特征点的位置,n为一常数,如0、1、2、3、4、5、6至n。The first imaging device 200 obtains the position P n of a feature point at the first viewing angle, where P n is the position of the feature point of the image of the object 40, and n is a constant, such as 0, 1, 2, 3, 4, 5, 6 to n.

S4:取得对应Pn的特征点的位置Pn’。S4: Obtain the position P n ′ of the feature point corresponding to P n .

第二取像装置202于第二视角取得特征点的位置Pn’,Pn’为Pn于第一视角的图像的对应位置。The second image capturing device 202 obtains the position P n ′ of the feature point at the second viewing angle, where P n ′ is the corresponding position of P n in the image of the first viewing angle.

S5:取得对应Pn’的特征点的位置Pn”。S5: Obtain the position P n ″ of the feature point corresponding to P n ′.

第一取像装置200相对Pn’于反向搜寻图像,并于第一视角取得特征点的位置Pn”,Pn”为Pn’于第一视角的图像的对应位置。The first imaging device 200 searches the image in the reverse direction relative to P n ′, and obtains the position P n ″ of the feature point at the first viewing angle, where P n ″ is the corresponding position of the image of P n ′ at the first viewing angle.

S6:判断是否阈值大于偏移量(ΔXn>Xn),偏移量为Xn,Xn=Pn”-Pn’,阈值为ΔXn,ΔXn=Xn-1-Xn,并且上述的特征点位置的搜寻与判断由中央处理单元30处理,或通过输出端口300的输出,以使另一运算单元进行演算,而取得特征点位置与前述的判断。S6: Judging whether the threshold is greater than the offset (ΔX n > X n ), the offset is X n , X n =P n ″-P n ', the threshold is ΔX n , ΔX n =X n-1 -X n , and the above-mentioned search and judgment of the feature point positions are processed by the central processing unit 30, or through the output of the output port 300, so that another computing unit performs calculations to obtain the feature point positions and the aforementioned judgments.

S7:如果否,则进行缩减匹配尺寸,即缩小像素,在像素缩小后,再进行S3。S7: If not, perform size reduction matching, that is, reduce pixels, and then perform S3 after the pixels are reduced.

S8:如果是,则取得特征点,并接着在S9进行一立体视觉测距,第一取像装置201与第二取像装置202撷取上述特征点,并将第一取像装置201所撷取到的特征点与第二取像装置202所撷取到的特征点坐标代入一第一光束交会共线函数及一第二光束交会共线函数中,以计算出物件40的立体空间坐标,其详述可见于中国台湾专利申请第099107942号。S8: If yes, obtain the feature points, and then perform a stereoscopic distance measurement in S9, the first imaging device 201 and the second imaging device 202 capture the above-mentioned feature points, and capture the first imaging device 201 The obtained feature points and the coordinates of the feature points captured by the second imaging device 202 are substituted into a first beam intersection collinear function and a second beam intersection collinear function to calculate the three-dimensional space coordinates of the object 40, A detailed description thereof can be found in Taiwan Patent Application No. 099107942.

综合上述,本公开针对一图像,以取得其特征点位置,再针对特征点位置进行图像中二特征点匹配,在图像匹配成功后,利用结果对图像进行一反向匹配,以取得两次匹配结果的偏移量,该偏移量为一位置偏移量,如果偏移量小于阈值,所搜寻的尺寸为一较佳尺寸,故可取得特征匹配精度,以克服视角差异造成图像特征变形。To sum up the above, the present disclosure is aimed at an image to obtain its feature point position, and then performs matching of two feature points in the image for the feature point position. After the image matching is successful, use the result to perform a reverse matching on the image to obtain two matching The offset of the result, the offset is a position offset, if the offset is less than the threshold, the searched size is a better size, so the feature matching accuracy can be obtained to overcome the image feature deformation caused by the difference in viewing angle.

然而以上所述的具体实施例,仅用于例释本公开,而非用于限定本公开的可实施范围,在未脱离本公开上述的精神与技术范围下,任何运用本公开所揭示内容而完成的等效改变及修饰,均仍应为所附权利要求书的要求保护的范围所涵盖。However, the specific embodiments described above are only used to illustrate the present disclosure, rather than to limit the applicable scope of the present disclosure. Completed equivalent changes and modifications should still be covered by the protection scope of the appended claims.

Claims (17)

1. a characteristic point matching method, it includes:
A unique point is obtained at the image of at least one object;
Obtain the position P of a unique point n;
Obtain corresponding P nthe position P of unique point n';
Judge whether that threshold value is greater than side-play amount; If so, then unique point is obtained,
Wherein one first image-taking device obtains the position P of a unique point in one first visual angle n, P nbe the position of the unique point of an object image, one second image-taking device obtains the position P of unique point in one second visual angle n', P n' be P nin the correspondence position of the image at the first visual angle,
The wherein relative P of this first image-taking device n' in oppositely searching image, and the position P of unique point is obtained in this first visual angle n", P n" be P n' correspondence position of image in this first visual angle, wherein this side-play amount is X n, X n=P n"-P n', this threshold value is Δ X n, Δ X n=X n-1-X n.
2. characteristic point matching method as claimed in claim 1, wherein obtaining of this unique point utilizes one jiao to detect or manual type appointment.
3. characteristic point matching method as claimed in claim 1, it has setting one matching size further.
4. characteristic point matching method as claimed in claim 3, wherein this matching size is picture number.
5. in this, characteristic point matching method as claimed in claim 3, wherein judges whether that threshold value is greater than in side-play amount, if not, then carries out reducing this matching size, then carry out the position Pn obtaining a unique point.
6. characteristic point matching method as claimed in claim 1, if wherein obtain unique point, then carries out a stereoscopy passive ranging.
7. a Feature Points Matching device, it includes:
One accommodation unit, it has at least one light source;
One image capturing device, it is located at the top of this accommodation unit; And
One CPU (central processing unit), it is electrically connected this accommodation unit and this image capturing device respectively,
Wherein this image capturing device has at least one first image-taking device, and this first image-taking device has one first visual angle relative to this accommodation unit,
Wherein this image capturing device has at least one second image-taking device, and this second image-taking device has one second visual angle relative to this accommodation unit,
Wherein this first image-taking device obtains the position P of a unique point in this first visual angle at the image of at least one object n, P nfor the position of the unique point of this object image, this second image-taking device obtains the position P of unique point in this second visual angle at the image of this at least one object n', P n' be P nin the correspondence position of the image at the first visual angle,
The wherein relative P of this first image-taking device n' in oppositely searching image, and the position P of unique point is obtained in this first visual angle n", P n" be P n' correspondence position of image in this first visual angle, wherein this side-play amount is X n, X n=P n"-P n', this threshold value is Δ X n, Δ X n=X n-1-X n.
8. Feature Points Matching device as claimed in claim 7, wherein this accommodation unit has a tray further, and this light source is located at the outer rim of this tray.
9. Feature Points Matching device as claimed in claim 8, wherein this tray can be one to move axially relative to this light source.
10. Feature Points Matching device as claimed in claim 9, wherein this light source is light emitting diode.
11. Feature Points Matching devices as claimed in claim 7, wherein this image capturing device has at least one first mobile unit, and this first image-taking device is located at this first mobile unit.
12. Feature Points Matching devices as claimed in claim 11, wherein this first mobile unit can be the combination of a linear slide rail and a slide block, the combination of a ball screw and a supporting seat or a mechanical arm.
13. Feature Points Matching devices as claimed in claim 7, wherein this first image-taking device and this second image-taking device can be an a CCD or CMOS.
14. Feature Points Matching devices as claimed in claim 13, wherein this image capturing device has at least one second mobile unit, and this second image-taking device is located at this second mobile unit.
15. Feature Points Matching devices as claimed in claim 14, wherein this second mobile unit can be the combination of a linear slide rail and a slide block, the combination of a ball screw and a supporting seat or a mechanical arm.
16. Feature Points Matching devices as claimed in claim 7, wherein this CPU (central processing unit) has at least one output port.
17. Feature Points Matching devices as claimed in claim 16, wherein this output port can be USB, IEEE1394a or IEEE1394b.
CN201210243804.5A 2012-05-29 2012-07-13 Feature point matching method and device Active CN103456000B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW101119148 2012-05-29
TW101119148A TWI595428B (en) 2012-05-29 2012-05-29 Method of feature point matching

Publications (2)

Publication Number Publication Date
CN103456000A CN103456000A (en) 2013-12-18
CN103456000B true CN103456000B (en) 2016-04-13

Family

ID=49738330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210243804.5A Active CN103456000B (en) 2012-05-29 2012-07-13 Feature point matching method and device

Country Status (2)

Country Link
CN (1) CN103456000B (en)
TW (1) TWI595428B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI571805B (en) * 2016-04-15 2017-02-21 元智大學 Progressive image matching method and device based on hashing function

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201014339D0 (en) * 2010-02-26 2010-10-13 Sony Corp A method and apparatus for determining misalignment
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image
CN102313982A (en) * 2010-07-02 2012-01-11 索尼公司 Method is confirmed in microscope and zone

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI321297B (en) * 2006-09-29 2010-03-01 Ind Tech Res Inst A method for corresponding, evolving and tracking feature points in three-dimensional space
WO2008076942A1 (en) * 2006-12-15 2008-06-26 Braintech Canada, Inc. System and method of identifying objects
TWI385597B (en) * 2009-11-03 2013-02-11 Teco Elec & Machinery Co Ltd Image processing method and image processing system
TWI420066B (en) * 2010-03-18 2013-12-21 Ind Tech Res Inst Object measuring method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image
GB201014339D0 (en) * 2010-02-26 2010-10-13 Sony Corp A method and apparatus for determining misalignment
CN102169577A (en) * 2010-02-26 2011-08-31 索尼公司 Method and apparatus for determining misalignment
CN102313982A (en) * 2010-07-02 2012-01-11 索尼公司 Method is confirmed in microscope and zone

Also Published As

Publication number Publication date
CN103456000A (en) 2013-12-18
TW201349125A (en) 2013-12-01
TWI595428B (en) 2017-08-11

Similar Documents

Publication Publication Date Title
US10899014B2 (en) Multiple lens-based smart mechanical arm and positioning and assembly method thereof
CN110163912B (en) Two-dimensional code pose calibration method, device and system
JP2023052266A (en) Systems and methods for combining machine vision coordinate spaces in a guided assembly environment
WO2019114339A1 (en) Method and device for correcting motion of robotic arm
KR100920931B1 (en) Object posture recognition method of robot using TFT camera
US10708479B2 (en) Optical measurement of object location in three dimensions
JP2006148745A (en) Camera calibration method and apparatus
CN114029946A (en) Method, device and equipment for guiding robot to position and grab based on 3D grating
CN101033958A (en) Mechanical vision locating method
CN104760812B (en) Product real-time positioning system and method on conveyer belt based on monocular vision
CN107263468A (en) A kind of SCARA robotic asssembly methods of utilization digital image processing techniques
Hsu et al. Development of a faster classification system for metal parts using machine vision under different lighting environments
US10535157B2 (en) Positioning and measuring system based on image scale
CN106964907A (en) A kind of method and apparatus of laser cutting
JP2014170368A (en) Image processing device, method and program and movable body
CN106276285A (en) Group material buttress position automatic testing method
CN103456000B (en) Feature point matching method and device
CN105488802A (en) Fingertip depth detection method and system
JP2010214546A (en) Device and method for assembling
JP2018179584A (en) Calibration device, calibration method, calibration program
JP2009250777A (en) Surface inspection device and surface inspection method
TWI765567B (en) Method of measureing position errors for feed drive system
KR101747350B1 (en) Method for recognizing coordinates of object for visual servoing
CN110246121A (en) A kind of electronic assemblies assembly accuracy detection method of view-based access control model
Chen et al. A fast positioning method with pattern tracking for automatic wafer alignment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant