CN103456000A - Feature point matching method and device - Google Patents

Feature point matching method and device Download PDF

Info

Publication number
CN103456000A
CN103456000A CN2012102438045A CN201210243804A CN103456000A CN 103456000 A CN103456000 A CN 103456000A CN 2012102438045 A CN2012102438045 A CN 2012102438045A CN 201210243804 A CN201210243804 A CN 201210243804A CN 103456000 A CN103456000 A CN 103456000A
Authority
CN
China
Prior art keywords
image
feature points
unique point
matching
characteristic point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102438045A
Other languages
Chinese (zh)
Other versions
CN103456000B (en
Inventor
黄国唐
吕尚杰
谢伯璜
江博通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Publication of CN103456000A publication Critical patent/CN103456000A/en
Application granted granted Critical
Publication of CN103456000B publication Critical patent/CN103456000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A feature point matching method includes: obtaining a feature point from an image of at least one object; obtaining a position Pn of a feature point; obtaining a position Pn' of a feature point corresponding to Pn; judging whether the threshold is larger than the offset; if yes, the feature points are obtained.

Description

Characteristic point matching method and device thereof
Technical field
The present invention relates to a kind of characteristic point matching method and device thereof, especially a kind of method and device thereof that is applied to image detection.
Background technology
Development along with robotization, in recent years industry has utilized mechanical arm, reach the purpose of factory automation, but most application still is confined to the fixing work of re-treatment, and these work palpus advance plannings, operate an instruction device by an operator who sees service again, move back and forth along trapped orbit with the instruction mechanical arm.
Yet in the electronic industry of miniaturization, various a small amount of production type, for the mechanical arm of this industry, will need the more time in instruction, and need a lot of clamp tools for multiple different object, virtually cause the cost lifting.
Therefore in recent years for the mechanical arm chance of automated production in conjunction with camera head, for example, as fixed single CCD(Charge-coupled Device), CCD and the stereoscopic vision module of Eye-in-Hand, by camera head elevating gear arm to workpiece attitude identification capability and Grasping skill, to shorten conversion time and the cost of manufacture that reduces tool.
So in mechanical arm (the Vision Guided Robot of vision guide, VGR) in system, all guiding tasks all will be take the unique point of image of object as basis, to be judged and location estimation, therefore the technology that obtains unique point by camera head just has discussible space.
Summary of the invention
In one embodiment, the disclosure provides a kind of characteristic point matching method, and it includes:
Obtain a unique point at the image of at least one object;
Obtain the position Pn of a unique point;
Obtain the position Pn ' of the unique point of corresponding Pn;
Judge whether that threshold value is greater than side-play amount; If so, obtain unique point.
In one embodiment, the disclosure provides a kind of Feature Points Matching device, and it includes:
One accommodation unit, it has at least one light source;
One image capturing device, it is the top of being located at this accommodation unit; And
One CPU (central processing unit), it is electrically connected respectively this accommodation unit and this image capturing device.
The accompanying drawing explanation
The schematic diagram of the embodiment that Fig. 1 is a kind of Feature Points Matching device of the present disclosure.
The schematic diagram of another embodiment that Fig. 2 is Feature Points Matching device of the present disclosure.
The schematic flow sheet that Fig. 3 is a kind of characteristic point matching method of the present disclosure.
[main element symbol description]
10 accommodation units
100 light sources
101 trays
20 image capturing devices
200 image capturing devices
201 first mobile units
202 second image-taking devices
203 second mobile units
30 CPU (central processing unit)
300 output ports
201A the first mobile unit
203A the second mobile unit
S 1 ~ S9 step
Embodiment
Below by particular specific embodiment, embodiment of the present disclosure is described, those skilled in the art can, by content disclosed in the present specification, understand the disclosure easily.
Please coordinate with reference to shown in figure 1, an embodiment of a kind of Feature Points Matching device of the present disclosure, it has an accommodation unit 10, an image capturing device 20 and a CPU (central processing unit) 30.
Accommodation unit 10 has at least one light source 100 and a tray 101, and light source 100 is located at the outer rim of tray 101, and tray 101 can be one with respect to light source 100 and move axially, and for example, this moves axially as the longitudinal axis moves, and light source 100 can be light emitting diode.
Image capturing device 20 has at least one image capturing device 200, at least one the first mobile unit 201, at least one the second image-taking device 202 and at least one the second mobile unit 203.
The first mobile unit 201 is located at the top of accommodation unit 10.
The first image-taking device 200 is located at the first mobile unit 201, first gets a device 200 has one first visual angle with respect to accommodation unit 10, if the first image-taking device 200 is removable, the first image-taking device 200 captures a plurality of characteristic point position P in different positions in an object 40 n(as described later), suppose that the first image-taking device 200 is for irremovable, tray 101 is one and moves axially, and the first image-taking device 200 captures a plurality of characteristic point position P in different positions in object 40 n.
The second mobile unit 203 is located at the top of accommodation unit 10, and for example, the second mobile unit 203 and the first mobile unit 201 can be the combination of a linear slide rail and a slide block, or the combination of a ball screw and a supporting seat.
The second image-taking device 202 is located at the second mobile unit 203, second gets a device 202 has one first visual angle with respect to accommodation unit 10, for example, the second visual angle and the first visual angle can be 0 to 180 degree, as 15,25,30,35,40,45,50,60,70,80,90,95,100,110,120,130,140,150,160,170 or 175 degree, the second image-taking device 202 and the first image-taking device 201 can be a CCD(Charge-coupled Device) or a CMOS(Complementary Metal Oxide Semiconductor).
If the second image-taking device 202 is removable, the second image-taking device 203 captures a plurality of characteristic point position P in different positions in an object 40 n(as described later), suppose that the second image-taking device 202 is for irremovable, tray 101 is one and moves axially, and the second image-taking device 202 captures a plurality of characteristic point position P in different positions in object 40 n'.
CPU (central processing unit) 30 is electrically connected respectively image capturing device 20 and accommodation unit 10, CPU (central processing unit) 30 has at least one output port 300, for example, output port 300 can be USB(Universal Serial Bus), IEEE1394a or IEEE1394b.
In addition, CPU (central processing unit) 30 is controlled the operation of the first mobile units 201, the second mobile unit 203 and tray 101, and receives the image of the object 40 that the second image-taking device 202 and the first image-taking device 201 capture.
Please coordinate with reference to shown in figure 2, another embodiment of Feature Points Matching device of the present disclosure, accommodation unit 10, light source 100, a tray 101, the first image-taking device 200, the second image-taking device 203, CPU (central processing unit) 30 at Fig. 2 are continued to use Fig. 1 with output port 300, therefore component symbol is continued to use.
In the present embodiment, the first mobile unit 201A and the second mobile unit 203A of image capturing device 20A change to respectively a mechanical arm, and the first image-taking device 200 and the second image-taking device 203 are located at the free end of mechanical arm.
The first mobile unit 201A and the second mobile unit 203A make respectively the first image-taking device 200 and the second image-taking device 203 can vertical or horizontal movement.
Please coordinate with reference to shown in figure 3, the disclosure is a kind of characteristic point matching method, and it includes:
S1: obtain unique point.
Obtain a unique point at the image of at least one object 40, for example, this object 40 can be the thin type object.
Please coordinate again with reference to figure 1 with shown in Fig. 2, at least one object 40 is positioned at tray 101, light source 100 is projected to object 40 surfaces with a horizontal direction by a light, and the surface that makes object 40 is formed with a plurality of continuous brightened dots and a plurality of continuous dark point, brightened dot or dark point can be considered unique point, tray 101 is moved axially in good time, to highlight unique point, for example, obtaining of unique point is to utilize one jiao of detection (Conrner Detection) or a manual type to specify, using and obtain the position of mating as picture point in object 40 profiles, this picture point is unique point, if detect and describe with angle, the surface of object 40 has brightened dot and a plurality of continuous dark point, these brightened dots or dark point can be considered unique point, if further illustrate, object 40 center is dark compared to brightened dot, the Gu Gai center can be regarded as an angle point, brightened dot can be considered unique point.
S2: set a matching size.
This matching size is picture number (pixle), and it is object 40 images that application the first image-taking device 200 described later and the second image-taking device 203 capture.
S3: the position P that obtains a unique point n.
The first image-taking device 200 is obtained the position P of a unique point in the first visual angle n, P nfor the position of the unique point of object 40 images, n is a constant, as 0,1,2,3,4,5,6 to n.
S4: obtain corresponding P nthe position P of unique point n'.
The second image-taking device 202 is obtained the position P of unique point in the second visual angle n', P n' be P ncorrespondence position in the image at the first visual angle.
S5: obtain corresponding P n' the position P of unique point n".
The relative P of the first image-taking device 200 n' in reverse search image, and obtain the position P of unique point in the first visual angle n", P n" be P n' in the correspondence position of the image at the first visual angle.
S6: judge whether that threshold value is greater than side-play amount (Δ X n>X n), side-play amount is X n, X n=P n"-P n', threshold value is Δ X n, Δ X n=X n-1-X n, and the search of above-mentioned characteristic point position processed by CPU (central processing unit) 30 with judgement, or by the output of output port 300, so that another arithmetic element calculates, and obtains characteristic point position and aforesaid judgement.
If not, reduced matching size (S7), dwindled pixel, after pixel is little, then carried out S3.
If, obtain unique point (S8), carry out a stereoscopy passive ranging (S9), the first image-taking device 201 and the above-mentioned unique point of the second image-taking device 203 acquisition, and in the unique point that the first image-taking device 201 is captured and the second image-taking device 203 the unique point coordinate substitution one first light beam intersection conllinear function and one second light beam intersection conllinear function that are captured, to calculate the solid space coordinate of object 40, its detailed description is found in the TaiWan, China patented claim No. 099107942.
Comprehensively above-mentioned, the disclosure, for an image, to obtain its characteristic point position, then is carried out two Feature Points Matching in image for characteristic point position, after the images match success, utilize result to carry out a negative relational matching to image, to obtain the side-play amount of twice coupling fruit, this side-play amount is a position offset, if side-play amount is less than threshold value, search and be of a size of a preferred dimensions, therefore can obtain the characteristic matching precision, to overcome visual angle difference, cause the characteristics of image distortion.
Yet above-described specific embodiment; only for example, release the disclosure; but but not for limiting practical range of the present disclosure; do not breaking away under the disclosure above-mentioned spirit and technical scope; any utilization disclosure institute's disclosure and the equivalence that completes changes and modify, the claimed scope that all still should be appended claims contains.

Claims (23)

1. a characteristic point matching method, it includes:
Obtain a unique point at the image of at least one object;
Obtain the position P of a unique point n;
Obtain corresponding P nthe position P of unique point n';
Judge whether that threshold value is greater than side-play amount; If so, obtain unique point.
2. characteristic point matching method as claimed in claim 1, wherein obtaining of this unique point is to utilize one jiao to detect or a manual type is specified.
3. characteristic point matching method as claimed in claim 1, it further has setting one matching size.
4. characteristic point matching method as claimed in claim 3, wherein this matching size is the picture number.
5. characteristic point matching method as claimed in claim 3, wherein judge whether that in this threshold value is greater than in side-play amount, if not, reduced this matching size, then obtained the position Pn of a unique point.
6. characteristic point matching method as claimed in claim 1, wherein one first image-taking device is obtained the position P of a unique point in one first visual angle n, P nit is the position of the unique point of an object image.
7. characteristic point matching method as claimed in claim 6, wherein one second image-taking device is obtained the position P of unique point in one second visual angle n', P n' be P ncorrespondence position in the image at the first visual angle.
8. characteristic point matching method as claimed in claim 7, the wherein relative P of this first image-taking device n' in reverse search image, and obtain the position P of unique point in this first visual angle n", P n" be P n' in the correspondence position of the image at this first visual angle.
9. characteristic point matching method as claimed in claim 1, if wherein obtain unique point, carry out a stereoscopy passive ranging.
10. characteristic point matching method as claimed in claim 1, wherein this side-play amount is X n, X n=P n"-P n', this threshold value is Δ X n, Δ X n=X n-1-X n.
11. a Feature Points Matching device, it includes:
One accommodation unit, it has at least one light source;
One image capturing device, it is located at the top of this accommodation unit; And
One CPU (central processing unit), it is electrically connected respectively this accommodation unit and this image capturing device.
12. Feature Points Matching device as claimed in claim 11, wherein this accommodation unit further has a tray, and this light source is located at the outer rim of this tray.
13. Feature Points Matching device as claimed in claim 12, wherein this tray can be one with respect to this light source and moves axially.
14. Feature Points Matching device as claimed in claim 13, wherein this light source is light emitting diode.
15. Feature Points Matching device as claimed in claim 11, wherein this image capturing device has at least one the first image-taking device, and this first is got a device and have one first visual angle with respect to accommodation unit.
16. Feature Points Matching device as claimed in claim 15, wherein this image capturing device has at least one the first mobile unit, and this first image-taking device is located at this first mobile unit.
17. Feature Points Matching device as claimed in claim 16, wherein this first mobile unit can be combination or a mechanical arm of combination, a ball screw and a supporting seat of a linear slide rail and a slide block.
18. Feature Points Matching device as claimed in claim 11, wherein this image capturing device has at least one the second image-taking device, and this second is got a device and have one second visual angle with respect to accommodation unit.
19. Feature Points Matching device as claimed in claim 18, wherein this first image-taking device and this second image-taking device can be a CCD or a CMOS.
20. Feature Points Matching device as claimed in claim 19, wherein this image capturing device has at least one the second mobile unit, and this second image-taking device is located at this second mobile unit.
21. Feature Points Matching device as claimed in claim 20, wherein this second mobile unit can be combination or a mechanical arm of combination, a ball screw and a supporting seat of a linear slide rail and a slide block.
22. Feature Points Matching device as claimed in claim 11, wherein this CPU (central processing unit) has at least one output port.
23. Feature Points Matching device as claimed in claim 22, wherein this output port can be USB, IEEE1394a or IEEE1394b.
CN201210243804.5A 2012-05-29 2012-07-13 Feature point matching method and device Active CN103456000B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW101119148 2012-05-29
TW101119148A TWI595428B (en) 2012-05-29 2012-05-29 Method of feature point matching

Publications (2)

Publication Number Publication Date
CN103456000A true CN103456000A (en) 2013-12-18
CN103456000B CN103456000B (en) 2016-04-13

Family

ID=49738330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210243804.5A Active CN103456000B (en) 2012-05-29 2012-07-13 Feature point matching method and device

Country Status (2)

Country Link
CN (1) CN103456000B (en)
TW (1) TWI595428B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI571805B (en) * 2016-04-15 2017-02-21 元智大學 Progressive image matching method and device based on hashing function

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201014339D0 (en) * 2010-02-26 2010-10-13 Sony Corp A method and apparatus for determining misalignment
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image
CN102313982A (en) * 2010-07-02 2012-01-11 索尼公司 Method is confirmed in microscope and zone

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI321297B (en) * 2006-09-29 2010-03-01 Ind Tech Res Inst A method for corresponding, evolving and tracking feature points in three-dimensional space
WO2008076942A1 (en) * 2006-12-15 2008-06-26 Braintech Canada, Inc. System and method of identifying objects
TWI385597B (en) * 2009-11-03 2013-02-11 Teco Elec & Machinery Co Ltd Image processing method and image processing system
TWI420066B (en) * 2010-03-18 2013-12-21 Ind Tech Res Inst Object measuring method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image
GB201014339D0 (en) * 2010-02-26 2010-10-13 Sony Corp A method and apparatus for determining misalignment
CN102169577A (en) * 2010-02-26 2011-08-31 索尼公司 Method and apparatus for determining misalignment
CN102313982A (en) * 2010-07-02 2012-01-11 索尼公司 Method is confirmed in microscope and zone

Also Published As

Publication number Publication date
TWI595428B (en) 2017-08-11
TW201349125A (en) 2013-12-01
CN103456000B (en) 2016-04-13

Similar Documents

Publication Publication Date Title
US10899014B2 (en) Multiple lens-based smart mechanical arm and positioning and assembly method thereof
CN107263468B (en) SCARA robot assembly method using digital image processing technology
CN109612390B (en) Large-size workpiece automatic measuring system based on machine vision
JP7212236B2 (en) Robot Visual Guidance Method and Apparatus by Integrating Overview Vision and Local Vision
CN103453889B (en) Ccd video camera calibration alignment method
CN105690393A (en) Four-axle parallel robot sorting system based on machine vision and sorting method thereof
CN104180753A (en) Rapid calibration method of robot visual system
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
CN104626169A (en) Robot part grabbing method based on vision and mechanical comprehensive positioning
CN116958146B (en) Acquisition method and device of 3D point cloud and electronic device
CN103676976A (en) Correction method for three-dimensional worktable repositioning error
CN114029946A (en) Method, device and equipment for guiding robot to position and grab based on 3D grating
CN111267094A (en) Workpiece positioning and grabbing method based on binocular vision
CN105945953A (en) Visual identification system for robot
CN104966681A (en) Vision-based wafer deflection angle detection method
CN104384902A (en) Press fit system based on machine vision positioning
CN103456000A (en) Feature point matching method and device
CN111906767A (en) Vision rectification mechanical arm based on binocular structured light and rectification method
CN205552536U (en) Four -axis parallel robot letter sorting system based on machine vision
CN208399978U (en) A kind of positioning measuring device
CN110568699A (en) control method for simultaneously automatically focusing most 12 cameras
CN108459558B (en) Positioning measurement device and positioning measurement method
CN106622990B (en) Part fixation and recognition processing system
JP6334528B2 (en) Imaging device and production equipment
JP2020146773A (en) Handling device and robot device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant