CN101033958A - Mechanical vision locating method - Google Patents

Mechanical vision locating method Download PDF

Info

Publication number
CN101033958A
CN101033958A CN 200710051446 CN200710051446A CN101033958A CN 101033958 A CN101033958 A CN 101033958A CN 200710051446 CN200710051446 CN 200710051446 CN 200710051446 A CN200710051446 A CN 200710051446A CN 101033958 A CN101033958 A CN 101033958A
Authority
CN
China
Prior art keywords
image
camera
interest
region
close
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200710051446
Other languages
Chinese (zh)
Other versions
CN100547351C (en
Inventor
刘宏
宋恩民
刘晶晶
李国宽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CNB2007100514467A priority Critical patent/CN100547351C/en
Publication of CN101033958A publication Critical patent/CN101033958A/en
Application granted granted Critical
Publication of CN100547351C publication Critical patent/CN100547351C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

This invention discloses a machine vision positioning method. It completes the detected mission by using two ordinary cameras includes remote camera used for shooting the whole image of the target object and the close camera used for shooting the interested site in short distance. Firstly calibrate remote camera and then determine the proportion relations between the distance of the actual physical distance and the close camera imaging plane pixel. By using the remote camera shoot target object image and then determine the imaged and the actual coordinates of interested center site. Move the close camera to the interested center site by using the manipulator, acquire the image and determine its imaged coordinates and the actual coordinates, get the information of actual location, shape, size in this site. This invention uses two ordinary camera to obtain with high-precision the detail information on the partial interested site of the target objects, satisfies the high precision demand on the industrial machine vision applications with low cost.

Description

A kind of machine vision localization method
Technical field
The invention belongs to machine vision in industrial application, be specifically related to a kind of machine vision localization method, this method utilizes two common cameras to obtain detailed information than the local region of interest of the target object in the large scene accurately, as line segment precise length, solder joint particular location etc.
Background technology
In the modern industry automated production, be applied to machine vision technique more and more, relate to various inspections, measurement and processing, for example laser engraving, the welding of printed circuit board (PCB) pin, laser boring, trade mark cutting etc.The common ground that this class is used is to need to produce in enormous quantities continuously, and the quality of product is had very high requirement.This work has the repeatability of height, is finished by manual detection usually.
If merely finish this procedure by increasing detection workman number, will certainly increase huge cost of labor and handling cost to factory, and, because human eye is done same thing for a long time and fatigue can be occurred, increase the probability of error detection, so still can not guarantee 100% detection qualification rate.Therefore, machine vision technique is incorporated into there is huge value in the commercial production.
At present, substantially all adopt a camera at the measurement of planar object or the research of detection, (its publication number is CN1427245 based on the plane surveying method of single image as Chinese patent literature, be 2003.07.02 in open day) and based on the single sight plane measuring method of parallel lines (its publication number is CN1427246, open day is 2003.07.02), but because camera need detect a bigger zone, when needing that simultaneously some local region of interest in whole zone is accurately observed and when in real time these parts being carried out necessary processing, such as the circuit board solder joint being carried out more careful detection and when welding, so just need a high-quality camera, otherwise just can't carry out Accurate Analysis, will inevitably make equipment cost higher like this local region of interest.
The machine vision applications of existing industry member, otherwise accuracy of detection is not high enough, or increase the cost of equipment in order to reach high precision.At present, industry member mainly adopts a high-quality camera to come the product on the streamline is detected, and equipment set is just expensive like this.The present invention can guarantee the equal high-precision while, reduces the equipment cost of camera part, is particularly useful for the application system that needs use mechanical arm to move on the measured target plane, as systems such as laser engraving, cut, machinery welding.
Summary of the invention
The object of the present invention is to provide a kind of machine vision localization method, this method can be obtained the precise information of the local interested position of target object with cheap equipment cost.
Machine vision localization method provided by the invention adopts two common cameras to finish the detection task jointly, wherein, one is the camera of stationkeeping, far away from the measured target plan range, be used to absorb the global image of target object, be called remote camera; Another is fixed on the mechanical arm, and is nearer from the measured target plan range, can move on the measured target plane, is used to absorb the close-up images of target object region of interest, is called close camera.Its step comprises:
(1) remote camera is demarcated, determined the respective coordinates transformational relation between its imaging plane and the measured target plane, and the proportionate relationship between pixel distance on definite close camera imaging plane and the actual physics distance;
(2) target object to be detected is placed on the measured target plane, takes a global image that comprises this target object with remote camera;
(3), and write down the image coordinate (u at region of interest center at the region of interest of determining on the captured global image on the target object I1, v I1), i=1 ..., m is total to m region of interest, wherein, m 〉=1;
(4) remote camera imaging plane that obtains according to step (1) and the respective coordinates transformational relation between the measured target plane, the image coordinate at the region of interest center that step (3) is write down is converted to actual coordinate (x I1, y I1), i=1 ..., m;
(5) utilize mechanical arm to move the actual coordinate place of close camera to the region of interest center, the center of region of interest and close camera imaging plane center coincide, and take the close-up images at this position;
(6) image coordinate of definite region of interest in the close-up images of above-mentioned region of interest, utilize pixel distance on the close camera imaging plane that step (1) obtains and the proportionate relationship between the actual physics distance, the image coordinate of the close-up images of above-mentioned region of interest is converted into actual coordinate, obtains the actual information of this region of interest.
The present invention utilizes the polynomial expression scaling method that remote camera is demarcated, and has eliminated the distortion effects such as radial distortion, tangential distortion of camera, can determine the respective coordinates transformational relation between imaging plane and the measured target plane more exactly; Utilize mechanical arm to move close camera is further accurately determined the region of interest of target object to the region of interest center detailed information then, can be for further processing to the detailed information that obtains according to the practical application needs.Because common camera is cheap, uses two cameras, to compare with using a camera, cost is similar, and the precision of detection but improves greatly; And, to compare with the single camera Vision Builder for Automated Inspection of using the high precision costliness, cost is much lower.
Description of drawings
Fig. 1 is the process flow diagram of machine vision localization method of the present invention;
Fig. 2 is the employed plane reference plate of the remote camera of a present invention image;
Fig. 3 is the employed template image of close camera of the present invention;
Fig. 4 is embodiment of the invention scaling board image characteristics extraction figure as a result.
Embodiment
Ultimate principle of the present invention is: utilize two cameras, wherein, remote camera is handled at target object integral body, and the global image of photographic subjects object detects interested position (needing to decide with the application scenario) on the global image of taking; Control mechanical arm then and move close camera, take high-precision image and do further careful check and analysis to these region of interest.
Below in conjunction with accompanying drawing and example technical scheme of the present invention is described in further detail.
As shown in Figure 1, machine vision localization method of the present invention may further comprise the steps:
(1) remote camera is demarcated, determined the respective coordinates transformational relation between its imaging plane and the measured target plane; And the proportionate relationship between pixel distance on definite close camera imaging plane and the actual physics distance;
(A) process that remote camera is demarcated is:
(A1) obtain the scaling board image: the scaling board image of making is placed in the plane, measured target object place, allows scaling board be full of whole remote camera view area as far as possible, utilize remote camera to take a scaling board image; Scaling board can adopt features such as the image that is similar to the chess chessboard, circular array obviously and the image of known physical size;
(A2) unique point (as the angle point or the center of circle) in the extraction scaling board image, and the image coordinate of definite all unique points that extract;
The scaling board of some type, may there be some assorted points in the scaling board image that it obtains or contain the angle point that angle point that some backgrounds cause extracting has skew or non-scaling board, so not all unique point all is useful, should reject useless unique point.Image as the chess chessboard just should be rejected useless unique point earlier, and circular array does not then need.
(A3) according to the image of scaling board, the corresponding actual coordinate of the unique point of extracting in the determining step (A2);
(A4) utilize the image coordinate and the actual coordinate of polynomial expression scaling method and above-mentioned unique point, determine the respective coordinates transformational relation between imaging plane and the tested plane.The image coordinate of note unique point is (u i, v i), actual coordinate is (x i, y i), i=1,2 ..., m.The formula of polynomial expression scaling method is:
x i=a 1u i k+a 2v i k+a 3u i k-1v i+a 4u iv i k-1+a 5u i k-1+...+a (4*k-4)u i+a (4*k-3)v i+a (4*k-2)
y i=b 1u i k+b 2v i k+b 3u i k-1v i+b 4u iv i k-1+b 5u i k-1+...+b (4*k-4)u i+b (4*k-3)v i+b (4*k-2)
Wherein, a 1, a 2..., a (4*k-2)And b 1, b 2..., b (4*k-2)Be undetermined parameter, utilize least square method can obtain undetermined parameter.
(B) determine proportionate relationship between pixel distance on the close camera imaging plane and the actual physics distance according to following step:
(B1) on the measured target plane, place the template of the line segment contain one section or several sections known length;
(B2) take the image of the different above-mentioned template of several Zhang Hanyou of or putting position;
(B3) line segment of known length on the extraction template image writes down the length in pixels between this line segment two-end-point, promptly forms the number of the pixel of this line segment;
(B4) proportionate relationship between calculating pixel distance and the actual physics distance.
(2) target object to be detected is placed on the measured target plane, takes a global image that comprises this target object with remote camera;
(3), and write down the image coordinate (u at region of interest center at the region of interest of determining on the captured global image on the target object I1, v I1), i=1 ..., m is total to m region of interest, wherein, m 〉=1;
(4) its imaging plane of remote camera that obtains according to step (1) and the respective coordinates transformational relation between the measured target plane, the image coordinate at the region of interest center that step (3) is write down is converted to actual coordinate (x I1, y I1), wherein, i=1 ..., m;
(5) utilize mechanical arm to move the actual coordinate place of close camera to the region of interest center, the center of region of interest and close camera imaging plane center coincide, and take the close-up images at this position; Because this camera is very near from the measured target plane, viewfinder range is little, and imaging is clear, therefore can obtain the detailed information of the precision of this region of interest;
(6) image coordinate of definite region of interest in the close-up images of above-mentioned region of interest, relative position relation according to its image coordinate and close camera imaging plane center, and utilize pixel distance on the close camera imaging plane that step (1) obtains and the proportionate relationship between the actual physics distance, the image coordinate of the close-up images of above-mentioned region of interest is converted into actual coordinate, obtain the actual information of this region of interest, comprise information such as position, size, shape.Contrast actual standard again and determine whether this position exists abnormal conditions such as breakage, fracture,, can do further processing to it as required if exist.
The circuit board solder joint detects example:
(1) remote camera is demarcated, determined the respective coordinates transformational relation between its imaging plane and the measured target plane, and the proportionate relationship between pixel distance on definite close camera imaging plane and the actual physics distance.Concrete steps are described as follows:
(A) process that remote camera is demarcated is:
(A1) obtain the scaling board image: with as shown in Figure 2 chess checkerboard image as scaling board, the scaling board image is placed in the plane, target object place, allow scaling board be full of whole remote camera view area as far as possible, take a scaling board image with remote camera, the physical length of the length of side of each grid of scaling board is known;
(A2) extract the feature of scaling board image, determine the image coordinate of unique point: because what adopt among the present invention is the chess checkerboard image, so can extract the unique point of angle point as scaling board.The method of extracting angle point has SUSAN, Harris etc., adopts Harris angle point extraction algorithm here, the results are shown in Figure 4, and the white crosses center is angle point among the figure;
(A3) actual coordinate of the unique point that is extracted in definite (A2);
According to selected scaling board image, may there be some assorted points in its image that obtains or contain the angle point that angle point that some backgrounds cause extracting has skew or non-scaling board, therefore not all unique point all is useful, need according to the actual coordinate of picking out useless unique point and definite effective unique point, step is as follows:
(a) on the scaling board image, choose a coordinate system, determine the effective coverage of true origin O and X, Y direction and scaling board;
(b) edge is cut apart, extracted in the effective coverage, and extract orthogonal straight line on the scaling board: can adopt methods such as maximum entropy method (MEM), OTSU algorithm that two-value is carried out in the effective coverage and cut apart, in the present invention, adopt the OTSU algorithm.After cutting apart, extract the profile of image, utilize the Hough mapping algorithm to extract each bar straight line then;
(c) straight line is sorted, obtain the intersection point of the orthogonal straight line of each bar: at first all straight lines are divided into two classes, vertical mutually between this two classes straight line, be parallel to each other between the same class straight line; Then, of a sort linear sequence is judged to the distance of straight line according to true origin; At last, obtain the actual coordinate of the intersection point of the orthogonal straight line of each bar;
(d) compare with the angle point of (A2) obtaining with obtaining the intersection point that comes, and then determine the actual coordinate information of each effective angle point: adopt the arest neighbors method here.Such as, article two, the actual coordinate of the intersection point of intersecting straight lines is (20,30) (mm), its image coordinate is (151,124), if the coordinate that angle point arranged therewith the Euclidean distance of the image coordinate of intersection point less than 3, and the absolute value of the range difference between image coordinate two components is less than 2, so just assert that this angle point is effective unique point, its actual coordinate be exactly (20,30) (mm);
(A4) in the present invention, adopt 3 order polynomial scaling methods to determine coordinate transformation relation between imaging plane and the measured target plane: the image coordinate of remembering each unique point is (u i, v i), actual coordinate is (x i, y i), i=1,2 ... (because 3 order polynomial scaling methods contain 20 undetermined parameters, thereby need at least 20 equations just can obtain, so must have: i>=10), have following relationship so:
x i=a 1u i 3+a 2v i 3+a 3u i 2v i+a 4u iv i 2+a 5u i 2+a 6v i 2+a 7u iv i+a 8u i+a 9v i+a 10
y i=b 1u i 3+b 2v i 3+b 3u i 2v i+b 4u iv i 2+b 5u i 2+b 6v i 2+b 7u iv i+b 8u i+b 9v i+b 10
Wherein, a 1, a 2..., a 10And b 1, b 2..., b 10Be undetermined parameter.
Utilize least square method to obtain parameter a j, b j, j=1 ..., 10.
(B) determine pixel distance on the close camera imaging plane and the proportionate relationship between the actual range according to following step:
(B1) placement template image as shown in Figure 3 on the measured target plane; Wherein, the physical distance between a, 2 of b is known;
(B2) take one or several (putting position can be different) above-mentioned template images with close camera;
(B3) line segment of known length on the extraction template, the length in pixels of record line segment: utilize the method for automatic or man-machine interaction to obtain a, the image coordinate that b is 2 obtains length in pixels D.The line segment span is big more, and result calculated is accurate more;
(B4) proportionate relationship between calculating pixel distance and the actual physics distance: to the n bar line segment that extracts, obtain respectively pixel distance and actual physics apart between proportionate relationship, then to obtain n as a result averaged as last result.
(2) circuit board is placed on the measured target plane, takes a global image that comprises circuit board with remote camera;
(3) utilization man-machine interactively or some image processing method such as a series of methods such as Threshold Segmentation, feature extraction, template matches, are found out roughly the image coordinate (u of all solder joints on global image on the circuit board I1, v I1), i=1 ..., m, wherein, m>=1 is noted;
(4) its imaging plane of remote camera that obtains according to step (1) and the respective coordinates transformational relation between the measured target plane are converted to the formula below the image coordinate utilization of all solder joints of step (3) mark the actual coordinate (x of each solder joint I1, y I1);
x i1=a 1u i1 3+a 2v i1 3+a 3u i1 2v i1+a 4u i1v i1 2+a 5u i1 2+a 6v i1 2+a 7u i1v i1+a 8u i1+a 9v i1+a 10
y i1=b 1u i1 3+b 2v i1 3+b 3u i1 2v i1+b 4u i1v i1 2+b 5u i1 2+b 6v i1 2+b 7u i1v i1+b 8u i1+b 9v i1+b 10
Wherein, a 1, a 2..., a 10And b 1, b 2..., b 10In step (1), try to achieve.
(5), utilize mechanical arm to move the physical location (x of close camera to solder joint according to the actual coordinate of each solder joint I1, y I1), this position is in close camera imaging plane center, and promptly the image coordinate at center is (u 2, v 2), actual coordinate is (x I1, y I1), take the close-up images of this solder joint;
(6) image coordinate (u of definite solder joint in the close-up images of above-mentioned solder joint I2, v I2), obtain the relative position relation (u at its image coordinate and close camera imaging plane center I2-u 2, v I2-v 2), and the proportionate relationship between pixel distance on the close camera imaging plane that obtains according to step (1) and the actual physics distance is converted into relativeness (x in the actual coordinate with above-mentioned relative position relation I2', y I2'), then, again according to the actual coordinate (x at close camera imaging plane center I1, y I1), obtain the actual accurate coordinates (x of this solder joint I2, y I2) be (x I1+ x I2', y I1+ y I2'), promptly the may command laser-welding machine physical location that moves to solder joint is welded.

Claims (3)

1, a kind of machine vision localization method, this method adopt two common cameras to finish the detection task jointly, wherein, one is the camera of stationkeeping, far away from the measured target plan range, be used to absorb the global image of target object, be called remote camera; Another is fixed on the mechanical arm, and is nearer from the measured target plan range, can move on the measured target plane, is used to absorb the close-up images of region of interest, is called close camera; Its step comprises:
(1) remote camera is demarcated, determined the respective coordinates transformational relation between its imaging plane and the measured target plane, and the proportionate relationship between pixel distance on definite close camera imaging plane and the actual physics distance;
(2) target object to be detected is placed on the measured target plane, takes a global image that comprises this target object with remote camera;
(3), and write down the image coordinate (u at region of interest center at the region of interest of determining on the captured global image on the target object I1, v I1), i=1 ..., m is total to m region of interest, wherein, m 〉=1;
(4) its imaging plane of remote camera that obtains according to step (1) and the respective coordinates transformational relation between the measured target plane, the image coordinate at the region of interest center that step (3) is write down is converted to actual coordinate (x I1, y I1), i=1 ..., m;
(5) utilize mechanical arm to move the actual coordinate place of close camera to the region of interest center, wherein, the center of region of interest and close camera imaging plane center match, and take the close-up images at this position;
(6) image coordinate of definite region of interest in the close-up images of above-mentioned region of interest, position relation according to this image coordinate and close camera imaging plane center, utilize pixel distance on the close camera imaging plane that step (1) obtains and the proportionate relationship between the actual physics distance, the image coordinate of the close-up images of above-mentioned region of interest is converted into actual coordinate, obtains the actual information of this region of interest.
2, machine vision localization method according to claim 1 is characterized in that: step is demarcated remote camera according to following step in (1);
(A1) the scaling board image is placed in the plane, target object place, utilizes remote camera to take a scaling board image;
(A2) extract the unique point of scaling board image, and determine the image coordinate of all unique points that extract;
(A3) according to the image of scaling board, the actual coordinate of the unique point of extracting in the determining step (A2);
(A4) utilize the image coordinate and the actual coordinate of following formula and above-mentioned unique point, determine the respective coordinates transformational relation between imaging plane and the tested plane, the image coordinate of note unique point is (u i, v i), actual coordinate is (x i, y i), i=1,2 ..., m;
x i=a 1u i k+a 2v i k+a 3u i k-1v i+a 4u iv i k-1+a5u i k-1+...+a (4*k-4)u i+a (4*k-3)v i+a (4*k-2)
y i=b 1u i k+b 2v i k+b 3u i k-1v i+b 4u iv i k-1+b 5u i k-1+...+b (4*k-4)u i+b (4*k-3)v i+b (4*k-2)
Wherein, a 1, a 2..., a (4*k-2)And b 1, b 2..., b (4*k-2)Be undetermined parameter, utilize least square method to obtain undetermined parameter.
3, machine vision localization method according to claim 1 and 2 is characterized in that: above-mentioned steps (1) is determined proportionate relationship between pixel distance on the close camera imaging plane and the actual physics distance according to following process:
(B1) on the measured target plane, place the template of the line segment contain one section or several sections known length;
(B2) take the image of the different above-mentioned template of several Zhang Hanyou in or the position of putting;
(B3) line segment of known length on the extraction template image writes down the length in pixels between this line segment two-end-point, promptly forms the number of the pixel of this line segment;
(B4) proportionate relationship between calculating pixel distance and the actual physics distance.
CNB2007100514467A 2007-02-01 2007-02-01 A kind of machine vision localization method Expired - Fee Related CN100547351C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2007100514467A CN100547351C (en) 2007-02-01 2007-02-01 A kind of machine vision localization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2007100514467A CN100547351C (en) 2007-02-01 2007-02-01 A kind of machine vision localization method

Publications (2)

Publication Number Publication Date
CN101033958A true CN101033958A (en) 2007-09-12
CN100547351C CN100547351C (en) 2009-10-07

Family

ID=38730618

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007100514467A Expired - Fee Related CN100547351C (en) 2007-02-01 2007-02-01 A kind of machine vision localization method

Country Status (1)

Country Link
CN (1) CN100547351C (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840736A (en) * 2010-05-07 2010-09-22 中国科学院自动化研究所 Device and method for mounting optical glass under vision guide
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN101699223B (en) * 2009-10-27 2011-06-15 北京控制工程研究所 Calibration device of binocular vision navigation system of lunar surface vehicle
CN102152033A (en) * 2011-02-14 2011-08-17 苏州工业园区华焊科技有限公司 Image centralizing location method for automatic tube plate welding
CN102506705A (en) * 2011-10-17 2012-06-20 罗艺 Method and device for obtaining coordinates of positioning mark on PCB (Printed Circuit Board) and patch device
CN103548341A (en) * 2011-06-08 2014-01-29 欧姆龙株式会社 Distributed image processing system
CN104217439A (en) * 2014-09-26 2014-12-17 南京工程学院 Indoor visual positioning system and method
CN104280755A (en) * 2014-09-29 2015-01-14 南京航空航天大学 Relative navigation method based on Beidou reflected signals and vision measurement
CN105279775A (en) * 2014-07-23 2016-01-27 广明光电股份有限公司 Correcting device and method of mechanical arm
CN106092058A (en) * 2016-08-25 2016-11-09 广东欧珀移动通信有限公司 The processing method of information data, device and terminal
CN106204560A (en) * 2016-07-02 2016-12-07 上海大学 Colony picker automatic calibration method
CN106998463A (en) * 2016-01-26 2017-08-01 宁波舜宇光电信息有限公司 The method of testing of camera module based on latticed mark version
CN108181323A (en) * 2017-12-15 2018-06-19 河南飞优驰网络科技有限公司 Integrated circuit plate fault detection method and system based on electromagnetic signature
CN108646727A (en) * 2018-05-14 2018-10-12 珠海市微半导体有限公司 A kind of vision cradle and its localization method and recharging method
CN108876850A (en) * 2018-06-29 2018-11-23 闽江学院 A kind of pcb board punching localization method
CN108972562A (en) * 2018-09-13 2018-12-11 河南机电职业学院 A method of calibration sorting machine people work station carrier chain CountsPerMeter parameter
CN109801300A (en) * 2017-11-16 2019-05-24 北京百度网讯科技有限公司 Coordinate extraction method, device, equipment and the computer readable storage medium of X-comers
CN110893269A (en) * 2019-11-26 2020-03-20 北京新松融通机器人科技有限公司 Fire-fighting robot water-supply hose joint butt joint method and system based on visual measurement
CN111310712A (en) * 2020-03-04 2020-06-19 杭州晟元数据安全技术股份有限公司 Fast searching method based on fingerprint bag-of-words features
CN111425183A (en) * 2020-02-24 2020-07-17 中铁第四勘察设计院集团有限公司 Geological exploration hole site positioning method and positioning robot
CN111486802A (en) * 2020-04-07 2020-08-04 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN112518323A (en) * 2020-11-25 2021-03-19 武汉耀皮康桥汽车玻璃有限公司 Automobile rear windshield processing device
CN112669382A (en) * 2020-12-30 2021-04-16 联想未来通信科技(重庆)有限公司 Image-based distance determination method and device
CN113400196A (en) * 2021-06-18 2021-09-17 西安奕斯伟硅片技术有限公司 Cleaning method, device and equipment for grinding fixed disc groove and computer storage medium
CN114585875A (en) * 2019-10-11 2022-06-03 莱卡地球系统公开股份有限公司 Metering system

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101699223B (en) * 2009-10-27 2011-06-15 北京控制工程研究所 Calibration device of binocular vision navigation system of lunar surface vehicle
CN101876535A (en) * 2009-12-02 2010-11-03 北京中星微电子有限公司 Method, device and monitoring system for height measurement
CN101876535B (en) * 2009-12-02 2015-11-25 北京中星微电子有限公司 A kind of height measurement method, device and supervisory system
CN101840736A (en) * 2010-05-07 2010-09-22 中国科学院自动化研究所 Device and method for mounting optical glass under vision guide
CN102152033A (en) * 2011-02-14 2011-08-17 苏州工业园区华焊科技有限公司 Image centralizing location method for automatic tube plate welding
CN102152033B (en) * 2011-02-14 2014-02-05 苏州工业园区华焊科技有限公司 Image centralizing location method for automatic tube plate welding
CN103548341A (en) * 2011-06-08 2014-01-29 欧姆龙株式会社 Distributed image processing system
CN102506705A (en) * 2011-10-17 2012-06-20 罗艺 Method and device for obtaining coordinates of positioning mark on PCB (Printed Circuit Board) and patch device
CN102506705B (en) * 2011-10-17 2014-04-09 罗艺 Method and device for obtaining coordinates of positioning mark on PCB (Printed Circuit Board) and patch device
CN105279775A (en) * 2014-07-23 2016-01-27 广明光电股份有限公司 Correcting device and method of mechanical arm
CN105279775B (en) * 2014-07-23 2019-09-17 广明光电股份有限公司 The means for correcting and method of robotic arm
CN104217439A (en) * 2014-09-26 2014-12-17 南京工程学院 Indoor visual positioning system and method
CN104217439B (en) * 2014-09-26 2017-04-19 南京工程学院 Indoor visual positioning system and method
CN104280755A (en) * 2014-09-29 2015-01-14 南京航空航天大学 Relative navigation method based on Beidou reflected signals and vision measurement
CN106998463A (en) * 2016-01-26 2017-08-01 宁波舜宇光电信息有限公司 The method of testing of camera module based on latticed mark version
CN106204560A (en) * 2016-07-02 2016-12-07 上海大学 Colony picker automatic calibration method
CN106204560B (en) * 2016-07-02 2019-04-16 上海大学 Colony picker automatic calibration method
CN106092058A (en) * 2016-08-25 2016-11-09 广东欧珀移动通信有限公司 The processing method of information data, device and terminal
CN109801300A (en) * 2017-11-16 2019-05-24 北京百度网讯科技有限公司 Coordinate extraction method, device, equipment and the computer readable storage medium of X-comers
CN108181323A (en) * 2017-12-15 2018-06-19 河南飞优驰网络科技有限公司 Integrated circuit plate fault detection method and system based on electromagnetic signature
CN108646727A (en) * 2018-05-14 2018-10-12 珠海市微半导体有限公司 A kind of vision cradle and its localization method and recharging method
CN108876850A (en) * 2018-06-29 2018-11-23 闽江学院 A kind of pcb board punching localization method
CN108972562A (en) * 2018-09-13 2018-12-11 河南机电职业学院 A method of calibration sorting machine people work station carrier chain CountsPerMeter parameter
CN114585875A (en) * 2019-10-11 2022-06-03 莱卡地球系统公开股份有限公司 Metering system
CN110893269A (en) * 2019-11-26 2020-03-20 北京新松融通机器人科技有限公司 Fire-fighting robot water-supply hose joint butt joint method and system based on visual measurement
CN111425183A (en) * 2020-02-24 2020-07-17 中铁第四勘察设计院集团有限公司 Geological exploration hole site positioning method and positioning robot
CN111425183B (en) * 2020-02-24 2023-12-08 中铁第四勘察设计院集团有限公司 Geological exploration hole site positioning method and positioning robot
CN111310712A (en) * 2020-03-04 2020-06-19 杭州晟元数据安全技术股份有限公司 Fast searching method based on fingerprint bag-of-words features
CN111310712B (en) * 2020-03-04 2024-02-13 杭州晟元数据安全技术股份有限公司 Quick searching method based on fingerprint word bag characteristics
CN111486802A (en) * 2020-04-07 2020-08-04 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN111486802B (en) * 2020-04-07 2021-04-06 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN112518323A (en) * 2020-11-25 2021-03-19 武汉耀皮康桥汽车玻璃有限公司 Automobile rear windshield processing device
CN112669382A (en) * 2020-12-30 2021-04-16 联想未来通信科技(重庆)有限公司 Image-based distance determination method and device
CN113400196A (en) * 2021-06-18 2021-09-17 西安奕斯伟硅片技术有限公司 Cleaning method, device and equipment for grinding fixed disc groove and computer storage medium

Also Published As

Publication number Publication date
CN100547351C (en) 2009-10-07

Similar Documents

Publication Publication Date Title
CN101033958A (en) Mechanical vision locating method
CN103759648B (en) A kind of complicated angle welding method for detecting position based on Binocular stereo vision with laser
CN102636120B (en) Visual servo secondary locating system for LED (light emitting diode) chip and locating method of visual servo secondary locating system
CN111537517A (en) Unmanned intelligent stamping defect identification method
CN105783711B (en) Three-dimensional scanner correction system and correction method thereof
CN105865344A (en) Workpiece dimension measuring method and device based on machine vision
KR101631841B1 (en) 3d vision inspection system
CN109829911B (en) PCB surface detection method based on contour out-of-tolerance algorithm
CN110146017B (en) Industrial robot repeated positioning precision measuring method
CN107092905B (en) Method for positioning instrument to be identified of power inspection robot
CN105023018A (en) Jet code detection method and system
Wang et al. Error analysis and improved calibration algorithm for LED chip localization system based on visual feedback
JPH09101125A (en) Article shape measuring method and device
CN113538583A (en) Method for accurately positioning position of workpiece on machine tool and vision system
JP2021193400A (en) Method for measuring artefact
JP6621351B2 (en) Image processing apparatus and image processing method for laser processing
US20190080468A1 (en) Positioning and measuring system based on image scale
CN115131268A (en) Automatic welding system based on image feature extraction and three-dimensional model matching
CN112697112A (en) Method and device for measuring horizontal plane inclination angle of camera
US7139421B1 (en) Methods and apparatuses for detecting similar features within an image
US6614926B1 (en) Methods and apparatuses for generating from an image a model of an object
CN105718929B (en) The quick round object localization method of high-precision and system under round-the-clock circumstances not known
JP6052871B2 (en) Object moving apparatus, method, program, and recording medium
JP2008294065A (en) Mounting method and mounting device for electronic component
CN110726402A (en) Laser point vision guiding method of non-orthogonal shafting laser total station

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091007

Termination date: 20100201