CN105118086B - 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment - Google Patents

3D point cloud Registration of Measuring Data method and system in 3D AOI equipment Download PDF

Info

Publication number
CN105118086B
CN105118086B CN201510382095.2A CN201510382095A CN105118086B CN 105118086 B CN105118086 B CN 105118086B CN 201510382095 A CN201510382095 A CN 201510382095A CN 105118086 B CN105118086 B CN 105118086B
Authority
CN
China
Prior art keywords
mrow
msub
mtd
image
mtr
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510382095.2A
Other languages
Chinese (zh)
Other versions
CN105118086A (en
Inventor
夏雨晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Meng Tuo Intelligent Technology Co., Ltd.
Original Assignee
DONGGUAN MENTO PHOTOELECTRIC TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DONGGUAN MENTO PHOTOELECTRIC TECHNOLOGY Co Ltd filed Critical DONGGUAN MENTO PHOTOELECTRIC TECHNOLOGY Co Ltd
Priority to CN201510382095.2A priority Critical patent/CN105118086B/en
Publication of CN105118086A publication Critical patent/CN105118086A/en
Application granted granted Critical
Publication of CN105118086B publication Critical patent/CN105118086B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The present invention relates to the 3D point cloud Registration of Measuring Data method in 3D AOI equipment.Comprise the following steps:Calculate the internal reference matrix of camera and the spin matrix of distortion parameter and the image;Width calculation camera according to the overlapping widths of default adjacent image and per piece image gathers mobile interval each during image, and gathers image sequence accordingly;Every piece image in image sequence is corrected and rotated;Translation matrix in sequence of computed images between each adjacent image, and each image is translated accordingly, translation parameters is preserved after translation, configuration file is formed;Each image in image sequence is spliced and merged according to configuration file and default overlapping widths, entire image is formed.The present invention is used only to correct the scaling board of camera parameter by script, reusing its high-precision feature is used for the error of correcting camera Mechanical Moving, so as to obtain the registration parameter required for 2D data and the splicing of 3D data, the whole plate image mosaic time is greatly reduced, splicing precision is improved.

Description

3D point cloud Registration of Measuring Data method and system in 3D-AOI equipment
Technical field
The present invention relates to AOI (Automatic Optic Inspection) apparatus field, more particularly to a kind of 3D- 3D point cloud Registration of Measuring Data method and system in AOI equipment.
Background technology
The whole plate image of traditional AOI circuitry plates is formed according to theoretical overlapping region direct splicing mostly.Tradition AOI equipment in image mosaic, the width of overlapping region is 0.I.e. traditional image mosaic is to be not provided with theoretical overlapping region , splice no fusion steps.In order to ensure the precision of splicing, high-precision motor and screw mandrel must be just used, while to peace The required precision of dress is also very high.This undoubtedly adds the holistic cost of equipment.
Simultaneously as the arrival of high-precision optical 3D technology, and due to the limitation of camera fields of view after configuration telecentric lens, be The overall detection of the error influence that splices image and 3D point cloud data and measurement accuracy.Traditional image split-joint method is compeled Be essential want to improve., must be constantly during whole plate scanning is carried out to circuit board in order to ensure higher accuracy of detection The 2D images collected and the 3D data calculated are corrected, are allowed to be completely superposed in theoretical overlapping region.
In traditional 3D point cloud data stitching algorithm, the cloud data for having overlapping subset to unknown visual angle is pursued mostly Carry out registration, splice to reconstruct 3D models, but there is certain rule in the most of data acquisition modes of industrial circle.It is existing 3D point cloud data splicing is the Mark marks that is, on testee surface mount using " auxiliary magnet to " mostly in industrial circle Label, strong angle point when splicing in the adjacent Point-clouds Registration of acquisition as data carries out registration.This method can not be realized The collection and processing of large batch of 3D data.
The content of the invention
The technical problems to be solved by the invention be to provide a kind of 3D point cloud Registration of Measuring Data method in 3D-AOI equipment and System, realizes that the splicing of 3D point cloud data does not use " auxiliary magnet to ".What the present invention was realized in:
A kind of 3D point cloud Registration of Measuring Data method in 3D-AOI equipment, comprises the following steps:
Step A:Piece image is gathered from round dot scaling board with camera, and utilizes the internal reference matrix of image calculating camera And distortion parameter, and the image spin matrix;The center of the image and some round dot center superposition in the image;
Step B:Width calculation camera according to the overlapping widths of default adjacent image and per piece image gathers image When each mobile interval, and the image in the round dot scaling board is gathered by the mobile intervening sequences, until the circle The IMAQ in all regions is finished in point scaling board, forms image sequence;
Step C:According to the internal reference matrix, distortion parameter and spin matrix to every piece image in described image sequence It is corrected and rotates;
Step D:A width is selected as benchmark image from the image sequence after corrected and rotation, and is translated Parameter zero setting;Meanwhile, calculate the translation matrix between adjacent two images in the image sequence after corrected and rotation, and according to This is translated to each image in described image sequence, while preserving the translation parameters of each image, forms configuration file;
Step E:Each image in described image sequence is spliced according to the configuration file and the overlapping widths And fusion, form entire image.
Further, in the step D, translation matrix is:
N=k*NX+(1-k)*NY(0<k<1);
Wherein k is weight coefficient, and N is translation matrix, NXFor the translation matrix of the image of X-direction, NYFor Y direction The translation matrix of image, tx、t′xFor along the translation pixel distance of X-axis, ty、t′yFor along the translation pixel distance of Y-axis.
Further, twice of the overlapping widths less than the pixel distance between adjacent round dot center.
Further, also comprise the following steps:Other images are spliced and merged according to the configuration file.
Further, in the step D, the first width in selection described image sequence is as benchmark image, to follow-up each Image is translated;It is previous with it according to each image on the basis of the preceding piece image of each image when being translated to each image Translation matrix between width image translates each image.
A kind of 3D point cloud Registration of Measuring Data system in 3D-AOI equipment, including:
Camera calibration module, for gathering piece image from round dot scaling board by camera, and is calculated using the image The internal reference matrix and distortion parameter of camera, and the image spin matrix;The center of the image and some circle in the image Dot center overlaps;
Image sequence acquisition module, the width gauge for the overlapping widths according to default adjacent image and per piece image Mobile interval each during camera collection image is calculated, and gathers by the mobile intervening sequences figure in the round dot scaling board Picture, until the IMAQ in all regions in the round dot scaling board is finished, forms image sequence;
Image correction module, for according to the internal reference matrix, distortion parameter and spin matrix in described image sequence Every piece image be corrected and rotate;
Translation parameters configuration module, for selecting a width as benchmark from the image sequence after corrected and rotation Image, and by its translation parameters zero setting;Meanwhile, calculate in the image sequence after corrected and rotation between adjacent two images Translation matrix, and each image in described image sequence is translated accordingly, while preserving the translation parameters of each image, shape Into configuration file;
Splice Fusion Module, for according to the configuration file and the overlapping widths to each figure in described image sequence As being spliced and being merged, entire image is formed.
Further, in the translation parameters configuration module, translation matrix is: N=k*NX+(1-k)*NY(0<k<1);
Wherein k is weight coefficient, and N is translation matrix, NXFor the translation matrix of the image of X-direction, NYFor Y direction The translation matrix of image, tx、t′xFor along the translation pixel distance of X-axis, ty、t′yFor along the translation pixel distance of Y-axis.
Further, twice of the overlapping widths less than the pixel distance between adjacent round dot center.
Further, the splicing Fusion Module is additionally operable to that other images are spliced and melted according to the configuration file Close.
Further, in the translation parameters configuration module, the first width in selection described image sequence is used as reference map Subsequent images are translated by picture;When being translated to each image, on the basis of the preceding piece image of each image, according to each Translation matrix between image and its preceding piece image translates each image.
Compared with prior art, the present invention is used only to correct the scaling board of camera parameter by script, reuses it High-precision feature is used for the error of correcting camera Mechanical Moving, so as to obtain the registration required for 2D data and the splicing of 3D data Parameter, greatly reduces the whole plate image mosaic time, improves splicing precision.
Brief description of the drawings
Fig. 1:3D point cloud Registration of Measuring Data method flow schematic diagram in 3D-AOI equipment provided in an embodiment of the present invention;
Fig. 2:Camera parameter demarcation gray level image used;
Fig. 3:Single image origin centre mark;
Fig. 4:Image translation parameter sample calculation;
Fig. 5:The spliced whole plate image of scaling board;
Fig. 6:3D point cloud Registration of Measuring Data system composition schematic diagram in 3D-AOI equipment provided in an embodiment of the present invention.
Embodiment
In order to make the purpose , technical scheme and advantage of the present invention be clearer, it is right below in conjunction with drawings and Examples The present invention is further elaborated.
3D-AOI device-specifics are in circuit board progress 3D optical detections.Present invention is primarily intended to provide 3D-AOI equipment In used 3D point cloud data method for registering.Shown in Fig. 1, the 3D point cloud number in 3D-AOI equipment provided in an embodiment of the present invention Comprise the following steps according to method for registering:
Step A:Piece image is gathered from round dot scaling board with camera, and utilizes the internal reference matrix of image calculating camera And distortion parameter, and the image spin matrix;The center of the image and some round dot center superposition in the image.
Wherein, internal reference matrix:
Radial distortion:
Tangential distortion:
Spin matrix:
Wherein q is the pixel coordinate of Circle in Digital Images dot center, and M is internal reference matrix, and Q is that round dot world coordinates is (actual to sit Mark), k1、k2、p1、p2、k3For distortion parameter.xcorrected、ycorrectedFor coordinate after correction, Rz(θ) is spin matrix, and θ is around Z The axle anglec of rotation.
Step B:Width calculation camera according to the overlapping widths of default adjacent image and per piece image gathers image When each mobile interval, and the image in round dot scaling board is gathered by mobile intervening sequences, until institute in round dot scaling board The IMAQ for having region is finished, and forms image sequence.
Step C:Every piece image in image sequence is corrected according to internal reference matrix, distortion parameter and spin matrix And rotation.
Step D:A width is selected as benchmark image from the image sequence after corrected and rotation, and is translated Parameter zero setting;Meanwhile, calculate the translation matrix between adjacent two images in the image sequence after corrected and rotation, and according to This is translated to each image in image sequence, while preserving the translation parameters of each image, forms configuration file.The step In D, the first width in selection described image sequence is translated as benchmark image to subsequent images;Each image is carried out During translation, on the basis of the preceding piece image of each image, translated according to the translation matrix between each image and its preceding piece image Each image.Translation matrix is:
N=k*NX+(1-k)*NY(0<k<1).Wherein, k is weight coefficient, and N is translation matrix, NXFor the figure of X-direction The translation matrix of picture, NYFor the translation matrix of the image of Y direction, tx、t′xFor along the translation pixel distance of X-axis, ty、t′yFor edge The translation pixel distance of Y-axis.
Step E:Each image in image sequence is spliced and merged according to configuration file and overlapping widths, is formed whole Width image.The 2D images and configuration file of scaling board have been obtained by step A to D.In actual applications, the 2D images of circuit board It can be directly corrected according to configuration file, and be spliced and merged, corresponding 3D point cloud data then can be after calibration In direct sequence of computed images in the cloud data of the adjacent image of physical location overlapping region difference in height average, then by number Identical height is moved to according to along Z axis, then will be spliced after the overlapping region fusion of 3D point cloud data.
It is the specific embodiment that image mosaic is carried out using the present invention below:
The systematic error of correcting camera first before image registration.In step A, the world of camera calibration is set to sit for convenience Mark, gathers the image that some round dot center is overlapped with picture centre in a round dot scaling board, gray-scale map as shown in Figure 2 first .The resolution ratio of camera collection image is 900*800, then the center point coordinate of the image of camera collection is (450,400).Will Image center is set to the origin of coordinates, and the origin of coordinates of the round dot center pixel coordinate obtained after image procossing is moved into central point Coordinate.And actual range between round dot, it is known that then correspond to the world coordinates value at round dot center for this apart from corresponding pixel away from From the integral multiple of (being 125 in Fig. 2).Pixel coordinate 35 is obtained according to Fig. 2, along with corresponding world coordinates.By general Camera calibration method can be obtained by the internal reference matrix and distortion parameter and spin matrix of camera.
Step B and step C completes the correction of the single picture in demarcation or actual application.Fig. 3 is that single image is former Dot center marks.The rough registration of view data is would have been completed when theory X, Y-axis overlapping widths are set when camera is moved. In the AOI equipment of some model, overlapping widths are necessarily less than twice of the pixel distance between adjacent round dot center, and this is ensured that Overlapping region has and only a row or column round dot is used to draw translation matrix.Once some overlapping widths parameter is have selected, after This parameter is not modifiable in continuous use.It is respectively 100 pixel distances, i.e. overlapping widths that X, Y-axis are overlapping in this exemplary plot For the width of 100 pixels.It is to give camera Mechanical Moving using first pictures as the origin of image registration in step S4 Origin and circuit board general image collection when circuit board placing origin.
Step D is then to realize accuracy registration.Since the image sequence origin, according to acquisition order, each image with Preceding piece image in image sequence has overlapping region.The upper left corner is image subject to registration in Fig. 4, and figure below and right figure are image sequence (" left figure " as described herein, " figure below " and " right figure " refers to upper left corner image, figure below in Fig. 4 to configured image in row respectively And right figure).By taking left images registration as an example, all complete round dot and its pixel are obtained by processing first in right figure in Fig. 4 Coordinate.Removing has default row or column, obtains a HnRow WnThe coordinate set of row.This coordinate set is:
{(xij,yij)|0≤j≤Wn-1,0≤i≤Hn-1}。
The coordinate for choosing wherein row k calculates step-length X-direction step-length Wdx from right point coordinates of turning left, Y direction step-length Wdy.Wherein WnFor:
If first row xi0<100 (this default overlapping widths of 100 expression is the distance of 100 pixels), then directly preserve Otherwise first column data is predicted, until x by step-length to the left as the angle point of matching with the first column datar<Untill 100, I.e.:
Thus the angle point for obtaining the matching of right figure is:
{(xr,yr)|0≤r≤Hn-1};X is possible in this column datar<0。
The coordinate set that left side image subject to registration is obtained is:
{(x′ij,y′ij)|0≤j≤W′n-1,0≤i≤H′n-1}。
With formula (e) by last row coordinate points move to under right figure identical coordinate system.Step parameter Wdx ', Wdy ' Formula (a) (b) is substituted into left figure coordinate set data can be calculated.Found to the right with formula (f) again and angle in registering right figure A closest row are put as the angle point of left figure matching.
Angle point in left figure and right figure is matched in the way of arest neighbors, is obtained:
{(xm,ym,x′m,y′m)|0≤m≤min(H′n,Hn)}。
Wherein, (x 'm,y′m)∈{(xl,yl)},(xm,ym)∈{(xr,yr)}.On the basis of right figure angle point, you can calculate a left side Map migration parameter tx、ty
The offset parameter t ' with registering image and figure below can similarly be calculatedx、t′y.Then translation matrix is finally tried to achieve:
N=k*NX+(1-k)*NY(0<k<1).This image is corrected according to the translation matrix tried to achieve.
After all scaling board whole plate image rectifications are finished, correction parameter is preserved into configuration file.After Fig. 5 is correction The whole plate image of splicing, pixel count is 16900*12700.Camera often moves the acquired image that moves a step and led in actual applications Cross after step A is corrected, there is corresponding translation parameters to be corrected.The time of registration is eliminated when data are spliced.
As shown in fig. 6, based on above-mentioned bearing calibration, present invention also offers the 3D point cloud data in a kind of 3D-AOI equipment Registration arrangement.The system includes camera calibration module 1, image sequence acquisition module 2, image correction module 3, translation parameters configuration Module 4 and splicing Fusion Module 5.
Camera calibration module 1 is used to gather piece image from round dot scaling board by camera, and is calculated using the image The internal reference matrix and distortion parameter of camera, and the image spin matrix;The center of the image and some circle in the image Dot center overlaps.
Image sequence acquisition module 2 is used for the overlapping widths according to default adjacent image and the width gauge per piece image Mobile interval each during camera collection image is calculated, and the image in round dot scaling board is gathered by mobile intervening sequences, until The IMAQ in all regions is finished in round dot scaling board, forms image sequence.
Image correction module 3 is used for according to internal reference matrix, distortion parameter and spin matrix to each width in image sequence Image is corrected and rotated.
Translation parameters configuration module 4 is used to select a width as benchmark from the image sequence after corrected and rotation Image, and by its translation parameters zero setting;Meanwhile, calculate in the image sequence after corrected and rotation between adjacent two images Translation matrix, and each image in image sequence is translated accordingly, while preserving the translation parameters of each image, formation is matched somebody with somebody Put file.Select the first width in described image sequence as benchmark image, subsequent images are translated;Each image is entered During row translation, on the basis of the preceding piece image of each image, put down according to the translation matrix between each image and its preceding piece image Move each image.
Splicing Fusion Module 5 be used for each image in image sequence is spliced according to configuration file and overlapping widths and Fusion, forms entire image.
In translation parameters configuration module 4, translation matrix is: N=k*NX+(1-k)*NY(0<k<1).Wherein k is weight coefficient, and N is translation matrix, NXFor X-direction Image translation matrix, NYFor the translation matrix of the image of Y direction, tx、t′xFor along the translation pixel distance of X-axis, ty、t′y For along the translation pixel distance of Y-axis.
The overlapping widths are less than twice of the pixel distance between adjacent round dot center.In the present embodiment, overlapping widths For the width of 100 pixels.Splicing Fusion Module 5 is additionally operable to that other images are spliced and merged according to configuration file.Number It is divided into two parts according to splicing:Elder generation is registering, merge again.Configuration file includes X, the config files of Y-axis.Overlapping region serves as figure The direct fusion of picture data (RGB), that is, realize the seamless spliced of view data.Because 2D view data only has X, Y-axis data, So data splicing can be directly realized by.3D data (also known as altitude information or cloud data) have the number of axle evidence of X, Y, Z tri-, deposit The configuration file of storage can realize registration.Overlapping region can be used to make the Z axis fusion of view data, calculate two width for needing to merge The average height difference D of the overlapping region of image, then cause the second width image integrally subtract D (the second width image is withdrawn into and The sustained height of piece image), then directly merged with piece image again, that is, realize the seamless spliced of view data.This Method is corrected with image, the configuration file drawn can be used for 2D images splicing, while for 3D data registration and Fusion
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention Any modifications, equivalent substitutions and improvements made within refreshing and principle etc., should be included in the scope of the protection.

Claims (4)

1. a kind of 3D point cloud Registration of Measuring Data method in 3D-AOI equipment, it is characterised in that comprise the following steps:
Step A:Gather piece image from round dot scaling board with camera, and the internal reference matrix of camera is calculated and abnormal using the image Variable element, and the image spin matrix;The center of the image and some round dot center superposition in the image;
Step B:Width calculation camera according to the overlapping widths of default adjacent image and per piece image gathers every during image Secondary mobile interval, and the image in the round dot scaling board is gathered by the mobile intervening sequences, until the round dot mark The IMAQ in all regions is finished in fixed board, formed image sequence, wherein the overlapping widths be less than adjacent round dot center it Between twice of pixel distance;
Step C:Every piece image in described image sequence is carried out according to the internal reference matrix, distortion parameter and spin matrix Correction and rotation;
Step D:A width is selected as benchmark image from the image sequence after corrected and rotation, and by its translation parameters Zero setting;Meanwhile, the translation matrix between adjacent two images in the image sequence after corrected and rotation is calculated, and it is right accordingly Each image in described image sequence is translated, while preserving the translation parameters of each image, forms configuration file;
Step E:Each image in described image sequence is spliced and melted according to the configuration file and the overlapping widths Close, form entire image;
In the step D, the first width in selection described image sequence is translated as benchmark image to subsequent images; When being translated to each image, on the basis of the preceding piece image of each image, according between each image and its preceding piece image Translation matrix translates each image;
Since the image sequence origin, according to acquisition order, the preceding piece image in each image and image sequence has Overlapping region, obtains all complete origins and its pixel coordinate, removing has default row or column, obtains to right figure by processing One HnRow WnThe coordinate set of row, the coordinate set is:
{(xij,yij)|0≤j≤Wn-1,0≤i≤Hn-1};
The coordinate for choosing wherein row k calculates step-length X-direction step-length Wdx, the Y direction step-length Wdy for point coordinates of being turned left from the right side, Wherein, Wdx and Wdy are respectively:
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>W</mi> <mi>d</mi> <mi>x</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>1</mn> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>k</mi> <mi>j</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>k</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mfrac> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>&amp;le;</mo> <mi>j</mi> <mo>&amp;le;</mo> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>a</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>W</mi> <mi>d</mi> <mi>y</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>1</mn> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>k</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>k</mi> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mfrac> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>&amp;le;</mo> <mi>j</mi> <mo>&amp;le;</mo> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>b</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
If first row xi0< 100, the numerical value 100 represents distance of the default overlapping widths for 100 pixels, then directly protects The first column data is deposited as the angle point of matching, is otherwise predicted to the left by step-length with the first column data, until xr< 100 is Only, i.e.,:
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>x</mi> <mrow> <mi>i</mi> <mn>0</mn> </mrow> </msub> <mo>&lt;</mo> <mn>100</mn> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>-</mo> <mi>W</mi> <mi>d</mi> <mi>x</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>-</mo> <mi>W</mi> <mi>d</mi> <mi>y</mi> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
The angle point for obtaining the matching of the right figure is:
{(xr,yr)|0≤r≤Hn-1};
The coordinate set that the image subject to registration is obtained is:
{(x′ij,y′ij)|0≤j≤W′n-1,0≤i≤H′n-1};
With formula (e) by last row coordinate points move to under the right figure identical coordinate system, step parameter Wdx ', Wdy ' Formula (a) (b) is substituted into the image coordinate collection data subject to registration to can be calculated, then is found to the right with formula (f) with having matched somebody with somebody The accurate right figure, the closest row of angle point as the images match subject to registration angle point;
<mrow> <mtable> <mtr> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>=</mo> <msubsup> <mi>x</mi> <mrow> <mi>l</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>W</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>800</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>l</mi> </msub> <mo>=</mo> <msubsup> <mi>y</mi> <mrow> <mi>l</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>W</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>800</mn> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>l</mi> <mo>&amp;le;</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>e</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>&gt;</mo> <mi>W</mi> <mi>d</mi> <mi>x</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>+</mo> <msup> <mi>Wdx</mi> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>+</mo> <msup> <mi>Wdy</mi> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>f</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Angle point in the right figure and the image subject to registration is matched in the way of arest neighbors, is obtained:
{(xm,ym,x′m,y′m)|0≤m≤min(H′n,Hn)};
Wherein, (x 'm,y′m)∈{(xl,yl)},(xm,ym)∈{(xr,yr), on the basis of the angle point of the right figure, calculate described The offset parameter t of image subject to registrationx、ty
<mrow> <mtable> <mtr> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>t</mi> <mi>x</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>0</mn> <mi>m</mi> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>m</mi> </msub> <mo>=</mo> <msubsup> <mi>x</mi> <mi>m</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mi>m</mi> </mfrac> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>t</mi> <mi>y</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>0</mn> <mi>m</mi> </msubsup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>m</mi> </msub> <mo>=</mo> <msubsup> <mi>y</mi> <mi>m</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mi>m</mi> </mfrac> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>m</mi> <mo>&amp;le;</mo> <mi>m</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>,</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>;</mo> </mrow>
The offset parameter t ' of the image subject to registration and figure below can similarly be calculatedx、t′y
The right figure and the preceding piece image that described figure below is the image subject to registration;
In the step D, translation matrix is:N=k ' * NX+ (1-k′)*NY(0 < k ' < 1);
Wherein k ' is weight coefficient, and N is the translation matrix, NXFor the translation matrix of the image of X-direction, NYFor Y direction The translation matrix of image, tx、t′xFor along the translation pixel distance of X-axis, ty、t′yFor along the translation pixel distance of Y-axis.
2. the 3D point cloud Registration of Measuring Data method in 3D-AOI equipment as claimed in claim 1, it is characterised in that also including as follows Step:Other images are spliced and merged according to the configuration file.
3. the 3D point cloud Registration of Measuring Data system in a kind of 3D-AOI equipment, it is characterised in that including:
Camera calibration module, for gathering piece image from round dot scaling board by camera, and calculates camera using the image Internal reference matrix and distortion parameter, and the image spin matrix;In some round dot in the center of the image and the image The heart is overlapped;
Image sequence acquisition module, the width calculation phase for the overlapping widths according to default adjacent image and per piece image Machine gathers mobile interval each during image, and gathers by the mobile intervening sequences image in the round dot scaling board, Until the IMAQ in all regions in the round dot scaling board is finished, image sequence is formed, wherein the overlapping widths are less than Twice of pixel distance between adjacent round dot center;
Image correction module, for according to the internal reference matrix, distortion parameter and spin matrix to every in described image sequence Piece image is corrected and rotated;
Translation parameters configuration module, for selecting a width as reference map from the image sequence after corrected and rotation Picture, and by its translation parameters zero setting;Meanwhile, calculate in the image sequence after corrected and rotation between adjacent two images Translation matrix, and each image in described image sequence is translated accordingly, while preserving the translation parameters of each image, formed Configuration file;
Splice Fusion Module, for entering according to the configuration file and the overlapping widths to each image in described image sequence Row splicing and fusion, form entire image;
In the translation parameters configuration module, the first width in selection described image sequence is as benchmark image, to follow-up each figure As being translated;When being translated to each image, on the basis of the preceding piece image of each image, according to each image and its previous width Translation matrix between image translates each image;
Since the image sequence origin, according to acquisition order, the preceding piece image in each image and image sequence has Overlapping region, obtains all complete round dots and its pixel coordinate, removing has default row or column, obtains to right figure by processing One HnRow WnThe coordinate set of row, the coordinate set is:
{(xij,yij)|0≤j≤Wn-1,0≤i≤Hn-1};
The coordinate for choosing wherein row k calculates step-length X-direction step-length Wdx, the Y direction step-length Wdy for point coordinates of being turned left from the right side, Wherein, Wdx and Wdy are respectively:
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>W</mi> <mi>d</mi> <mi>x</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>1</mn> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>k</mi> <mi>j</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>k</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mfrac> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>&amp;le;</mo> <mi>j</mi> <mo>&amp;le;</mo> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>a</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>W</mi> <mi>d</mi> <mi>y</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>1</mn> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>k</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>k</mi> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mrow> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mfrac> </mrow> </mtd> <mtd> <mrow> <mn>1</mn> <mo>&amp;le;</mo> <mi>j</mi> <mo>&amp;le;</mo> <msub> <mi>W</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>b</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
If first row xi0< 100, the numerical value 100 represents distance of the default overlapping widths for 100 pixels, then directly protects The first column data is deposited as the angle point of matching, is otherwise predicted to the left by step-length with the first column data, until xr< 100 is Only, i.e.,:
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>x</mi> <mrow> <mi>i</mi> <mn>0</mn> </mrow> </msub> <mo>&lt;</mo> <mn>100</mn> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>-</mo> <mi>W</mi> <mi>d</mi> <mi>x</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>-</mo> <mi>W</mi> <mi>d</mi> <mi>y</mi> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>d</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
The angle point for obtaining the matching of the right figure is:
{(xr,yr)|0≤r≤Hn-1};
The coordinate set that the image subject to registration is obtained is:
{(x′ij,y′ij)|0≤j≤W′n-1,0≤i≤H′n-1};
With formula (e) by last row coordinate points move to under the right figure identical coordinate system, step parameter Wdx ', Wdy ' With the image coordinate collection data subject to registration formula (a) (b) is substituted into can be calculated, then found to the right with formula (f) with it is registering The right figure, angle point it is closest one row as the images match subject to registration angle point;
<mrow> <mtable> <mtr> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>=</mo> <msubsup> <mi>x</mi> <mrow> <mi>l</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>W</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>800</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>l</mi> </msub> <mo>=</mo> <msubsup> <mi>y</mi> <mrow> <mi>l</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>W</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </mrow> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>800</mn> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>l</mi> <mo>&amp;le;</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>e</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>i</mi> <mi>f</mi> </mrow> </mtd> <mtd> <mrow> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>&gt;</mo> <mi>W</mi> <mi>d</mi> <mi>x</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>+</mo> <msup> <mi>Wdx</mi> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> <mo>+</mo> <msup> <mi>Wdy</mi> <mo>&amp;prime;</mo> </msup> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>r</mi> <mo>&amp;le;</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>-</mo> <mn>1</mn> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mi>f</mi> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Angle point in the right figure and the image subject to registration is matched in the way of arest neighbors, is obtained:
{(xm,ym,x′m,y′m)|0≤m≤min(H′n,Hn)};
Wherein, (x 'm,y′m)∈{(xl,yl)},(xm,ym)∈{(xr,yr), on the basis of the angle point of the right figure, calculate described The offset parameter t of image subject to registrationx、ty
<mrow> <mtable> <mtr> <mtd> <mtable> <mtr> <mtd> <mrow> <msub> <mi>t</mi> <mi>x</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>0</mn> <mi>m</mi> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>m</mi> </msub> <mo>=</mo> <msubsup> <mi>x</mi> <mi>m</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mi>m</mi> </mfrac> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>t</mi> <mi>y</mi> </msub> <mo>=</mo> <mfrac> <mrow> <mo>&amp;lsqb;</mo> <msubsup> <mi>&amp;Sigma;</mi> <mn>0</mn> <mi>m</mi> </msubsup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mi>m</mi> </msub> <mo>=</mo> <msubsup> <mi>y</mi> <mi>m</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mi>m</mi> </mfrac> </mrow> </mtd> </mtr> </mtable> </mtd> <mtd> <mrow> <mn>0</mn> <mo>&amp;le;</mo> <mi>m</mi> <mo>&amp;le;</mo> <mi>m</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msubsup> <mi>H</mi> <mi>n</mi> <mo>&amp;prime;</mo> </msubsup> <mo>,</mo> <msub> <mi>H</mi> <mi>n</mi> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>;</mo> </mrow>
The offset parameter t ' of the image subject to registration and figure below can similarly be calculatedx、t′y
The right figure and the preceding piece image that described figure below is the image subject to registration;
In the translation parameters configuration module, translation matrix is: N=k ' * NX+(1-k′)*NY(0 < k ' < 1);
Wherein k ' is weight coefficient, and N is the translation matrix, NXFor the translation matrix of the image of X-direction, NYFor Y direction The translation matrix of image, tx、t′xFor along the translation pixel distance of X-axis, ty、t′yFor along the translation pixel distance of Y-axis.
4. the 3D point cloud Registration of Measuring Data system in 3D-AOI equipment as claimed in claim 3, it is characterised in that the splicing is melted Matched moulds block is additionally operable to that other images are spliced and merged according to the configuration file.
CN201510382095.2A 2015-06-30 2015-06-30 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment Active CN105118086B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510382095.2A CN105118086B (en) 2015-06-30 2015-06-30 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510382095.2A CN105118086B (en) 2015-06-30 2015-06-30 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment

Publications (2)

Publication Number Publication Date
CN105118086A CN105118086A (en) 2015-12-02
CN105118086B true CN105118086B (en) 2017-09-29

Family

ID=54666059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510382095.2A Active CN105118086B (en) 2015-06-30 2015-06-30 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment

Country Status (1)

Country Link
CN (1) CN105118086B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469085B (en) * 2015-12-23 2019-03-26 广州视源电子科技股份有限公司 Board card image acquisition method and system
CN109031339B (en) * 2018-05-23 2020-11-10 北京控制工程研究所 Three-dimensional point cloud motion compensation method
CN109345597B (en) * 2018-09-27 2020-08-25 四川大学 Camera calibration image acquisition method and device based on augmented reality
CN109509226B (en) * 2018-11-27 2023-03-28 广东工业大学 Three-dimensional point cloud data registration method, device and equipment and readable storage medium
CN110298799B (en) * 2019-06-25 2021-02-23 福建工程学院 PCB image positioning correction method
CN111982015B (en) * 2020-08-18 2022-06-21 深圳大学 Method for monitoring structure geometry
CN112286972B (en) * 2020-11-06 2024-02-20 精华电子(苏州)有限公司 AOI tracing system
CN112750169B (en) * 2021-01-13 2024-03-19 深圳瀚维智能医疗科技有限公司 Camera calibration method, device, system and computer readable storage medium
CN114638824B (en) * 2022-05-11 2022-08-23 成都数联云算科技有限公司 Fusion method, device, equipment and medium for collecting images based on AOI equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341439A (en) * 1989-09-21 1994-08-23 Hsu Shin Yi System for texture-based automatic detection of man-made objects in representations of sensed natural environmental scenes
CN101276465B (en) * 2008-04-17 2010-06-16 上海交通大学 Method for automatically split-jointing wide-angle image
CN101710932B (en) * 2009-12-21 2011-06-22 华为终端有限公司 Image stitching method and device

Also Published As

Publication number Publication date
CN105118086A (en) 2015-12-02

Similar Documents

Publication Publication Date Title
CN105118086B (en) 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment
CN108648240B (en) Non-overlapping view field camera attitude calibration method based on point cloud feature map registration
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN106683173B (en) A method of Three-dimensional Gravity is improved based on neighborhood Block- matching and is laid foundations the dense degree of cloud
CN108648232B (en) Binocular stereoscopic vision sensor integrated calibration method based on precise two-axis turntable
CN108510530B (en) Three-dimensional point cloud matching method and system
CN102376089B (en) Target correction method and system
CN100470590C (en) Camera calibration method and calibration apparatus thereof
CN106529587B (en) Vision course recognition methods based on object detection
US20170287166A1 (en) Camera calibration method using a calibration target
CN108921901A (en) A kind of big visual field camera calibration method based on accurate two-axis platcform and laser tracker
CN109272574B (en) Construction method and calibration method of linear array rotary scanning camera imaging model based on projection transformation
CN110443879B (en) Perspective error compensation method based on neural network
CN106871787A (en) Large space line scanning imagery method for three-dimensional measurement
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN105823416A (en) Method for measuring object through multiple cameras and device thereof
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN108716890A (en) A kind of high-precision size detecting method based on machine vision
CN113920205B (en) Calibration method of non-coaxial camera
CN110223355B (en) Feature mark point matching method based on dual epipolar constraint
CN109961485A (en) A method of target positioning is carried out based on monocular vision
CN106996748A (en) Wheel diameter measuring method based on binocular vision
CN109584310B (en) Splicing method for large object surface shape measurement based on verticality constraint
CN104167001B (en) Large-visual-field camera calibration method based on orthogonal compensation
CN114705122A (en) Large-field stereoscopic vision calibration method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 523000 Second Floor, Executive Building No. 1 Minxing Road, Chang'an Town Corner Community, Dongguan City, Guangdong Province

Patentee after: Dongguan Meng Tuo Intelligent Technology Co., Ltd.

Address before: 523000 4th Floor, 118 Guanchang Road, Yongtou, Chang'an Town, Dongguan City, Guangdong Province

Patentee before: Dongguan MENTO Photoelectric Technology Co., Ltd.

CP03 Change of name, title or address