CN107726975A - A kind of error analysis method of view-based access control model stitching measure - Google Patents

A kind of error analysis method of view-based access control model stitching measure Download PDF

Info

Publication number
CN107726975A
CN107726975A CN201710853804.XA CN201710853804A CN107726975A CN 107726975 A CN107726975 A CN 107726975A CN 201710853804 A CN201710853804 A CN 201710853804A CN 107726975 A CN107726975 A CN 107726975A
Authority
CN
China
Prior art keywords
mrow
msub
coordinate system
error
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710853804.XA
Other languages
Chinese (zh)
Other versions
CN107726975B (en
Inventor
刘巍
兰志广
张洋
张致远
邸宏图
逯永康
马建伟
贾振元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201710853804.XA priority Critical patent/CN107726975B/en
Publication of CN107726975A publication Critical patent/CN107726975A/en
Application granted granted Critical
Publication of CN107726975B publication Critical patent/CN107726975B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/93Detection standards; Calibrating baseline adjustment, drift correction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A kind of error analysis method of view-based access control model stitching measure of the present invention belongs to computer vision measurement technical field, is related to a kind of error analysis method of view-based access control model stitching measure.This method is based on laser tracker and binocular vision system carries out stitching measure, multiple common points are arranged in its public view field first, binocular camera gathers image and extracts the pixel coordinate of image, and laser tracker gathers the coordinate of each common point simultaneously, and this coordinate value is under world coordinate system.Calculate an influence for the external parameter matrix of pixel error of extraction, calculate the error of outer ginseng matrix again influences and puts coordinate value error influence of the coordinate pair point under world coordinate system under visual coordinate system to the coordinate value error put under world coordinate system, finally obtains composition error of the tested point under world coordinate system.This method analysis process is simple, and error propagation chain is clear;Optimize the layout of common point according to the error analysis, improve the overall precision of measuring system.

Description

A kind of error analysis method of view-based access control model stitching measure
Technical field
The invention belongs to computer vision measurement technical field, is related to a kind of error analysis side of view-based access control model stitching measure Method
Background technology
With the continuous improvement of the parts manufacture level such as aerospace field, auto industry, essence of the people to parts Degree requires more and more higher.Include three coordinate machine method, laser radar method, indoor GPS for the traditional measuring method of these parts Method etc., the machine vision method developed in recent years also extensively should due to having the advantages that non-contact, measuring speed is fast, precision is high For Aero-Space and auto industry field.This method is a little sat by extracting the pixel coordinate of image shot by camera to ask in vision The lower three-dimensional coordinate of mark system, then using visual coordinate system to world coordinate system transition matrix by point under visual coordinate system three Dimension coordinate is changed to world coordinate system, completes DATA REASONING splicing.Due to error be present when the pixel coordinate of point extracts, this Error will directly affect coordinate value of the maximal end point under world coordinate system, that is, influence measurement accuracy a little, therefore carrying on point Take error for the research that final three-dimensional coordinate influences for improving Instrument measuring precision and ensureing that the quality of parts has weight The meaning wanted.
By literature search, Chinese invention patent number:CN 104729534 A, Tan Qimeng, Li Jingdong, Hu Chengwei et al. are invented " the monocular vision error measuring system and limits of error quantization method of cooperative target " patent of invention propose a kind of monocular vision Error measuring system, analyze calibration of camera internal parameters error, visual indicia point three dimensional space coordinate value obtain error and its Two-dimensional coordinate position error in image is marked, is capable of the source of view measurement error, there is important meaning for Error Tracing & Justice, but in the patent and these errors are not specified to the quantitative effect relation of the three-dimensional coordinate error of maximal end point.Chinese invention The patent No.:" a kind of stereoscopic vision relative measurement system mistake of CN 106323337 A, Liu Zongming, Zhang Yu, Cao Shuqing et al. invention The patent of invention labor of poor analysis method " image characteristics extraction precision, focal length stated accuracy, rotation and translation matrix mark Determine the composition error that precision is measured space three-dimensional target point, but only analyze the error of vision measurement system in itself, can not During for large parts vision stitching measure, the error analysis of whole system.
The content of the invention
The present invention is the defects of overcoming prior art, invents a kind of error analysis method of view-based access control model stitching measure, should Method is described in detail influence of the pixel extraction error to each calculating link error, quantified by the solution procedure of point coordinates The pixel extraction error that analyzes the precision of maximal end point three-dimensional coordinate under world coordinate system is influenceed, calculation formula is simple, easily It is significant in realization, the analysis and raising and system layout for vision splicing system precision.
The technical solution adopted by the present invention is a kind of error analysis method of view-based access control model stitching measure, it is characterized in that, should Method is based on laser tracker and binocular vision system carries out stitching measure, and multiple common points are arranged in its public view field, double Mesh camera gathers image and extracts the pixel coordinate of image, and the pixel error of quantitative analysis extraction is sat to putting under world coordinate system The influence of scale value error;A picture for extraction is calculated according to the three-dimensional reconstruction formula of visual coordinate point and Formula of Coordinate System Transformation first The influence of the plain external parameter matrix of error, the error of outer ginseng matrix is then calculated to putting the coordinate value error under world coordinate system Coordinate value error of the coordinate pair point under world coordinate system for influenceing and putting under visual coordinate system influences, and finally obtains tested point Composition error under world coordinate system;This method comprises the following steps that:
The first step builds the binocular vision splicing measuring systems based on laser tracker, establishes coordinate system;
First, left and right camera 3,5 is separately fixed on left and right camera bearing 2,4, then respectively by left and right camera bearing It is fixed in the crossbeam arranged on left and right sides of tripod 1;Put down using the photocentre of left camera 3 as the origin of local coordinate system, camera imaging The u direction in face is x directions, and optical axis direction is z-axis direction, establishes right-handed coordinate system;Connection is arranged on swashing on laser turntable Optical tracker system gauge head 8, laser tracker coordinate system is established as global coordinate system;Arranged on measured object 6 in public view field Multiple common points 7, image is gathered by binocular camera, and extract the pixel coordinate of image, pass through the reconstruction formula of binocular vision point The coordinate a little under visual coordinate system is tried to achieve, laser tracker gathers the coordinate of each common point simultaneously, and this coordinate value is alive Under boundary's coordinate system, visual coordinate system is asked for the transition matrix of world coordinate system;
Second step visual coordinate system calculates to the random error of world coordinate system transition matrix
1) the pixel extraction error calculated first a little causes the error of a little three-dimensional coordinate under visual coordinate system;
By the pixel coordinate (u of known left and right camera1,v1) and (u2,v2) and calibration result, three-dimensional a little is calculated Coordinate (xv,yv,zv):
Wherein(u01,v01) it is left camera Principal point coordinate, (u02,v02) be right camera principal point coordinate, fx1、fy1For the equivalent focal length of left camera, fx2、fy2For right camera Equivalent focal length,For the transition matrix parameter of left camera to right camera.According to above-mentioned three-dimensional reconstruction Formula, when extraction pixel has error in the picture, this pixel error will be delivered to by formula (1) a little sits in vision On three-dimensional coordinate under mark system, u is located at1、v1、u2、v2The error of extraction pixel on direction is respectivelyAnd orthogonal, then covariance square of the coordinate value under visual coordinate system on putting forward a pixel error Battle array Cxyz
Cxyz=JDuv·JT (2)
Wherein, the covariance matrix of coordinate pair pixel errors of the J for point under visual coordinate system, DuvFor pixel error itself Covariance matrix,
2) influence of the error of the public point coordinates under computation vision coordinate system to transition matrix
Common point is obtained after the coordinate of visual coordinate system, it is necessary to try to achieve visual coordinate using the point of three or more than three It is the transition matrix to world coordinate system, if taking n (n >=3, n ∈ N) individual common point, common point is sat in visual coordinate system and the world Coordinate under mark system is expressed asWithTwo coordinate system Transfer algorithm is:
Wherein For the spin matrix of visual coordinate system to world coordinate system, ψ, θ and φ are respectively the anglec of rotation around x-axis, y-axis and z-axis,For the translation matrix of visual coordinate system to world coordinate system, two matrixes share 6 unknown parameters.Above-mentioned formula is again It can be write as:
Because the measurement of common point has error, thereforeIn order to try to achieve optimal transition matrix, should make each It is individualValue take minimum, makeTransition matrix parameter vector is represented,Represent common point under visual coordinate system Coordinate parameters vector, therefore following object function is built to try to achieve transition matrix:
To set up above-mentioned formula, the derivation on conversion parameter is carried out to f (RT, PC), its derivative value should be 0, i.e.,:
Above-mentioned formula is the implicit function on RT and PC, i.e. existence function relation between RT and PC, RT=f (PC), according to Implicit function differentiation rule has:
Beyond parameter RT be expressed as C on carrying the covariance matrix of point toleranceRT
CRT=GCxyz·GT (8)
Wherein G is local derviations of the parameter RT to PC.
The error calculation of 3rd step tested point coordinate value under world coordinate system
After trying to achieve outer ginseng matrix, tested point can be tried to achieve under visual coordinate system by following formula in the seat of world coordinate system Scale value:
Wherein Pwm is three-dimensional coordinate of the tested point under world coordinate system, Pwm=[xwm,ywm,zwm]T, Pvm is tested point Three-dimensional coordinate under visual coordinate system, Pvm=[xvm,yvm,zvm]T, but because tested point Pvm is under visual coordinate system Coordinate and transition matrixError all be present, and independently of each other, therefore it should be asked for respectively the alive boundary of tested point is sat The error of the lower coordinate value of mark system;
1) matrix error to caused by the coordinate under the alive boundary's coordinate system of tested point is joined outside
Externally join the parameter of matrix according to formula (9)Carry out derivation:
Wherein, H is the local derviation of tested point parameter outside the coordinate pair under world coordinate system, then tested point is in world coordinate system Under coordinate be expressed as the covariance matrix of outer parameter matrix:
CPRT=HCRT·HT (11)
Therefore, coordinate of the tested point under world coordinate system is expressed as on the error of outer parameter:
ERT=trace (CPRT) (12)
Wherein, trace (CPRT) representing matrix CPRTMark;
2) three-dimensional coordinate error is under the alive boundary's coordinate system of tested point caused by being extracted under visual coordinate system due to pixel Coordinate caused by error
According to three-dimensional coordinate Pvm=[x of the formula (9) to tested point under visual coordinate systemvm,yvm,zvm]TCarry out derivation:
The then association side of three-dimensional coordinate of coordinate of the tested point under world coordinate system for tested point under visual coordinate system Poor matrix is expressed as:
CPvm=LCxyz·LT (14)
Therefore, the errors table of coordinate of coordinate of the tested point under world coordinate system on tested point under visual coordinate system It is shown as:
EPvm=trace (CPvm) (15)
Composition error EP be coordinate of the above-mentioned tested point under world coordinate system on outer parameter matrix and tested point regarding Feel the error sum of the coordinate under coordinate system, i.e.,:
EP=ERT+EPvm。 (16)
The beneficial effects of the invention are as follows can quantitative analysis pixel extraction error to put in visual coordinate system and world coordinates The influence of the lower coordinate value of system and the influence to transition matrix error, accurately represent the precision of diverse location point;Analysis process Simply, error propagation chain is clear;The layout of common point can be optimized according to the error analysis, improve the overall essence of measuring system Degree.
Brief description of the drawings
Accompanying drawing 1 is the systematic schematic diagram of view-based access control model stitching measure.Wherein, 1- tripods, the left camera bearings of 2-, the left phases of 3- Machine, the right camera bearings of 4-, the right cameras of 5-, 6- measured objects, 7- common points, 8- laser tracker gauge heads;OWXWYWZW- world coordinates System, OVXVYVZV- local coordinate system, RV W- visual coordinate system is to world coordinate system spin matrix, TV W- visual coordinate system sits to the world Mark system translation matrix.
Accompanying drawing 2 is view-based access control model stitching measure error analysis flow chart.
Embodiment
Describe the embodiment of the present invention in detail with technical scheme below in conjunction with the accompanying drawings.
Embodiment 1, the binocular vision splicing measuring systems based on laser tracker as shown in Figure 1 are built first, swash Optical tracker system gauge head 8 is from Leica AT960 MR, measurement range 1-20m.Left camera 3 and right camera 5 select VC-12MC- M, resolution ratio 3072*4096, highest frame frequency 60Hz.And binocular vision system is demarcated using gridiron pattern, the camera of acquisition Calibrating parameters are as follows:Left principal point for camera coordinate value u01=2140.397824, v01=1510.250152;Equivalent focal length fx1= 6447.987913 fy1=6454.015281;Right principal point for camera coordinate value u02=2124.090030, v02=1526.184441, Equivalent focal length fx2=6417.044403, fy2=6420.363610, and left camera is to the transition matrix of right cameraIn the public view field of laser tracker and vision measurement system 9 common points are arranged, vision measurement system gathers public dot image and any dot image, and laser tracker measurement common point is alive Coordinate value under boundary's coordinate system, then the error at any point is analyzed, the detailed process of method is as follows:
First step visual coordinate system calculates to the random error of world coordinate system transition matrix
1) the pixel extraction error calculated first a little causes the error of a little three-dimensional coordinate under visual coordinate system to clap camera Take the photograph obtained image and carry out pixel extraction, obtain a little position left and right pixel coordinate
,
And these coordinate values under visual coordinate system are calculated using formula (1)According to experiment bar Part sets pixel extraction errorIt is equal and equal to 0.5, then using formula (2) try to achieve each point regarding Feel the covariance matrix under coordinate systemWherein
2) influence of the error of public point coordinates to transition matrix takes 9 common points under computation vision coordinate system, and it sits in vision Coordinate under mark system and world coordinate system is expressed asWith Least square object function f (RT, PC) is established according to formula (3), solves to obtain ψ=0.580206, θ=0.306834, φ=- 2.343679 t1=2982.129371, t2=330.777454, t3=-1482.893136, visual coordinate is understood according to formula (5) It is to world coordinate system transition matrixWith Retain all parameters in formula (6), establish the implicit function on function RT=f (PC), and be according to implicit function differentiation rule Formula (7) obtains derivatives of the RT to PC, then substitutes into the association side that outer parameter RT is calculated using formula (8) for all known numeric values Poor matrix
Second step tested point is counted after coordinate value error calculation tries to achieve outer parameter covariance matrix under world coordinate system respectively Calculate shadow of the coordinate value of outer parameter matrix and tested point under visual coordinate system to the coordinate value under the alive boundary's coordinate system of tested point Ring.
1) matrix error to caused by the coordinate under the alive boundary's coordinate system of tested point is joined outside
Tested point is shot and extracts its pixel coordinate, its pixel coordinate is (u1,v1)=(3215.06, 434.58), (u2,v2)=(2895.27,375.02), and formula (11) is substituted into, tested point is calculated in world coordinates System under coordinate for outer parameter matrix covariance matrixTo CPRTSeek mark, Obtain error E of coordinate of the tested point under world coordinate system for outer parameter matrixRT=0.1221.
2) three-dimensional coordinate error is under the alive boundary's coordinate system of tested point caused by being extracted under visual coordinate system due to pixel Coordinate caused by error
The pixel coordinate of the tested point is substituted into formula (2) to the error matrix for calculating its coordinate under visual coordinate systemFormula (14) is substituted into, and substitutes into outer parameter matrix and tries to achieve tested point under world coordinate system Three-dimensional coordinate of the coordinate for tested point under visual coordinate system covariance matrix To CPvmMark is sought, obtains error E of coordinate of this under world coordinate system on putting the coordinate under visual coordinate systemPvm= 0.0571。
Most at last coordinate of the above-mentioned tested point tried to achieve under world coordinate system on outer parameter matrix and tested point regarding Feel that the error of the coordinate under coordinate system is summed according to formula (16), the error for obtaining tested point is EP=0.1792.
Embodiment can be in the hope of coordinate of any tested point under world coordinate system on caused by pixel extraction error Error, the precision for the analysis measurement point that can be quantified, the layout, raising system accuracy for common point are significant.

Claims (1)

1. a kind of error analysis method of view-based access control model stitching measure, it is characterized in that, this method is based on laser tracker and binocular Vision system carries out stitching measure, builds binocular vision system first, multiple common points, binocular phase are arranged in its public view field Machine gathers image and extracts the pixel coordinate of image, and the pixel error of quantitative analysis extraction is to putting the coordinate value under world coordinate system The influence of error;A pixel error pair for extraction is calculated according to the three-dimensional reconstruction formula of visual coordinate point and Formula of Coordinate System Transformation The influence of outer parameter matrix, then calculate the outer error for joining matrix and exist on putting the coordinate value error influence under world coordinate system and putting Coordinate value error of the coordinate pair point under world coordinate system under visual coordinate system influences, and finally, obtains tested point and is sat in the world Composition error under mark system;This method comprises the following steps that:
The first step builds the binocular vision splicing measuring systems based on laser tracker, establishes coordinate system;
First, left and right camera (3,5) is separately fixed on left and right camera bearing (2,4), then respectively by left and right camera bearing It is fixed in the crossbeam arranged on left and right sides of tripod (1);Using the photocentre of left camera (3) as the origin of visual coordinate system, camera into The u direction of image plane is x directions, and optical axis direction is z-axis direction, establishes visual coordinate system OVXVYVZV;Connection is arranged on laser Laser tracker gauge head (8) on turntable, establishes laser tracker coordinate system as world coordinate system OWXWYWZW;Regarded public Multiple common points (7) are arranged on measured object (6) in, image is gathered by binocular camera, and extract the pixel coordinate of image, are led to The reconstruction formula for crossing binocular vision point tries to achieve coordinate a little under visual coordinate system, and laser tracker gathers each common point simultaneously Coordinate, this coordinate value be under world coordinate system, using least square method try to achieve visual coordinate system to world coordinate system turn Change matrix;
Second step visual coordinate system calculates to the random error of world coordinate system transition matrix
1) the pixel extraction error calculated first a little causes the error of a little three-dimensional coordinate under visual coordinate system;
By the pixel coordinate (u of known left and right camera1,v1) and (u2,v2) and calibration result, three-dimensional coordinate a little is calculated (xv,yv,zv):
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>v</mi> </msub> <mo>=</mo> <msub> <mi>z</mi> <mi>v</mi> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>X</mi> <mn>1</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>v</mi> </msub> <mo>=</mo> <msub> <mi>z</mi> <mi>v</mi> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>Y</mi> <mn>1</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>z</mi> <mi>v</mi> </msub> <mo>=</mo> <mfrac> <mrow> <msub> <mi>t</mi> <mn>34</mn> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>Y</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>t</mi> <mn>24</mn> </msub> </mrow> <mrow> <msub> <mi>t</mi> <mn>21</mn> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>X</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>t</mi> <mn>22</mn> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>Y</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>t</mi> <mn>23</mn> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mn>2</mn> </msub> <mo>&amp;CenterDot;</mo> <mrow> <mo>(</mo> <msub> <mi>t</mi> <mn>31</mn> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>X</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>t</mi> <mn>32</mn> </msub> <mo>&amp;CenterDot;</mo> <msub> <mi>Y</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>t</mi> <mn>33</mn> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein(u01,v01) be left camera master Point coordinates, (u02,v02) be right camera principal point coordinate, fx1、fy1For the equivalent focal length of left camera, fx2、fy2For right camera etc. Imitate focal length,For the transition matrix parameter of left camera to right camera;According to above-mentioned three-dimensional reconstruction formula, When extraction pixel has error in the picture, this pixel error will be delivered to a little in visual coordinate system by formula (1) Under three-dimensional coordinate on, be located at u1、v1、u2、v2The error of extraction pixel on direction is respectively And orthogonal, then covariance matrix C of the coordinate value under visual coordinate system on putting forward a pixel errorxyz
Cxyz=JDuv·JT (2)
Wherein, the covariance matrix of coordinate pair pixel errors of the J for point under visual coordinate system, DuvFor the association of pixel error itself Variance matrix,
2) influence of the error of the public point coordinates under computation vision coordinate system to transition matrix
Common point is obtained after the coordinate of visual coordinate system and is arrived, it is necessary to try to achieve visual coordinate system using the point of three or more than three The transition matrix of world coordinate system, if taking n (n >=3, n ∈ N) individual common point, common point is in visual coordinate system and world coordinate system Under coordinate be expressed asWithThe conversion of Two coordinate system Algorithm is:
<mrow> <msub> <mi>Pw</mi> <mi>i</mi> </msub> <mo>=</mo> <msubsup> <mi>R</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>&amp;CenterDot;</mo> <msub> <mi>Pv</mi> <mi>i</mi> </msub> <mo>+</mo> <msubsup> <mi>T</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
WhereinFor regarding Feel coordinate system arrive world coordinate system spin matrix, ψ, θ and φ respectively around x-axis, y-axis and z-axis the anglec of rotation,For Visual coordinate system is to the translation matrix of world coordinate system, and two matrixes share 6 unknown parameters, and above-mentioned formula can be write as again:
<mrow> <msub> <mi>&amp;Delta;</mi> <mi>i</mi> </msub> <mo>=</mo> <msub> <mi>Pw</mi> <mi>i</mi> </msub> <mo>-</mo> <mrow> <mo>(</mo> <msubsup> <mi>R</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>&amp;CenterDot;</mo> <msub> <mi>Pv</mi> <mi>i</mi> </msub> <mo>+</mo> <msubsup> <mi>T</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
Because the measurement of common point has error, thereforeIn order to try to achieve optimal transition matrix, each should be made Value take minimum, makeTransition matrix parameter vector is represented,Represent common point under visual coordinate system Coordinate parameters vector, therefore following object function is built to try to achieve transition matrix:
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>R</mi> <mi>T</mi> <mo>,</mo> <mi>P</mi> <mi>C</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <msubsup> <mi>&amp;Delta;</mi> <mi>i</mi> <mi>T</mi> </msubsup> <mo>&amp;CenterDot;</mo> <msub> <mi>&amp;Delta;</mi> <mi>i</mi> </msub> <mo>=</mo> <mi>m</mi> <mi>i</mi> <mi>n</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
To set up above-mentioned formula, the derivation on conversion parameter is carried out to f (RT, PC), its derivative value should be 0, i.e.,:
Above-mentioned formula is the implicit function on RT and PC, i.e. existence function relation between RT and PC, RT=f (PC), according to hidden letter Number Rule for derivation has:
<mrow> <mi>G</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>R</mi> <mi>T</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>C</mi> </mrow> </mfrac> <mo>=</mo> <mo>-</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>D</mi> <mi>f</mi> <mrow> <mo>(</mo> <mi>R</mi> <mi>T</mi> <mo>,</mo> <mi>P</mi> <mi>C</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mo>&amp;part;</mo> <mi>R</mi> <mi>T</mi> </mrow> </mfrac> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mo>&amp;CenterDot;</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>D</mi> <mi>f</mi> <mrow> <mo>(</mo> <mi>R</mi> <mi>T</mi> <mo>,</mo> <mi>P</mi> <mi>C</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>C</mi> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
Beyond parameter RT be expressed as C on carrying the covariance matrix of point toleranceRT
CRT=GCxyz·GT (8)
Wherein G is local derviations of the parameter RT to PC;
The error calculation of 3rd step tested point coordinate value under world coordinate system
After trying to achieve outer ginseng matrix, tested point can be tried to achieve under visual coordinate system by following formula in the coordinate of world coordinate system Value:
<mrow> <mi>P</mi> <mi>w</mi> <mi>m</mi> <mo>=</mo> <msubsup> <mi>R</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>&amp;CenterDot;</mo> <mi>P</mi> <mi>v</mi> <mi>m</mi> <mo>+</mo> <msubsup> <mi>T</mi> <mi>v</mi> <mi>w</mi> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
Wherein Pwm is three-dimensional coordinate of the tested point under world coordinate system, Pwm=[xwm,ywm,zwm]T, Pvm be tested point regarding Feel the three-dimensional coordinate under coordinate system, Pvm=[xvm,yvm,zvm]T, but the coordinate due to tested point Pvm under visual coordinate system And transition matrixError all be present, and independently of each other, therefore it should be asked for respectively under the alive boundary's coordinate system of tested point The error of coordinate value;
1) matrix error to caused by the coordinate under the alive boundary's coordinate system of tested point is joined outside
Externally join the parameter of matrix according to formula (9)Carry out derivation:
<mrow> <mi>H</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>R</mi> <mi>T</mi> </mrow> </mfrac> <mo>=</mo> <msub> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>&amp;psi;</mi> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>&amp;theta;</mi> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>&amp;phi;</mi> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>t</mi> <mn>1</mn> </msub> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>t</mi> <mn>2</mn> </msub> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>t</mi> <mn>3</mn> </msub> </mrow> </mfrac> </mtd> </mtr> </mtable> </mfenced> <mrow> <mn>3</mn> <mo>&amp;times;</mo> <mn>6</mn> </mrow> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow>
Wherein, H is the local derviation of tested point parameter outside the coordinate pair under world coordinate system, then tested point is under world coordinate system Coordinate is expressed as the covariance matrix of outer parameter matrix:
CPRT=HCRT·HT (11)
Therefore, coordinate of the tested point under world coordinate system is expressed as on the error of outer parameter:
ERT=trace (CPRT) (12)
Wherein, trace (CPRT) representing matrix CPRTMark;
2) three-dimensional coordinate error is to the seat under the alive boundary's coordinate system of tested point caused by being extracted under visual coordinate system due to pixel Error caused by mark
According to three-dimensional coordinate Pvm=[x of the formula (9) to tested point under visual coordinate systemvm,yvm,zvm]TCarry out derivation:
<mrow> <mi>L</mi> <mo>=</mo> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>v</mi> <mi>m</mi> </mrow> </mfrac> <mo>=</mo> <msubsup> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>x</mi> <mrow> <mi>v</mi> <mi>m</mi> </mrow> </msub> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>y</mi> <mrow> <mi>v</mi> <mi>m</mi> </mrow> </msub> </mrow> </mfrac> </mtd> <mtd> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>P</mi> <mi>w</mi> <mi>m</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>z</mi> <mrow> <mi>v</mi> <mi>m</mi> </mrow> </msub> </mrow> </mfrac> </mtd> </mtr> </mtable> </mfenced> <mrow> <mn>3</mn> <mo>&amp;times;</mo> <mn>3</mn> </mrow> <mi>T</mi> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> </mrow>
The then covariance square of three-dimensional coordinate of coordinate of the tested point under world coordinate system for tested point under visual coordinate system Matrix representation is:
CPvm=LCxyz·LT (14)
Therefore, the error of coordinate of coordinate of the tested point under world coordinate system on tested point under visual coordinate system represents For:
EPvm=trace (CPvm) (15)
Composition error EP is that coordinate of the above-mentioned tested point under world coordinate system is sat on outer parameter matrix and tested point in vision The error sum of coordinate under mark system, i.e.,:
EP=ERT+EPvm (16)。
CN201710853804.XA 2017-09-20 2017-09-20 A kind of error analysis method of view-based access control model stitching measure Active CN107726975B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710853804.XA CN107726975B (en) 2017-09-20 2017-09-20 A kind of error analysis method of view-based access control model stitching measure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710853804.XA CN107726975B (en) 2017-09-20 2017-09-20 A kind of error analysis method of view-based access control model stitching measure

Publications (2)

Publication Number Publication Date
CN107726975A true CN107726975A (en) 2018-02-23
CN107726975B CN107726975B (en) 2019-05-14

Family

ID=61206703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710853804.XA Active CN107726975B (en) 2017-09-20 2017-09-20 A kind of error analysis method of view-based access control model stitching measure

Country Status (1)

Country Link
CN (1) CN107726975B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108332708A (en) * 2018-03-29 2018-07-27 苏州凌创瑞地测控技术有限公司 Laser leveler automatic checkout system and detection method
CN109085561A (en) * 2018-07-08 2018-12-25 河北数冶科技有限公司 Three-dimensional laser radar measuring system and scaling method
CN109993802A (en) * 2019-04-03 2019-07-09 浙江工业大学 A kind of Hybrid camera scaling method in urban environment
CN110720983A (en) * 2019-09-05 2020-01-24 北京万特福医疗器械有限公司 Visual identification method and system
CN110966937A (en) * 2019-12-18 2020-04-07 哈尔滨工业大学 Large member three-dimensional configuration splicing method based on laser vision sensing
CN112051160A (en) * 2020-09-09 2020-12-08 中山大学 Segment joint bending stiffness measuring method, system, equipment and storage medium
CN112762825A (en) * 2020-12-24 2021-05-07 复旦大学 Matrix spectrum radius method for representing three-dimensional coordinate reconstruction error of photogrammetric system
CN112907727A (en) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix
CN114565714A (en) * 2022-02-11 2022-05-31 山西支点科技有限公司 Monocular vision sensor hybrid high-precision three-dimensional structure recovery method
CN115690205A (en) * 2022-10-09 2023-02-03 北京自动化控制设备研究所 Visual relative pose measurement error estimation method based on point-line comprehensive characteristics

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102155923A (en) * 2011-03-17 2011-08-17 北京信息科技大学 Splicing measuring method and system based on three-dimensional target
KR20110132835A (en) * 2010-06-03 2011-12-09 한국전자통신연구원 Method and apparatus contrasting image through perspective distortion correction
US20120082340A1 (en) * 2006-09-25 2012-04-05 Sri International System and method for providing mobile range sensing
CN103247053A (en) * 2013-05-16 2013-08-14 大连理工大学 Accurate part positioning method based on binocular microscopy stereo vision
CN103471618A (en) * 2013-09-22 2013-12-25 电子科技大学 Coordinate error determination method for image acquisition device of visual inspection system
CN104457569A (en) * 2014-11-27 2015-03-25 大连理工大学 Geometric parameter visual measurement method for large composite board
CN104729534A (en) * 2015-03-12 2015-06-24 北京空间飞行器总体设计部 Monocular visual error measurement system for cooperative target and error limit quantification method
CN105716542A (en) * 2016-04-07 2016-06-29 大连理工大学 Method for three-dimensional data registration based on flexible feature points
CN105957018A (en) * 2016-07-15 2016-09-21 武汉大学 Unmanned aerial vehicle image filtering frequency division jointing method
CN106323337A (en) * 2016-08-02 2017-01-11 上海航天控制技术研究所 Stereoscopic-vision relative-measurement-system error analysis method
CN106412497A (en) * 2016-08-30 2017-02-15 中国南方电网有限责任公司 Binocular vision stereo matching method based on panoramic mosaic staring technique
CN107067473A (en) * 2015-12-31 2017-08-18 达索系统公司 3D modeling object is reconstructed

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120082340A1 (en) * 2006-09-25 2012-04-05 Sri International System and method for providing mobile range sensing
KR20110132835A (en) * 2010-06-03 2011-12-09 한국전자통신연구원 Method and apparatus contrasting image through perspective distortion correction
CN102155923A (en) * 2011-03-17 2011-08-17 北京信息科技大学 Splicing measuring method and system based on three-dimensional target
CN103247053A (en) * 2013-05-16 2013-08-14 大连理工大学 Accurate part positioning method based on binocular microscopy stereo vision
CN103471618A (en) * 2013-09-22 2013-12-25 电子科技大学 Coordinate error determination method for image acquisition device of visual inspection system
CN104457569A (en) * 2014-11-27 2015-03-25 大连理工大学 Geometric parameter visual measurement method for large composite board
CN104729534A (en) * 2015-03-12 2015-06-24 北京空间飞行器总体设计部 Monocular visual error measurement system for cooperative target and error limit quantification method
CN107067473A (en) * 2015-12-31 2017-08-18 达索系统公司 3D modeling object is reconstructed
CN105716542A (en) * 2016-04-07 2016-06-29 大连理工大学 Method for three-dimensional data registration based on flexible feature points
CN105957018A (en) * 2016-07-15 2016-09-21 武汉大学 Unmanned aerial vehicle image filtering frequency division jointing method
CN106323337A (en) * 2016-08-02 2017-01-11 上海航天控制技术研究所 Stereoscopic-vision relative-measurement-system error analysis method
CN106412497A (en) * 2016-08-30 2017-02-15 中国南方电网有限责任公司 Binocular vision stereo matching method based on panoramic mosaic staring technique

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
刘佳音等: "一种双目立体视觉系统的误差分析方法", 《光学技术》 *
孟环标等: "基于双目视觉的测量系统误差分析", 《第五届全国几何设计与计算学术会议论文集》 *
李晓峰等: "立体视觉中三维定位误差分析的教学研讨", 《电气电子教学学报》 *
董明伦等: "同步气象卫星视地理坐标与像素直角坐标的转换关系及误差分析", 《海洋预报》 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108332708A (en) * 2018-03-29 2018-07-27 苏州凌创瑞地测控技术有限公司 Laser leveler automatic checkout system and detection method
CN108332708B (en) * 2018-03-29 2023-09-05 苏州瑞地测控技术有限公司 Automatic detection system and detection method for laser level meter
CN109085561A (en) * 2018-07-08 2018-12-25 河北数冶科技有限公司 Three-dimensional laser radar measuring system and scaling method
CN109993802A (en) * 2019-04-03 2019-07-09 浙江工业大学 A kind of Hybrid camera scaling method in urban environment
CN110720983B (en) * 2019-09-05 2021-05-25 北京万特福医疗器械有限公司 Visual identification method and system
CN110720983A (en) * 2019-09-05 2020-01-24 北京万特福医疗器械有限公司 Visual identification method and system
CN110966937A (en) * 2019-12-18 2020-04-07 哈尔滨工业大学 Large member three-dimensional configuration splicing method based on laser vision sensing
CN110966937B (en) * 2019-12-18 2021-03-09 哈尔滨工业大学 Large member three-dimensional configuration splicing method based on laser vision sensing
CN112051160B (en) * 2020-09-09 2022-04-19 中山大学 Segment joint bending stiffness measuring method, system, equipment and storage medium
CN112051160A (en) * 2020-09-09 2020-12-08 中山大学 Segment joint bending stiffness measuring method, system, equipment and storage medium
CN112762825A (en) * 2020-12-24 2021-05-07 复旦大学 Matrix spectrum radius method for representing three-dimensional coordinate reconstruction error of photogrammetric system
CN112907727A (en) * 2021-01-25 2021-06-04 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix
CN112907727B (en) * 2021-01-25 2023-09-01 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix
CN114565714A (en) * 2022-02-11 2022-05-31 山西支点科技有限公司 Monocular vision sensor hybrid high-precision three-dimensional structure recovery method
CN114565714B (en) * 2022-02-11 2023-05-23 山西支点科技有限公司 Monocular vision sensor hybrid high-precision three-dimensional structure recovery method
CN115690205A (en) * 2022-10-09 2023-02-03 北京自动化控制设备研究所 Visual relative pose measurement error estimation method based on point-line comprehensive characteristics
CN115690205B (en) * 2022-10-09 2023-12-05 北京自动化控制设备研究所 Visual relative pose measurement error estimation method based on point-line comprehensive characteristics

Also Published As

Publication number Publication date
CN107726975B (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN107726975A (en) A kind of error analysis method of view-based access control model stitching measure
CN111414798B (en) Head posture detection method and system based on RGB-D image
CN103162622B (en) The Portable ball target of single camera vision system and use thereof and measuring method thereof
CN103578117B (en) Determine the photographic head method relative to the attitude of environment
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN101699313B (en) Method and system for calibrating external parameters based on camera and three-dimensional laser radar
CN106548462B (en) Non-linear SAR image geometric correction method based on thin-plate spline interpolation
CN110296691A (en) Merge the binocular stereo vision measurement method and system of IMU calibration
CN109993800A (en) A kind of detection method of workpiece size, device and storage medium
CN104634248B (en) Revolving shaft calibration method under binocular vision
CN106408609A (en) Parallel mechanism end motion pose detection method based on binocular vision
CN103559711A (en) Motion estimation method based on image features and three-dimensional information of three-dimensional visual system
CN104835144A (en) Solving camera intrinsic parameter by using image of center of sphere and orthogonality
CN103759669A (en) Monocular vision measuring method for large parts
CN110517325A (en) The vehicle body surrounding objects localization method and system of a kind of coordinate transform and coordinate transform
CN109523595A (en) A kind of architectural engineering straight line corner angle spacing vision measuring method
CN110009667A (en) Multi-viewpoint cloud global registration method based on Douglas Rodríguez transformation
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN109448043A (en) Standing tree height extracting method under plane restriction
CN104268880A (en) Depth information obtaining method based on combination of features and region matching
CN110264527A (en) Real-time binocular stereo vision output method based on ZYNQ
Li et al. 3D triangulation based extrinsic calibration between a stereo vision system and a LIDAR
CN110196031A (en) A kind of scaling method of three-dimensional point cloud acquisition system
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN109584157A (en) Object plane degree measurement method and device, storage medium, electronic metering equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant