CN107977997A - A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud - Google Patents

A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud Download PDF

Info

Publication number
CN107977997A
CN107977997A CN201711225701.5A CN201711225701A CN107977997A CN 107977997 A CN107977997 A CN 107977997A CN 201711225701 A CN201711225701 A CN 201711225701A CN 107977997 A CN107977997 A CN 107977997A
Authority
CN
China
Prior art keywords
mrow
mtd
msub
mtr
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711225701.5A
Other languages
Chinese (zh)
Other versions
CN107977997B (en
Inventor
李小路
徐立军
冯静
刘畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201711225701.5A priority Critical patent/CN107977997B/en
Publication of CN107977997A publication Critical patent/CN107977997A/en
Application granted granted Critical
Publication of CN107977997B publication Critical patent/CN107977997B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Abstract

The present invention relates to photogrammetric and three-dimensional reconstruction field, disclose a kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud, by carrying out 3-D scanning to testee and gathering Same Scene image in different points of view, feature extraction is carried out to the multiple image collected to obtain characteristic point, corresponding closest approach lookup is carried out using the characteristic point in any two images overlapping region, and matching double points are obtained with reference to random consistency algorithm;Fundamental matrix F is solved using the matching double points so that fundamental matrix F is insensitive to white Gaussian noise;The object function on camera internal reference is established with reference to the relation of fundamental matrix and camera internal reference, and camera internal reference solution is carried out using algorithm is optimized;Camera internal reference initial value needed for optimization algorithm is configured based on laser radar three dimensional point cloud, pinhole imaging system principle, the Pixel Dimensions of CCD camera and image center location.The present invention realizes the Accurate Calibration to camera internal reference.

Description

A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud
Technical field
The present invention relates to photogrammetric with three-dimensional reconstruction field, especially a kind of combination laser radar three dimensional point cloud Camera Self-Calibration method.
Background technology
In photogrammetric and three-dimensional reconstruction field, camera parameter is the necessary bar that two dimensional image is reverted to three-dimensional data Part, and its precision directly affects restoration result, therefore it is all the time that how rapid and convenient carries out calibration to camera exactly Hot spot.Camera calibration is divided into three kinds of modes, is traditional camera calibration, active vision scaling method and Camera Self-Calibration respectively Method.Wherein, traditional camera calibration needs the calibration thing of high-precision known object shapes and sizes, obtained calibration result essence Degree is high, but calibration process needs human intervention, complicated, meeting introduce error because of misoperation, and due to calibration The foozle of thing itself, these produce strong influence to calibration result.In addition, essence of this method due to demarcating thing Degree requires height, therefore cost is higher.Active vision scaling method is that one kind is gathered by accurately controlling camera motion information Image, therefore in the case of known to camera location information, may be such that camera model linearizes, and then simplify computation complexity, Algorithm robustness is high, but the operation of this method is extremely complex, is difficult to realize in practical applications, therefore application range very little, Be not suitable for ordinary circumstance.Camera Self-Calibration is that one kind need not be by calibration thing, it is only necessary to repeatedly shooting is carried out to scene and is obtained Image itself between mutual restriction relation demarcate, easy to operate, method flexibility is high, and applicability is extensive, but this The precision of method is low, poor robustness.
The patent provided according to existing Patent Office is consulted, and Camera Self-Calibration method is mainly divided to two major classes:First kind method is adopted Self-calibration is carried out with end point, and this self-calibrating method requires to there must be in collected image mutually orthogonal parallel Straight line, this method carry out constraint solving camera internal reference using the perspective geometry relation of orthogonal straight lines, and algorithm is succinctly efficient, still Application range is not extensive enough, limited precision.Above-mentioned patent is included in " a kind of base disclosed in Chinese patent 201510708196.4 In the Camera Self-Calibration method of two end points ".Second class uses the characteristic point pair and epipolar-line constraint relation between multiple image Self-calibration is carried out, the Block- matching by finding image is combined the matching double points found between two images with polar curve and utilizes simple 8 methods solve fundamental matrix, and then decompose and obtain camera internal reference, the solution that this decomposition method obtains may not be it is unique, And be not stable, so while this method flexibility is high, but due to decomposing the nonuniqueness of situation, cause its precision Low, stability is bad, and above-mentioned patent is included in the " camera based on more sheet photos disclosed in Chinese patent 201510131895.7 Self-calibrating method and its device ".Since above-mentioned patent is merely simple to fundamental matrix progress decomposition algorithm, but precision is low, Stability is bad.The main distinction of the present invention and above method patent are:Propose one kind and combine laser radar scanning data Pre-estimation camera parameter scope is carried out, and improves a kind of new fundamental matrix method for solving and causes it to the Gauss on image Insensitive for noise, and finally solve camera internal reference method be not by simply decomposing obtaining for fundamental matrix, but The object function established is optimized to obtain by optimizing algorithm.
The invention discloses a kind of camera self-calibration method of combination laser radar scanning data, this method passes through to quilt Survey object to carry out 3-D scanning and gather Same Scene image in different points of view, feature extraction is carried out to the multiple image collected To obtain characteristic point, corresponding closest approach lookup is carried out using the characteristic point in any two images overlapping region, and combine with Machine consistency algorithm obtains matching double points;Fundamental matrix is solved using the matching double points so that the fundamental matrix is to white Gaussian Insensitive for noise;The object function on camera internal reference is established with reference to the relation of fundamental matrix and camera internal reference, and utilizes optimization The optimal solution that algorithm asks for object function is camera calibration result.In optimization process, obtained using laser radar scanning Represent the three-dimensional coordinate of true spatial location information and combine pinhole imaging system principle to estimate that the focal length of camera is joined as initialization Count to improve the stability of parameter.Compared with traditional Camera Self-Calibration method, method proposed by the invention has merged laser The high-precision cloud data that radar scanning obtains, has easy to operate, and to photographed scene without particular requirement, algorithm stability is good, The high characteristic of precision, has very high practical value in the texture mapping field of laser radar and image.
The content of the invention
The present invention discloses a kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud, solves traditional camera mark Fixed complicated, traditional self-calibration poor robustness, the problems such as precision is low.
The purpose of the present invention is what is be achieved through the following technical solutions:
A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud, it is characterised in that including:
Step 1, video camera is fixed on above laser radar while Same Scene is scanned and shot, establish with Laser radar position is the three-dimensional system of coordinate of origin, calculates the 3 d space coordinate (x, y, z) under this coordinate system;
Step 2, laser radar fixed position is scanned, and camera moves repeatedly to carry out regard to Same Scene more Point is shot, and there are image overlapping region between each viewpoint shooting photo;
Step 3, feature extraction is carried out to described image overlapping region, and the pixel for obtaining the characteristic point of every piece image is sat Mark (u, v), the pixel coordinate (u, v) of the characteristic point of described image overlapping region between any two viewpoint is slightly matched Matched with essence, i.e., the matching double points A (u after matching is purified twicei,vi, 1) and A'(u'i,v'i,1);
Step 4, utilizes the matching double points A (u after the purificationi,vi, 1) and A'(u'i,v'i, 1) and solve fundamental matrix F With limit e';
Step 5, related camera internal reference f is established using the fundamental matrix Fx,fy,uo,voObject function, wherein, fxGeneration Effective focal length (unit of the table camera focus in x-axis direction:Pixel), fyRepresent effective focal length of the camera focus in y-axis direction (unit:Pixel), (uo,vo) be the image shot by camera principal point coordinate (unit:Pixel), note camera internal reference matrix is:
The fundamental matrix F and camera internal reference fx,fy,uo,voBetween relation, shown in formula specific as follows:
Wherein,It is a symmetrical matrix, [e ']xIt is on the limit e' Antisymmetric matrix, λ2It is a normal number factor;
Because λ2It is a normal number factor, so can be by the equation left side matrix F CF of (1) formulaTWith the right matrixIn each element correspond to be divided by, be as a result equal to λ2
Since Matrix C is symmetrical matrix, order
C=(c1,c2,c3,c4,c5) (4) then (2) formula matrix equation the right and left matrix can be expressed as formula (5) and (6) form, wherein, the matrix equation left side is represented by:
It is represented by the right of matrix equation:
So the two matrix corresponding elements are divided by, shaped likeRepresent formula (5) that i-th of fundamental matrix formed and (6) corresponding element is divided by,Then represent formula (5) that i-th of fundamental matrix formed and all corresponding element phases of (6) The average removed, is as a result equal to λ2, using this feature as foundation, establish the object function f (c) as shown in following formula (7);
Step 6, in theory described in step 5As a result it is equal to λ2, so object function is in ideal In the case of should be equal to 0, therefore its result is infinitely close to 0 using algorithm is optimized, then can obtain the camera internal reference fx,fy, uo,vo;Described utilize optimizes camera internal reference f described in Algorithm for Solvingx,fy,uo,voDuring, the optimization algorithm is needed to defeated Enter parameter, i.e. camera internal reference fx,fy,uo,voCarry out initial value restriction, the present invention by extract any vertex of measured object it is described with Laser radar position is the 3 d space coordinate (x, y, z) calculated under the three-dimensional system of coordinate of origin and calculates it With the distance on ground, with reference to any vertex of the measured object distance away from ground and pinhole imaging system principle and described on the image Camera pixel Size calculation goes out camera focus fx0,fy0, and the picture centre coordinate definition of described image is (uo0,vo0), with institute State picture centre coordinate (uo0,vo0) centered on image principal point is limited, the restriction scope must be reasonable, therefore in institute State picture centre coordinate (uo0,vo0) centered on, optimal solution is found in the range of being limited to ± 15% up and down, similarly the camera is burnt Away from restriction scope be also defined similar to this method, defined scope of initial values equally regulation be limited to ± 15% model up and down In enclosing, using the initial value defined above as the optimization algorithm and then precision height, the good camera internal reference of robustness are solved.
The characteristic point by image between any two viewpoint matches twice described in carrying out, wherein once matching what is used Be the characteristic point for finding two width described image overlapping regions pixel coordinate (u, v) between Euclidean distance be less than it is set The characteristic point pair of threshold value, Secondary Match are then to reject Mismatching point using stochastical sampling uniformity, are obtained so that away from by the base The distance for the polar curve equation that this matrix F is obtained is less than the point of given threshold to the matching double points A (u after being the purificationi,vi, And A'(u' 1)i,v'i,1)。
The fundamental matrix F is insensitive to white Gaussian noise, according to the matching double points A (u after the purificationi,vi, 1) and A' (u'i,v'i, 1) carry out solving the fundamental matrix F:
To the matching double points A (u after the purificationi,vi, 1) and A'(u'i,v'i, 1) and image coordinate normalized is carried out, Respectively to the matching double points A (u after the purificationi,vi, 1) and A'(u'i,v'i, 1) and carry out displacement and scale transformation so that conversion The origin of image is located at the center of image afterwards, and all characteristic points are respectively positioned on using the center of image as the center of circle,For radius In circle, its conversion is represented by:
B=TA, B'=T'A'(8) Wherein, A and A' is the matching double points A (u after the purificationi,vi, 1) and A'(ui',vi', 1), T and T' is after the purifications Matching double points A (ui,vi, 1) and A'(u'i,v'i, 1) and corresponding normalization matrix, B and B' are the match point after normalization To B (uib,vib, 1) and B'(u'ib,v'ib,1);
Equation, such as equation are established using the relation between the matching double points B and B' and fundamental matrix F after the normalization (9) shown in:
Solution matrixAnd then solve fundamental matrix
By the fundamental matrixAs initial value with reference to the matching double points A (u after the purificationi,vi, 1) and A'(u'i,v'i, 1) and utilize the maximal possibility estimation solution fundamental matrix F so that and it is insensitive to white Gaussian noise;
Using the fundamental matrix F according to FTE'=0 solves limit e'.
The object function is substantially one on the camera internal reference fx,fy,uo,voFunction, utilize the optimization Algorithm carries out optimization processing to the object function, by the camera internal reference fx,fy,uo,voAs input so that the mesh Scalar functions are minimum, and obtained camera internal reference is then the camera internal reference result to be demarcated.
Using it is described using laser radar position as the three-dimensional system of coordinate of origin under the 3 d space coordinate pair that calculates Focal range is estimated that its specific steps includes:
Step 1, the scan data solution is counted as using the laser radar position as three under the coordinate system of origin Dimension point cloud coordinate;
Step 2, extracts one vertex of testee and calculates the distance on one vertex of testee and ground;
Step 3, phase is calculated with reference to distance and pinhole imaging system principle of one vertex of testee on image plane with ground Machine focal length and the effective focal length f that camera is calculated with reference to the CCD camera Pixel Dimensions usedx0,fy0, with fx0,fy0Centered on, It is limited to up and down in the range of ± 15%, the constraint for image principal point coordinate is with described image centre coordinate (uo0,vo0) be The heart, then limit scope and be defined to [0.85uo0,1.15uo0] and [0.85vo0,1.15vo0], the restriction scope must be reasonable, only Have and find optimal solution in the case of rationally estimation initial value, can just obtain precision height, the internal reference of the good camera of stability.
Brief description of the drawings
Fig. 1 is a kind of experimental program schematic diagram of the Camera Self-Calibration method of combination laser radar three dimensional point cloud;
Fig. 2 is a kind of implementing procedure figure of the Camera Self-Calibration method of combination laser radar three dimensional point cloud;
Fig. 3 is the three-dimensional laser radar scanning and camera acquisition system schematic diagram employed in embodiment;
Fig. 4 resolves three dimensional point cloud schematic diagram for machine and lidar measurement initial data;
Fig. 5 shoots Same Scene schematic diagram for camera in multiple views;
Fig. 6 shoots any two images feature point extraction and matching process flow chart in Same Scene for multiple views;
Fig. 7 is the optimization algorithm initial value estimation process flow chart based on laser radar three dimensional point cloud;
Fig. 8 is the feature point extraction result figure that any viewpoint claps captured image;
Fig. 9 is any two images and the image characteristic point result figure after Secondary Match.
Embodiment
The present invention is described in further detail with reference to the accompanying drawings and detailed description.
The present invention provides a kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud, based on three-dimensional laser Radar scanning carries out scheme implementation with camera acquisition system, and experimental facilities is built and scheme schematic diagram is as shown in Figure 1, specific implementation Flow chart is as shown in Figure 2.
(1) laser radar three-dimensional point cloud and image data acquisition
Video camera is fixed on above laser radar first while Same Scene is scanned and shot, system schematic As shown in figure 3, including CCD camera, the two-dimensional laser radar integrated system that can be dismantled at any time respectively, it mainly utilizes tilting mirror It is vertical to rotate to realize two-dimensional scan, 3-D scanning is finally realized with reference to the holder horizontally rotated, due to scanning obtained data It is target range d and level angle α and vertical angle β, establishes the three-dimensional system of coordinate using laser radar position as origin, The coordinate system schematic diagram as shown in figure 4, according to space three-dimensional point coordinates (x, y, z) and target range d and level angle α and Relation between vertical angle β:
The 3 d space coordinate (x, y, z) under this coordinate system is calculated according to formula (10).
The video camera being fixed on laser radar is pulled down, shift position carries out Same Scene multiple views shooting, shooting Schematic diagram is as shown in Figure 5, it is desirable to which the multiplicity between each viewpoint shooting photo reaches 70% several Same Scenes derived above The image of different points of view.
(2) image collected to multiple views carries out feature point extraction with matching
SIFT feature extraction is carried out to described image respectively, obtains the characteristic point of every piece image, extraction result such as Fig. 8 institutes Show, the characteristic point of image between any two viewpoint is subjected to Secondary Match, wherein once matching using finding two images Euclidean distance between characteristic point is less than the characteristic point pair of set threshold value, and Secondary Match is picked using stochastical sampling uniformity Except Mismatching point, correct matching double points A (u are obtainedi,vi, 1) and A'(u'i,v'i, 1), the stream of this feature extracting and matching Journey figure is as shown in fig. 6, final correctly matching result is as shown in Figure 9.It only ensure that correctly matching could be grasped subsequently Make, to obtain accurate camera internal reference calibration result.
(3) fundamental matrix F and epipolar-line constraint are solved
Using the correctly matching to A (ui,vi, 1) and A'(u'i,v'i, 1) and fundamental matrix F and epipolar-line constraint are solved, And fundamental matrix F is insensitive to white Gaussian noise, it is as described below that it specifically solves flow:
To the correctly matching double points A (ui,vi, 1) and A'(u'i,v'i, 1) and image coordinate normalized is carried out, point It is other that displacement and scale transformation are carried out to the correctly matching double points so that the origin of image is located at the center of image after conversion, And all characteristic points are respectively positioned on using the center of image as the center of circle,For in the circle of radius, its conversion is represented by:
B=TA, B'=T'A'(11)
Wherein, A and A' is the correct matching double points, and matching double points are corresponding returns to be described correctly by T and T' One changes matrix, and B and B' are the matching double points B (u after normalizationib,vib, 1) and B'(u'ib,v'ib,1);
Equation is established using the relation between the matching double points B and B' and fundamental matrix F after the normalization and is asked Solution, equation areSolution obtainsAnd then solve basic Matrix
By the fundamental matrixAs initial value with reference to the correctly matching double points A (ui,vi, 1) and A' (u'i,v'i, 1) and using maximal possibility estimation the fundamental matrix F insensitive to white Gaussian noise is solved, using described right The insensitive fundamental matrix F of white Gaussian noise is according to FTE'=0 solves limit e'.
(4) foundation of object function
Related camera internal reference f is established with reference to Kruppa equations and the fundamental matrix Fx,fy,uo,voObject function.Root It is according to the relation between the Kruppa equations and the fundamental matrix F
Wherein,λ2It is a normal number factor, [e ']xIt is on e' Antisymmetric matrix;
OrderC=(c1,c2,c3,c4,c5), then
Wherein, FCFTWithEach element be on c=(c1,c2,c3,c4,c5) linear list reach Formula, and they are symmetrical matrixes, therefore according to the λ2It is a normal number factor, establishes object function
Wherein,Left side matrix F CF for the Kruppa equations being made of i-th of fundamental matrix FTAnd the right side Side matrixEach element correspond to ratio obtained from,For the average of the ratio.
(5) particle swarm optimization algorithm camera internal reference is utilized
The object function is substantially one on the camera internal reference fx,fy,uo,voFunction, utilize the particle Colony optimization algorithm carries out optimization processing to the object function, by the camera internal reference fx,fy,uo,voAs input so that The object function is minimum, and obtained camera internal reference is then the camera internal reference result to be demarcated.
The particle swarm optimization algorithm needs to limit input parameter scope, and the present invention is using the center of described image as master The initialization of point, with described image centre coordinate (uo0,vo0) centered on, then limit scope and be defined to [0.85uo0,1.15uo0], [0.85vo0,1.15vo0], the restriction scope should not be too large also should not be too small, therefore in described image centre coordinate (uo0,vo0) Centered on, find optimal solution in the range of being limited to ± 15% up and down;Using it is described using laser radar position as origin three The 3 d space coordinate focusing scope calculated under dimension coordinate system is estimated that idiographic flow is as shown in fig. 7, its specific steps Including:
Step 1, the scan data solution is counted as using the laser radar position as three under the coordinate system of origin Dimension point cloud coordinate;
Step 2, one vertex of testee described in manual extraction and calculate one vertex of testee and ground away from From;
Step 3, phase is calculated with reference to distance and pinhole imaging system principle of one vertex of testee on image plane with ground Machine focal length and the effective focal length f that camera is calculated with reference to the CCD camera Pixel Dimensions usedx0,fy0, with fx0,fy0Centered on, Optimal solution is found in the range of being limited to ± 15% up and down, you can obtains precision height, the internal reference f of the good camera of stabilityx,fy, uo,vo
In conclusion the present invention proposes a kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud, lead to Cross and the camera internal reference demarcated is initialized with reference to laser radar three dimensional point cloud, and improved fundamental matrix is asked for The final high accuracy realized to camera internal reference of foundation of method and fresh target function, the calibration of high stability.The institute of the present invention With reference to laser radar data, to be because formed system is combined using more and more extensive in laser radar and camera, three Tie up the fields such as urban renewal, historical relic's protection, base surveying to have applied, therefore the present invention has fine practical value.
The above, is only the basic scheme of specific implementation method of the present invention, but protection scope of the present invention is not limited to In this, any those skilled in the art in technical scope disclosed by the invention, it is contemplated that change or replacement, all should It is included within the scope of the present invention.Therefore, protection scope of the present invention should be subject to scope of the claims. Change in the equivalent implication and scope of fallen with claim is intended to be included within the scope of claim.

Claims (5)

  1. A kind of 1. Camera Self-Calibration method of combination laser radar three dimensional point cloud, it is characterised in that including:
    Step 1, video camera is fixed on above laser radar while Same Scene is scanned and shot, is established with laser Radar is set to the three-dimensional system of coordinate of origin in place, calculates the 3 d space coordinate (x, y, z) under this coordinate system;
    Step 2, laser radar fixed position is scanned, and camera, which moves, repeatedly to be come to carry out multiple views bat to Same Scene Take the photograph, there are image overlapping region between each viewpoint shooting photo;
    Step 3, to described image overlapping region carry out feature extraction, obtain the characteristic point of every piece image pixel coordinate (u, V), the pixel coordinate (u, v) of the characteristic point of described image overlapping region between any two viewpoint is subjected to slightly matching and essence Matching, i.e., the matching double points A (u after matching is purified twicei,vi, 1) and A'(u 'i,v′i,1);
    Step 4, utilizes the matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i, 1) and solve fundamental matrix F and limit e';
    Step 5, related camera internal reference f is established using the fundamental matrix Fx,fy,uo,voObject function, wherein, fxRepresent phase Effective focal length (unit of the machine focal length in x-axis direction:Pixel), fyRepresent effective focal length (unit of the camera focus in y-axis direction: Pixel), (uo,vo) be the image shot by camera principal point coordinate (unit:Pixel), note camera internal reference matrix is:
    <mrow> <mi>K</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>f</mi> <mi>x</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>u</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>f</mi> <mi>y</mi> </msub> </mtd> <mtd> <msub> <mi>v</mi> <mi>o</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
    The fundamental matrix F and camera internal reference fx,fy,uo,voBetween relation, shown in formula specific as follows:
    <mrow> <msup> <mi>FCF</mi> <mi>T</mi> </msup> <mo>=</mo> <msup> <mi>&amp;lambda;</mi> <mn>2</mn> </msup> <msub> <mrow> <mo>&amp;lsqb;</mo> <msup> <mi>e</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;rsqb;</mo> </mrow> <mi>x</mi> </msub> <mi>C</mi> <msubsup> <mrow> <mo>&amp;lsqb;</mo> <msup> <mi>e</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;rsqb;</mo> </mrow> <mi>x</mi> <mi>T</mi> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
    Wherein,It is a symmetrical matrix, [e ']xIt is on the anti-of the limit e' Symmetrical matrix, λ2It is a normal number factor;
    Because λ2It is a normal number factor, so can be by the equation left side matrix F CF of (1) formulaTWith the right matrix In each element correspond to be divided by, be as a result equal to λ2
    Since Matrix C is symmetrical matrix, order
    <mrow> <mi>C</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>c</mi> <mn>1</mn> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mn>5</mn> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>c</mi> <mn>5</mn> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mn>2</mn> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mn>4</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>c</mi> <mn>3</mn> </msub> </mtd> <mtd> <msub> <mi>c</mi> <mn>4</mn> </msub> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
    C=(c1,c2,c3,c4,c5) (4)
    Then matrix equation the right and left matrix of (2) formula can be expressed as formula (5) and (6) form, wherein, the matrix equation left side It is represented by:
    <mrow> <msup> <mi>FCF</mi> <mi>T</mi> </msup> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>P</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>5</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>P</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>6</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>P</mi> <mn>5</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>6</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>P</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
    It is represented by the right of matrix equation:
    <mrow> <msub> <mrow> <mo>&amp;lsqb;</mo> <msup> <mi>e</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;rsqb;</mo> </mrow> <mi>x</mi> </msub> <mi>C</mi> <msubsup> <mrow> <mo>&amp;lsqb;</mo> <msup> <mi>e</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;rsqb;</mo> </mrow> <mi>x</mi> <mi>T</mi> </msubsup> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>p</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>5</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>p</mi> <mn>4</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>6</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>p</mi> <mn>5</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>6</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msub> <mi>p</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
    So the two matrix corresponding elements are divided by, shaped likeRepresent formula (5) that i-th of fundamental matrix formed and (6) Corresponding element is divided by,Then represent that all corresponding elements of formula (5) that i-th of fundamental matrix formed and (6) are divided by Average, is as a result equal to λ2, using this feature as foundation, establish the object function f (c) as shown in following formula (7);
    <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>n</mi> </munderover> <mo>{</mo> <mfrac> <mn>1</mn> <mn>5</mn> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>6</mn> </munderover> <msup> <mrow> <mo>&amp;lsqb;</mo> <mfrac> <mrow> <msubsup> <mi>P</mi> <mi>j</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msubsup> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msubsup> <mi>p</mi> <mi>j</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msubsup> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mover> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>P</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msup> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msup> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </msup> <mrow> <mo>(</mo> <mi>c</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>&amp;OverBar;</mo> </mover> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>}</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
    Step 6, in theory described in step 5 As a result it is equal to λ2, so object function is in ideal situation Under should be equal to 0, therefore its result is infinitely close to 0 using algorithm is optimized, then can obtain the camera internal reference fx,fy,uo, vo;Described utilize optimizes camera internal reference f described in Algorithm for Solvingx,fy,uo,voDuring, the optimization algorithm needs to join input Number, i.e. camera internal reference fx,fy,uo,voInitial value restriction is carried out, the present invention is by extracting any vertex of measured object described with laser Radar is set to the 3 d space coordinate (x, y, z) calculated under the three-dimensional system of coordinate of origin and calculates itself and ground in place The distance in face, with reference to any vertex of the measured object distance away from ground and pinhole imaging system principle and the camera on the image Pixel Dimensions calculate camera focus fx0,fy0, and the picture centre coordinate definition of described image is (uo0,vo0), with the figure Inconocenter coordinate (uo0,vo0) centered on image principal point is limited, the restriction scope must be reasonable, therefore in the figure Inconocenter coordinate (uo0,vo0) centered on, optimal solution is found in the range of being limited to ± 15% up and down, similarly the camera focus Limit scope to be also defined similar to this method, defined scope of initial values equally regulation is being limited to ± 15% scope up and down It is interior, using the initial value defined above as the optimization algorithm and then solve precision height, the good camera internal reference of robustness.
  2. 2. according to the method described in claim 1, it is characterized in that, the feature by image between any two viewpoint clicks through Matched twice described in row, wherein once matching the pixel using the characteristic point for finding two width described image overlapping regions Euclidean distance between coordinate (u, v) is less than the characteristic point pair of set threshold value, and Secondary Match is then consistent using stochastical sampling Property reject Mismatching point, obtain so that the distance away from the polar curve equation obtained by the fundamental matrix F less than given threshold point To being the matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i,1)。
  3. 3. according to the method described in claim 1, it is characterized in that, the fundamental matrix F is insensitive to white Gaussian noise, according to Matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i, 1) carry out solving the fundamental matrix F:
    To the matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i, 1) and image coordinate normalized is carried out, respectively To the matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i, 1) and carry out displacement and scale transformation so that scheme after conversion The origin of picture is located at the center of image, and all characteristic points are respectively positioned on using the center of image as the center of circle,For the circle of radius Interior, its conversion is represented by:
    B=TA, B'=T'A'(8)
    Wherein, A and A' is the matching double points A (u after the purificationi,vi, 1) and A'(u 'i,v′i, 1), T and T' is after the purifications Matching double points A (ui,vi, 1) and A'(u 'i,v′i, 1) and corresponding normalization matrix, B and B' are the matching after normalization Point is to B (uib,vib, 1) and B'(u 'ib,v′ib,1);
    Equation is established using the relation between the matching double points B and B' and fundamental matrix F after the normalization, such as equation (9) institute Show:
    <mrow> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>u</mi> <mrow> <mi>i</mi> <mi>b</mi> </mrow> </msub> </mtd> <mtd> <msub> <mi>v</mi> <mrow> <mi>i</mi> <mi>b</mi> </mrow> </msub> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>f</mi> <mn>1</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>2</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>3</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mn>4</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>5</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>6</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>f</mi> <mn>7</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>8</mn> </msub> </mtd> <mtd> <msub> <mi>f</mi> <mn>9</mn> </msub> </mtd> </mtr> </mtable> </mfenced> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msubsup> <mi>u</mi> <mrow> <mi>i</mi> <mi>b</mi> </mrow> <mo>&amp;prime;</mo> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>v</mi> <mrow> <mi>i</mi> <mi>b</mi> </mrow> <mo>&amp;prime;</mo> </msubsup> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mn>0</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
    Solution matrixAnd then solve fundamental matrix
    By the fundamental matrixAs initial value with reference to the matching double points A (u after the purificationi,vi, 1) and A' (u′i,v′i, 1) and utilize the maximal possibility estimation solution fundamental matrix F so that and it is insensitive to white Gaussian noise;
    Using the fundamental matrix F according to FTE'=0 solves limit e'.
  4. 4. according to the method described in claim 1, it is characterized in that, the object function is substantially one on the camera Internal reference fx,fy,uo,voFunction, using it is described optimization algorithm optimization processing is carried out to the object function, by the camera Internal reference fx,fy,uo,voAs input so that the object function is minimum, and obtained camera internal reference is then the camera to be demarcated Internal reference result.
  5. 5. according to the method described in claim 1, it is characterized in that, using it is described using laser radar position as origin three The 3 d space coordinate focusing scope calculated under dimension coordinate system is estimated that its specific steps includes:
    Step 1, the scan data solution is counted as using the laser radar position as the three-dimensional point under the coordinate system of origin Cloud coordinate;
    Step 2, extracts one vertex of testee and calculates the distance on one vertex of testee and ground;
    Step 3, it is burnt to calculate camera with reference to distance and pinhole imaging system principle of one vertex of testee on image plane and ground The effective focal length f of camera is calculated away from and with reference to the CCD camera Pixel Dimensions usedx0,fy0, with fx0,fy0Centered on, up and down It is limited in the range of ± 15%, the constraint for image principal point coordinate is with described image centre coordinate (uo0,vo0) centered on, then Limit scope and be defined to [0.85uo0,1.15uo0] and [0.85vo0,1.15vo0], the restriction scope must be reasonable, is only closing Optimal solution is found in the case of reason estimation initial value, can just obtain precision height, the internal reference of the good camera of stability.
CN201711225701.5A 2017-11-29 2017-11-29 Camera self-calibration method combined with laser radar three-dimensional point cloud data Active CN107977997B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711225701.5A CN107977997B (en) 2017-11-29 2017-11-29 Camera self-calibration method combined with laser radar three-dimensional point cloud data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711225701.5A CN107977997B (en) 2017-11-29 2017-11-29 Camera self-calibration method combined with laser radar three-dimensional point cloud data

Publications (2)

Publication Number Publication Date
CN107977997A true CN107977997A (en) 2018-05-01
CN107977997B CN107977997B (en) 2020-01-17

Family

ID=62008533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711225701.5A Active CN107977997B (en) 2017-11-29 2017-11-29 Camera self-calibration method combined with laser radar three-dimensional point cloud data

Country Status (1)

Country Link
CN (1) CN107977997B (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876909A (en) * 2018-06-08 2018-11-23 桂林电子科技大学 A kind of three-dimensional rebuilding method based on more image mosaics
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN109215063A (en) * 2018-07-05 2019-01-15 中山大学 A kind of method for registering of event triggering camera and three-dimensional laser radar
CN109308714A (en) * 2018-08-29 2019-02-05 清华大学苏州汽车研究院(吴江) Camera and laser radar information method for registering based on classification punishment
CN109360241A (en) * 2018-10-17 2019-02-19 江西洪都航空工业集团有限责任公司 One camera measures three-dimensional undercarriage wheel center displacement method
CN109741402A (en) * 2018-12-26 2019-05-10 上海交通大学 Small coincidence visual field multiple-camera combined calibrating method based on laser radar
CN109828262A (en) * 2019-03-15 2019-05-31 苏州天准科技股份有限公司 Laser radar and the automatic combined calibrating method of camera based on plane and space characteristics
CN109901123A (en) * 2018-12-24 2019-06-18 文远知行有限公司 Transducer calibration method, device, computer equipment and storage medium
CN110033492A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Camera marking method and terminal
CN110033493A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Video camera 3D scaling method and terminal
CN110109086A (en) * 2019-04-24 2019-08-09 禾多科技(北京)有限公司 The off-line calibration method of the multi-thread beam laser radar in field end
CN110322518A (en) * 2019-07-05 2019-10-11 深圳市道通智能航空技术有限公司 Evaluation method, evaluation system and the test equipment of Stereo Matching Algorithm
CN110390697A (en) * 2019-07-11 2019-10-29 浙江大学 A kind of millimetre-wave radar based on LM algorithm and camera combined calibrating method
CN110456330A (en) * 2019-08-27 2019-11-15 中国人民解放军国防科技大学 Method and system for automatically calibrating external parameter without target between camera and laser radar
CN110503692A (en) * 2019-07-31 2019-11-26 武汉大学 Single line battle array optical satellite geometric calibration method and system based on small range calibration field
CN110543871A (en) * 2018-09-05 2019-12-06 天目爱视(北京)科技有限公司 point cloud-based 3D comparison measurement method
CN110601068A (en) * 2019-08-08 2019-12-20 国家电网有限公司 Full-automatic dynamic laser aiming device and method
CN110703230A (en) * 2019-10-15 2020-01-17 西安电子科技大学 Position calibration method between laser radar and camera
CN110910431A (en) * 2019-10-15 2020-03-24 西安理工大学 Monocular camera-based multi-view three-dimensional point set recovery method
CN111008602A (en) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 Two-dimensional and three-dimensional visual combined lineation feature extraction method for small-curvature thin-wall part
CN111123242A (en) * 2018-10-31 2020-05-08 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN111160098A (en) * 2019-11-21 2020-05-15 长春理工大学 Expression change face recognition method based on SIFT features
CN111292382A (en) * 2020-02-10 2020-06-16 北京百度网讯科技有限公司 Method and device for calibrating vehicle-mounted image acquisition equipment, electronic equipment and medium
CN111578862A (en) * 2020-05-27 2020-08-25 山东大学 Point cloud precision calibration device and method for ground three-dimensional laser scanner
CN111666935A (en) * 2019-03-06 2020-09-15 北京京东尚科信息技术有限公司 Article center positioning method and device, logistics system and storage medium
CN111963249A (en) * 2020-07-30 2020-11-20 中煤科工集团西安研究院有限公司 Distributed transparent working face full roadway monitoring system and method
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN112711033A (en) * 2020-12-09 2021-04-27 中科视语(北京)科技有限公司 Slope safety monitoring and early warning device and method
CN113256696A (en) * 2021-06-28 2021-08-13 中国人民解放军国防科技大学 External parameter calibration method of laser radar and camera based on natural scene
CN113325434A (en) * 2021-04-16 2021-08-31 盎锐(上海)信息科技有限公司 Explosion point display method and system for actual measurement actual quantity and laser radar
CN113658264A (en) * 2021-07-12 2021-11-16 华南理工大学 Single image camera focal length estimation method based on distance information
CN115790449A (en) * 2023-01-06 2023-03-14 威海晶合数字矿山技术有限公司 Three-dimensional shape measurement method for long and narrow space

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102339463A (en) * 2010-07-22 2012-02-01 首都师范大学 System and method for calibrating linear array camera based on laser scanner
CN102982548A (en) * 2012-12-11 2013-03-20 清华大学 Multi-view stereoscopic video acquisition system and camera parameter calibrating method thereof
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
EP2960859A1 (en) * 2014-06-19 2015-12-30 Tata Consultancy Services Limited Constructing a 3d structure
CN106228537A (en) * 2016-07-12 2016-12-14 北京理工大学 A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102339463A (en) * 2010-07-22 2012-02-01 首都师范大学 System and method for calibrating linear array camera based on laser scanner
CN102982548A (en) * 2012-12-11 2013-03-20 清华大学 Multi-view stereoscopic video acquisition system and camera parameter calibrating method thereof
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
EP2960859A1 (en) * 2014-06-19 2015-12-30 Tata Consultancy Services Limited Constructing a 3d structure
CN106228537A (en) * 2016-07-12 2016-12-14 北京理工大学 A kind of three-dimensional laser radar and the combined calibrating method of monocular-camera

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108876909A (en) * 2018-06-08 2018-11-23 桂林电子科技大学 A kind of three-dimensional rebuilding method based on more image mosaics
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN109215063A (en) * 2018-07-05 2019-01-15 中山大学 A kind of method for registering of event triggering camera and three-dimensional laser radar
CN109308714A (en) * 2018-08-29 2019-02-05 清华大学苏州汽车研究院(吴江) Camera and laser radar information method for registering based on classification punishment
CN110543871A (en) * 2018-09-05 2019-12-06 天目爱视(北京)科技有限公司 point cloud-based 3D comparison measurement method
CN110543871B (en) * 2018-09-05 2022-01-04 天目爱视(北京)科技有限公司 Point cloud-based 3D comparison measurement method
CN109360241A (en) * 2018-10-17 2019-02-19 江西洪都航空工业集团有限责任公司 One camera measures three-dimensional undercarriage wheel center displacement method
CN109360241B (en) * 2018-10-17 2022-03-15 江西洪都航空工业集团有限责任公司 Method for measuring center displacement of three-dimensional undercarriage wheel by single camera
CN111123242B (en) * 2018-10-31 2022-03-15 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN111123242A (en) * 2018-10-31 2020-05-08 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN109901123A (en) * 2018-12-24 2019-06-18 文远知行有限公司 Transducer calibration method, device, computer equipment and storage medium
CN109901123B (en) * 2018-12-24 2023-12-01 文远知行有限公司 Sensor calibration method, device, computer equipment and storage medium
CN109741402B (en) * 2018-12-26 2023-04-07 上海交通大学 Small-coincidence-field multi-camera combined calibration method based on laser radar
CN109741402A (en) * 2018-12-26 2019-05-10 上海交通大学 Small coincidence visual field multiple-camera combined calibrating method based on laser radar
CN111666935A (en) * 2019-03-06 2020-09-15 北京京东尚科信息技术有限公司 Article center positioning method and device, logistics system and storage medium
CN109828262A (en) * 2019-03-15 2019-05-31 苏州天准科技股份有限公司 Laser radar and the automatic combined calibrating method of camera based on plane and space characteristics
CN110033493A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Video camera 3D scaling method and terminal
CN110033492A (en) * 2019-04-17 2019-07-19 深圳金三立视频科技股份有限公司 Camera marking method and terminal
CN110109086A (en) * 2019-04-24 2019-08-09 禾多科技(北京)有限公司 The off-line calibration method of the multi-thread beam laser radar in field end
CN110322518B (en) * 2019-07-05 2021-12-17 深圳市道通智能航空技术股份有限公司 Evaluation method, evaluation system and test equipment of stereo matching algorithm
CN110322518A (en) * 2019-07-05 2019-10-11 深圳市道通智能航空技术有限公司 Evaluation method, evaluation system and the test equipment of Stereo Matching Algorithm
CN110390697B (en) * 2019-07-11 2021-11-05 浙江大学 Millimeter wave radar and camera combined calibration method based on LM algorithm
CN110390697A (en) * 2019-07-11 2019-10-29 浙江大学 A kind of millimetre-wave radar based on LM algorithm and camera combined calibrating method
CN110503692A (en) * 2019-07-31 2019-11-26 武汉大学 Single line battle array optical satellite geometric calibration method and system based on small range calibration field
CN110601068A (en) * 2019-08-08 2019-12-20 国家电网有限公司 Full-automatic dynamic laser aiming device and method
CN110456330A (en) * 2019-08-27 2019-11-15 中国人民解放军国防科技大学 Method and system for automatically calibrating external parameter without target between camera and laser radar
CN110456330B (en) * 2019-08-27 2021-07-09 中国人民解放军国防科技大学 Method and system for automatically calibrating external parameter without target between camera and laser radar
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN110703230B (en) * 2019-10-15 2023-05-19 西安电子科技大学 Position calibration method between laser radar and camera
CN110910431A (en) * 2019-10-15 2020-03-24 西安理工大学 Monocular camera-based multi-view three-dimensional point set recovery method
CN110703230A (en) * 2019-10-15 2020-01-17 西安电子科技大学 Position calibration method between laser radar and camera
CN111160098A (en) * 2019-11-21 2020-05-15 长春理工大学 Expression change face recognition method based on SIFT features
CN111008602B (en) * 2019-12-06 2023-07-25 青岛海之晨工业装备有限公司 Scribing feature extraction method combining two-dimensional vision and three-dimensional vision for small-curvature thin-wall part
CN111008602A (en) * 2019-12-06 2020-04-14 青岛海之晨工业装备有限公司 Two-dimensional and three-dimensional visual combined lineation feature extraction method for small-curvature thin-wall part
CN111292382B (en) * 2020-02-10 2023-11-14 阿波罗智能技术(北京)有限公司 Method and device for calibrating vehicle-mounted image acquisition equipment, electronic equipment and medium
CN111292382A (en) * 2020-02-10 2020-06-16 北京百度网讯科技有限公司 Method and device for calibrating vehicle-mounted image acquisition equipment, electronic equipment and medium
CN111578862A (en) * 2020-05-27 2020-08-25 山东大学 Point cloud precision calibration device and method for ground three-dimensional laser scanner
CN111578862B (en) * 2020-05-27 2021-05-04 山东大学 Point cloud precision calibration device and method for ground three-dimensional laser scanner
CN111963249A (en) * 2020-07-30 2020-11-20 中煤科工集团西安研究院有限公司 Distributed transparent working face full roadway monitoring system and method
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN112711033B (en) * 2020-12-09 2022-02-18 中科视语(北京)科技有限公司 Slope safety monitoring and early warning device and method
CN112711033A (en) * 2020-12-09 2021-04-27 中科视语(北京)科技有限公司 Slope safety monitoring and early warning device and method
CN113325434A (en) * 2021-04-16 2021-08-31 盎锐(上海)信息科技有限公司 Explosion point display method and system for actual measurement actual quantity and laser radar
CN113256696A (en) * 2021-06-28 2021-08-13 中国人民解放军国防科技大学 External parameter calibration method of laser radar and camera based on natural scene
CN113658264A (en) * 2021-07-12 2021-11-16 华南理工大学 Single image camera focal length estimation method based on distance information
CN113658264B (en) * 2021-07-12 2023-08-18 华南理工大学 Single image camera focal length estimation method based on distance information
CN115790449A (en) * 2023-01-06 2023-03-14 威海晶合数字矿山技术有限公司 Three-dimensional shape measurement method for long and narrow space
CN115790449B (en) * 2023-01-06 2023-04-18 威海晶合数字矿山技术有限公司 Three-dimensional shape measurement method for long and narrow space

Also Published As

Publication number Publication date
CN107977997B (en) 2020-01-17

Similar Documents

Publication Publication Date Title
CN107977997A (en) A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud
CN104156972B (en) Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN103106688B (en) Based on the indoor method for reconstructing three-dimensional scene of double-deck method for registering
Remondino et al. Dense image matching: Comparisons and analyses
CN103868460B (en) Binocular stereo vision method for automatic measurement based on parallax optimized algorithm
CN106023303B (en) A method of Three-dimensional Gravity is improved based on profile validity and is laid foundations the dense degree of cloud
CN112927360A (en) Three-dimensional modeling method and system based on fusion of tilt model and laser point cloud data
CN102592124A (en) Geometrical correction method, device and binocular stereoscopic vision system of text image
CN106534670B (en) It is a kind of based on the panoramic video generation method for connecting firmly fish eye lens video camera group
CN110322485A (en) A kind of fast image registration method of isomery polyphaser imaging system
CN107084680A (en) A kind of target depth measuring method based on machine monocular vision
CN105931185A (en) Automatic splicing method of multiple view angle image
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN108881717A (en) A kind of Depth Imaging method and system
CN108924408A (en) A kind of Depth Imaging method and system
CN114359406A (en) Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
CN107909543A (en) A kind of flake binocular vision Stereo matching space-location method
CN108230242A (en) A kind of conversion method from panorama laser point cloud to video flowing
CN116958437A (en) Multi-view reconstruction method and system integrating attention mechanism
CN116957987A (en) Multi-eye polar line correction method, device, computer equipment and storage medium
Peng et al. LF-fusion: Dense and accurate 3D reconstruction from light field images
JP7300895B2 (en) Image processing device, image processing method, program, and storage medium
CN116721149A (en) Weed positioning method based on binocular vision
CN107123135A (en) A kind of undistorted imaging method of unordered three-dimensional point cloud
CN108924407A (en) A kind of Depth Imaging method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant