CN108828606B - One kind being based on laser radar and binocular Visible Light Camera union measuring method - Google Patents

One kind being based on laser radar and binocular Visible Light Camera union measuring method Download PDF

Info

Publication number
CN108828606B
CN108828606B CN201810240140.4A CN201810240140A CN108828606B CN 108828606 B CN108828606 B CN 108828606B CN 201810240140 A CN201810240140 A CN 201810240140A CN 108828606 B CN108828606 B CN 108828606B
Authority
CN
China
Prior art keywords
matrix
visible light
laser radar
light camera
binocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810240140.4A
Other languages
Chinese (zh)
Other versions
CN108828606A (en
Inventor
曹剑中
胡国良
周祚峰
郭惠楠
黄会敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN201810240140.4A priority Critical patent/CN108828606B/en
Publication of CN108828606A publication Critical patent/CN108828606A/en
Application granted granted Critical
Publication of CN108828606B publication Critical patent/CN108828606B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/36Videogrammetry, i.e. electronic processing of video signals from a single source or from different sources to give parallax or range information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Manufacturing & Machinery (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides one kind based on laser radar and binocular Visible Light Camera union measuring method, can simply, efficiently obtain the more accurate and dense three-dimensional information of target.Laser radar is assumed a camera apparatus with fixed internal reference by the union measuring method, three dimensional point cloud is directly projected as two dimensional image, using the method for image procossing, go out rotation between laser radar apparatus and binocular camera, translation relation using the matching and calibration between two dimensional image;Wherein the solution of the thought of matrix norm and trace of a matrix progress spin matrix is asked in innovative introducing;Finally by laser radar and binocular stereo vision point cloud data fusion, accurate position and posture information can not only be obtained, and the special texture of target surface, characteristic information can be reconstructed, this crosses docking for the spacecraft in military domain, and workpiece calibration in hostile satellite capture etc. and civil field, unmanned etc. have very high application value.

Description

One kind being based on laser radar and binocular Visible Light Camera union measuring method
Technical field
The invention belongs to radars and technical field of visual measurement, are related to a kind of method of three-dimensional information measurement.
Background technique
Binocular vision mainly studies the function that human eye how is realized using computer, that is, utilizes two-dimensional projection image realization pair Perception, identification and the understanding of objective world three-dimensional scenic.With the rapid development of computer technology, need of the people to threedimensional model Ask more and more.Binocular stereo vision measurement working principle is two one width scenes of shooting by different location, according to spatial point Parallax in two images obtains the D coordinates value of the point.
Vision measurement system passes through long-run development, has had already appeared a variety of advanced image processing methods.But vision system System still can only be in engineering-environment, as used in industrialization or laboratory scene.Stringent scene is being required measurement accuracy In, such as in independently the cross docking and avoidance of satellite, in the capture of non-cooperative Spacecraft or satellite, minimum error will be led Catastrophic consequence is caused, in this case, vision system can't be used alone as main means.The limit of vision measurement system System derives from the limitation of measuring principle, regardless of by active illumination or passive light source, measurement result is often excessively dependent on anti- The intensity of light is penetrated, therefore this measurement method is under sensitive lighting condition, for the weak texture region in imaging, object features It extracts and match difficult to realize, cause to measure inaccurate.
Laser radar is also known as laser scanner, is a kind of emerging remote sensing, it utilizes laser distance measuring principle, passes through reality When laser beam space angular displacement obtain the three-dimensional coordinate information of target surface point.Three-dimensional laser scanning technique light harvesting, machine, , it is tradition mapping measurement technology by accurate sensing process integration and various modern high-tech to the various technologies such as electricity Means are integrated and grow up, and are the summary to a variety of traditional surveying and mapping technologies and integration.Three-dimensional laser scanning technique is one The emerging surveying and mapping technology of door is the technological revolution again after survey field GPS technology.It is traditional due to technically breaching Single-point measurement method, maximum feature are exactly that precision is high, speed is fast, approach original shape, are that survey field research both at home and abroad is closed at present One of hot spot of note.
Lidar measurement system is compared with two CCD camera measure system, although spot measurement precision is higher, but if right Entire target scene scanning, then have the shortcomings that relative data is sparse, scan frequency is relatively low, and laser radar is only adjusted the distance Information sensing, it is insensitive to texture, so laser radar point cloud data lack texture information, and lack surface specific characteristic or Colouring information.
107421465 A of Chinese patent literature CN (a kind of binocular vision joining method based on laser tracker) is proposed Using the method for laser tracker (being not lidar measurement system) and binocular vision system progress three-dimensional data splicing. Laser tracker is substantially equivalent to an energy laser interference ranging and automatically tracks the total station of angle and distance measuring, and difference place is It does not have telescope.The key equipment of laser tracker is reflector target, and cost is more expensive, is a kind of optics reflex reflection Device, it reflects back all light along optical axis direction incidence along former road, into interference system, interferes reality with reference light Now to the high-acruracy survey of displacement.Laser tracker according to be received back come deviation of the light beam on position detector come band turn Mirror rotation, until incident light by the center of target mirror, makes system reach tracking balance state again.In this scheme for The measurement of target is entirely by binocular vision system, and in order to solve eclipse phenomena, surrounding target rotates biocular systems, reconstructs Multiple groups three dimensional point cloud.The purpose for introducing laser tracker is that multiple groups binocular point cloud data is all switched to swashing for unification Under optical tracker system coordinate system, to realize the three-dimensional reconstruction of all surfaces of target.In order to realize the fusion of multiple groups point cloud, system Rotation between camera and laser tracker, translation relation must be first obtained, this method establishes laser tracker coordinate respectively System, transmitter target co-ordinates system, three coordinate systems of binocular camera coordinate system are sat by calibrating laser tracker and target respectively The positional relationship of system, the positional relationship between binocular camera and target co-ordinates system are marked, and then finds out laser tracker and binocular phase Positional relationship between machine coordinate system is prepared for subsequent Point-clouds Registration.Reflector target used in the method is made High price is expensive, is not available very much, is so very unfavorable for the calibration of binocular camera and target co-ordinates system, because use Characteristic point is very little, and accounts for that entire camera imaging visual field is too small, and the precision of camera calibration will not be very high.For surface line in industry Managing information is not that workpiece (such as pure color) very abundant measures, and the use of Binocular vision photogrammetry means is also difficult to realize , binocular feature error hiding rate is very high.
The burnt grand doctoral thesis of the National University of Defense Technology is " based on the compound three-dimensional essence of imaging laser radar and double CCD It is fine into as technical research " a kind of scaling method that laser radar and camera are realized using three-dimensional target is proposed in a text.It should The three-dimensional target that scheme uses is made of three faces, and grid is all drawn on three faces, is looked for not in the data of laser radar scanning To the three-dimensional coordinate of mesh point, because radar scanning data are no plane characteristic points.It is demarcated to realize, it is necessary in article First the point set screening in face each in laser radar data to be come, then respective fit Plane, fits three faces and ask three again The intersection point in a face as the most important three-dimensional point in laser radar coordinate system, then further according to planning grid when net Lattice size and three respective equations in face, calculate the three-dimensional coordinate of other mesh points.This three surface grids stereo target exists simultaneously It is the two dimensional image coordinate for being easy to extract its all mesh point in camera image, utilizes three-dimensional coordinate above in this way Projection matrix can first be calculated similar to Zhang Zhengyou calibration method by utilizing with corresponding two-dimensional coordinate, then calculate spin matrix again And translation matrix, so far just complete the calibration between laser radar and camera.There are some problems for this scaling method:
1) amount of calculation is excessive, and process is many and diverse.Because first having to that accurately the point set screening in three faces is come, also want Fit Plane, then after matching the mesh point in three dimensional network lattice point and image, then just can be carried out demarcation flow.
2) it is readily incorporated accumulated error, precision does not guarantee.The fit procedure of three planes necessarily has error, results in The solution of three face intersection points has error introducing, and then other calculated mesh coordinates all have error.Below be similar In Zhang Zhengyou calibration method, projection matrix is first sought, then asks rotation, translation matrix.The result that such method obtains is often not It is good, because the calibration point needs relied on are very more and it is necessary to which the data of calibration are optimized with the calibration value that can be just got well.
3) to be used in calibration as far as possible more than mesh point to reduce error because stereo target is three faces, in order to allow phase Machine takes mesh points more as far as possible, just requires to the placement of target, if arbitrarily placed, the mesh point shot is in image In layout it is irregular, be difficult to be automatically performed when doing three dimensional network lattice point and matching with the mesh point of two dimensional image, as fruit dot is non- If often more, manpower intervention processing is just very troublesome.
Summary of the invention
The present invention provides one kind and is based on laser radar and binocular Visible Light Camera union measuring method, can be easy, efficient Ground obtains the more accurate and dense three-dimensional information of target, can not only obtain accurate position and posture information, and can be with Reconstruct the special texture of target surface, characteristic information.
The key that the technology of the present invention is realized is: the calibration between laser radar and binocular camera, between binocular camera Calibration;So as to realize that the point cloud data that laser radar obtains and the point cloud data that binocular camera measurement obtains are merged Measurement, the relatively more accurate and dense point cloud data of final available noncooperative target.
For how to realize the combined calibrating to laser radar and binocular camera, design concept and reason of the invention are summarized By being analyzed as follows:
Calibration for laser radar and binocular Visible Light Camera uses the left camera in laser radar and binocular camera All stereo target is scanned and is imaged, laser radar is assumed into a camera apparatus with fixed internal reference, using setting Fixed suitable parameter projects to space three-dimensional target on two dimensional image, obtains target center using the means of image procossing and sits Mark, it is theoretical using the Epipolar geometry in computer vision, the basis matrix F between two equipment first is calculated using 8 methods, into And find out essential matrix E.Based on essential matrix E, the present invention is specifically incorporated a kind of ask and matrix norm and the derivation of trace of a matrix is asked to think Road proposes a kind of new method for solving spin matrix R, translation matrix T by decomposing essential matrix.
Above method process is equally applicable to the calibration between binocular camera.
The above specific theory deduction of scaling method is as follows:
If plAnd prCorresponding characteristic point, M respectively in left imagesl、MrThe respectively internal reference of two imaging devices in left and right Matrix, it is theoretical according to Epipolar geometry in computer vision, have:
Matrix E is known as essential matrix, and it is all the matrix of 3*3 that matrix F, which is known as basis matrix,.Formula (1) can turn to:
Wherein: pl=[x1 y1]T, pr=[x2 y2]T,
Basis matrix F can be found out using 8 pairs or more of match points.Then essential matrix is acquired by formula (2) again E.In order to reduce error, resolved as far as possible using enough match points.
It is rotation and translation matrix between two imaging devices that calibration is most conceivable.
The property of essential matrix:
E=RS (4)
E*T=RS*T=0 (5)
Wherein, R is the spin matrix of equipment room, and S is the antisymmetric matrix of translation matrix T construction.
The value of T can be obtained using the method for singular value decomposition, it is noted here that T be it is normalized, lack a proportionality coefficient, and It is not true translation vector between two equipment.It could go to seek proportionality coefficient again after finding out R.
After finding out T, it is assumed that T=[t1 t2 t3]T, just it is to solve for spin matrix R.The solution of R is one optimal in fact Change problem, the innovative introducing of the present invention ask the thought of matrix norm and trace of a matrix to be derived, and equation is as follows:
Problem conversion are as follows:
Wherein M=SETCarry out singular value decomposition:
M=UDVT (8)
Because of Z=VTRU is orthogonal matrix, therefore Zii≤1
Then work as Z=VTRU=I, trace (MRS) are maximum.
Then R=V*UT (10)
It is found in experimentation, there are two solutions by R.The reason is that a property of essential matrix: it is zero that characteristic value, which has one, separately Outer two are equal, it may be assumed that
Then for singular value decomposition: M=[U1 U2 U3]D[V1 V2 V3]T (11)
U3Positive and negative value so that singular value decomposition all be set up.Do you and how to determine which R is correct? because of R (1) Value represents projection of the X-axis of left equipment in right equipment X-axis, Ying Weizheng's, if finding that R (1) is negative in solution process, this When, U3It takes negative, then brings resolving into, it is just correct.So far R is just obtained.
The value of T mentioned above lacks a proportionality coefficient, if at this time using just calibrate come R and T to two groups pairs It should put and be rebuild, can be obtained by the proportionality coefficient of needs divided by calculated value with the actual distance of two three-dimensional space, Certainly, in order to reduce error, multi-group data can be calculated and averaged.
In order to verify the correctness of the above essential matrix resolution theory, 20 groups of R, T matrixes are constructed in experimentation, are utilized Formula (4) calculates essential matrix, then carries out decomposing reverse R, T in aforementioned manners, as a result identical with R, T of construction, it was demonstrated that The correctness of resolution theory.
For the scanning imagery of laser radar, optimization design stereo target of the present invention is ball-type target.Either laser Radar or camera are all a ball or round spot, the ball point cloud that laser radar scanning goes out from any angle scanning or shooting It is necessarily exactly a round spot when projecting to two dimensional image.In this way, just relatively simple in the extraction for carrying out round spot centre coordinate It is single, and in the projected image of laser radar and the image of camera, the layout order of all round spots is the same, of central point With that can be automatically performed, speed is quickly.
Since the matching at round spot center can be easily automatically performed, so many balls can be used in entire target To reduce error.The extraction at each round spot center be it is independent, characteristic point is more, finds out essential matrix E error using 8 methods It is smaller, accumulated error will not be introduced as conventional method.Decomposition essential matrix proposed by the present invention is recycled to ask rotation, translation Method, so that it may accurate decompose obtains rotation R, the translation matrix T of equipment room.
Based on the above design concept and theory analysis, the present invention finally provides following solution:
It should be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: the following steps are included:
1) measuring system is built, and sets up coordinate system;The experimental facilities includes laser radar, binocular Visible Light Camera;
2) calibration of laser radar and binocular Visible Light Camera
Ball-type target is set, i.e., arranges multiple balls on smooth substrate, known to the spacing between each ball;
Laser radar is scanned to obtain three dimensional point cloud to ball-type target, and laser radar, which is assumed one, to be had admittedly The camera apparatus for determining internal reference projects three dimensional point cloud to obtain first group of two dimensional image;
Ball-type target is imaged using binocular Visible Light Camera one of camera to obtain second group of two dimensional image;
The centre coordinate of all balls in two groups of two dimensional images is respectively obtained using the means of image procossing, is based on two groups Centre coordinate, it is theoretical according to Epipolar geometry in computer vision, first laser radar and binocular visible light phase are calculated using 8 methods Basis matrix F between machine, and then find out essential matrix E;
According to formula E*T=RS*T=0, S is the antisymmetric matrix of translation matrix T construction, is solved according to essential matrix E To the translation matrix T between laser radar and binocular Visible Light Camera;
Then it is derived according to E=RS according to seeking matrix norm and the derivation thinking of trace of a matrix being asked to carry out optimization Out as trace (SETRS it is minimum to meet trace (| | E-RS | |) when) maximum, according to R=V*UTSpin matrix R is calculated, Middle V and U is to SETCarry out two unitary matrice that singular value decomposition obtains;Complete laser radar and binocular Visible Light Camera Calibration;
3) calibration between binocular Visible Light Camera
Target is set, and two cameras in binocular Visible Light Camera are respectively imaged target, obtain two groups of X-Y schemes Picture respectively obtains the character pair point coordinate of target in two groups of two dimensional images using the means of image procossing, is based on two groups of features Point coordinate, it is theoretical according to Epipolar geometry in computer vision, first the basis between binocular Visible Light Camera is calculated using 8 methods Matrix F, and then find out essential matrix E;
According to formula E*T=RS*T=0, S is the antisymmetric matrix of translation matrix T construction, is solved according to essential matrix E To the translation matrix T between binocular Visible Light Camera;
Then it is derived according to E=RS according to seeking matrix norm and the derivation thinking of trace of a matrix being asked to carry out optimization Out as trace (SETRS it is minimum to meet trace (| | E-RS | |) when) maximum, according to R=V*UTSpin matrix R is calculated, Middle V and U is to SETCarry out two unitary matrice that singular value decomposition obtains;Complete the calibration between binocular Visible Light Camera;
It should be noted that distinguishing above, step 2), step 3) simply indicates that this is two independent processes, is not pair The restriction of sequencing, it may be assumed that can first carry out step 2) and carry out step 3) again, can also first carry out step 3) and carry out step again 2);Moreover, the two steps can carry out simultaneously in the case where all selection ball-type targets.
4) it measures, merge
Laser radar is scanned measured target to obtain first group of three dimensional point cloud;
Measured target is imaged in binocular Visible Light Camera, according to step 3) translation matrix T obtained by calibrating and rotation Matrix R carries out three-dimensional reconstruction, obtains second group of three dimensional point cloud;
According to step 2) translation matrix T obtained by calibrating and spin matrix R, first group of three dimensional point cloud is carried out corresponding Translation, rotation transformation, complete two groups of three dimensional point clouds fusion.
It advanced optimizes, gridiron pattern scaling board can be used in target in step 3).The ball-type being especially arranged compared to step 2) Target, step 3) can make the precision of final pose measurement higher using conventional gridiron pattern scaling board.
It advanced optimizes, set ball-type target in step 2), multiple ball sizes are identical on substrate, equidistantly Arrangement.It so, it is possible to facilitate step 2) calibration more accurate, so that the effect for merging maximal end point cloud is more preferable.
It advanced optimizes, translation matrix T is solved according to essential matrix E, the mode of SVD decomposition can be used.Certainly, existing To this, there are also a lot of other conventional solution modes in technology.
It advanced optimizes, two cameras of binocular Visible Light Camera are respectively placed in the left and right sides of laser radar.Due to Combined measurement only merges overlapping region, so being arranged such can make overlapping region bigger, then the range finally merged is more Completely.
It advanced optimizes, if used binocular camera is not industrial camera (known to internal reference), before step 2), Preferably first the internal reference of the single camera of binocular camera is demarcated.Specifically binocular camera list is completed using Zhang Zhengyou calibration method The calibration of the internal reference of a camera.
Technical effect of the invention is as follows:
1, laser radar and binocular stereo vision point cloud data fusion can be made up respective disadvantage by the present invention, be obtained The more accurate and dense three-dimensional information of target, can not only obtain accurate position and posture information, and can reconstruct The special texture of target surface, characteristic information, this crosses docking, hostile satellite capture etc. for the spacecraft in military domain, And the workpiece calibration in civil field, unmanned wait have very high application value.
2, for the fusion between multiple groups point cloud, existing method is mostly using based on cloud coordinate and local feature Similarity carries out matched method, such as iteration closest approach, the matching process based on thin plate spline, consistent point drift and based on complete The matching process of office and local feature.Since one group of point cloud data often has millions of a points, if fusion is all adopted again every time It is registrated with traditional method, workload will be very big, and efficiency is very low.
And laser radar is assumed into a camera apparatus with fixed internal reference in the present invention, three dimensional point cloud is straight It connects and is projected as two dimensional image, using the method for image procossing, go out laser radar apparatus using the matching and calibration between two dimensional image Rotation, translation relation between binocular camera.In this manner it is possible to the point cloud datas of any multiple groups directly using calibrating Rotation, translation relation are merged.The present invention only need to be demarcated once, and time complexity and space complexity are all far smaller than Traditional method, greatly reduces workload.
3, the present invention innovatively introduces the solution for asking the thought of matrix norm and trace of a matrix to carry out spin matrix, and first asks R, T of multiple groups, the conventional method optimized again are compared out, and this method for solving calculation amount of the invention is smaller, demarcate speed Faster, more accumulated errors will not be introduced, the accuracy of calibration is higher, so that final pose measurement precision is higher.
4, optimization design stereo target of the present invention is ball-type target, is carried out easy when the character pair point coordinate extraction of target Fast, and many balls can be used to reduce error in entire target.
Detailed description of the invention
Fig. 1 is measurement general flow chart.
Fig. 2 is camera and laser radar demarcation flow figure.
Fig. 3 is binocular measurement flow chart.
Fig. 4 is the gridiron pattern scaling board of binocular camera calibration and usage.
Fig. 5 is the stereo target of laser radar and camera combined calibrating.
Fig. 6 is laser radar three-dimensional point cloud atlas.
Fig. 7 is the three-dimensional point cloud atlas at laser radar point cloud interception target.
Fig. 8 is that laser radar target three-dimensional point cloud projects to two-dimensional image.
Fig. 9 is filtered image.
Figure 10 is the image after binary conversion treatment.
Figure 11 is the 40 round spot centers extracted.
Figure 12 is the target center extracted after camera image is handled.
Specific embodiment
The following describes the present invention in detail with reference to the accompanying drawings and specific embodiments.
The measuring system that the present embodiment is built, laser radar are erected among binocular camera, binocular camera measuring system Global coordinate system is established on left camera, and origin is established at optical center, and the u direction of camera imaging plane is X-direction, and the direction v is the side Y To optical axis direction is Z-direction.
The camera used in measuring system is capital boat JHSM300f.3.2*3.2 μm of Pixel size, resolution ratio 2048*1536. Composition binocular measuring system can be arbitrarily placed using two.
Laser radar is a high speed three with overlength scanning distance using FARO Focus3D X330 HDR Tie up scanner.0.6-330 meters of scanning range;Measurement error error in 10m and 25m is ± 2mm,;Reflectivity is respectively 90% With 10%;Resolution ratio is greater than 70,000,000 colour elements;Vertical resolution is 0.009 degree;Horizontal resolution is 0.009 degree.
As shown in Figure 1, overall measurement process are as follows:
The calibration of the internal reference of binocular camera single camera is completed first with Zhang Zhengyou calibration method, utilization is proposed by the present invention The calibration of rotation, translation relation between matrix solving method completion binocular camera.The left camera of binocular and laser radar are completed again The calibration of rotation, translation relation between equipment.Then using calibration as a result, completing the measurement of binocular camera first to weight The three-dimensional point cloud for building target finally completes merging for binocular point cloud and laser radar point cloud.
One of as shown in Fig. 2, binocular camera () process with laser radar calibration are as follows:
Laser radar is assumed into a camera apparatus with fixed internal reference first, first assumes its focal length out, optical center Position, the three-dimensional point cloud then scanned to it carry out projecting to 2D image, and by filtering, binary conversion treatment can be extracted Stereo target out, and then extract target center.Target is extracted using same method to the target image of camera left in binocular Center, the target center of two groups of images utilizes the principle of Epipolar geometry, solves basis matrix F using 8 methods, and then seeks essence Then matrix E decomposes essential matrix SVD, finally obtains rotation, the translation matrix of two equipment rooms.
Calibration between binocular camera is similar with above method.
The flow chart of binocular camera measurement, as shown in Figure 3.Binocular camera camera calibration (including binocular camera is carried out first Calibration between the calibration and binocular camera of internal reference), feature extraction and matching then is carried out to left images respectively, obtains parallax Scheme and then reconstruct the three-dimensional point cloud of target.
Below with reference to specific test case, the present invention will be described, and steps are as follows:
Step 1) builds experimental facilities, and sets up coordinate system.
The calibration of step 2) binocular camera measuring system:
Binocular measuring system is constituted using two JHSM300f cameras, utilizes gridiron pattern shown in Zhang Zhengyou calibration method and Fig. 4 Scaling board demarcates camera, and calibration result is as follows:
Left camera internal reference matrix:
Right camera internal reference matrix:
Essential matrix is solved using 1458 groups of corresponding points.
And rotation, translation matrix are acquired with method proposed by the present invention decomposition.Decomposition obtains the rotation of right camera to left camera Turn, translation relation are as follows:
It is more preferable in order to verify the rotation translation relation that the present invention solves, three-dimensional is carried out to this 1458 points with obtained matrix The reconstruction of coordinate compares the error between actual distance and calculated value, is compared with the conventional method carried in Matlab.
The conventional method carried in Matlab, the rotation of solution, translation matrix are as follows:
T=[- 332.284802560441-2.10062465311190 6.67091696474354]
1 method for solving Contrast on effect of table
True value (mm) The method of the present invention (mm) Matlab conventional method (mm)
50 49.7691 50.2390
100 99.8966 100.6317
150 149.8056 150.5668
200 199.8594 200.4737
70.7107 70.3406 71.1591
400 399.6709 400.6941
The data from table 1 are, it is apparent that the method precision that the result ratio Matlab that the method for the present invention solves is carried is wanted It is much higher.
The calibration of step 3) laser radar and the left camera of binocular
The scan data of laser radar is obtained first with ball-type target shown in fig. 5, as shown in fig. 6, then individually cutting The three-dimensional data at stereo target is taken, as shown in Figure 7.
Laser radar is assumed into a camera now, sets suitable internal reference matrix for it:
Target point cloud is carried out being projected as 2D image, as shown in Figure 8.Image is filtered, is schemed as shown in Figure 9 Picture, then binaryzation obtains image shown in Figure 10.Round spot center is extracted, as shown in figure 11.
Left camera image is obtained, suitable threshold value is set and carries out binaryzation extraction, and then extract target center, such as Figure 12 institute Show.
Essential matrix is solved using 8 methods to two groups of target centers, then essential matrix is decomposed to obtain two The rotation translation relation of equipment room.
T=[0.203788030063619-0.124632794510692-0.0517513804995891]
Step 4) point Yun Ronghe
According to calibration result, binocular measuring system reconstruction point cloud translates laser radar point cloud data accordingly, is revolved Transformation is changed, and the fusion of two groups of three dimensional point clouds is completed.
The results showed that
Binocular measuring system is significantly red for texture region strong in target, such as color characteristic, blue circular ring, angle point Region can reconstruct well to be come, but weak texture region elsewhere is difficult to rebuild.
The point cloud data range that laser radar obtains is larger, the information sensing but laser radar is only adjusted the distance, and to target On color, specific characteristic is insensitive, so the point cloud scanned only has range information.
By by the fusion of two groups of point cloud datas, can be very good to make up respective disadvantage, noncooperative target is obtained more Accurate and three-dimensional information abundant.
In order to verify the measurement accuracy of this measurement experiment, high-precision rotating platform Zolix RAK200 is used, is determined Position error is less than 0.005 degree.By comparing calculated value and practical rotation, shift value, obtain algorithm meet 1m apart from it is lower it is opposite away from It is less than 0.05m@1m, less than 0.4 °@1m of relative angle measurement error from measurement error.Multiple groups high accuracy data is as a result, also verify Essential matrix proposed by the present invention decompose seek rotation, translation matrix method correctness;Either in military, civilian neck There is very high application value in domain.

Claims (7)

1. one kind is based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: the following steps are included:
1) measuring system is built, and sets up coordinate system;The measuring system includes laser radar, binocular Visible Light Camera;
2) calibration of laser radar and binocular Visible Light Camera
Ball-type target is set, i.e., arranges multiple balls on smooth substrate, known to the spacing between each ball;
Laser radar is scanned to obtain three dimensional point cloud to ball-type target, and laser radar, which is assumed one, to be had in fixed The camera apparatus of ginseng projects three dimensional point cloud to obtain first group of two dimensional image;
Ball-type target is imaged using binocular Visible Light Camera one of camera to obtain second group of two dimensional image;
The centre coordinate of all balls in two groups of two dimensional images is respectively obtained using the means of image procossing, is based on two groups of centers Coordinate, it is theoretical according to Epipolar geometry in computer vision, first using 8 methods calculate laser radar and binocular Visible Light Camera it Between basis matrix F, and then find out essential matrix E;
According to formula E*T=RS*T=0, S is the antisymmetric matrix of translation matrix T construction, is solved and is swashed according to essential matrix E Translation matrix T between optical radar and binocular Visible Light Camera;
Then it is derived from and works as according to seeking matrix norm and the derivation thinking of trace of a matrix being asked to carry out optimization according to E=RS trace(SETRS it is minimum to meet trace (| | E-RS | |) when) maximum, according to R=V*UTSpin matrix R is calculated, wherein V and U is to SETCarry out two unitary matrice that singular value decomposition obtains;Complete the calibration of laser radar and binocular Visible Light Camera;
3) calibration between binocular Visible Light Camera
Target is set, and two cameras in binocular Visible Light Camera are respectively imaged target, obtain two groups of two dimensional images, benefit The character pair point coordinate of target in two groups of two dimensional images is respectively obtained with the means of image procossing, is sat based on two groups of characteristic points Mark, it is theoretical according to Epipolar geometry in computer vision, first the basis matrix between binocular Visible Light Camera is calculated using 8 methods F ', and then find out essential matrix E ';
It is the antisymmetric matrix of translation matrix T ' construction according to formula E ' * T '=R ' S ' * T '=0, S ', is asked according to essential matrix E ' Solution obtains the translation matrix T ' between binocular Visible Light Camera;
Then it is derived according to E '=R ' S ' according to seeking matrix norm and the derivation thinking of trace of a matrix being asked to carry out optimization Out as trace (S ' E 'TR ' S ') it is maximum when to meet trace (| | E '-R ' S ' | |) minimum, according to R '=V ' * U 'TRotation is calculated Torque battle array R ', wherein V ' and U ' is to S ' E 'TCarry out two unitary matrice that singular value decomposition obtains;Complete binocular visible light phase Calibration between machine;
4) it measures, merge
Laser radar is scanned measured target to obtain first group of three dimensional point cloud;
Measured target is imaged in binocular Visible Light Camera, according to step 3) translation matrix T ' obtained by calibrating and spin matrix R ' carries out three-dimensional reconstruction, obtains second group of three dimensional point cloud;
According to step 2) translation matrix T obtained by calibrating and spin matrix R, first group of three dimensional point cloud is put down accordingly It moves, rotation transformation, completes the fusion of two groups of three dimensional point clouds.
2. according to claim 1 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: Target is gridiron pattern scaling board in step 3).
3. according to claim 1 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: On the substrate of the ball-type target, multiple ball sizes are identical, are equally spaced.
4. according to claim 1 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: According to essential matrix E, E ' translation matrix T, T is accordingly solved respectively ' it is by the way of SVD decomposition.
5. according to claim 1 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: Two cameras of binocular Visible Light Camera are respectively placed in the left and right sides of laser radar.
6. according to claim 1 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: Before step 2), first the internal reference of the single camera of binocular Visible Light Camera is demarcated.
7. according to claim 6 be based on laser radar and binocular Visible Light Camera union measuring method, it is characterised in that: The calibration of the internal reference of binocular Visible Light Camera single camera is completed using Zhang Zhengyou calibration method.
CN201810240140.4A 2018-03-22 2018-03-22 One kind being based on laser radar and binocular Visible Light Camera union measuring method Active CN108828606B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810240140.4A CN108828606B (en) 2018-03-22 2018-03-22 One kind being based on laser radar and binocular Visible Light Camera union measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810240140.4A CN108828606B (en) 2018-03-22 2018-03-22 One kind being based on laser radar and binocular Visible Light Camera union measuring method

Publications (2)

Publication Number Publication Date
CN108828606A CN108828606A (en) 2018-11-16
CN108828606B true CN108828606B (en) 2019-04-30

Family

ID=64154253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810240140.4A Active CN108828606B (en) 2018-03-22 2018-03-22 One kind being based on laser radar and binocular Visible Light Camera union measuring method

Country Status (1)

Country Link
CN (1) CN108828606B (en)

Families Citing this family (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109472831A (en) * 2018-11-19 2019-03-15 东南大学 Obstacle recognition range-measurement system and method towards road roller work progress
CN109614889B (en) * 2018-11-23 2020-09-18 华为技术有限公司 Object detection method, related device and computer storage medium
CN109785431A (en) * 2018-12-18 2019-05-21 天津理工大学 A kind of road ground three-dimensional feature acquisition method and device based on laser network
CN109598765B (en) * 2018-12-21 2023-01-03 浙江大学 Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
CN109901123B (en) * 2018-12-24 2023-12-01 文远知行有限公司 Sensor calibration method, device, computer equipment and storage medium
CN111383279B (en) * 2018-12-29 2023-06-20 阿里巴巴集团控股有限公司 External parameter calibration method and device and electronic equipment
CN109816774B (en) * 2018-12-31 2023-11-17 江苏天策机器人科技有限公司 Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle
CN110021046B (en) * 2019-03-05 2021-11-19 中国科学院计算技术研究所 External parameter calibration method and system for camera and laser radar combined sensor
CN109934877B (en) * 2019-03-15 2023-06-09 苏州天准科技股份有限公司 Calibration method for combined calibration of 2D laser and binocular camera
CN109828250B (en) * 2019-03-28 2020-07-21 森思泰克河北科技有限公司 Radar calibration method, calibration device and terminal equipment
CN110223226B (en) * 2019-05-07 2021-01-15 中国农业大学 Panoramic image splicing method and system
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
CN110349257B (en) * 2019-07-16 2020-02-28 四川大学 Phase pseudo mapping-based binocular measurement missing point cloud interpolation method
CN110412564A (en) * 2019-07-29 2019-11-05 哈尔滨工业大学 A kind of identification of train railway carriage and distance measuring method based on Multi-sensor Fusion
CN110579764B (en) * 2019-08-08 2021-03-09 北京三快在线科技有限公司 Registration method and device for depth camera and millimeter wave radar, and electronic equipment
CN110456377B (en) * 2019-08-15 2021-07-30 中国人民解放军63921部队 Satellite foreign matter attack detection method and system based on three-dimensional laser radar
CN110599546A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method, system, device and storage medium for acquiring three-dimensional space data
CN110675436A (en) * 2019-09-09 2020-01-10 中国科学院微小卫星创新研究院 Laser radar and stereoscopic vision registration method based on 3D feature points
CN112785649A (en) * 2019-11-11 2021-05-11 北京京邦达贸易有限公司 Laser radar and camera calibration method and device, electronic equipment and medium
US10859684B1 (en) 2019-11-12 2020-12-08 Huawei Technologies Co., Ltd. Method and system for camera-lidar calibration
WO2021092805A1 (en) * 2019-11-13 2021-05-20 中新智擎科技有限公司 Multi-modal data fusion method and apparatus, and intellignet robot
CN110969669B (en) * 2019-11-22 2021-12-03 大连理工大学 Visible light and infrared camera combined calibration method based on mutual information registration
CN110930442B (en) * 2019-11-26 2020-07-31 广东技术师范大学 Method and device for determining positions of key points in robot hand-eye calibration based on calibration block
CN113376617B (en) * 2020-02-25 2024-04-05 北京京东乾石科技有限公司 Method, device, storage medium and system for evaluating accuracy of radar calibration result
CN111340797B (en) * 2020-03-10 2023-04-28 山东大学 Laser radar and binocular camera data fusion detection method and system
CN111487642A (en) * 2020-03-10 2020-08-04 国电南瑞科技股份有限公司 Transformer substation inspection robot positioning navigation system and method based on three-dimensional laser and binocular vision
CN111508020B (en) * 2020-03-23 2024-05-07 北京国电富通科技发展有限责任公司 Cable three-dimensional position calculation method and device for fusing image and laser radar
WO2021195939A1 (en) * 2020-03-31 2021-10-07 深圳市大疆创新科技有限公司 Calibrating method for external parameters of binocular photographing device, movable platform and system
CN111880192B (en) * 2020-07-31 2021-06-29 湖南国天电子科技有限公司 Ocean monitoring buoy device and system based on water surface and underwater target early warning
CN112648998A (en) * 2020-08-06 2021-04-13 成都道克科技有限公司 Unmanned aerial vehicle cooperative target autonomous guidance measurement method based on shape and color
CN112212784B (en) * 2020-09-01 2022-02-08 长春工程学院 Method and system for fusing coordinates of point laser displacement sensor and binocular camera
CN112101222A (en) * 2020-09-16 2020-12-18 中国海洋大学 Sea surface three-dimensional target detection method based on unmanned ship multi-mode sensor
CN112258517A (en) * 2020-09-30 2021-01-22 无锡太机脑智能科技有限公司 Automatic map repairing method and device for laser radar grid map
CN112416000B (en) * 2020-11-02 2023-09-15 北京信息科技大学 Unmanned equation motorcycle race environment sensing and navigation method and steering control method
CN112099031B (en) * 2020-11-09 2021-02-02 天津天瞳威势电子科技有限公司 Vehicle distance measuring method and device
CN112379390A (en) * 2020-11-18 2021-02-19 成都通甲优博科技有限责任公司 Pose measurement method, device and system based on heterogeneous data and electronic equipment
CN112489110A (en) * 2020-11-25 2021-03-12 西北工业大学青岛研究院 Optical hybrid three-dimensional imaging method for underwater dynamic scene
CN112529965A (en) * 2020-12-08 2021-03-19 长沙行深智能科技有限公司 Calibration method and device for laser radar and monocular camera
CN112598729B (en) * 2020-12-24 2022-12-23 哈尔滨工业大学芜湖机器人产业技术研究院 Target object identification and positioning method integrating laser and camera
CN112634318B (en) * 2020-12-31 2022-11-08 中国海洋大学 Teleoperation system and method for underwater maintenance robot
CN112902874B (en) * 2021-01-19 2022-09-27 中国汽车工程研究院股份有限公司 Image acquisition device and method, image processing method and device and image processing system
CN112907681A (en) * 2021-02-26 2021-06-04 北京中科慧眼科技有限公司 Combined calibration method and system based on millimeter wave radar and binocular camera
CN113176544B (en) * 2021-03-05 2022-11-11 河海大学 Mismatching correction method for slope radar image and terrain point cloud
CN113109833A (en) * 2021-04-02 2021-07-13 北京理明智能科技有限公司 Bionic three-dimensional imaging system and method based on fusion of visible light and laser radar
CN113288424A (en) * 2021-04-14 2021-08-24 上海大学 Calibration plate and calibration method for field calibration of optical surgical navigation system
CN113379844B (en) * 2021-05-25 2022-07-15 成都飞机工业(集团)有限责任公司 Method for detecting large-range surface quality of airplane
CN113447948B (en) * 2021-05-28 2023-03-21 淮阴工学院 Camera and multi-laser-radar fusion method based on ROS robot
CN113205563B (en) * 2021-06-03 2022-11-18 河南科技大学 Automatic driving sensor combined calibration target and calibration method
CN113379894B (en) * 2021-06-10 2023-08-01 西安亚思工业自动化控制有限公司 Three-dimensional data model reconstruction method for bar
CN113538591B (en) * 2021-06-18 2024-03-12 深圳奥锐达科技有限公司 Calibration method and device for distance measuring device and camera fusion system
CN113776515B (en) * 2021-08-31 2022-06-10 南昌工学院 Robot navigation method and device, computer equipment and storage medium
CN113759365B (en) * 2021-10-11 2024-05-31 内蒙古方向图科技有限公司 Binocular vision three-dimensional optical image and foundation radar data fusion method and system
CN113763303B (en) * 2021-11-10 2022-03-18 北京中科慧眼科技有限公司 Real-time ground fusion method and system based on binocular stereo vision and intelligent terminal
CN114167866B (en) * 2021-12-02 2024-04-12 桂林电子科技大学 Intelligent logistics robot and control method
CN114723825A (en) * 2022-04-21 2022-07-08 中冶赛迪重庆信息技术有限公司 Camera coordinate mapping method, system, medium and electronic terminal used in unmanned driving scene
CN115239821B (en) * 2022-07-15 2023-03-31 小米汽车科技有限公司 Parameter information determination method and device, vehicle, electronic equipment and storage medium
CN115797185B (en) * 2023-02-08 2023-05-02 四川精伍轨道交通科技有限公司 Coordinate conversion method based on image processing and complex sphere
CN118115598A (en) * 2023-02-10 2024-05-31 深圳市中图仪器股份有限公司 Method for calibrating binocular camera and laser tracker with binocular camera
CN116027269B (en) * 2023-03-29 2023-06-06 成都量芯集成科技有限公司 Plane scene positioning method
CN117523105B (en) * 2023-11-24 2024-05-28 哈工大郑州研究院 Three-dimensional scene reconstruction method for laser radar and multi-camera data fusion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104574376B (en) * 2014-12-24 2017-08-08 重庆大学 Avoiding collision based on binocular vision and laser radar joint verification in hustle traffic
CN104573646B (en) * 2014-12-29 2017-12-12 长安大学 Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN107421465B (en) * 2017-08-18 2018-12-21 大连理工大学 A kind of binocular vision joining method based on laser tracker

Also Published As

Publication number Publication date
CN108828606A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108828606B (en) One kind being based on laser radar and binocular Visible Light Camera union measuring method
CN110044300B (en) Amphibious three-dimensional vision detection device and detection method based on laser
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN105203044B (en) To calculate stereo vision three-dimensional measurement method and system of the laser speckle as texture
CN102927908B (en) Robot eye-on-hand system structured light plane parameter calibration device and method
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN109323650B (en) Unified method for measuring coordinate system by visual image sensor and light spot distance measuring sensor in measuring system
Shang et al. Measurement methods of 3D shape of large-scale complex surfaces based on computer vision: A review
CN107121109A (en) A kind of structure light parameter calibration device and method based on preceding plated film level crossing
CN108594245A (en) A kind of object movement monitoring system and method
CN105115560B (en) A kind of non-contact measurement method of cabin volume of compartment
Yang et al. Panoramic UAV surveillance and recycling system based on structure-free camera array
CN103292710A (en) Distance measuring method applying binocular visual parallax error distance-measuring principle
CN102692214A (en) Narrow space binocular vision measuring and positioning device and method
CN103759669A (en) Monocular vision measuring method for large parts
CN110132226A (en) The distance and azimuth angle measurement system and method for a kind of unmanned plane line walking
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN108682028A (en) Laser point cloud based on radiation correcting and optical image automatic matching method
CN108010125A (en) True scale three-dimensional reconstruction system and method based on line-structured light and image information
CN104165598A (en) Automatic reflection light spot positioning method for large-caliber mirror interferometer vertical type detection
CN108413865A (en) The secondary reflection minute surface type detection method converted based on three-dimensional measurement and coordinate system
CN112461204A (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
Wu et al. Passive ranging based on planar homography in a monocular vision system
CN105737803B (en) The two-sided battle array stereo mapping system of aviation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant