CN113837981B - Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment - Google Patents

Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment Download PDF

Info

Publication number
CN113837981B
CN113837981B CN202111390247.5A CN202111390247A CN113837981B CN 113837981 B CN113837981 B CN 113837981B CN 202111390247 A CN202111390247 A CN 202111390247A CN 113837981 B CN113837981 B CN 113837981B
Authority
CN
China
Prior art keywords
point cloud
dimensional point
cloud data
scaling factor
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111390247.5A
Other languages
Chinese (zh)
Other versions
CN113837981A (en
Inventor
陶天阳
姚立
李辉
刘佳
夏鑫成
张奔
展亚南
徐俊瑜
梁石
徐仁
周旭
王一迪
倪苏东
张亚军
郑欣洋
刘天宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 28 Research Institute
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN202111390247.5A priority Critical patent/CN113837981B/en
Publication of CN113837981A publication Critical patent/CN113837981A/en
Application granted granted Critical
Publication of CN113837981B publication Critical patent/CN113837981B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses an automatic fusion method for acquiring three-dimensional point clouds by multi-terminal equipment, which is used for solving the problem of difficult automatic fusion caused by different scaling coefficients among the three-dimensional point clouds acquired by different terminal instruments. The invention provides a rough-to-fine multi-step point cloud scaling coefficient calculation method, which can calculate the relative scaling coefficient between different point clouds, improve the point cloud fusion efficiency obtained by different devices, and meet the point cloud complementary requirements of different devices in a specific scene. The method has the key point that the method does not need to limit the acquisition way of the point cloud, and has the advantages of strong universality and high automation degree.

Description

Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment
Technical Field
The invention relates to a three-dimensional point cloud fusion method, in particular to an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment.
Background
The three-dimensional imaging technology has important application in the fields of industry, national defense and the like nowadays. The current three-dimensional technology includes, but is not limited to, three-dimensional imaging technologies such as stereoscopic imaging, multi-view imaging, oblique photography, structured light three-dimensional imaging, and laser radar three-dimensional imaging. The key of the three-dimensional imaging technology is to acquire depth information of an object or a scene or corresponding three-dimensional point cloud data so as to provide depth perception capability or a three-dimensional space data model for a machine. The three-dimensional imaging technology has become mature after recent development and has considerable application in the fields of entertainment, industrial detection, national defense and the like. However, for complex application scenarios, such as urban building modeling, it is difficult to acquire complete three-dimensional point cloud data by a single three-dimensional imaging technology, and it is often necessary to perform fusion processing on point cloud data acquired by multiple three-dimensional imaging technologies to acquire relatively complete point cloud data. Therefore, point cloud fusion becomes a key technology for complete three-dimensional point cloud acquisition in these application scenes.
The existing point cloud fusion technology mainly aims at the fusion of homologous point clouds, for example, a common sweeping robot carries out three-dimensional map reconstruction through a single laser radar, in the process of map reconstruction, the map acquisition process has strong continuity, and the continuity of the homologous point clouds and the map acquisition process strengthens the stability of map fusion. However, in some application scenarios, the consistency of the homologous point cloud and the map acquisition cannot be ensured, and the difficulty of point cloud fusion in the scenario is increased sharply. One of the bottlenecks in the fusion between different point clouds is that the scaling factor between the point clouds may not be estimated, or it is difficult to ensure that highly consistent scaling ratios are maintained between point cloud data acquired by different imaging technologies due to different calibration methods and different imaging principles. In addition, because the map acquisition processes may be discontinuous in different methods or the coordinate systems set in the same method are different, the relative spatial positions between different point clouds cannot be obtained in advance, so that the scaling coefficient cannot be directly estimated by combining direct scaling with a fine registration method.
In summary, it can be found that a key problem of cloud fusion of different source points is that the scaling factor between point clouds is difficult to evaluate. The existing commercial software provides quite abundant point cloud operation tools including point cloud scaling coefficient evaluation and point cloud fusion, but the software needs more manual intervention to ensure better processing effect. This is extremely inefficient or even inapplicable for application scenarios requiring cloud fusion of a large number of different source points.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to solve the technical problem of providing an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment aiming at the defects of the prior art.
In order to solve the technical problem, the invention discloses an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment, which comprises the following steps:
step 1, three-dimensional point cloud data of two different sources are obtained
Figure 691860DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 281105DEST_PATH_IMAGE002
Step 2, calculating three-dimensional point cloud data
Figure 998525DEST_PATH_IMAGE001
Center of gravity of
Figure 322628DEST_PATH_IMAGE003
And three-dimensional point cloud data
Figure 74684DEST_PATH_IMAGE002
Center of gravity of
Figure 365988DEST_PATH_IMAGE004
And calculating to obtain three-dimensional point cloud data
Figure 570704DEST_PATH_IMAGE001
Each point in
Figure 707287DEST_PATH_IMAGE005
To the center of gravity
Figure 313849DEST_PATH_IMAGE003
Average distance of
Figure 182579DEST_PATH_IMAGE006
Three-dimensional point cloud data
Figure 874592DEST_PATH_IMAGE002
From each point to the center of gravity
Figure 546357DEST_PATH_IMAGE004
Average distance of
Figure 741846DEST_PATH_IMAGE007
By three-dimensional point cloud data
Figure 843794DEST_PATH_IMAGE001
For reference, three-dimensional point cloud data is calculated
Figure 23103DEST_PATH_IMAGE002
Relative to the three-dimensional point cloud data
Figure 32647DEST_PATH_IMAGE001
Scaling factor of
Figure 613801DEST_PATH_IMAGE008
And comparing the three-dimensional point cloud data
Figure 621071DEST_PATH_IMAGE002
Zooming to obtain zoomed three-dimensional point cloud data
Figure 287676DEST_PATH_IMAGE009
Step 3, three-dimensional point cloud data
Figure 103841DEST_PATH_IMAGE001
And
Figure 273922DEST_PATH_IMAGE009
performing coarse registration to obtain
Figure 779990DEST_PATH_IMAGE001
And
Figure 402732DEST_PATH_IMAGE009
corresponding characteristic point sequence
Figure 19659DEST_PATH_IMAGE010
And
Figure 44246DEST_PATH_IMAGE011
wherein the subscript
Figure 455636DEST_PATH_IMAGE012
Is shown as
Figure 627991DEST_PATH_IMAGE012
A feature point; calculating to obtain the gravity center of the characteristic point sequence and the average distance between the characteristic point and the gravity center
Figure 720712DEST_PATH_IMAGE013
And
Figure 662124DEST_PATH_IMAGE014
and obtaining three-dimensional point cloud data
Figure 509994DEST_PATH_IMAGE009
With respect to three-dimensional point cloud data
Figure 104399DEST_PATH_IMAGE001
Scaling factor of
Figure 63128DEST_PATH_IMAGE015
And three-dimensional point cloud data
Figure 796728DEST_PATH_IMAGE009
Carry out scaling factor
Figure 549921DEST_PATH_IMAGE015
Scaled three-dimensional point cloud data
Figure 900131DEST_PATH_IMAGE016
Step 4, three-dimensional point cloud data
Figure 396971DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 312974DEST_PATH_IMAGE016
Carrying out fine registration and calculating to obtain a scaling coefficient
Figure 174751DEST_PATH_IMAGE017
And performing three-dimensional point cloud data
Figure 808995DEST_PATH_IMAGE016
Scaling factor
Figure 47209DEST_PATH_IMAGE017
Scaled three-dimensional point cloud data
Figure 83298DEST_PATH_IMAGE018
Step 5, in the scaling factor
Figure 912714DEST_PATH_IMAGE017
By setting the zoom range
Figure 986585DEST_PATH_IMAGE019
And step size
Figure 90808DEST_PATH_IMAGE020
To obtain the corresponding scaling coefficient sequence
Figure 450245DEST_PATH_IMAGE021
Wherein the maximum step size
Figure 919403DEST_PATH_IMAGE022
And further obtaining three-dimensional point cloud data
Figure 262660DEST_PATH_IMAGE018
Corresponding three-dimensional point cloud data after zooming
Figure 108256DEST_PATH_IMAGE023
Step 6, calculating all three-dimensional point cloud data through precise registration
Figure 587779DEST_PATH_IMAGE024
And three-dimensional point cloud data
Figure 493418DEST_PATH_IMAGE001
Mean square error of three-dimensional point cloud data between
Figure 323971DEST_PATH_IMAGE025
Selecting the variance
Figure 504417DEST_PATH_IMAGE026
Scaling factor corresponding to minimum value
Figure 41708DEST_PATH_IMAGE027
As final scaling factor and for three-dimensional point cloud data
Figure 646478DEST_PATH_IMAGE018
Carry out scaling factor
Figure 964326DEST_PATH_IMAGE028
Zooming to obtain final three-dimensional point cloudData of
Figure 682884DEST_PATH_IMAGE029
Step 7, finishing the three-dimensional point cloud data A and the three-dimensional point cloud data through fine registration
Figure 340261DEST_PATH_IMAGE029
Fusing the three-dimensional point cloud data.
Calculating three-dimensional point cloud data in step 2 of the invention
Figure 384441DEST_PATH_IMAGE001
Center of gravity of
Figure 127269DEST_PATH_IMAGE003
The method comprises the following steps:
Figure 649517DEST_PATH_IMAGE030
wherein
Figure 958138DEST_PATH_IMAGE005
As three-dimensional point cloud data
Figure 173219DEST_PATH_IMAGE001
At any one point in the above-mentioned (b),
Figure 137764DEST_PATH_IMAGE031
as three-dimensional point cloud data
Figure 463703DEST_PATH_IMAGE001
The number of midpoints;
simultaneous computation of three-dimensional point cloud data
Figure 892410DEST_PATH_IMAGE002
Center of gravity of
Figure 12813DEST_PATH_IMAGE004
The three-dimensional point cloud data calculated in step 2 of the invention
Figure 733163DEST_PATH_IMAGE001
Each point in
Figure 862793DEST_PATH_IMAGE005
To the center of gravity
Figure 83690DEST_PATH_IMAGE003
Average distance of
Figure 906153DEST_PATH_IMAGE006
The method comprises the following steps:
Figure 845290DEST_PATH_IMAGE032
simultaneous computation of three-dimensional point cloud data
Figure 513031DEST_PATH_IMAGE002
From each point to the center of gravity
Figure 650752DEST_PATH_IMAGE004
Average distance of
Figure 581799DEST_PATH_IMAGE007
In step 2, the average distance from each point of the three-dimensional point cloud data to the gravity center of the three-dimensional point cloud data is taken as a representation of the scale of the three-dimensional point cloud data, and the three-dimensional point cloud data is taken as
Figure 70549DEST_PATH_IMAGE001
For reference, three-dimensional point cloud data is calculated
Figure 479665DEST_PATH_IMAGE002
With respect to three-dimensional point cloud data
Figure 471891DEST_PATH_IMAGE001
Scaling factor of
Figure 633227DEST_PATH_IMAGE008
The calculation method is as follows:
Figure 78115DEST_PATH_IMAGE033
step 2 of the invention is to carry out three-dimensional point cloud data
Figure 290921DEST_PATH_IMAGE002
According to the scaling factor
Figure 668813DEST_PATH_IMAGE008
Zooming to obtain zoomed three-dimensional point cloud data
Figure 676083DEST_PATH_IMAGE009
Figure 873846DEST_PATH_IMAGE034
In the present invention, step 3 comprises: three-dimensional point cloud data rough registration method based on RANSAC algorithm
Figure 687082DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 653901DEST_PATH_IMAGE009
Carrying out coarse registration and obtaining three-dimensional point cloud data
Figure 832072DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 517131DEST_PATH_IMAGE009
Corresponding characteristic point sequence
Figure 134058DEST_PATH_IMAGE010
And
Figure 955383DEST_PATH_IMAGE011
further calculating the feature point sequenceCenter of gravity of column and average distance of feature point to center of gravity
Figure 561246DEST_PATH_IMAGE013
And
Figure 468022DEST_PATH_IMAGE014
and calculating three-dimensional point cloud data
Figure 623060DEST_PATH_IMAGE009
With respect to three-dimensional point cloud data
Figure 564471DEST_PATH_IMAGE001
Scaling factor of
Figure 881183DEST_PATH_IMAGE015
And three-dimensional point cloud data
Figure 478518DEST_PATH_IMAGE009
Carry out scaling factor
Figure 437246DEST_PATH_IMAGE015
Scaled three-dimensional point cloud data
Figure 436426DEST_PATH_IMAGE016
In step 4 of the invention, three-dimensional point cloud data is processed by an icp algorithm
Figure 189619DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 336566DEST_PATH_IMAGE016
And carrying out fine registration.
The step 5 of the invention comprises: scaling factor obtained in step 4
Figure 567827DEST_PATH_IMAGE017
By setting the zoom range
Figure 421514DEST_PATH_IMAGE019
And step size
Figure 611187DEST_PATH_IMAGE020
Obtaining the sequence
Figure 180184DEST_PATH_IMAGE035
Wherein
Figure 215136DEST_PATH_IMAGE022
(ii) a Obtaining a sequence of scaling coefficients
Figure 985646DEST_PATH_IMAGE036
And three-dimensional point cloud data
Figure 80641DEST_PATH_IMAGE018
Carry out scaling factor
Figure 139864DEST_PATH_IMAGE037
Scaled corresponding three-dimensional point cloud data
Figure 712928DEST_PATH_IMAGE023
In step 6 of the invention, all three-dimensional point cloud data are calculated through icp fine registration
Figure 603523DEST_PATH_IMAGE024
And three-dimensional point cloud data
Figure 869419DEST_PATH_IMAGE001
Mean square error of three-dimensional point cloud data between
Figure 150359DEST_PATH_IMAGE025
And by the following formula:
Figure 792693DEST_PATH_IMAGE038
selecting variance
Figure 537795DEST_PATH_IMAGE026
Middle minimumValue-corresponding scaling factor
Figure 974593DEST_PATH_IMAGE027
As final scaling factor and for three-dimensional point cloud data
Figure 745758DEST_PATH_IMAGE018
Carry out scaling factor
Figure 660625DEST_PATH_IMAGE028
Zooming to obtain final three-dimensional point cloud data
Figure 525813DEST_PATH_IMAGE029
In step 7 of the invention, the three-dimensional point cloud data A and the three-dimensional point cloud data A are precisely registered according to an icp algorithm
Figure 133511DEST_PATH_IMAGE029
Fusing the three-dimensional point cloud data.
Has the advantages that:
(1) the distance from each point of the point cloud to the gravity center is fully utilized to be used as the representation of the size of the point cloud, and the acquisition mode of the point cloud is not limited or the requirement on the size of the point cloud is not required to be acquired in advance.
(2) The characteristics of coarse registration and fine registration are fully coupled, a mode of combining one-time coarse registration and multiple times of fine registration is adopted, and the accurate relative scaling coefficient between the point clouds is progressively determined from coarse to fine.
Drawings
The foregoing and/or other advantages of the invention will become further apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flow chart of an automated fusion method of three-dimensional point cloud in the present invention.
Detailed Description
The invention provides an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment, which comprises the following steps of:
step 1,Obtaining three-dimensional point cloud data of two different sources by different terminal devices or methods, e.g. stereo vision, oblique photography, lidar, structured light scanning, etc
Figure 389043DEST_PATH_IMAGE001
And
Figure 373180DEST_PATH_IMAGE002
step 2, by the following formula
Figure 30557DEST_PATH_IMAGE039
Computing point clouds
Figure 74737DEST_PATH_IMAGE001
Center of gravity of
Figure 817565DEST_PATH_IMAGE003
Wherein
Figure 605392DEST_PATH_IMAGE005
As a point cloud
Figure 117276DEST_PATH_IMAGE001
At any one point in the above-mentioned (b),
Figure 332357DEST_PATH_IMAGE031
as a point cloud
Figure 559551DEST_PATH_IMAGE001
The number of the middle points can be obtained by the same method
Figure 88753DEST_PATH_IMAGE002
Center of gravity of
Figure 720722DEST_PATH_IMAGE004
. Further by the formula
Figure 575546DEST_PATH_IMAGE040
Calculating to obtain point cloud
Figure 558545DEST_PATH_IMAGE001
Each point in
Figure 625859DEST_PATH_IMAGE005
To the center of gravity
Figure 909072DEST_PATH_IMAGE003
Average distance of
Figure 669218DEST_PATH_IMAGE006
And can calculate the point cloud in the same way
Figure 670672DEST_PATH_IMAGE002
From each point to the center of gravity
Figure 290745DEST_PATH_IMAGE004
Average distance of
Figure 694045DEST_PATH_IMAGE007
. Taking the average distance from each point of the point cloud to the gravity center of the point cloud as the representative of the scale of the point cloud, and taking the point cloud
Figure 625092DEST_PATH_IMAGE001
For reference, a point cloud is calculated
Figure 848263DEST_PATH_IMAGE002
Relative to
Figure 522958DEST_PATH_IMAGE001
Scaling factor of
Figure 515184DEST_PATH_IMAGE008
The specific calculation method is as follows
Figure 413870DEST_PATH_IMAGE033
Parallel-point cloud
Figure 593179DEST_PATH_IMAGE002
According to the formula
Figure 602723DEST_PATH_IMAGE041
Zooming to obtain zoomed point cloud
Figure 652719DEST_PATH_IMAGE009
Step 3, through RANSAC algorithm (reference Martin A. Fischler)&Robert C. Bolles (June 1981). "Random Sample Consensus: A partner for Model fixing with Applications to Image Analysis and Automated graphics" (PDF.) Comm. ACM. 24 (6): 381-395.)
Figure 987885DEST_PATH_IMAGE001
And
Figure 120402DEST_PATH_IMAGE009
performing coarse registration to obtain
Figure 199216DEST_PATH_IMAGE001
And
Figure 369297DEST_PATH_IMAGE009
corresponding characteristic point sequence
Figure 609786DEST_PATH_IMAGE010
And
Figure 498107DEST_PATH_IMAGE011
. The gravity center of the feature point sequence and the average distance between the feature points and the gravity center can be calculated and obtained through the step 2 and the step 3
Figure 115034DEST_PATH_IMAGE013
And
Figure 936359DEST_PATH_IMAGE014
and calculating again to obtain the point cloud through the step 4
Figure 285432DEST_PATH_IMAGE009
Relative to
Figure 457787DEST_PATH_IMAGE001
Scaling factor of
Figure 816087DEST_PATH_IMAGE015
And carry out
Figure 757499DEST_PATH_IMAGE009
Zoom
Figure 545982DEST_PATH_IMAGE015
Rear point cloud
Figure 940054DEST_PATH_IMAGE016
Step 4, passing an icp algorithm (refer to Besl P J, McKay N D. Method for registration of 3-D maps [ C)]v/Sensor fusion IV control parts and data structures, International Society for Optics and Photonics, 1992, 1611: 586-
Figure 898783DEST_PATH_IMAGE001
And
Figure 897963DEST_PATH_IMAGE016
fine registration is performed and the scaling factor is calculated again through steps 2, 3 and 4
Figure 916734DEST_PATH_IMAGE017
And carry out
Figure 735786DEST_PATH_IMAGE016
Zoom
Figure 498205DEST_PATH_IMAGE017
Rear point cloud
Figure 351892DEST_PATH_IMAGE018
Step 5, in
Figure 541565DEST_PATH_IMAGE017
By setting the zoom range
Figure 910229DEST_PATH_IMAGE019
And step size
Figure 148444DEST_PATH_IMAGE020
Obtaining the sequence
Figure 916024DEST_PATH_IMAGE035
Wherein
Figure 214281DEST_PATH_IMAGE022
(ii) a Obtaining a sequence of scaling coefficients
Figure 70242DEST_PATH_IMAGE036
And three-dimensional point cloud data
Figure 112147DEST_PATH_IMAGE018
Carry out scaling factor
Figure 2743DEST_PATH_IMAGE037
Scaled corresponding three-dimensional point cloud data
Figure 471901DEST_PATH_IMAGE023
Step 6, calculating all three-dimensional point cloud data through icp fine registration
Figure 815158DEST_PATH_IMAGE024
And three-dimensional point cloud data
Figure 395175DEST_PATH_IMAGE001
Mean square error of three-dimensional point cloud data between
Figure 140277DEST_PATH_IMAGE025
And by the following formula:
Figure 45916DEST_PATH_IMAGE038
selecting variance
Figure 70942DEST_PATH_IMAGE026
Scaling factor corresponding to minimum value
Figure 251388DEST_PATH_IMAGE027
As final scaling factor and for three-dimensional point cloud data
Figure 54259DEST_PATH_IMAGE018
Carry out scaling factor
Figure 661958DEST_PATH_IMAGE028
Zooming to obtain final three-dimensional point cloud data
Figure 448648DEST_PATH_IMAGE029
Step 7, finishing A and A by fine registration according to an icp algorithm
Figure 698364DEST_PATH_IMAGE029
Point cloud fusion of (2).
The invention provides a thought and a method for an automatic fusion method for a multi-terminal device to obtain a three-dimensional point cloud, and particularly provides a plurality of methods and ways for realizing the technical scheme. All the components not specified in the present embodiment can be realized by the prior art.

Claims (10)

1. An automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment is characterized by comprising the following steps:
step 1, three-dimensional point cloud data of two different sources are obtained
Figure 196867DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 975467DEST_PATH_IMAGE002
Step 2, calculating three-dimensional point cloud data
Figure 515033DEST_PATH_IMAGE001
Center of gravity of
Figure 37281DEST_PATH_IMAGE003
And three-dimensional point cloud data
Figure 549165DEST_PATH_IMAGE002
Center of gravity of
Figure 498666DEST_PATH_IMAGE004
And calculating to obtain three-dimensional point cloud data
Figure 791107DEST_PATH_IMAGE001
Each point in
Figure 54729DEST_PATH_IMAGE005
To the center of gravity
Figure 763664DEST_PATH_IMAGE003
Average distance of
Figure 87330DEST_PATH_IMAGE006
Three-dimensional point cloud data
Figure 335908DEST_PATH_IMAGE002
From each point to the center of gravity
Figure 199959DEST_PATH_IMAGE004
Average distance of
Figure 483173DEST_PATH_IMAGE007
By three-dimensional point cloud data
Figure 40056DEST_PATH_IMAGE001
For reference, three-dimensional point cloud data is calculated
Figure 775931DEST_PATH_IMAGE002
Relative to the three-dimensional point cloud data
Figure 443673DEST_PATH_IMAGE001
Scaling factor of
Figure 581393DEST_PATH_IMAGE008
And comparing the three-dimensional point cloud data
Figure 246861DEST_PATH_IMAGE002
Zooming to obtain zoomed three-dimensional point cloud data
Figure 735611DEST_PATH_IMAGE009
Step 3, three-dimensional point cloud data
Figure 207043DEST_PATH_IMAGE001
And
Figure 199270DEST_PATH_IMAGE009
performing coarse registration to obtain
Figure 32709DEST_PATH_IMAGE001
And
Figure 743177DEST_PATH_IMAGE009
corresponding characteristic point sequence
Figure 18300DEST_PATH_IMAGE010
And
Figure 865033DEST_PATH_IMAGE011
wherein the subscript
Figure 137883DEST_PATH_IMAGE012
Is shown as
Figure 70067DEST_PATH_IMAGE012
A feature point; calculating to obtain the gravity center of the characteristic point sequence and the average distance between the characteristic point and the gravity center
Figure 883302DEST_PATH_IMAGE013
And
Figure 850121DEST_PATH_IMAGE014
and obtaining three-dimensional point cloud data
Figure 90609DEST_PATH_IMAGE009
With respect to three-dimensional point cloud data
Figure 713352DEST_PATH_IMAGE001
Scaling factor of
Figure 64699DEST_PATH_IMAGE015
And three-dimensional point cloud data
Figure 151603DEST_PATH_IMAGE009
Carry out scaling factor
Figure 562993DEST_PATH_IMAGE015
Scaled three-dimensional point cloud data
Figure 469769DEST_PATH_IMAGE016
Step 4, aligning the three-dimensional pointsCloud data
Figure 830999DEST_PATH_IMAGE001
And three-dimensional point cloud data
Figure 506831DEST_PATH_IMAGE016
Carrying out fine registration and calculating to obtain a scaling coefficient
Figure 354701DEST_PATH_IMAGE017
And performing three-dimensional point cloud data
Figure 748774DEST_PATH_IMAGE016
Scaling factor
Figure 645185DEST_PATH_IMAGE017
Scaled three-dimensional point cloud data
Figure 441103DEST_PATH_IMAGE018
Step 5, in the scaling factor
Figure 194296DEST_PATH_IMAGE017
By setting the zoom range
Figure 75664DEST_PATH_IMAGE019
And step size
Figure 775767DEST_PATH_IMAGE020
To obtain the corresponding scaling coefficient sequence
Figure 426191DEST_PATH_IMAGE021
Wherein
Figure 615864DEST_PATH_IMAGE022
And further obtaining three-dimensional point cloud data
Figure 718949DEST_PATH_IMAGE018
Corresponding three-dimensional point cloud data after zooming
Figure 753901DEST_PATH_IMAGE023
Step 6, calculating all three-dimensional point cloud data through precise registration
Figure 724744DEST_PATH_IMAGE024
And three-dimensional point cloud data
Figure 85318DEST_PATH_IMAGE001
Mean square error of three-dimensional point cloud data between
Figure 675699DEST_PATH_IMAGE025
Selecting the variance
Figure 514342DEST_PATH_IMAGE026
Scaling factor corresponding to minimum value
Figure 342621DEST_PATH_IMAGE027
As final scaling factor and for three-dimensional point cloud data
Figure 608517DEST_PATH_IMAGE018
Carry out scaling factor
Figure 686194DEST_PATH_IMAGE028
Zooming to obtain final three-dimensional point cloud data
Figure 62949DEST_PATH_IMAGE029
Step 7, finishing the three-dimensional point cloud data A and the three-dimensional point cloud data through fine registration
Figure 542472DEST_PATH_IMAGE030
Fusing the three-dimensional point cloud data.
2. The method as claimed in claim 1, wherein the step 2 of calculating the three-dimensional point cloud data comprises
Figure 244849DEST_PATH_IMAGE031
Center of gravity of
Figure 809822DEST_PATH_IMAGE032
The method comprises the following steps:
Figure 990268DEST_PATH_IMAGE033
wherein
Figure 324297DEST_PATH_IMAGE034
As three-dimensional point cloud data
Figure 931996DEST_PATH_IMAGE031
At any one point in the above-mentioned (b),
Figure 178739DEST_PATH_IMAGE035
as three-dimensional point cloud data
Figure 162876DEST_PATH_IMAGE031
The number of midpoints;
simultaneous computation of three-dimensional point cloud data
Figure 616991DEST_PATH_IMAGE002
Center of gravity of
Figure 395591DEST_PATH_IMAGE004
3. The method as claimed in claim 2, wherein the step 2 of calculating the three-dimensional point cloud data is performed by an automated fusion method for acquiring the three-dimensional point cloud by the multi-terminal device
Figure 935157DEST_PATH_IMAGE031
Each point in
Figure 660667DEST_PATH_IMAGE034
To the center of gravity
Figure 969289DEST_PATH_IMAGE032
Average distance of
Figure 918790DEST_PATH_IMAGE036
The method comprises the following steps:
Figure 211231DEST_PATH_IMAGE037
simultaneous computation of three-dimensional point cloud data
Figure 271591DEST_PATH_IMAGE038
From each point to the center of gravity
Figure 434719DEST_PATH_IMAGE039
Average distance of
Figure 820701DEST_PATH_IMAGE040
4. The method as claimed in claim 3, wherein the average distance between each point of the three-dimensional point cloud data and the center of gravity is used as a representative of the scale of the three-dimensional point cloud data in step 2, and the three-dimensional point cloud data is used as the representative of the scale of the three-dimensional point cloud data
Figure 69280DEST_PATH_IMAGE031
For reference, three-dimensional point cloud data is calculated
Figure 133663DEST_PATH_IMAGE038
With respect to three-dimensional point cloud data
Figure 416877DEST_PATH_IMAGE031
Scaling factor of
Figure 973760DEST_PATH_IMAGE041
The calculation method is as follows:
Figure 709635DEST_PATH_IMAGE042
5. the method as claimed in claim 4, wherein the step 2 is performed on the three-dimensional point cloud data
Figure 377377DEST_PATH_IMAGE038
According to the scaling factor
Figure 718360DEST_PATH_IMAGE041
Zooming to obtain zoomed three-dimensional point cloud data
Figure 180565DEST_PATH_IMAGE043
Figure 669315DEST_PATH_IMAGE044
6. The automated fusion method for acquiring three-dimensional point cloud by multi-terminal equipment according to claim 5, wherein the step 3 comprises: three-dimensional point cloud data rough registration method based on RANSAC algorithm
Figure 875169DEST_PATH_IMAGE031
And three-dimensional point cloud data
Figure 70658DEST_PATH_IMAGE043
Carrying out coarse registration and obtaining three-dimensional point cloud data
Figure 969344DEST_PATH_IMAGE031
And three-dimensional point cloud data
Figure 945390DEST_PATH_IMAGE043
Corresponding characteristic point sequence
Figure 689355DEST_PATH_IMAGE045
And
Figure 801667DEST_PATH_IMAGE046
further calculating the gravity center of the characteristic point sequence and the average distance between the characteristic points and the gravity center
Figure 811867DEST_PATH_IMAGE047
And
Figure 9631DEST_PATH_IMAGE048
and calculating three-dimensional point cloud data
Figure 822866DEST_PATH_IMAGE043
With respect to three-dimensional point cloud data
Figure 789685DEST_PATH_IMAGE031
Scaling factor of
Figure 233436DEST_PATH_IMAGE049
And three-dimensional point cloud data
Figure 652916DEST_PATH_IMAGE043
Carry out scaling factor
Figure 4262DEST_PATH_IMAGE049
Scaled three-dimensional point cloud data
Figure 91167DEST_PATH_IMAGE050
7. The method as claimed in claim 6, wherein the step 4 is performed by using an icp algorithm to obtain the three-dimensional point cloud data
Figure 502557DEST_PATH_IMAGE031
And three-dimensional point cloud data
Figure 612595DEST_PATH_IMAGE050
And carrying out fine registration.
8. The automated fusion method for acquiring three-dimensional point cloud by multi-terminal device according to claim 7, wherein the step 5 comprises: scaling factor obtained in step 4
Figure 502054DEST_PATH_IMAGE051
By setting the zoom range
Figure 709044DEST_PATH_IMAGE052
And step size
Figure 291335DEST_PATH_IMAGE053
Obtaining the sequence
Figure 885740DEST_PATH_IMAGE054
Wherein
Figure 578890DEST_PATH_IMAGE055
(ii) a Obtaining a sequence of scaling coefficients
Figure 374808DEST_PATH_IMAGE056
And three-dimensional point cloud data
Figure 128000DEST_PATH_IMAGE057
Carry out scaling factor
Figure 212631DEST_PATH_IMAGE058
Scaled corresponding three-dimensional point cloud data
Figure 709471DEST_PATH_IMAGE059
9. The method as claimed in claim 8, wherein all three-dimensional point cloud data are calculated by icp fine registration in step 6
Figure 359895DEST_PATH_IMAGE060
And three-dimensional point cloud data
Figure 283989DEST_PATH_IMAGE031
Mean square error of three-dimensional point cloud data between
Figure 652653DEST_PATH_IMAGE061
And by the following formula:
Figure 687605DEST_PATH_IMAGE062
selecting variance
Figure 661378DEST_PATH_IMAGE063
Scaling factor corresponding to minimum value
Figure 490793DEST_PATH_IMAGE064
As final scaling factor and for three-dimensional point cloud data
Figure 346754DEST_PATH_IMAGE057
Carry out scaling factor
Figure 200045DEST_PATH_IMAGE065
Zooming to obtain final three-dimensional point cloud data
Figure 559483DEST_PATH_IMAGE066
10. The automated fusion method for multi-terminal device to acquire three-dimensional point cloud of claim 9, wherein the step 7 comprises finishing the fine registration of the three-dimensional point cloud data A and the three-dimensional point cloud data according to the icp algorithm
Figure 90958DEST_PATH_IMAGE066
Fusing the three-dimensional point cloud data.
CN202111390247.5A 2021-11-23 2021-11-23 Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment Active CN113837981B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111390247.5A CN113837981B (en) 2021-11-23 2021-11-23 Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111390247.5A CN113837981B (en) 2021-11-23 2021-11-23 Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment

Publications (2)

Publication Number Publication Date
CN113837981A CN113837981A (en) 2021-12-24
CN113837981B true CN113837981B (en) 2022-03-08

Family

ID=78971618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111390247.5A Active CN113837981B (en) 2021-11-23 2021-11-23 Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment

Country Status (1)

Country Link
CN (1) CN113837981B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485690A (en) * 2015-08-25 2017-03-08 南京理工大学 Cloud data based on a feature and the autoregistration fusion method of optical image
JP2020035448A (en) * 2018-08-30 2020-03-05 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Method, apparatus, device, storage medium for generating three-dimensional scene map
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN113223145A (en) * 2021-04-19 2021-08-06 中国科学院国家空间科学中心 Sub-pixel measurement multi-source data fusion method and system for planetary surface detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485690A (en) * 2015-08-25 2017-03-08 南京理工大学 Cloud data based on a feature and the autoregistration fusion method of optical image
JP2020035448A (en) * 2018-08-30 2020-03-05 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Method, apparatus, device, storage medium for generating three-dimensional scene map
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN113223145A (en) * 2021-04-19 2021-08-06 中国科学院国家空间科学中心 Sub-pixel measurement multi-source data fusion method and system for planetary surface detection

Also Published As

Publication number Publication date
CN113837981A (en) 2021-12-24

Similar Documents

Publication Publication Date Title
CN110567469B (en) Visual positioning method and device, electronic equipment and system
CN107977997B (en) Camera self-calibration method combined with laser radar three-dimensional point cloud data
CN111383279B (en) External parameter calibration method and device and electronic equipment
JP6883608B2 (en) Depth data processing system that can optimize depth data by aligning images with respect to depth maps
CN107588721A (en) The measuring method and system of a kind of more sizes of part based on binocular vision
CN103810685A (en) Super resolution processing method for depth image
CN110361005B (en) Positioning method, positioning device, readable storage medium and electronic equipment
CN112097732A (en) Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium
CN112017248B (en) 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics
CN110675436A (en) Laser radar and stereoscopic vision registration method based on 3D feature points
CN116309813A (en) Solid-state laser radar-camera tight coupling pose estimation method
CN111429571B (en) Rapid stereo matching method based on spatio-temporal image information joint correlation
CN114782636A (en) Three-dimensional reconstruction method, device and system
Ann et al. Study on 3D scene reconstruction in robot navigation using stereo vision
CN113393524A (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
JP2005322128A (en) Calibration method for stereo three-dimensional measurement and three-dimensional position calculating method
CN104236468A (en) Method and system for calculating coordinates of target space and mobile robot
CN114543787A (en) Millimeter-scale indoor map positioning method based on fringe projection profilometry
WO2019012004A1 (en) Method for determining a spatial uncertainty in images of an environmental area of a motor vehicle, driver assistance system as well as motor vehicle
CN113837981B (en) Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment
KR101634283B1 (en) The apparatus and method of 3d modeling by 3d camera calibration
CN116804537A (en) Binocular range finding system and method
CN110068308B (en) Distance measurement method and distance measurement system based on multi-view camera
CN106548482A (en) It is a kind of based on sparse matching and the dense matching method and system of image border
CN113792645A (en) AI eyeball fusing image and laser radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 210000 No.1, Lingshan South Road, Qixia District, Nanjing City, Jiangsu Province

Applicant after: THE 28TH RESEARCH INSTITUTE OF CHINA ELECTRONICS TECHNOLOGY Group Corp.

Address before: 210007 1 East Street, alfalfa garden, Qinhuai District, Nanjing, Jiangsu.

Applicant before: THE 28TH RESEARCH INSTITUTE OF CHINA ELECTRONICS TECHNOLOGY Group Corp.

GR01 Patent grant
GR01 Patent grant