CN112084875B - Method for multi-laser radar coordinate system - Google Patents

Method for multi-laser radar coordinate system Download PDF

Info

Publication number
CN112084875B
CN112084875B CN202010802368.5A CN202010802368A CN112084875B CN 112084875 B CN112084875 B CN 112084875B CN 202010802368 A CN202010802368 A CN 202010802368A CN 112084875 B CN112084875 B CN 112084875B
Authority
CN
China
Prior art keywords
point cloud
laser radar
straight line
coordinate system
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010802368.5A
Other languages
Chinese (zh)
Other versions
CN112084875A (en
Inventor
袁诚
赖际舟
雍成优
吕品
付相可
季博文
方玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202010802368.5A priority Critical patent/CN112084875B/en
Publication of CN112084875A publication Critical patent/CN112084875A/en
Application granted granted Critical
Publication of CN112084875B publication Critical patent/CN112084875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/12Simultaneous equations, e.g. systems of linear equations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Operations Research (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a method for integrating multiple laser radar coordinate systems, which comprises the following steps: step 1: periodically collecting point cloud data of a plurality of two-dimensional laser radars at the moment k; step 2: screening point cloud data overlapped by the adjacent laser radars according to the overlapping angles of the adjacent laser radars; step 3, grouping the overlapping point cloud data of the adjacent laser radars according to the space geometric structure characteristics, and screening out the linear characteristics; step 4: performing linear fitting on the grouping result in the step 3, and solving a linear equation of the grouping result under a laser radar coordinate system; step 5: solving a state transition matrix according to different linear equations of the same linear characteristic under different laser radar coordinate systems; step 6: and (5) repeating the steps 2 to 5 until the state transition matrix among all the adjacent laser radar coordinate systems is obtained. The invention has the beneficial effects that: the invention can solve the coordinate transfer matrix among a plurality of laser radars to finish the unification of the coordinate systems of the laser radars.

Description

Method for multi-laser radar coordinate system
Technical Field
The invention relates to the technical field of autonomous navigation of robots, in particular to a method for integrating multiple laser radar coordinate systems.
Background
The two-dimensional laser radar is independent of external illumination conditions, is an active detection sensor, has high-precision ranging information, and is a navigation sensor commonly used for mobile robots. However, due to limitations of the field, the placement location and the scanning angle of the lidar, a single lidar sometimes cannot meet the requirements of robot navigation positioning. Multiple lidars are therefore mounted on the mobile robot.
When a plurality of laser radars are used on the same mobile robot, the coordinate systems of the laser radars are different due to different placement positions. Different pose information is also obtained when pose estimation is performed using synchronous localization and mapping (SLAM, simultaneous Localization AND MAPPING) techniques. Therefore, in practical use, the coordinate systems of a plurality of lidars need to be unified.
At present, a method for a coordinate system of a plurality of two-dimensional laser radars is mostly carried out by a high-precision reference standard, such as a high-precision laser radar marker plate, a motion capture system, a total station and the like, so that the cost is high, the configuration is troublesome, the calibration process is relatively complicated, and the calibration result is greatly influenced by the calibration device and the level of operators.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: a method for solving the problem that a plurality of two-dimensional laser radar coordinate systems positioned on the same horizontal plane on the same mobile robot are not uniform is provided.
The invention adopts the following technical scheme for solving the technical problems:
a method of a multiple lidar coordinate system comprising the steps of:
step 1, collecting point cloud data of a plurality of two-dimensional laser radars at k time;
Step 2, screening point cloud data overlapped by the adjacent laser radars according to the overlapping angles of the adjacent laser radars And/>
Step 3, clustering coincident point cloud data of adjacent laser radars by using a nearest neighbor algorithm, and screening out linear characteristics according to curvature information; after converting the point cloud data S m (k) and S m+1 (k) under the original polar coordinate system into a rectangular coordinate system, calculating tear points P i m and P m+1 by using a nearest neighbor algorithmGrouping/>, grouping coincident point cloud data of adjacent lidarsFor each point cloud group/>Curvature/>Further dividing points Yun Qun/>, which are calculated and extracted from the corner pointsThen screening out straight line characteristics according to curvature information;
Step 4, carrying out least square method linear fitting on the grouping result in the step 3 to solve the straight line And establishing straight line characteristics of the laser radar under different laser radar coordinate systems for searching the same straight line L m+1、Lm;
Step 5, according to different linear characteristic equations of the same linear characteristic under different laser radar coordinate systems, solving a state transition matrix to obtain a rotation angle theta' and translation vectors dx and dy;
Step 6: and (5) repeating the steps 2 to 5 until the state transition matrix among all the adjacent laser radar coordinate systems is obtained, and completing the calibration of the multiple laser radars.
Further, the specific process of the step 2 is as follows:
Let S m (k) be the point cloud data of the mth laser radar at time k, and S m+1 (k) be the point cloud data of the (m+1) th laser radar at time k. The m-th laser radar and the (m+1) -th laser radar are two adjacent laser radars, and the approximate coincidence angle is alpha {m,m+1}, so that a coincidence point cloud area of the two is selected And/>Wherein/>Is the representation of the superposition of the mth laser radar and the (m+1) th laser radar under the mth laser radar coordinate system. /(I)Is the representation of the superposition of the m+1th laser radar and the m laser radar under the m+1th laser radar coordinate system.
Further, the specific process of the step 3 is as follows:
step 3.1 will And/>The point cloud information in the system is converted into a rectangular coordinate system from a polar coordinate system.
Is provided withIs/>I=1, 2,3, …, N m,Nm is/>And the number of lidar points. /(I)Is/>Coordinates in polar coordinate System, then/>Rectangular coordinate system coordinate under laser radar coordinate systemThe calculation formula of (2) is as follows:
Is provided with Is/>I=1, 2,3, …, N m+1,Nm+1 is/>And the number of lidar points. /(I)Is/>Coordinates in polar coordinate System, then/>Rectangular coordinate system coordinate/>, under laser radar coordinate systemThe calculation formula of (2) is as follows:
Step 3.2 computing the coincidence point cloud data And/>The distance between adjacent effective laser points is divided by a nearest neighbor algorithm:
Overlapping point cloud data The distance between adjacent laser points is calculated as follows:
wherein P i m, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m under the laser radar coordinate system,/>Is/>And the abscissa and the ordinate in the laser radar coordinate system.
If it isAbove the threshold, then P i m and/>, are consideredIs the tear point. Dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters.
Overlapping point cloud dataThe distance between adjacent laser points is calculated as follows:
Wherein P i m+1, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m+1 under the laser radar coordinate system,/>Is thatAnd the abscissa and the ordinate in the laser radar coordinate system.
If it isAbove the threshold, then P i m+1 and/>, are consideredIs the tear point. Dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters.
Step 3.3 extraction of segmentation completionMidpoint cloud/>And/>Midpoint cloud/>Is a straight line feature of:
for each partial point cloud Extracting corner points by using curvature, wherein the curvature provided with effective points P i m is/>Then
Where 2n+1 is a set section size. If it isAbove the curvature threshold, P i m is the corner point.
Point-to-point cloud according to corner pointsSplitting, and marking the split point cloud group as/> Is point Yun Qun/>Is a number of (3).
For each partial point cloudExtracting corner points by using curvature, and arranging effective points/>Is/>Then
Where 2n+1 is a set section size. If it isAbove the curvature threshold, P i m+1 is the corner point.
Point-to-point cloud according to corner pointsSplitting, and marking the split point cloud group as/> Is point Yun Qun/>Is a number of (3).
Further, the specific process of the step 4 is as follows
First by least square methodGrouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/> Is/>Number of laser spots in/(Is/>Coordinates in the mth lidar coordinate system.
Equation parameters of the extracted straight line feature/>The calculation formula of (2) is as follows:
Calculation of Equation parameters of the extracted straight line feature/>The method comprises the following steps:
First by least square method Grouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/>Is thatNumber of laser spots in/(Is/>Coordinates in the m+1th lidar coordinate system.
Equation parameters of the extracted straight line feature/>The calculation formula of (2) is as follows:
further, the specific process of the step 5 is as follows
The equation of the same straight line under two adjacent laser radar coordinate systems can be obtained through the step 4:
Wherein the method comprises the steps of And/>Is point Yun Qun/>In (1)/(x)And/>Is point Yun Qun/>Is included in the linear parameter.
Let the coordinate conversion equation of two adjacent laser radar coordinate systems be as follows:
Wherein x 1、y1 and x 2、y2 are each a point cloud And/>The laser radar coordinates under the coordinate system, theta is the included angle between the two coordinate systems, dx and dy are the translation amounts in the X and Y axis directions. Then
Wherein θ' is θ or θ.+ -. Pi, providedAnd/>Is the start and end of L m; And/> Is the start and end of L m+1. Order the
So that
Dx and dy can be solved by combining a plurality of corresponding linear equation sets, and the linear equations of two linear features in adjacent laser radar coordinate systems are respectively known as
Wherein the method comprises the steps ofAnd/>Is the straight line L1 in the point cloud group/>Straight line parameters of (a); /(I)And/>Is the straight line L1 in the point cloud group/>Straight line parameters of (a); wherein/>And/>Is the straight line L2 at the point cloud group/>Straight line parameters of (a); And/> Is the straight line L2 at the point cloud group/>Is included in the linear parameter.
Order the
Then
The invention has the beneficial effects that: the method and the device can solve the coordinate transfer matrix among the plurality of the laser radars, finish the unification of the coordinate systems of the plurality of the laser radars, improve the richness of the point cloud information perceived by the laser radars and the success rate of the point cloud matching under various environments, expand the use scene of the traditional laser radars, and further improve the application range of the carrier and the navigation precision thereof.
Drawings
FIG. 1 is a flow chart of a method of the present invention for a multi-lidar coordinate system.
FIG. 2 is a raw point cloud image of a lidar coordinate system.
Fig. 3 is a point cloud image of the laser radar coordinate system after alignment.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings. The embodiments described below by referring to the drawings are exemplary only for explaining the present invention and are not to be construed as limiting the present invention.
The invention adopts the following technical scheme for solving the technical problems:
a method of a multi-lidar coordinate system, the flowchart of which is shown in fig. 1, comprising the steps of:
Step 1, collecting point cloud data of a plurality of two-dimensional laser radars at k time as shown in fig. 2;
and 2, screening point cloud data overlapped by the adjacent laser radars according to the overlapping angles of the adjacent laser radars, wherein the specific method comprises the following steps:
Let S m (k) be the point cloud data of the mth laser radar at time k, and S m+1 (k) be the point cloud data of the (m+1) th laser radar at time k. The m-th laser radar and the (m+1) -th laser radar are two adjacent laser radars, and the approximate coincidence angle is alpha {m,m+1}, so that a coincidence point cloud area of the two is selected And/>Wherein/>Is the representation of the superposition of the mth laser radar and the (m+1) th laser radar under the mth laser radar coordinate system. /(I)Is the representation of the superposition of the m+1th laser radar and the m laser radar under the m+1th laser radar coordinate system.
And 3, clustering coincident point cloud data of adjacent laser radars by using a nearest neighbor algorithm, and screening out straight line features according to curvature information, wherein the method comprises the following steps of:
step 3.1 will And/>The point cloud information in the system is converted into a rectangular coordinate system from a polar coordinate system.
Is provided withIs/>I=1, 2,3, …, N m,Nm is/>And the number of lidar points. /(I)Is/>Coordinates in polar coordinate System, then/>Coordinates in rectangular coordinate System/>The calculation formula of (2) is as follows:
Is provided with Is/>I=1, 2,3, …, N m+1,Nm+1 is/>And the number of lidar points. /(I)Is/>Coordinates in polar coordinate System, then/>Coordinates in rectangular coordinate System/>The calculation formula of (2) is as follows:
Step 3.2 computing the coincidence point cloud data And/>The distance between adjacent effective laser points is divided by a nearest neighbor algorithm:
Overlapping point cloud data The distance between adjacent laser points is calculated as follows:
wherein P i m, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m under the laser radar coordinate system,/>Is/>And the abscissa and the ordinate in the laser radar coordinate system.
If it isAbove the threshold, then P i m and/>, are consideredIs the tear point. Dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters.
Overlapping point cloud dataThe distance between adjacent laser points is calculated as follows:
Wherein P i m+1, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m+1 under the laser radar coordinate system,/>Is thatAnd the abscissa and the ordinate in the laser radar coordinate system.
If it isAbove the threshold, then P i m+1 and/>, are consideredIs the tear point. Dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters.
Step 3.3. Extraction segmentation completionMidpoint cloud/>And/>Midpoint cloud/>Is a straight line feature of:
for each partial point cloud Extracting corner points by using curvature, wherein the curvature provided with effective points P i m is/>Then
Where 2n+1 is a set section size. If it isAbove the curvature threshold, P i m is the corner point.
Point-to-point cloud according to corner pointsSplitting, and marking the split point cloud group as/> Is point Yun Qun/>Is a number of (3).
For each partial point cloudExtracting corner points by using curvature, wherein the curvature provided with effective points P i m+1 is/>Then
Where 2n+1 is a set section size. If it isAbove the curvature threshold, P i m+1 is the corner point.
Point-to-point cloud according to corner pointsSplitting, and marking the split point cloud group as/> Is point Yun Qun/>Is a number of (3).
And 4, performing linear fitting on the grouping result in the step 3, and solving linear equations of the grouping result under different laser radar coordinate systems, wherein the specific method is as follows:
First by least square method Grouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/> Is/>Number of laser spots in/(Is/>Coordinates in the mth lidar coordinate system.
Equation parameters of the extracted straight line feature/> The calculation formula of (2) is as follows:
Calculation of Equation parameters of the extracted straight line feature/>The method comprises the following steps:
First by least square method Grouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/> Is thatNumber of laser spots in/(Is/>Coordinates in the m+1th lidar coordinate system.
Equation parameters of the extracted straight line feature/>The calculation formula of (2) is as follows:
And 5, solving a state transition matrix according to different linear equations of the same linear characteristic under different laser radar coordinate systems, wherein the specific method is as follows:
And (3) obtaining an equation of the same straight line under two adjacent laser radar coordinate systems through the step (4):
Wherein the method comprises the steps of And/>Is point Yun Qun/>In (1)/(x)And/>Is point Yun Qun/>Is included in the linear parameter.
Let the coordinate conversion equation of two adjacent laser radar coordinate systems be as follows:
Wherein x 1、y1 and x 2、y2 are each a point cloud And/>The laser radar coordinates under the coordinate system, theta is the included angle between the two coordinate systems, dx and dy are the translation amounts in the X and Y axis directions. Then
Wherein θ' is θ or θ.+ -. Pi, providedAnd/>Is the start and end of L m; And/> Is the start and end of L m+1. Order the
So that
By solving dx and dy through a plurality of corresponding linear equation sets in parallel, the linear equations of two linear features in adjacent laser radar coordinate systems are respectively known as
Wherein the method comprises the steps ofAnd/>Is the straight line L1 in the point cloud group/>Straight line parameters of (a); /(I)And/>Is the straight line L1 in the point cloud group/>Straight line parameters of (a); wherein/>And/>Is the straight line L2 at the point cloud group/>Straight line parameters of (a); And/> Is the straight line L2 at the point cloud group/>Is included in the linear parameter.
Order the
Then
Step 6: and (3) repeating the steps (2) - (5) until the state transition matrix among all the adjacent laser radar coordinate systems is obtained, and finishing the calibration result as shown in figure 3.

Claims (1)

1. A method of a multiple lidar coordinate system, comprising the steps of:
Step 1: collecting point cloud data of a plurality of two-dimensional laser radars at the moment k;
step 2: screening point cloud data overlapped by adjacent laser radars according to overlapping angles of the adjacent laser radars AndThe specific process is as follows:
S m (k) is the point cloud data of the mth laser radar at the k moment, and S m+1 (k) is the point cloud data of the (m+1) th laser radar at the k moment; the m-th laser radar and the (m+1) -th laser radar are two adjacent laser radars, and the coincidence angle is alpha {m,m+1}, so that a coincidence point cloud area of the two is selected And/>Wherein/>The method is the representation of the superposition part of the mth laser radar and the (m+1) th laser radar under the mth laser radar coordinate system; /(I)The method is the representation of the superposition part of the m+1th laser radar and the m laser radar under the m+1th laser radar coordinate system;
step 3: coincident point cloud data under original polar coordinate system And/>After conversion to a rectangular coordinate system, the nearest neighbor algorithm is used to calculate the tear points P i m and/>Grouping/>, grouping coincident point cloud data of adjacent lidarsFor each point cloud group/>Curvature/>Calculating, and extracting corner points to further divide points Yun Qun/>Then screening out straight line characteristics according to curvature information; the specific process of the step 3 is as follows:
step 3.1 will And/>The point cloud information in (a) is converted into a rectangular coordinate system from a polar coordinate system:
Is provided with Is/>I=1, 2,3, …, N m,Nm is/>The number of mid-lidar points; /(I)Is/>Coordinates in polar coordinate System, then/>Coordinates in rectangular coordinate System/>The calculation formula of (2) is as follows:
Is provided with Is/>I=1, 2,3, …, N m+1,Nm+1 is/>The number of mid-lidar points; /(I)Is/>Coordinates in polar coordinate System, then/>Coordinates in rectangular coordinate System/>The calculation formula of (2) is as follows:
Step 3.2 computing the coincidence point cloud data And/>The distance between adjacent effective laser points is divided by a nearest neighbor algorithm:
Overlapping point cloud data The distance between adjacent laser points is calculated as follows:
wherein P i m, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m under the laser radar coordinate system,/>Is/>An abscissa in a lidar coordinate system;
If it is Above the threshold, then P i m and/>, are consideredIs the tear point; dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters;
Overlapping point cloud data The distance between adjacent laser points is calculated as follows:
Wherein P i m+1, Is laser radar point cloud/>I and i+1 effective points,/>Is the distance between two adjacent points,/>Is the abscissa/ordinate of P i m+1 under the laser radar coordinate system,/>Is/>An abscissa in a lidar coordinate system;
If it is Above the threshold, then P i m+1 and/>, are consideredIs the tear point; dividing the point cloud according to the tearing points, and marking each part of the divided point cloud as/> Is the number of point cloud clusters;
step 3.3 extraction of segmentation completion Midpoint cloud/>And/>Midpoint cloud/>Is a straight line feature of:
for each partial point cloud Extracting corner points by using curvature, wherein the curvature provided with effective points P i m is/>Then
Where 2n+1 is the set interval size ifGreater than the curvature threshold, then P i m is the corner point;
Point-to-point cloud according to corner points Splitting, and marking the split point cloud group as/> Is point Yun Qun/>Is the number of (3);
for each partial point cloud Extracting corner points by using curvature, wherein the curvature provided with effective points P i m+1 is/>Then
Where 2n+1 is the set interval size ifGreater than the curvature threshold, then P i m+1 is the corner point;
Point-to-point cloud according to corner points Splitting, and marking the split point cloud group as/> Is a point Yun QunIs the number of (3);
Step 4: performing least square method straight line fitting to solve straight line on the grouping result in the step 3 And establishing straight line characteristics of the laser radar under different laser radar coordinate systems for searching the same straight line L m+1、Lm;
First by least square method Grouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/> Is/>Number of laser spots in/(Is/>Coordinates in the mth lidar coordinate system;
Equation parameters of the extracted straight line feature/> The calculation formula of (2) is as follows:
Calculation of Equation parameters of the extracted straight line feature/>The method comprises the following steps:
First by least square method Grouping/>Personal point cloud information/>Respectively performing straight line fitting, wherein the objective function is/>
Recording deviceIs a point cloud/>, which belongs to the after the segmentation is completedI-th laser spot,/> Is/>Number of laser spots in/(Is/>Coordinates in the m+1th lidar coordinate system;
Equation parameters of the extracted straight line feature/> The calculation formula of (2) is as follows:
Step 5: according to different linear characteristic equations of the same linear characteristic under different laser radar coordinate systems, solving a state transition matrix to obtain a rotation angle theta' and translation vectors dx and dy; the specific process is as follows:
And (3) obtaining an equation of the same straight line under two adjacent laser radar coordinate systems through the step (4):
Wherein the method comprises the steps of And/>Is point Yun Qun/>In (1)/(x)And/>Is point Yun Qun/>Straight line parameters of (a);
let the coordinate conversion equation of two adjacent laser radar coordinate systems be as follows:
Wherein x 1、y1 and x 2、y2 are each a point cloud And/>The laser radar coordinates under the coordinate system, theta is the included angle between the two coordinate systems, dx and dy are translation amounts in the X and Y axis directions; then
Wherein θ' is θ or θ.+ -. Pi, providedAnd/>Is the start and end of L m; And/> Is the start point and the end point of L m+1, let
So that
By solving dx and dy through a plurality of corresponding linear equation sets in parallel, the linear equations of two linear features in adjacent laser radar coordinate systems are respectively known as
Wherein the method comprises the steps ofAnd/>Is the straight line L1 in the point cloud group/>Straight line parameters of (a); /(I)And/>Is a straight line L1 in the point cloud groupStraight line parameters of (a); wherein/>And/>Is the straight line L2 at the point cloud group/>Straight line parameters of (a); /(I)AndIs the straight line L2 at the point cloud group/>Straight line parameters of (a);
Order the
Then
Step 6: and (5) repeating the steps 2 to 5 until the state transition matrix among all the adjacent laser radar coordinate systems is obtained, and completing the calibration of the multiple laser radars.
CN202010802368.5A 2020-08-11 2020-08-11 Method for multi-laser radar coordinate system Active CN112084875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010802368.5A CN112084875B (en) 2020-08-11 2020-08-11 Method for multi-laser radar coordinate system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010802368.5A CN112084875B (en) 2020-08-11 2020-08-11 Method for multi-laser radar coordinate system

Publications (2)

Publication Number Publication Date
CN112084875A CN112084875A (en) 2020-12-15
CN112084875B true CN112084875B (en) 2024-06-11

Family

ID=73735835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010802368.5A Active CN112084875B (en) 2020-08-11 2020-08-11 Method for multi-laser radar coordinate system

Country Status (1)

Country Link
CN (1) CN112084875B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113340266A (en) * 2021-06-02 2021-09-03 江苏豪杰测绘科技有限公司 Indoor space surveying and mapping system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787933A (en) * 2016-02-19 2016-07-20 武汉理工大学 Water front three-dimensional reconstruction apparatus and method based on multi-view point cloud registration
CN108445505A (en) * 2018-03-29 2018-08-24 南京航空航天大学 Feature significance detection method based on laser radar under thread environment
CN108562289A (en) * 2018-06-07 2018-09-21 南京航空航天大学 Quadrotor laser radar air navigation aid in continuous polygon geometry environment
CN111275748A (en) * 2020-01-15 2020-06-12 南京航空航天大学 Point cloud registration method based on laser radar in dynamic environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787933A (en) * 2016-02-19 2016-07-20 武汉理工大学 Water front three-dimensional reconstruction apparatus and method based on multi-view point cloud registration
CN108445505A (en) * 2018-03-29 2018-08-24 南京航空航天大学 Feature significance detection method based on laser radar under thread environment
CN108562289A (en) * 2018-06-07 2018-09-21 南京航空航天大学 Quadrotor laser radar air navigation aid in continuous polygon geometry environment
CN111275748A (en) * 2020-01-15 2020-06-12 南京航空航天大学 Point cloud registration method based on laser radar in dynamic environment

Also Published As

Publication number Publication date
CN112084875A (en) 2020-12-15

Similar Documents

Publication Publication Date Title
CN110781827B (en) Road edge detection system and method based on laser radar and fan-shaped space division
CN110084272B (en) Cluster map creation method and repositioning method based on cluster map and position descriptor matching
CN110033457B (en) Target point cloud segmentation method
CN112070770B (en) High-precision three-dimensional map and two-dimensional grid map synchronous construction method
CN109947097B (en) Robot positioning method based on vision and laser fusion and navigation application
CN101567046B (en) Target recognition method of unmanned aerial vehicle based on minimum circle-cover matching
CN111781608B (en) Moving target detection method and system based on FMCW laser radar
CN112665575B (en) SLAM loop detection method based on mobile robot
CN111968177A (en) Mobile robot positioning method based on fixed camera vision
CN110794396B (en) Multi-target identification method and system based on laser radar and navigation radar
CN103065131A (en) Method and system of automatic target recognition tracking under complex scene
CN104517289A (en) Indoor scene positioning method based on hybrid camera
CN111198496A (en) Target following robot and following method
CN112084875B (en) Method for multi-laser radar coordinate system
CN113985435A (en) Mapping method and system fusing multiple laser radars
CN112232139A (en) Obstacle avoidance method based on combination of Yolo v4 and Tof algorithm
CN112947526A (en) Unmanned aerial vehicle autonomous landing method and system
CN113947636B (en) Laser SLAM positioning system and method based on deep learning
CN112907610B (en) LeGO-LOAM-based step-by-step interframe pose estimation algorithm
CN102831388A (en) Method and system for detecting real-time characteristic point based on expanded active shape model
CN115144879B (en) Multi-machine multi-target dynamic positioning system and method
CN111239761B (en) Method for indoor real-time establishment of two-dimensional map
CN115436968A (en) Bitmap repositioning method based on laser radar
CN112731923B (en) Cluster robot co-positioning system and method
Guo et al. 3D Lidar SLAM Based on Ground Segmentation and Scan Context Loop Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant