CN107917710B - Indoor real-time positioning and three-dimensional map construction method based on single line laser - Google Patents

Indoor real-time positioning and three-dimensional map construction method based on single line laser Download PDF

Info

Publication number
CN107917710B
CN107917710B CN201711092661.1A CN201711092661A CN107917710B CN 107917710 B CN107917710 B CN 107917710B CN 201711092661 A CN201711092661 A CN 201711092661A CN 107917710 B CN107917710 B CN 107917710B
Authority
CN
China
Prior art keywords
plane
line segment
straight
dimensional
straight line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711092661.1A
Other languages
Chinese (zh)
Other versions
CN107917710A (en
Inventor
姚剑
谈彬
罗磊
李礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201711092661.1A priority Critical patent/CN107917710B/en
Publication of CN107917710A publication Critical patent/CN107917710A/en
Application granted granted Critical
Publication of CN107917710B publication Critical patent/CN107917710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an indoor real-time positioning and three-dimensional map construction method based on single-line laser, which overcomes the defect that a single-line laser range finder can only acquire plane 2D data by fixing the single-line laser range finder on a rotating device, and realizes the acquisition of indoor 3D environment data. For the acquired 3D scanning data, the invention realizes the six-degree-of-freedom pose estimation of the laser scanning line through an accurate structure map construction algorithm and a robust scanning line matching algorithm, thereby obtaining the motion track of the sensor and constructing the three-dimensional map of the scanned environment by combining the laser scanning data. The invention also realizes an accurate indoor real-time positioning and map construction method.

Description

Indoor real-time positioning and three-dimensional map construction method based on single line laser
Technical Field
The invention belongs to the technical field of indoor three-dimensional environment maps, and relates to a method for indoor real-time positioning and three-dimensional map construction based on single line laser, which can be used for real-time positioning service in indoor environments and meets the construction requirements of indoor three-dimensional environment maps.
Background
With the rapid development of computer technology, automation technology, artificial intelligence technology and other technologies and the gradual improvement of the living standard of people, the demand of indoor service robots is continuously increasing. Different from the traditional industrial robot, the service robot needs to deal with more complex and changeable working environments. Therefore, automation and intellectualization are two necessary characteristics of the current indoor service robot, and the autonomous intelligent service robot can effectively work in a complex and unknown environment. To fulfill the need for autonomous intelligence, a robot must have the ability to feel the three-dimensional environment and determine its position within the environment. The conventional sensors for acquiring environmental data mainly include a vision sensor and a laser sensor. The vision sensor has low manufacturing cost and is convenient to use, and a monocular camera (such as an industrial camera), a binocular camera, a depth camera (such as an RGB-D camera) and the like are typical. However, the field angle of such vision sensors is generally small and sensitive to changes in ambient lighting. Compared with a vision sensor, the laser sensor has a larger field angle and is not very sensitive to the change of ambient illumination. In addition, the laser sensor has higher distance measurement precision, and is favorable for constructing a high-precision environment map.
The laser sensor can be subdivided into a 2D laser sensor and a 3D laser sensor. The 3D laser sensor can directly acquire three-dimensional scan data of an environment, and thus is widely applied to a navigation technology in an outdoor environment. However, the 3D laser sensor is expensive and not suitable for a service robot working in an indoor environment. In contrast, the 2D laser sensor is low in cost, and also has the ranging accuracy and the viewing angle equivalent to those of the 3D laser sensor. However, since the 2D laser sensor can only acquire planar two-dimensional scanning data, it is often only used for estimating the planar motion trajectory of the sensor and constructing an indoor planar map, and currently, the open-source sector slam and mapping are typical algorithms. However, for the indoor service robot, it is far from enough to determine its plane pose and construct a plane environment map.
Disclosure of Invention
In order to determine the six-degree-of-freedom pose of the robot, construct an indoor three-dimensional environment map and reduce the cost of a sensor, the invention provides an indoor real-time positioning and three-dimensional environment map construction method based on a 2D laser sensor.
The technical scheme adopted by the invention is as follows: a method for indoor real-time positioning and three-dimensional map construction based on single line laser is characterized by comprising the following steps:
step 1: acquiring original point cloud data on a two-dimensional scanning line (Scan), extracting straight line segments from the two-dimensional scanning line (Scan), and forming a set by the scanning lines completely covered in a three-dimensional space, which is called Sweep;
step 2: extracting a plane from the straight line segment set obtained in the step 1;
and step 3: for the first sweet, extracting the main direction and thinning by using the extracted plane information;
and 4, step 4: constructing an initial structure map;
and 5: matching the single Scan with the structure map, and extracting the point pairs needing to be optimized;
step 6: optimizing the distance d between corresponding points, and calculating a pose estimation value of the Scan;
and 7: iteratively circulating the step 5 to the step 6 until all Scan optimization in a Sweep is finished, adding the point cloud into the structure map, and growing the structure map;
and 8: and (5) circularly iterating the step 5 to the step 7 until the point cloud data is processed.
Compared with the prior art, the invention has the beneficial results that: a laser scanning strategy with lower cost is used, and real-time pose estimation of the sensor and accurate composition of an indoor three-dimensional environment are realized through an accurate structure map construction algorithm and a robust scanning line matching algorithm. An effective technical means is provided for the development of the indoor intelligent service robot.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention;
FIG. 2 is a diagram illustrating a result of constructing a structure map according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a result of superimposing a three-dimensional scene map and a motion trail in the embodiment of the present invention.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
Referring to fig. 1, the method for indoor real-time positioning and three-dimensional map construction based on single line laser provided by the invention comprises the following steps:
step 1.1: starting from a starting point on a scanning line, performing least square straight line fitting on scanning points of 10 scanning points (including the starting point) behind the starting point, and calculating a fitting error;
the expression of the fitting error is as follows:
Figure GDA0002797079920000031
wherein a and b are straight line fitting parameters, (x)i,yi) Is the two-dimensional coordinate of the ith scanning point, j is the serial number of the starting point of the current straight-line segment on the current scanning line, N is a preset value, 10 is taken in this embodiment;
step 1.2: when the fitting error epsilonl<TlThen, an initial straight line segment is considered to be obtained; otherwise, skipping the current starting point, and continuing to repeat the operation of step 1.1 from the next point until an initial line segment is found; wherein T islThe threshold value of the straight line fitting error is 50 mm;
step 1.3: after an initial straight line segment is obtained, expanding the initial straight line segment; the method specifically comprises the following substeps:
step 1.3.1: from a point P after the current initial straight line segmentiAt the beginning, point P is calculatediThe distance d from the current straight-line segment, wherein i is more than j + N, j represents the point serial number of the starting point of the current straight-line segment in the scanning line, and N can be 10; if d < TdIf the point belongs to the current straight line segment, the point P is determinediAdding the new straight line segment into the current straight line segment to expand the straight line segment, fitting the straight line segment again at the same time, and updating the fitting parameters of the straight line segment; otherwise, the current straight line segment cannot be expanded continuously, and the expanding operation of the straight line segment is finished; wherein T isdThe maximum threshold value of the distance from the point to the straight line can be 50 mm;
step 1.3.2: if the current straight line segment is successfully expanded, continuing to repeat the operation of the step 1.3.1 until the straight line segment cannot be expanded continuously;
step 1.4: after the expansion of one straight-line segment is finished, adding the straight-line segment into a candidate straight-line segment set, and repeating the operations of the steps 1.1-1.3 from the next point of the current straight-line segment until a new straight-line segment can not be extracted from the current scanning line;
step 1.5: performing merging and deleting operations on all candidate straight-line segments on the current scanning line to obtain a final straight-line segment; the method specifically comprises the following substeps:
step 1.5.1: starting from the ith candidate straight-line segment, judging whether the (i + 1) th candidate straight-line segment adjacent to the ith candidate straight-line segment can be merged with the ith candidate straight-line segment, wherein the judgment conditions are as follows:
Figure GDA0002797079920000041
wherein v isiThe direction vector, theta (v), representing the ith straight line segmenti,vi+1) Representing the angle between the vectors of the two straight line segments, TθFor the threshold, it may take 1,
Figure GDA00027970799200000410
the last point representing the ith candidate straight-line segment,
Figure GDA0002797079920000042
the first point representing the i +1 th candidate straight-line segment,
Figure GDA0002797079920000043
denotes the distance between two points, TdFor the threshold value, 200mm can be taken;
when the included angle of the direction vectors of two adjacent candidate straight-line segments is smaller than a threshold value and the distance between head and tail points is also smaller than the threshold value, combining the two candidate straight-line segments into one straight-line segment, and recalculating the fitting parameters of the combined straight-line segment;
step 1.5.2: repeating the operation of 1.5.1 until there are no more candidate straight-line segments that can be recombined; and calculating the length of the combined straight line segment, removing the straight line segment with the length less than 100mm, taking the remaining straight line segment as the straight line segment on the current scanning line which is finally extracted, and storing the straight line segments in sequence.
And 2, extracting a plane from the line segment set obtained in the step 1. The method comprises the following specific steps:
step 2.1: converting two-dimensional straight line segments on corresponding scanning lines into three-dimensional straight line segments according to the pose values of the scanning lines; if the scan line is acquired statically, its pose can be directly derived from the motor rotation angle provided by the scanning device. If the pose estimation value is not acquired statically, the pose estimation value of the scanning line can be obtained through the methods in the step 5-the step 6. When the three-dimensional straight line segment extraction operation is completed on all the scanning lines in a Sweep, a group of three-dimensional straight line segment sets distributed in the whole scanning environment can be obtained and recorded as a three-dimensional straight line segment set
Figure GDA0002797079920000044
Wherein the content of the first and second substances,
Figure GDA0002797079920000045
representing the jth straight line segment on the ith scanning line;
step 2.2: optionally a straight line segment in L
Figure GDA0002797079920000046
In that
Figure GDA0002797079920000047
Associated scanning line SiAdjacent scanning line Si+1,Si-1Upper, search and
Figure GDA0002797079920000048
coplanar straight line segments;
the plane equation is described as:
x·n=d,
wherein x is a point on the plane, n is a normal vector of the plane, and d is a constant; will be provided with
Figure GDA0002797079920000049
And performing least square plane fitting according to the plane equation and the straight line segment on the adjacent scanning line, and if the fitting error is less than a preset value, determining that an initial plane Pl is found. Otherwise, changing a straight line segment and repeating the steps2.2, until finding an initial plane;
step 2.3: searching for a straight line segment around Pl that belongs to Pl, and for a straight line segment l around Pl, if it satisfies the following condition, then l is considered to belong to the plane Pl:
Figure GDA0002797079920000051
wherein x isiIs a point on the straight line segment l;
adding the straight line segment l into the plane Pl, and recalculating the fitting parameters of the plane Pl;
step 2.4: repeating the operation of the step 2.3, and continuously expanding the plane Pl until no straight line segment which can be merged into the plane Pl exists around the plane Pl, so that a complete plane Pl is obtained at the moment;
step 2.5: for the
Figure GDA0002797079920000052
The remaining straight line segments repeat the operations of step 2.2-step 2.4 until no more new planes can be extracted.
And 3, for the first sweet, extracting the main direction and thinning by using the extracted plane information. The method mainly comprises the following steps:
step 3.1: initializing three orthogonal principal directions D ═ D1,d2,d3{1,0,0}, {0,1,0}, {0,0,1} }, calculating a normal vector n of each planepAnd unitizing;
step 3.2: each plane is cross-multiplied with three main directions, ni=np×diSelecting niThe principal direction corresponding to the minimum is taken as the principal direction d matched with the surfacep
Step 3.3: performing main direction optimization by using the matched normal vector pair and by using ceres, and further extracting an optimal main direction;
the main direction optimization by means of ceres specifically comprises the following sub-steps:
step 3.3.1: d ═ D1,d2,d3The transformation matrix r of { rotateX, rotay, rotaz } is used as an optimization term of ceres;
step 3.3.2: cross-multiplication of the matched vector pairs is calculated for each plane, s ═ np×dpTaking s as residual error term of ceres optimization;
step 3.3.3: through continuous iteration optimization R, a corresponding rotation matrix of the main direction is obtained when the optimization error is smaller than a preset value, the R at the moment is converted into a 3 x 3 rotation matrix R, and then the optimized main direction
Figure GDA0002797079920000053
Step 3.4: and uniformly thinning the structural map point cloud.
And 4, constructing an initial structure map. The basic steps are as follows:
step 4.1: for each plane, determining the principal direction closest to the plane, and calculating the two vector angles:
Figure GDA0002797079920000061
if the theta is less than or equal to a preset threshold value, 15 degrees can be taken out, and plane direction optimization is carried out;
the performing of the plane direction optimization specifically includes the following substeps:
step 4.1.1: translating the point cloud C into a coordinate system taking the center of gravity as an origin to form the point cloud Cg
Figure GDA0002797079920000062
Step 4.1.2: using a rotating shaft s-np×dpSolving a rotation matrix R from the rotation angle theta;
step 4.1.3: point cloud CgIs converted into C'g=RCg
Step 4.1.4: c 'point cloud'gChange back to the worldA world coordinate system;
Figure GDA0002797079920000063
step 4.2: it is determined whether the structure map has been initialized. And if the plane map is not initialized, directly adding the optimized plane into the structure map to finish the initialization of the structure map. If the structure map is initialized, judging whether a plane similar to the optimized plane exists in the structure map;
if the similar planes exist, merging is carried out, the merging condition is that the plane vectors are equal, the plane distance d is less than a preset threshold, and 0.1m is taken in the embodiment;
Figure GDA0002797079920000064
wherein xp,yp,zpIs the center of gravity of one surface, and A, B, C, D are the plane equation coefficients of the other surface;
the merging process includes the following substeps:
step 4.2.1: calculating the merged gravity center point Pg
Figure GDA0002797079920000065
Wherein P isg1,Pg2Is the two-plane gravity center point, size1,size2Representing the number of two plane points;
step 4.2.2: translating the point cloud to a gravity center point according to a plane finding direction;
Figure GDA0002797079920000071
wherein T is1,T2Representing the translation amount of the point cloud of the two planes;
if no similar plane exists, projecting points on the plane onto an absolute plane of a plane equation, and adding the plane into a structure map;
step 4.3: and (4) circulating the steps 4.1-4.2 until no plane is optimized, and forming the structure map.
And 5, matching the single Scan with the structural map, and extracting the point pairs needing to be optimized.
The basic steps are as follows:
step 5.1: determining the initial pose of the current Scan;
Figure GDA0002797079920000072
wherein
Figure GDA0002797079920000073
Pi sRepresenting the poses of the ith-1 Scan and the ith Scan;
Figure GDA0002797079920000074
Pi mrepresenting the poses of the i-1 th Scan and the motor corresponding to the ith Scan;
step 5.2: extracting matching points; and carrying out down-sampling on the straight line segment of the Scan, and searching matching points of down-sampling points in the structural map to form matching point pairs.
And 6, optimizing the distance d between the corresponding points.
Figure GDA0002797079920000075
Wherein (x)i,yi,zi) Is the three-dimensional coordinate of the ith scanning point; and taking the pose of each Scan as an optimization item, taking the distance between corresponding points as a residual error item, optimizing by means of cerees, and when the difference between the two optimized poses is obtained, ending the optimization to obtain the optimized pose of each scanning line.
And 7, iteratively circulating the steps 5-6 until sweet point cloud optimization is finished, adding the point cloud into the structure map, and growing the structure map. The method comprises the following specific steps:
step 7.1: carrying out surface extraction on the new sweet according to the steps 1-2;
step 7.2: for each extraction plane PiSearching a candidate surface in the structure map; the condition of candidate surface is that the included angle of two surfaces is less than the angle threshold value TθCan take 15 degrees, and the distance between the two surfaces is less than a distance threshold value Td100mm can be taken;
Figure GDA0002797079920000081
wherein xp,yp,zpIs the center of gravity of one surface, and A, B, C, D are the plane equation coefficients of the other surface; n isnewNormal vector, n, representing the extraction planestructA vector representing a candidate face in the structural map.
If no candidate surface exists, projecting points on the candidate surface onto an absolute plane of a plane equation, and adding the plane into a structure map; if the candidate face exists, the step 7.3 is carried out;
step 7.3: determining merged surface P by building k-d tree search of candidate surfacesmerge
The method specifically comprises the following substeps:
step 7.3.1: establishing a k-d tree for the candidate surface point cloud, and establishing an index relation of a belonging surface for the candidate surface point cloud;
step 7.3.2: to PiEach line segment of the medium composition surface takes two end points and a middle point as search points;
step 7.3.3: searching the nearest point in the k-d tree, recording the plane information of the nearest point, and finally taking the plane with the most nearest points as a merging plane Pmerge
Step 7.4: carry out PiIs optimized in the direction of PmergeAnd (5) combining the two surfaces in the same direction, and growing a structural map.
Step 8, circularly iterating the steps 5-7 until the point cloud data is processed; the final structure map is shown in fig. 2, and a schematic diagram of a result of superimposing the three-dimensional scene map and the motion trail is shown in fig. 3.
The invention fixes a two-dimensional laser range finder on a stable high-precision handheld rotary platform on hardware equipment. The rotary platform can drive the two-dimensional laser range finder to rotate in two modes of continuous rotation and shaking head rotation, and therefore three-dimensional environment data can be acquired. For the acquired three-dimensional environment data, the invention designs an accurate structure map construction algorithm and a robust scanning line matching algorithm, and realizes the functions of real-time positioning in an indoor environment and three-dimensional environment map construction.
The invention provides an accurate structure map construction algorithm based on structure information and a robust scanning line matching algorithm, and realizes real-time pose track of a sensor and accurate composition of an indoor three-dimensional environment. The method provides an effective technical basis for the development of the indoor intelligent service robot, has the advantages of rapidness, accuracy and strong practicability, and can be applied to various scenes such as three-dimensional modeling and service robots.
It should be understood that parts of the specification not set forth in detail are well within the prior art.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. A method for indoor real-time positioning and three-dimensional map construction based on single line laser is characterized by comprising the following steps:
step 1: acquiring original point cloud data on a two-dimensional scanning line, extracting straight line segments from the two-dimensional scanning line, and forming a set of the straight line segments completely covered in a three-dimensional space, and recording the set as Sweep;
the method for extracting the straight line segment in the two-dimensional scanning line comprises the following specific steps of:
step 1.1: starting from a starting point on a two-dimensional scanning line, performing least square straight line fitting on N scanning points behind the starting point, and calculating a fitting error;
the expression of the fitting error is as follows:
Figure FDA0002823043560000011
wherein a and b are straight line fitting parameters, (x)i,yi) The two-dimensional coordinates of the ith scanning point are obtained, j is the serial number of the starting point of the current straight-line segment on the current two-dimensional scanning line, and N is a preset value;
step 1.2: when the fitting error epsilonl<TlThen, an initial straight line segment is considered to be obtained; otherwise, skipping the current starting point, and continuing to repeat the operation of the step 1.1 from the next point until an initial straight line segment is found; wherein T islIs a threshold for straight line fitting error;
step 1.3: after an initial straight line segment is obtained, expanding the initial straight line segment; the method specifically comprises the following substeps:
step 1.3.1: from a point P after the current initial straight line segmentiAt the beginning, point P is calculatediThe distance L from the current initial straight-line segment, wherein i is larger than j + N, and j represents the point serial number of the starting point of the current initial straight-line segment in the two-dimensional scanning line; if L < TdThen the point belongs to the current initial straight line segment, and point P is setiAdding the initial straight-line segment into the current initial straight-line segment to expand the current initial straight-line segment, fitting the initial straight-line segment again at the same time, and updating the fitting parameters of the current initial straight-line segment; otherwise, the current initial straight-line segment cannot be expanded continuously, and the expanding operation of the current initial straight-line segment is finished; wherein T isdA maximum threshold for point-to-line distance;
step 1.3.2: if the current initial straight-line segment is successfully expanded, continuing to repeat the operation of the step 1.3.1 until the current initial straight-line segment cannot be expanded continuously;
step 1.4: after the expansion of one initial straight line segment is finished, adding the initial straight line segment into a candidate straight line segment set, and repeating the operations of the steps 1.1-1.3 from the next point of the current initial straight line segment until a new straight line segment can not be extracted from the current two-dimensional scanning line;
step 1.5: performing merging and deleting operations on all candidate straight-line segments on the current two-dimensional scanning line to obtain a final straight-line segment; the method specifically comprises the following substeps:
step 1.5.1: starting from the ith candidate straight-line segment, judging whether the (i + 1) th candidate straight-line segment adjacent to the ith candidate straight-line segment can be merged with the ith candidate straight-line segment, wherein the judgment conditions are as follows:
Figure FDA0002823043560000021
wherein v isiThe direction vector, theta (v), representing the ith straight line segmenti,vi+1) Representing the angle between the vectors of the two straight line segments, TθIs a threshold value, and is,
Figure FDA0002823043560000022
the last point representing the ith candidate straight-line segment,
Figure FDA0002823043560000023
the first point representing the i +1 th candidate straight-line segment,
Figure FDA0002823043560000024
denotes the distance between two points, Td' is a threshold;
when the included angle of the direction vectors of two adjacent candidate straight-line segments is smaller than a threshold value and the distance between head and tail points is also smaller than the threshold value, combining the two candidate straight-line segments into one straight-line segment, and recalculating the fitting parameters of the combined straight-line segment;
step 1.5.2: repeating the operation of 1.5.1 until there are no more candidate straight-line segments that can be recombined; calculating the length of the combined straight-line segment, removing the straight-line segment with the length smaller than a preset threshold value, taking the remaining straight-line segment as the straight-line segment on the current two-dimensional scanning line which is finally extracted, and storing the straight-line segments in sequence;
step 2: extracting a plane from the straight line segment set obtained in the step 1;
and step 3: for the first sweet, extracting the main direction and thinning by using the extracted plane information;
and 4, step 4: constructing an initial structure map;
and 5: matching the single two-dimensional scanning line with the structural map, and extracting the point pairs to be optimized;
step 6: optimizing the distance d' between corresponding points, and calculating the pose estimation value of the two-dimensional scanning line;
and 7: circularly iterating the step 5 to the step 6 until the optimization of all the two-dimensional scanning lines in a Sweep is finished, adding the point cloud into the structure map, and growing the structure map;
and 8: and (5) circularly iterating the step 5 to the step 7 until the point cloud data is processed.
2. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 1, wherein the step 2 comprises the following sub-steps:
step 2.1: converting two-dimensional straight line segments on corresponding two-dimensional scanning lines into three-dimensional straight line segments according to the pose values of the two-dimensional scanning lines;
judging whether the two-dimensional scanning line is statically collected or not; if the two-dimensional scanning line is statically acquired, the pose of the two-dimensional scanning line is directly obtained by the rotation angle of a motor provided by the scanning equipment; if the two-dimensional scanning line is not statically acquired, obtaining a pose estimation value of the two-dimensional scanning line by the method in the step 5-step 6;
when the conversion operation of three-dimensional straight line segments is completed for all two-dimensional scanning lines in a Sweep, a group of three-dimensional straight line segment sets distributed in the whole scanning environment can be obtained and recorded as a three-dimensional straight line segment set
Figure FDA0002823043560000031
Wherein the content of the first and second substances,
Figure FDA0002823043560000032
representing the j straight line segment on the i two-dimensional scanning line;
step 2.2: optionally a straight line segment in L
Figure FDA0002823043560000033
In that
Figure FDA0002823043560000034
Associated two-dimensional scan line SiAdjacent two-dimensional scanning line Si+1,Si-1Upper, search and
Figure FDA0002823043560000035
coplanar straight line segments;
the plane equation is described as:
x·N=d,
wherein x is a point on the plane, N is a normal vector of the plane, and d is a constant; will be provided with
Figure FDA0002823043560000036
Performing least square plane fitting on the straight line segment on the two-dimensional scanning line adjacent to the straight line segment according to the plane equation, and if the fitting error is smaller than a preset value, determining that an initial plane Pl is found; otherwise, changing a straight line segment, and repeating the operation of the step 2.2 until an initial plane is found;
step 2.3: searching for a straight line segment around Pl that belongs to Pl, and for a straight line segment l around Pl, if it satisfies the following condition, then l is considered to belong to the plane Pl:
Figure FDA0002823043560000037
wherein x isiIs a point on the straight line segment l;
adding the straight line segment l into the plane Pl, and recalculating the fitting parameters of the plane Pl;
step 2.4: repeating the operation of the step 2.3, and continuously expanding the plane Pl until no straight line segment which can be merged into the plane Pl exists around the plane Pl, so that a complete plane Pl is obtained at the moment;
step 2.5: for the
Figure FDA0002823043560000038
The remaining straight line segments repeat the operations of step 2.2-step 2.4 until no more new planes can be extracted.
3. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 1, wherein the step 3 comprises the following sub-steps:
step 3.1: initializing three orthogonal principal directions D ═ D1,d2,d3{1,0,0}, {0,1,0}, {0,0,1} }, calculating a normal vector n of each planepAnd unitizing;
step 3.2: the normal vector of each plane is cross-multiplied with three main directions respectively, ni=np×diSelecting niThe principal direction corresponding to the minimum is taken as the principal direction d matched with the surfacep(ii) a Wherein i is 1,2, 3;
step 3.3: performing main direction optimization by using the matched normal vector pair and by using ceres, and further extracting an optimal main direction;
the main direction optimization by means of ceres specifically comprises the following sub-steps:
step 3.3.1: d ═ D1,d2,d3The transformation matrix r of { rotateX, rotay, rotaz } is used as an optimization term of ceres;
step 3.3.2: cross-multiplication of the matched vector pairs is calculated for each plane, s ═ np×dpTaking s as residual error term of ceres optimization;
step 3.3.3: through continuous iteration optimization r, a corresponding rotation matrix of the main direction is obtained when the optimization error is smaller than a preset valueThen, R at this time is converted into a 3 × 3 rotation matrix R, and an optimized main direction is obtained
Figure FDA0002823043560000042
Step 3.4: and uniformly thinning the structural map point cloud.
4. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 3, wherein the step 4 comprises the following sub-steps:
step 4.1: for each plane, determining the main direction nearest to the plane, and calculating the included angle between the main direction and the normal vector:
Figure FDA0002823043560000041
if the absolute value theta is less than or equal to a preset threshold value, carrying out plane direction optimization;
the performing of the plane direction optimization specifically includes the following substeps:
step 4.1.1: translating the point cloud C into a coordinate system with the center of gravity as an origin to form the point cloud Cg
Step 4.1.2: using a rotating shaft s-np×dpSolving a rotation matrix R from theta;
step 4.1.3: point cloud CgIs converted into C'g=RCg
Step 4.1.4: c 'point cloud'gTransforming back to the world coordinate system;
step 4.2: judging whether the structure map is initialized or not;
if the plane map is not initialized, directly adding the optimized plane into the structure map to finish the initialization of the structure map;
if the structure map is initialized, judging whether a plane similar to the optimized plane exists in the structure map;
if the similar planes exist, merging is carried out, the merging condition is that the plane vectors are equal, and the plane distance d' is less than a preset threshold value;
Figure FDA0002823043560000051
wherein xp,yp,zpIs the center of gravity of one surface, and A, B, C, D are the plane equation coefficients of the other surface;
the merging process includes the following substeps:
step 4.2.1: calculating the merged gravity center point Pg
Figure FDA0002823043560000052
Wherein P isg1,Pg2Is the two-plane gravity center point, size1,size2Representing the number of two plane points;
step 4.2.2: translating the point cloud to a gravity center point according to the normal direction of the plane;
if no similar plane exists, projecting points on the plane onto an absolute plane of a plane equation, and adding the plane into a structure map;
step 4.3: and (4) circulating the steps 4.1-4.2 until no plane is optimized, and forming the structure map.
5. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 1, wherein the step 5 comprises the following sub-steps:
step 5.1: determining the initial pose of the current two-dimensional scanning line;
Figure FDA0002823043560000053
wherein
Figure FDA0002823043560000054
Representing the poses of the (i-1) th two-dimensional scanning line and the ith two-dimensional scanning line;
Figure FDA0002823043560000055
representing the pose of the motor corresponding to the (i-1) th two-dimensional scanning line and the ith two-dimensional scanning line;
step 5.2: extracting matching points; and performing down-sampling on the straight line segment of the two-dimensional scanning line, and searching matching points of down-sampling points in the structural map to form a matching point pair.
6. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 1, wherein the specific implementation process of step 6 is as follows:
Figure FDA0002823043560000061
wherein (x)l,yl,zl)、(xm,ym,zm) Three-dimensional coordinates of the first scanning point and the m scanning points are respectively; using the pose of each two-dimensional scanning line as an optimization item, using the distance between corresponding points as a residual error item, optimizing by using cerees, and when the difference dP between two optimized poses is less than TdPAnd if so, ending the optimization to obtain the optimized pose of each two-dimensional scanning line.
7. The method for indoor real-time positioning and three-dimensional map building based on single line laser as claimed in claim 1, wherein the step 7 comprises the following sub-steps:
step 7.1: carrying out surface extraction on the new sweet according to the steps 1-2;
step 7.2: for each extraction plane PiSearching a candidate surface in the structure map; the condition of candidate surface is that the included angle theta' of two surfaces is smaller than the angle threshold value TθAnd the distance between the two surfaces is less than the distance threshold value Td″;
Figure FDA0002823043560000062
Wherein xp,yp,zpIs the center of gravity of one surface, and A, B, C, D are the plane equation coefficients of the other surface; n isnewNormal vector, n, representing the extraction planestructA vector representing a candidate face in the structural map;
if no candidate surface exists, projecting points on the candidate surface onto an absolute plane of a plane equation, and adding the plane into a structure map; if the candidate face exists, the step 7.3 is carried out;
step 7.3: determining merged surface P by building k-d tree search of candidate surfacesmerge
The method specifically comprises the following substeps:
step 7.3.1: establishing a k-d tree for the candidate surface point cloud, and establishing an index relation of a belonging surface for the candidate surface point cloud;
step 7.3.2: to PiEach line segment of the medium composition surface takes two end points and a middle point as search points;
step 7.3.3: searching the nearest points in the k-d tree, recording the plane information of the nearest points, and finally taking the plane with the most nearest points as a merging plane Pmerge
Step 7.4: carry out PiIs optimized in the direction of PmergeAnd (5) combining the two surfaces in the same direction, and growing a structural map.
CN201711092661.1A 2017-11-08 2017-11-08 Indoor real-time positioning and three-dimensional map construction method based on single line laser Active CN107917710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711092661.1A CN107917710B (en) 2017-11-08 2017-11-08 Indoor real-time positioning and three-dimensional map construction method based on single line laser

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711092661.1A CN107917710B (en) 2017-11-08 2017-11-08 Indoor real-time positioning and three-dimensional map construction method based on single line laser

Publications (2)

Publication Number Publication Date
CN107917710A CN107917710A (en) 2018-04-17
CN107917710B true CN107917710B (en) 2021-03-16

Family

ID=61895285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711092661.1A Active CN107917710B (en) 2017-11-08 2017-11-08 Indoor real-time positioning and three-dimensional map construction method based on single line laser

Country Status (1)

Country Link
CN (1) CN107917710B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110895833A (en) * 2018-09-13 2020-03-20 北京京东尚科信息技术有限公司 Method and device for three-dimensional modeling of indoor scene
CN109655805B (en) * 2019-01-25 2021-12-10 南京理工大学 Laser radar positioning method based on scan line segment coincidence length estimation
CN110764096A (en) * 2019-09-24 2020-02-07 浙江华消科技有限公司 Three-dimensional map construction method for disaster area, robot and robot control method
CN111077495B (en) * 2019-12-10 2022-02-22 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111308481B (en) * 2020-02-21 2021-10-15 深圳市银星智能科技股份有限公司 Laser positioning method and device and mobile robot
CN112418288B (en) * 2020-11-17 2023-02-03 武汉大学 GMS and motion detection-based dynamic vision SLAM method
CN112700464B (en) * 2021-01-15 2022-03-29 腾讯科技(深圳)有限公司 Map information processing method and device, electronic equipment and storage medium
CN113256722B (en) * 2021-06-21 2021-10-15 浙江华睿科技股份有限公司 Pose determination method, pose determination device and storage medium
CN114332232B (en) * 2022-03-11 2022-05-31 中国人民解放军国防科技大学 Smart phone indoor positioning method based on space point, line and surface feature hybrid modeling

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150020900A (en) * 2013-08-19 2015-02-27 부경대학교 산학협력단 System for location recognization and mapping using laser scanner, and method for location recognization using the same
CN104615880A (en) * 2015-01-31 2015-05-13 电子科技大学中山学院 Rapid ICP (inductively coupled plasma) method for point cloud matching of three-dimensional laser radar
CN105354883A (en) * 2015-11-25 2016-02-24 武汉大学 3ds Max fast and precise three-dimensional modeling method and system based on point cloud
CN106599108A (en) * 2016-11-30 2017-04-26 浙江大学 Method for constructing multi-mode environmental map in three-dimensional environment
CN107239076A (en) * 2017-06-28 2017-10-10 仲训昱 The AGV laser SLAM methods matched based on virtual scan with ranging

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150020900A (en) * 2013-08-19 2015-02-27 부경대학교 산학협력단 System for location recognization and mapping using laser scanner, and method for location recognization using the same
CN104615880A (en) * 2015-01-31 2015-05-13 电子科技大学中山学院 Rapid ICP (inductively coupled plasma) method for point cloud matching of three-dimensional laser radar
CN105354883A (en) * 2015-11-25 2016-02-24 武汉大学 3ds Max fast and precise three-dimensional modeling method and system based on point cloud
CN106599108A (en) * 2016-11-30 2017-04-26 浙江大学 Method for constructing multi-mode environmental map in three-dimensional environment
CN107239076A (en) * 2017-06-28 2017-10-10 仲训昱 The AGV laser SLAM methods matched based on virtual scan with ranging

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LASER-BASED SLAM WITH EFFICIENT OCCUPANCY LIKELIHOOD MAP LEARNING FOR DYNAMIC INDOOR SCENES;Li Li, et al.;《ISPRS Annals of the Photogrammetry Remote Sensing and Spatial Information Sciences》;20161231;第3卷(第4期);第119-126页 *

Also Published As

Publication number Publication date
CN107917710A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
CN107917710B (en) Indoor real-time positioning and three-dimensional map construction method based on single line laser
CN111968129B (en) Instant positioning and map construction system and method with semantic perception
CN109166149B (en) Positioning and three-dimensional line frame structure reconstruction method and system integrating binocular camera and IMU
US11074701B2 (en) Interior photographic documentation of architectural and industrial environments using 360 panoramic videos
CN106127739B (en) Monocular vision combined RGB-D SLAM method
CN110189399B (en) Indoor three-dimensional layout reconstruction method and system
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
CN109186606B (en) Robot composition and navigation method based on SLAM and image information
CN106767827B (en) Mobile robot point cloud map creation method based on laser data
CN108010081B (en) RGB-D visual odometer method based on Census transformation and local graph optimization
CN107590827A (en) A kind of indoor mobile robot vision SLAM methods based on Kinect
CN104040590A (en) Method for estimating pose of object
WO2012014545A1 (en) Three-dimensional object recognizing device and three-dimensional object recognizing method
CN113012122B (en) Category-level 6D pose and size estimation method and device
CN112070770A (en) High-precision three-dimensional map and two-dimensional grid map synchronous construction method
CN110570474B (en) Pose estimation method and system of depth camera
CN113192200B (en) Method for constructing urban real scene three-dimensional model based on space-three parallel computing algorithm
CN110749308B (en) SLAM-oriented outdoor positioning method using consumer-grade GPS and 2.5D building models
Agrawal et al. PCE-SLAM: A real-time simultaneous localization and mapping using LiDAR data
CN115471748A (en) Monocular vision SLAM method oriented to dynamic environment
CN116429116A (en) Robot positioning method and equipment
CN114792338A (en) Vision fusion positioning method based on prior three-dimensional laser radar point cloud map
Rothermel et al. Fast and robust generation of semantic urban terrain models from UAV video streams
CN116597080A (en) Complete scene 3D fine model construction system and method for multi-source spatial data
CN115031735A (en) Pose estimation method of monocular vision inertial odometer system based on structural features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant