CN118014850A - SLAM precision enhancement method based on plane characteristics - Google Patents

SLAM precision enhancement method based on plane characteristics Download PDF

Info

Publication number
CN118014850A
CN118014850A CN202410418294.3A CN202410418294A CN118014850A CN 118014850 A CN118014850 A CN 118014850A CN 202410418294 A CN202410418294 A CN 202410418294A CN 118014850 A CN118014850 A CN 118014850A
Authority
CN
China
Prior art keywords
point cloud
plane
slam
clouds
error term
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410418294.3A
Other languages
Chinese (zh)
Other versions
CN118014850B (en
Inventor
魏占营
刘洋
杨东辉
陈学霞
王坤
陈利东
姚琰燚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Vision Zhixing Technology Co ltd
Original Assignee
Beijing Vision Zhixing Technology Co ltd
Huajian Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Vision Zhixing Technology Co ltd, Huajian Technology Shenzhen Co ltd filed Critical Beijing Vision Zhixing Technology Co ltd
Priority to CN202410418294.3A priority Critical patent/CN118014850B/en
Priority claimed from CN202410418294.3A external-priority patent/CN118014850B/en
Publication of CN118014850A publication Critical patent/CN118014850A/en
Application granted granted Critical
Publication of CN118014850B publication Critical patent/CN118014850B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a SLAM precision enhancement method based on plane characteristics, which comprises the following steps: acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds; based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation; performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal; obtaining a plane error term according to the plane equation and the corresponding normal; obtaining a target SLAM track according to the plane error item; and performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced. The invention solves the problems of low SLAM achievement precision and high acquisition cost in the prior art.

Description

SLAM precision enhancement method based on plane characteristics
Technical Field
The invention relates to the technical field of point cloud acquisition, in particular to a SLAM precision enhancement method based on plane characteristics.
Background
The laser SLAM system can collect high-precision point cloud of indoor environment, but the SLAM system needs a plurality of pre-conditions to collect high-precision point cloud results, such as small scenes, closed loops and the like, but the requirements cannot be met under complex scenes, so that the SLAM results cannot meet the high-precision requirements of surveying and mapping, and precision losses of different degrees occur. To avoid these problems, the conventional solution is to add a GNSS/RTK (if it is outside), and a control point solution is needed indoors, namely: and arranging control points (such as a sticker, a steel nail and the like) at positions where accuracy loss is easy to occur in an indoor scene, measuring and acquiring coordinates by using a surveying instrument (total station), contacting the control points by using a fixed position of equipment when the SLAM system works and passing the positions, recording the time, and accurately aligning the trajectories corresponding to SLAM, thereby realizing accuracy control as control points (also called anchor points).
The control point method can well control the precision, but the control points are required to be distributed uniformly and densely, for example, the indoor parking lot is generally required to be distributed according to the density of 20-30 meters, if the control points are fewer, the area far away from the control points cannot be restrained, and the precision still is lost. Therefore, the cost for arranging the control points is high, the construction and measurement difficulty is high, the measurement of the transfer station is very troublesome especially in places with poor vision conditions, and the workload for arranging the control points exceeds half of the whole acquisition work according to experience. In addition, in some situations, control points, such as galleries or places where manpower cannot reach, cannot be laid out due to condition restrictions.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a SLAM precision enhancement method based on plane characteristics, and solves the problems of low SLAM achievement precision and high acquisition cost in the prior art.
In order to achieve the above object, the present invention provides the following solutions:
a planar feature-based SLAM accuracy enhancement method, comprising:
acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds;
Based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation;
performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal;
Obtaining a plane error term according to the plane equation and the corresponding normal;
Obtaining a target SLAM track according to the plane error item;
And performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced.
Preferably, the plane error term includes:
A parallel error term, a vertical error term, and a coplanar error term.
Preferably, the parallel error term is:
=/>
Wherein, Representing the error between any two parallel planes in all parallel (x total) planes.
Preferably, the vertical error term is:
=/>
Wherein, Is a vertical error term,/>Representing the error between any two vertical planes in all vertical (total x) planes.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
The invention provides a SLAM precision enhancement method based on plane characteristics, which comprises the following steps: acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds; based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation; performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal; obtaining a plane error term according to the plane equation and the corresponding normal; obtaining a target SLAM track according to the plane error item; and performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced. According to the invention, the plane error term is obtained based on the plane characteristics, and the SLAM track is jointly optimized from the plane, the vertical and the coplanar angles 3, so that the accuracy of the SLAM track is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a SLAM precision enhancement method based on planar features provided by an embodiment of the present invention;
FIG. 2 is a schematic view of a feature plane provided by an embodiment of the present invention;
FIG. 3 is a schematic view of a parallel plane provided by an embodiment of the present invention;
FIG. 4 is a schematic vertical plan view of an embodiment of the present invention;
fig. 5 is a schematic plan view of a coplanar structure according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention aims to provide a SLAM precision enhancement method based on plane characteristics, which solves the problems of low SLAM achievement precision and over-high acquisition cost in the prior art
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
As shown in fig. 1, the present invention provides a SLAM accuracy enhancement method based on planar features, including:
Step 100: acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds;
Specifically, a point cloud selects a rule. Representative plane point clouds such as column sides, walls and floors are selected from SLAM point cloud achievements, all the selected point clouds are guaranteed to be stored in groups on planes which are parallel (or vertical or coplanar) with each other according to coefficients of plane equations, planes of each group of point clouds must be in a parallel (or vertical or coplanar) relation, and each group of point clouds has corresponding ID numbers to show differences. For example, of the 8 facades in fig. 2, ①③⑤⑦ is one set and ②④⑥⑧ is another set. The point cloud can be selected according to actual conditions, not every face (different from sample labeling based on a deep learning technology) needs not to be selected, and every face also does not need to be selected, only a part of representative point cloud needs to be selected, but the number of points of the selected point cloud is ensured not to be too small, otherwise, the point cloud cannot be fitted into an accurate plane, and the constraint relation is too small.
Step 200: based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation;
specifically, based on the selected point cloud, a plurality of 'slice' point clouds are formed through point cloud clustering and segmentation. Multiple pieces may be selected on a larger surface, such as surface ① being large in fig. 3.
Step 300: performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal;
specifically, plane fitting is performed on the fragment point cloud, and a plane equation is generated. The sliced point cloud comprises a certain number of laser frames, the point cloud in the slice belongs to the same plane, and plane fitting is carried out by utilizing Ransac algorithm to generate a plane equation and a normal. The manually selected point cloud is inherently less noisy, so that the desired plane equation can be obtained using Ransac.
Step 400: obtaining a plane error term according to the plane equation and the corresponding normal;
Step 500: obtaining a target SLAM track according to the plane error item;
specifically, an error term is constructed according to the fitted normal of each point cloud and the track range corresponding to the point cloud in the chip, and a new high-precision SLAM track is calculated.
Step 600: and performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced.
Specifically, all high-precision SLAM observation values are integrated, and combined optimization is carried out with the initial SLAM track, so that SLAM precision enhancement is realized.
Specifically, based on priori knowledge in the scanning scene, such as spatial relationships of coplanarity, parallelism, verticality and the like among walls, vertical faces and the ground, the spatial relationships are respectively: the SLAM precision enhancement based on the parallel plane, the SLAM precision enhancement based on the vertical plane and the SLAM precision enhancement based on the coplanar feature can be restrained at the same time by three common features, so that the precision enhancement based on the plane feature is realized.
Further, the plane error term includes:
A parallel error term, a vertical error term, and a coplanar error term.
Specifically, SLAM accuracy enhancement based on parallel features: one of the manifestations of SLAM accuracy loss is: in a scanning scenario, two (or more) walls, facades, floors that should be parallel to each other appear non-parallel in the resulting point cloud, as in fig. 2, ①③⑤⑦ is a plane parallel to each other and ②④⑥⑧ is another set of planes parallel to each other.
Let the equation for two planes be:
(1);
(2);
Is provided with ,/>Normalized normal to two planes
If the two planes are parallel, it can be seen that:
(3);
wherein A, B, C is the plane equation coefficient, i.e., the plane normal, and different subscripts represent different planes.
Specifically, as shown in fig. 3, two surfaces (consisting of laser point clouds) in a scanned scene are shown;
The six-degree-of-freedom (x, y, z, h, p, r) pose corresponding to the track of the SLAM system at the time t is set, and the pose is expressed as a matrix The coordinate system of the laser point cloud at the moment in the base itself is denoted/>(X, y, z), then the coordinates of the laser point cloud in the world coordinate system can be expressed as:
(x,y,z)=/>(x,y,z)·/>
wherein, local (denoted by a subscript l) refers to a local coordinate system of the laser itself; global (denoted by the subscript g) refers to the coordinate system of the resulting point cloud generated by the SLAM system, the global coordinate system
At any point in the same planeThe method comprises the following steps:
=/>=0 (4);
Wherein the method comprises the steps of And/>Is the normalized normal of the object plane at local coordinates and world coordinates.
The point on the same plane, whether in the world coordinate system (global) or the laser coordinate system (local), satisfies the formula (4), i.e., the dot product of the normal line and the coordinate point is zero.
And because the following conditions are satisfied:=/>(5);
i.e. point cloud under laser coordinate system Via SLAM track/>Transformation can obtain the point cloud/>, under the global coordinate systemAnd (4) in combination, then:
=/>(6);
the further derivation is as follows:
=/>(7);
If there is a plane A and a plane B parallel, the construction error term is:
=/> (8);
All selected parallel planes are assembled in this way, i.e. the parallel error term is obtained:
=/> (9);
Wherein, Is a parallel error term
Based on optimization theory to minimize error term, the selected plane in the scene can be obtainedSix-degree-of-freedom trajectories corresponding to SLAMs at different times: /(I)=/>Wherein/>The spatial location information is represented by a representation of the spatial location information,Representing three axis angle information, heading, pitch, roll. The track is used as high-precision observation and is combined and optimized with the track fusion before SLAM, so that SLAM precision enhancement can be realized.
Specifically, SLAM accuracy enhancement based on parallel features:
The vertical plane-based SLAM precision enhancement is that a plurality of planes, which should be vertical in space, are not vertical in a SLAM point cloud, two vertical planes in upper and lower layers, or two vertical planes in the same layer but far away or after a relatively complex path are not vertical in a point cloud, and only two vertical planes are selected for convenience of discussion, as shown in fig. 4.
The two perpendicular planes can be expressed as:
(10);
Can also be expressed as:
(11);
The vertical planes are as shown in fig. 2 for ① and ②、③ and ④、⑤ and ⑥, etc., whereby:
=/>(12);
The vertical error term is:
=/>(13);
Specifically, precision enhancement based on coplanar features:
Coplanarity is a special case of parallelism, and as shown in fig. 5, two coplanarity planes can be expressed as:
(14);
the error terms of three SLAM precision enhancements, parallel, perpendicular, and coplanar, can be collectively expressed as:
=/>+/>
a plane is thus generally denoted by N.
Specifically, the invention also provides the following embodiments:
Five-layer ring corridor for a building is 300 meters long and about 2.0 meters wide, and the ring corridor is shaped like a multi-fold line. In order to test the SLAM precision enhancement effect based on the characteristic plane, acquisition is intentionally interrupted at the position of the head-tail closed loop by 10 meters, because the common view area is smaller, the head-tail deviation occurs after calculation, the error of the head-tail plane and the elevation is about 1-2 meters after measurement, and the whole point cloud has certain distortion.
And 7 point clouds are selected together to form two vertical planes, three horizontal planes and two common planes (ground), and accordingly, a mixed characteristic plane is arranged. Wherein ①②③ and ④⑤ are vertical planes (parallel planes within a group, vertical planes between groups), and ⑥⑦ are coplanar.
The invention realizes the precision correction of the plane and the elevation, and the error of the plane and the elevation at the joint of the head and the tail is superior to 5cm through actual measurement; the whole corridor is straight, the distance between the north and the south is measured, and the distance between the north and the east is measured, and the internal error is within 10cm, so that the effect consistent with priori knowledge is achieved.
The beneficial effects of the invention are as follows:
The invention provides a SLAM precision enhancement method based on plane characteristics, which comprises the following steps: acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds; based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation; performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal; obtaining a plane error term according to the plane equation and the corresponding normal; obtaining a target SLAM track according to the plane error item; and performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced. According to the invention, the plane error term is obtained based on the plane characteristics, and the SLAM track is jointly optimized from the plane, the vertical and the coplanar angles 3, so that the accuracy of the SLAM track is improved.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (4)

1. A planar feature-based SLAM accuracy enhancement method, comprising:
acquiring Ping Miandian clouds according to the point cloud achievements to be enhanced; the planar point cloud includes: parallel Ping Miandian clouds, perpendicular Ping Miandian clouds, and coplanar point clouds;
Based on the plane point cloud, obtaining a fragment point cloud through cloud clustering and segmentation;
performing plane fitting on the fragment point cloud to obtain a plane equation and a corresponding normal;
Obtaining a plane error term according to the plane equation and the corresponding normal;
Obtaining a target SLAM track according to the plane error item;
And performing joint optimization on the target SLAM track and an initial target SLAM track corresponding to the point cloud achievement to be enhanced so as to perform precision enhancement on the initial target SLAM track corresponding to the point cloud achievement to be enhanced.
2. The SLAM accuracy enhancement method based on planar features of claim 1, wherein the planar error term comprises:
A parallel error term, a vertical error term, and a coplanar error term.
3. The SLAM accuracy enhancement method based on planar features of claim 2, wherein the parallel error term is:
=/>
Wherein, Representing the error between any two parallel planes in all parallel planes.
4. The SLAM accuracy enhancement method based on planar features of claim 2, wherein the vertical error term is:
=/>
Wherein, Is a vertical error term,/>Representing the error between any two vertical planes in all vertical planes.
CN202410418294.3A 2024-04-09 SLAM precision enhancement method based on plane characteristics Active CN118014850B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410418294.3A CN118014850B (en) 2024-04-09 SLAM precision enhancement method based on plane characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410418294.3A CN118014850B (en) 2024-04-09 SLAM precision enhancement method based on plane characteristics

Publications (2)

Publication Number Publication Date
CN118014850A true CN118014850A (en) 2024-05-10
CN118014850B CN118014850B (en) 2024-06-28

Family

ID=

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110111421A (en) * 2019-05-10 2019-08-09 武汉海达数云技术有限公司 A kind of method and device of mobile mapping point cloud
US11481970B1 (en) * 2021-05-28 2022-10-25 End To End, Inc. Modeling indoor scenes using measurements captured using mobile devices
CN115409954A (en) * 2021-05-26 2022-11-29 西华大学 Dense point cloud map construction method based on ORB feature points
CN116429084A (en) * 2023-03-08 2023-07-14 北京交通大学 Dynamic environment 3D synchronous positioning and mapping method
CN116973893A (en) * 2023-07-19 2023-10-31 广州高新兴机器人有限公司 Three-dimensional laser radar calibration method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110111421A (en) * 2019-05-10 2019-08-09 武汉海达数云技术有限公司 A kind of method and device of mobile mapping point cloud
CN115409954A (en) * 2021-05-26 2022-11-29 西华大学 Dense point cloud map construction method based on ORB feature points
US11481970B1 (en) * 2021-05-28 2022-10-25 End To End, Inc. Modeling indoor scenes using measurements captured using mobile devices
CN116429084A (en) * 2023-03-08 2023-07-14 北京交通大学 Dynamic environment 3D synchronous positioning and mapping method
CN116973893A (en) * 2023-07-19 2023-10-31 广州高新兴机器人有限公司 Three-dimensional laser radar calibration method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
范都耀: "基于深度信息的三维重建关键技术研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》, 15 February 2024 (2024-02-15), pages 138 - 1236 *

Similar Documents

Publication Publication Date Title
CN112525164B (en) Method for detecting deformation of super high-rise building based on unmanned aerial vehicle oblique photography technology
CN106846308B (en) The detection method and device of topographic map precision based on cloud
CN105203023B (en) A kind of one-stop scaling method of vehicle-mounted three-dimensional laser scanning system placement parameter
CN103217688B (en) Airborne laser radar point cloud adjustment computing method based on triangular irregular network
CN108413988B (en) Method for quickly calibrating coordinate system of theodolite at tail end of robot
CN101413785B (en) Error compensation method of positioning system based on double-rotating laser plane transmitter network
CN109916406B (en) Surrounding target positioning method based on unmanned aerial vehicle cluster
CN113401308B (en) Ship large-line type subsection total assembly precision control method
CN112884647A (en) Embedded part construction positioning method based on BIM point cloud technology guidance
WO2021098808A1 (en) Method and system for determining laser tracker station, electronic device, and medium
CN112924975B (en) Adaptive observation method and system for AOI (automatic optical inspection) applicable to networking weather radar
CN103400416A (en) City environment robot navigation method based on multi-layer probabilistic terrain
CN109297426A (en) A kind of large-scale precision industrial equipment deflection and servo angle detecting method
CN109959898A (en) A kind of seat bottom type underwater sound Passive Positioning basic matrix method for self-calibrating
CN109856616A (en) A kind of radar fix relative systematic error modification method
CN110779512B (en) Flight test route planning method for accuracy identification of measurement and control equipment
CN110856104B (en) Ultra-wideband indoor positioning method combining least square positioning and trilateral positioning
CN118014850B (en) SLAM precision enhancement method based on plane characteristics
CN118014850A (en) SLAM precision enhancement method based on plane characteristics
CN106546229B (en) A kind of surveying and locating method convenient for floor manager
CN109283539A (en) A kind of localization method suitable for high-rise non-flat configuration
CN106646413A (en) Radar networking vertical line crossing integration positioning method and error calculating method thereof
CN110487181A (en) A kind of 3 D laser scanning method suitable for marine oil and gas platform
CN115793002A (en) Double-satellite combined passive positioning method based on direction finding error weight
CN115371599A (en) High-precision ground flatness measuring system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240606

Address after: Room 301-303, 3rd Floor, Building 19, 11th District, No. 188 South Fourth Ring West Road, Fengtai District, Beijing, 100000

Applicant after: Beijing Vision Zhixing Technology Co.,Ltd.

Country or region after: China

Address before: 518000 Room 301, building 2, Shenzhen new generation industrial park, No. 136, Zhongkang Road, Meidu community, Meilin street, Futian District, Shenzhen, Guangdong Province

Applicant before: Huajian Technology (Shenzhen) Co.,Ltd.

Country or region before: China

Applicant before: Beijing Vision Zhixing Technology Co.,Ltd.

GR01 Patent grant