CN114494391A - SLAM map precision confirmation method and system based on EVO - Google Patents
SLAM map precision confirmation method and system based on EVO Download PDFInfo
- Publication number
- CN114494391A CN114494391A CN202111537125.4A CN202111537125A CN114494391A CN 114494391 A CN114494391 A CN 114494391A CN 202111537125 A CN202111537125 A CN 202111537125A CN 114494391 A CN114494391 A CN 114494391A
- Authority
- CN
- China
- Prior art keywords
- track
- evo
- error
- map
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
- G06F16/258—Data format conversion from or to a database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Abstract
The embodiment of the invention provides an SLAM map accuracy confirmation method and an SLAM map accuracy confirmation system based on an EVO (evaluation of motion vector optimization), wherein the accuracy and the effect of SLAM map construction are evaluated through a track evaluation tool EVO, and the EVO supports track formats (TUM, KITTI, EuRoC MAV and Reactive Oxygen Species (ROS) bag) of various data sets and also supports mutual conversion among the data formats. The core function of the EVO is to be able to plot the trajectory of the camera and evaluate the error between the estimated trajectory and the true value. Under the condition that the performance of a single sensor or the effect of a multi-sensor fusion algorithm is poor, the SLAM mapping effect can be accurately evaluated through absolute errors (APE) and relative errors (RPE), so that the sensor or multi-sensor fusion algorithm is improved, the accuracy of map data is further ensured, and the comfort of a driver and the safety of a vehicle are greatly improved.
Description
Technical Field
The embodiment of the invention relates to the technical field of advanced assistant driving, in particular to an SLAM map precision confirmation method and system based on EVO.
Background
Advanced Driving Assistance Systems (ADAS) aim to help drivers avoid collisions and thereby substantially reduce road traffic accidents and related injuries and deaths. These systems are faster than humans, can remain alert at all times, and have been widely used and deployed in various automotive fields, including advanced and commercial vehicle models, and the like.
The ADAS system will continuously monitor the vehicle surroundings, alert the driver of dangerous road conditions, and take remedial action, such as slowing down or stopping. These systems use inputs from a number of sensors such as cameras, radar, inertial navigation. These inputs are processed after fusion and the information is passed to the drives and other parts of the system. The same sensor technology can be used for the current ADAS system and the upcoming fully autonomous driving system.
Instant positioning and Mapping (SLAM), also called cml (current Mapping and Mapping), or concurrent Mapping and positioning. For the earliest applications in the field of robotics, the problem can be described as: the robot starts to move from an unknown position in an unknown environment, self-positioning is carried out according to the position and the map in the moving process, and meanwhile, an incremental map is built on the basis of self-positioning, so that the autonomous positioning and navigation of the robot are realized. Is also widely applied in the ADAS field at present. If data sent by sensors such as a camera and inertial navigation are received, a track map is drawn, and the vehicle is controlled by a control module to complete a track searching task.
In the process of actually utilizing SLAM mapping, due to the fact that the performance of a single sensor or a multi-sensor fusion algorithm is different, the accuracy of collected map data is greatly fluctuated, and the SLAM mapping effect is influenced. If the map building effect is not ideal, the tracking deviation of the automatic driving vehicle can be caused, and more serious safety accidents can be caused.
Disclosure of Invention
The embodiment of the invention provides an SLAM map accuracy confirming method and system based on an EVO (event-based object modeling), and aims to solve the problem that in the prior art, due to the fact that the performance of a single sensor or a multi-sensor fusion algorithm is different, the accuracy of acquired map data is greatly fluctuated, and the SLAM map building effect is influenced.
In order to solve the above technical problem, in a first aspect, an embodiment of the present invention provides an method for confirming accuracy of a SLAM map based on an EVO, where the method includes:
obtaining map data in an instant positioning and map building SLAM map, and analyzing and processing the map data based on a track evaluation tool EVO to draw a real track of a camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and performing track evaluation based on the real track and the SLAM map to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement based on the error.
Preferably, the map data comprises at least one or more of a TUM data set, a KITTI data set, a EuRoC MAV data set, and a bag data set of ROS.
Preferably, the analyzing and processing of the map data based on the trajectory evaluation tool EVO specifically includes:
plotting estimated poses of the camera based on the trajectory assessment tool EVO and the map data, the estimated trajectories including estimated poses of the camera at a number of trajectory points; and calculating errors between the estimated pose and the real pose, and determining a transformation matrix between the estimated pose and the real pose.
Preferably, the track evaluation is performed based on the real track and the SLAM map to determine the error of each track point, and specifically includes:
determining relative and absolute pose errors of the estimated trajectory and the true trajectory.
Preferably, the determining the relative pose error of the estimated trajectory and the real trajectory specifically includes:
obtaining the estimation pose and the real pose corresponding to two frames of images separated by a set time delta, and determining the relative pose error RPE of the ith frame:
in the above formula, P1,…,Pi,…,PnEstimated pose at time i, Q1,…,Qi,…QnIs true at time iA real posture;
determining a relative pose error overall value based on the root mean square error RMSE:
in the formula, m is equal to n-delta; trans (E)i) Representing the translation error in the relative attitude error; calculating the RMSE (E) for all deltasi:nAverage of Δ):
preferably, determining the absolute pose error of the estimated trajectory and the true trajectory specifically includes:
carrying out coordinate alignment on the estimated track and the real track, determining a conversion matrix S from the estimated track to the real track based on a least square method, and determining an absolute pose error APE at a moment i:
determining an absolute pose error overall value based on the root mean square error RMSE:
in a second aspect, an embodiment of the present invention provides an EVO-based SLAM map accuracy confirmation system, including:
the track drawing module is used for acquiring map data in the instant positioning and map building SLAM map, analyzing and processing the map data based on a track evaluation tool EVO (error vector optimization) to draw a real track of the camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and the track evaluation module is used for carrying out track evaluation on the basis of the real track and the SLAM map so as to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement or not on the basis of the error.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method for determining accuracy of an EVO-based SLAM map according to the embodiment of the first aspect of the present invention when executing the program.
In a fourth aspect, an embodiment of the present invention provides a non-transitory computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method for determining accuracy of an EVO-based SLAM map according to an embodiment of the first aspect of the present invention.
According to the SLAM map accuracy confirming method and system based on the EVO, provided by the embodiment of the invention, the accuracy and the effect of SLAM map construction are evaluated through a track evaluation tool EVO, the EVO supports track formats (TUM, KITTI, EuRoC MAV and Reactive Oxygen Species (ROS) of various data sets, and the mutual conversion among the data formats is supported. The core function of the EVO is to be able to plot the trajectory of the camera and evaluate the error between the estimated trajectory and the true value. Under the condition that the performance of a single sensor or the effect of a multi-sensor fusion algorithm is poor, the SLAM mapping effect can be accurately evaluated through absolute errors (APE) and relative errors (RPE), so that the sensor or multi-sensor fusion algorithm is improved, the accuracy of map data is further ensured, and the comfort of a driver and the safety of a vehicle are greatly improved. In practical application, whether the SLAM mapping effect meets the requirement of subsequent development or not can be judged through the absolute error value.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a flow chart of an EVO-based SLAM map accuracy validation method according to an embodiment of the invention;
FIG. 2 is an EVO plotting trace graph according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating changes in xyz coordinates in a trajectory according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the variation of roll, pitch, yaw angles in a track according to an embodiment of the present invention;
FIG. 5 is an EVO trajectory evaluation graph according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the process of actually utilizing SLAM mapping, due to the fact that the performance of a single sensor or a multi-sensor fusion algorithm is different, the accuracy of collected map data is greatly fluctuated, and the SLAM mapping effect is influenced. If the map building effect is not ideal, the tracking deviation of the automatic driving vehicle can be caused, and more serious safety accidents can be caused.
Therefore, the embodiment of the invention provides an EVO-based SLAM map accuracy confirmation method and system, and under the condition that the performance of a single sensor or the effect of a multi-sensor fusion algorithm is poor, the SLAM map construction effect can be accurately evaluated through absolute errors (APE) and relative errors (RPE), so that the sensor or multi-sensor fusion algorithm is improved, the accuracy of map data is further ensured, and the comfort of a driver and the safety of a vehicle are greatly improved. The following description and description will proceed with reference being made to various embodiments.
Fig. 1 provides an method for determining accuracy of a SLAM map based on an EVO in an embodiment of the present invention, including:
obtaining map data in an instant positioning and map building SLAM map, and analyzing and processing the map data based on a track evaluation tool EVO to draw a real track of a camera, wherein the real track comprises real poses of the camera at a plurality of track points;
in the SLAM mapping process, the map data accuracy is often poor due to the performance of a single sensor or the deficiency of a multi-sensor fusion algorithm, so that the effect of subsequent vehicle tracking or other automatic driving functions is affected, and more serious safety accidents may be caused.
The embodiment of the invention evaluates the accuracy and effect of SLAM mapping by a track evaluation tool EVO, wherein the EVO supports track formats (TUM, KITTI, EuRoC MAV and ROS bag) of various data sets, and simultaneously supports mutual conversion among the data formats. The core function of the EVO is to be able to plot the trajectory of the camera and evaluate the error between the estimated trajectory and the true value. The track precision evaluation indexes used in the method are APE (Absolute position Error) and RPE (Relative position Error).
SLAM map data is collected, and the data format can be tum/kitti/bag/euroc and the like.
And (4) mounting an EVO:
the system is ubuntu and ensures that the system is installed Python.
The installation method comprises the following steps:
a) install directly from pip: pip install evo-upgrade-no-binary evo
b) Installing a source code:
git clone git@github.com:MichaelGrupp/evo.git
cd evo
pip install–editable.–upgrade–no-binary evo
the following command is entered and if successful, a response is made.
evo_ape-h
And performing track evaluation based on the real track and the SLAM map to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement based on the error.
The method specifically comprises the following steps:
plotting estimated poses of the camera based on the trajectory assessment tool EVO and the map data, the estimated trajectories including estimated poses of the camera at a number of trajectory points; and calculating errors between the estimated pose and the real pose, and determining a transformation matrix between the estimated pose and the real pose.
Performing track evaluation based on the real track and the SLAM map to determine errors of track points, specifically comprising:
determining relative and absolute pose errors of the estimated trajectory and the true trajectory. Whether the SLAM map precision meets the requirements or not can be determined by comparing the relative pose error and the absolute pose error with preset error thresholds.
Relative pose errors compare the motion (pose increments). The relative pose error may give a local accuracy, such as the amount of translational or rotational drift per meter of the slam system. The relative pose error is mainly described by the precision (compared with the real pose) of two frames of pose difference at a fixed time difference delta, and is equivalent to the error of a direct measurement odometer. Determining a relative pose error of the estimated trajectory and the real trajectory, specifically comprising:
acquiring the estimation pose and the real pose corresponding to two frames of images separated by a set time delta (or separated by frames), and determining the relative pose error RPE of the ith frame (or frame):
in the above formula, P1,…,Pi,…,PnEstimated pose at time i, Q1,…,Qi,…QnIs the true pose at time i; in this embodiment, it is assumed that the estimated pose and the true pose are aligned in time of each frame, and the total number of frames is the same.
Determining a relative pose error overall value based on the root mean square error RMSE:
in the formula, m is equal to n-delta; RPE contains two partial errors, rotation error and translation error, trans (E), respectivelyi) Representing the translation error in the relative attitude error; given the total n and the interval Δ, m-n- Δ RPEs can be obtained, and this error can then be counted using the root mean square error RMSE to obtain an overall value, which is usually sufficient to evaluate using the translation error, but if desired, the rotation angle error can also be counted using the same method. In this case, the performance of the algorithm can be evaluated by the magnitude of the RMSE value, however, in practical cases, a plurality of choices are available for selecting the delta, and in order to comprehensively measure the performance of the algorithm, the RMSE (E) of all the delta is calculatedi:nAverage of Δ):
sometimes to reduce computational complexity, an estimate may be calculated as a final result by calculating a fixed number of RPE samples.
On the basis of the above embodiment, determining the absolute pose error of the estimated trajectory and the true trajectory specifically includes:
the absolute track error is a direct difference value between an estimated pose and a real pose, and can reflect the algorithm precision and the track global consistency very intuitively. Usually, the estimated pose and the real track are not in the same coordinate system, so the estimated pose and the real track need to be aligned firstly, that is, a transformation matrix S from the estimated pose to the real pose is calculated by a least square method, and an absolute pose error APE at a time i is determined:
determining an absolute pose error overall value based on the root mean square error RMSE:
in the actual use process, the step of visualizing the track through the evo comprises the following steps:
the instructions for evo trace drawing are: evo _ traj
The necessary heel parameters are: format of data (tum/kitti/bag/euroc et al) + track file (single/multiple). For example: evo _ traj tum traj1.txt traj2.txt
The instruction is only basic information of a track, and if the track is to be drawn, an optional parameter-p or-plot is added, such as: evo _ traj tum traj1. txt-p
Drawing a track graph as shown in FIG. 2, wherein the change of the vehicle attitude information is shown in FIGS. 3 and 4, and FIG. 3 is a change graph of xyz coordinates in the track; FIG. 4 is a graph showing changes in roll, pitch, and yaw angles in a track.
The track alignment and scaling step comprises:
the drawing instruction without trajectory alignment and scaling (with a certain position and angular offset in the initial position) is:
evo_traj tum realTraj.txt estTraj.txt–p
the drawing instruction for track alignment and scaling is as follows:
evo_traj tum estTraj.txt–ref realTraj.txt-p–s–a
-ref: specifying a reference tum esttraj. txt
-s: scaling parameters
-a: alignment parameters
The trajectory evaluation step includes:
evo_rpe:
command syntax: command data format reference trajectory estimation trajectory selectable
The formats include data formats such as tum and euroc, and the options include alignment commands, drawing, saving results and the like.
An example of a commonly used command:
evo_rpe tum traj1.txt traj2.txt–r full–va-p–plot_mode xz–save_results results/orb_rpe.zip
evo_ape:
command syntax: command data format reference trajectory estimation trajectory selectable
The formats include data formats such as tum and euroc, and the options include alignment commands, drawing, saving results and the like.
Example of a commonly used command:
evo_ape tum traj1.txt traj2.txt–r full–va-p–plot_mode xz–save_results results/orb_ape.zip
the trajectory evaluation graph is shown in fig. 5. The actual track and the SLAM mapping track are drawn in the map, the absolute error of each track point is marked, and in practical application, whether the SLAM mapping effect meets the requirement of subsequent development or not can be judged through the absolute error value.
The embodiment of the invention also provides an SLAM map accuracy confirmation system based on the EVO, and the SLAM map accuracy confirmation method based on the EVO in the embodiments comprises the following steps:
the track drawing module is used for acquiring map data in the instant positioning and map building SLAM map, analyzing and processing the map data based on a track evaluation tool EVO (error vector optimization) to draw a real track of the camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and the track evaluation module is used for carrying out track evaluation on the basis of the real track and the SLAM map so as to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement or not on the basis of the error.
Based on the same concept, an embodiment of the present invention further provides an entity structure schematic diagram of an electronic device, and as shown in fig. 6, the server may include: a processor (processor)810, a communication Interface 820, a memory 830 and a communication bus 840, wherein the processor 810, the communication Interface 820 and the memory 830 communicate with each other via the communication bus 840. Processor 810 may invoke logic instructions in memory 830 to perform the steps of the EVO-based SLAM map accuracy validation method as described in the various embodiments above. Examples include:
obtaining map data in an instant positioning and map building SLAM map, and analyzing and processing the map data based on a track evaluation tool EVO to draw a real track of a camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and performing track evaluation based on the real track and the SLAM map to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement based on the error.
Based on the same concept, embodiments of the present invention further provide a non-transitory computer-readable storage medium storing a computer program, where the computer program includes at least one code, and the at least one code is executable by a master device to control the master device to implement the steps of the method for determining accuracy of an EVO-based SLAM map according to the embodiments. Examples include:
obtaining map data in an instant positioning and map building SLAM map, and analyzing and processing the map data based on a track evaluation tool EVO to draw a real track of a camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and performing track evaluation based on the real track and the SLAM map to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement based on the error.
In an embodiment, the present application further provides a computer program, which is used to implement the above method embodiment when the computer program is executed by the main control device.
The program may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
In one embodiment, the present application further provides a processor, which is configured to implement the foregoing method embodiment. The processor may be a chip.
In summary, according to the method and the system for determining the accuracy of the SLAM map based on the EVO provided by the embodiment of the invention, the accuracy and the effect of SLAM mapping are evaluated by a trajectory evaluation tool EVO, and the EVO supports trajectory formats (TUM, KITTI, EuRoC MAV, and bag of ROS) of multiple data sets and simultaneously supports mutual conversion among the data formats. The core function of the EVO is to be able to plot the trajectory of the camera and evaluate the error between the estimated trajectory and the true value. Under the condition that the performance of a single sensor or the effect of a multi-sensor fusion algorithm is poor, the SLAM mapping effect can be accurately evaluated through absolute errors (APE) and relative errors (RPE), so that the sensor or multi-sensor fusion algorithm is improved, the accuracy of map data is further ensured, and the comfort of a driver and the safety of a vehicle are greatly improved. In practical application, whether the SLAM mapping effect meets the requirement of subsequent development or not can be judged through the absolute error value.
It should be noted that, in the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to relevant descriptions of other embodiments for parts that are not described in detail in a certain embodiment.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (9)
1. An SLAM map accuracy confirmation method based on an EVO is characterized by comprising the following steps:
obtaining map data in an instant positioning and map building SLAM map, and analyzing and processing the map data based on a track evaluation tool EVO to draw a real track of a camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and performing track evaluation based on the real track and the SLAM map to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement based on the error.
2. The EVO-based SLAM map accuracy validation method of claim 1, wherein the map data comprises at least one or more of a TUM data set, a KITTI data set, a EuRoC MAV data set, and a bag data set of ROS.
3. The method for determining the accuracy of the SLAM map based on the EVO as claimed in claim 1, wherein analyzing and processing the map data based on a trajectory evaluation tool EVO specifically comprises:
plotting estimated poses of the camera based on the trajectory assessment tool EVO and the map data, the estimated trajectories including estimated poses of the camera at a number of trajectory points; and calculating errors between the estimated pose and the real pose, and determining a transformation matrix between the estimated pose and the real pose.
4. The method for determining the accuracy of an SLAM map based on an EVO (event-resolved event) -based according to claim 3, wherein the track evaluation is performed based on the real track and the SLAM map to determine the error of each track point, specifically comprising:
determining relative and absolute pose errors of the estimated trajectory and the true trajectory.
5. The method for determining the accuracy of the SLAM map based on the EVO as claimed in claim 4, wherein determining the relative pose error of the estimated trajectory and the real trajectory specifically comprises:
obtaining the estimation pose and the real pose corresponding to two frames of images separated by a set time delta, and determining the relative pose error RPE of the ith frame:
in the above formula, P1,…,Pi,…,PnEstimated pose at time i, Q1,…,Qi,…QnIs the true pose at time i;
determining a relative pose error overall value based on the root mean square error RMSE:
in the formula, m is equal to n-delta; trans (E)i) Representing the translation error in the relative attitude error; calculating the RMSE (E) for all deltasi:nAverage of Δ):
6. the method for determining the accuracy of an SLAM map based on an EVO as claimed in claim 5, wherein determining the absolute pose error of the estimated trajectory and the true trajectory specifically comprises:
carrying out coordinate alignment on the estimated track and the real track, determining a conversion matrix S from the estimated track to the real track based on a least square method, and determining an absolute pose error APE at a moment i:
and determining an absolute pose error overall value based on the root mean square error RMSE:
7. an EVO-based SLAM map accuracy validation system, comprising:
the track drawing module is used for acquiring map data in the instant positioning and map building SLAM map, analyzing and processing the map data based on a track evaluation tool EVO (error vector optimization) to draw a real track of the camera, wherein the real track comprises real poses of the camera at a plurality of track points;
and the track evaluation module is used for carrying out track evaluation on the basis of the real track and the SLAM map so as to determine the error of each track point, and determining whether the precision of the SLAM map meets the requirement or not on the basis of the error.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the method for EVO-based SLAM map accuracy validation method according to any of claims 1 to 6.
9. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the EVO-based SLAM map accuracy validation method of any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111537125.4A CN114494391A (en) | 2021-12-15 | 2021-12-15 | SLAM map precision confirmation method and system based on EVO |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111537125.4A CN114494391A (en) | 2021-12-15 | 2021-12-15 | SLAM map precision confirmation method and system based on EVO |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114494391A true CN114494391A (en) | 2022-05-13 |
Family
ID=81493486
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111537125.4A Pending CN114494391A (en) | 2021-12-15 | 2021-12-15 | SLAM map precision confirmation method and system based on EVO |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114494391A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116659529A (en) * | 2023-05-26 | 2023-08-29 | 小米汽车科技有限公司 | Data detection method, device, vehicle and storage medium |
-
2021
- 2021-12-15 CN CN202111537125.4A patent/CN114494391A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116659529A (en) * | 2023-05-26 | 2023-08-29 | 小米汽车科技有限公司 | Data detection method, device, vehicle and storage medium |
CN116659529B (en) * | 2023-05-26 | 2024-02-06 | 小米汽车科技有限公司 | Data detection method, device, vehicle and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110673115B (en) | Combined calibration method, device, equipment and medium for radar and integrated navigation system | |
CN110160542B (en) | Method and device for positioning lane line, storage medium and electronic device | |
CN110968087B (en) | Calibration method and device for vehicle control parameters, vehicle-mounted controller and unmanned vehicle | |
CN109901138B (en) | Laser radar calibration method, device, equipment and storage medium | |
Lee et al. | Robust multirate on-road vehicle localization for autonomous highway driving vehicles | |
US10369993B2 (en) | Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free | |
WO2020140431A1 (en) | Camera pose determination method and apparatus, electronic device and storage medium | |
CN112835085B (en) | Method and device for determining vehicle position | |
KR102620325B1 (en) | Methods, devices, electronic devices and storage media for determining traffic flow information | |
CN112146682B (en) | Sensor calibration method and device for intelligent automobile, electronic equipment and medium | |
CN107782304B (en) | Mobile robot positioning method and device, mobile robot and storage medium | |
CN113183975B (en) | Control method, device, equipment and storage medium for automatic driving vehicle | |
CN113920198B (en) | Coarse-to-fine multi-sensor fusion positioning method based on semantic edge alignment | |
CN111176270A (en) | Positioning using dynamic landmarks | |
CN112286049A (en) | Motion trajectory prediction method and device | |
CN111812669B (en) | Winding machine inspection device, positioning method thereof and storage medium | |
CN115127576A (en) | Path planning method, device, chip, terminal, electronic equipment and storage medium | |
CN114943952A (en) | Method, system, device and medium for obstacle fusion under multi-camera overlapped view field | |
EP3644293B1 (en) | Travel control method and travel control device | |
CN114494391A (en) | SLAM map precision confirmation method and system based on EVO | |
CN110637209A (en) | Method, apparatus, and computer-readable storage medium having instructions for estimating a pose of a motor vehicle | |
CN112925302A (en) | Robot pose control method and device | |
CN110083158B (en) | Method and equipment for determining local planning path | |
CN112304322B (en) | Restarting method after visual positioning failure and vehicle-mounted terminal | |
CN108961337B (en) | Vehicle-mounted camera course angle calibration method and device, electronic equipment and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |