CN114964212B - Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration - Google Patents

Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration Download PDF

Info

Publication number
CN114964212B
CN114964212B CN202210622880.0A CN202210622880A CN114964212B CN 114964212 B CN114964212 B CN 114964212B CN 202210622880 A CN202210622880 A CN 202210622880A CN 114964212 B CN114964212 B CN 114964212B
Authority
CN
China
Prior art keywords
odometer
point
degradation
fusion
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210622880.0A
Other languages
Chinese (zh)
Other versions
CN114964212A (en
Inventor
徐雍
凌胜财
陈梓杰
饶红霞
黄增鸿
鲁仁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202210622880.0A priority Critical patent/CN114964212B/en
Publication of CN114964212A publication Critical patent/CN114964212A/en
Application granted granted Critical
Publication of CN114964212B publication Critical patent/CN114964212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a multi-machine collaborative fusion positioning and mapping method for unknown space exploration, which is used for realizing the close coupling state estimation of an inertial odometer, a laser-inertial odometer and a vision-inertial odometer based on a lightweight multi-sensor fusion mileage calculation method. Degradation identification and fusion decision algorithm based on the observability analysis judges whether the odometer is degraded or not, fusion of odometer information can be completed based on the degradation state of the odometer, the pose estimation result of the robot in the characteristic sparse scene is improved, multi-source information fusion of the robot is realized, and reliable guarantee is provided for autonomous navigation, real-time obstacle avoidance and map construction of the robot in a complex unknown space. The map information sharing and the optimized distribution of computing resources of all robots are realized based on a multi-machine collaborative map building algorithm, the construction of a global consistent map and the distribution of collaborative map building tasks are completed based on an information sharing network, and the efficiency of the robots in constructing an unknown space map is improved.

Description

Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration
Technical Field
The invention relates to the technical field of robots, in particular to a multi-machine collaborative fusion positioning and mapping method for unknown space exploration.
Background
The complex unknown space such as an underground karst cave, an original forest and the like generally has the characteristics of shielding satellite positioning signals, limited communication, complex landform and landform, incapability of arranging landmark points in advance and the like, so that under the condition of no environmental prior information, the robot for exploring the unknown space has important significance on modeling the surrounding environment information and estimating the self pose by using a sensor carried by the robot in the motion process. In recent years, researchers continuously improve real-time positioning and mapping technology, but most application scenes aim at indoor or outdoor open environments, and research applied to unknown space exploration is not much.
The single sensor has defects in perception capability and robustness, and cannot meet various challenges brought by complex environments, so that multi-sensor fusion is the mainstream scheme at present. Due to the limitation of self load and power consumption of the robot for exploring the unknown space, airborne computer computing resources of the robot are limited, and the unknown space with limited communication does not have the condition of using cloud computing resources, so that how to enable the multi-sensor fusion algorithm to run on a robot platform in real time becomes very challenging. The existing scheme reduces the operation cost mainly by realizing the loose coupling mode among all the sensors. The method lacks comprehensive consideration on the characteristics of the sensor, and is not beneficial to improving the positioning precision of the robot.
When a robot is used for exploring an unknown space, a scene with a large number of repeated geometric structures or single texture and sparse features is encountered, and due to the lack of constraints in certain directions, the scene generally causes the state estimation of some sensors to be degraded, and the system stability is seriously influenced. The existing method mainly judges whether the sensor is degraded or not by comparing the predicted value with the observed value, and the method does not analyze from the own observation equation of the sensor and has certain limitation.
The robot is limited by the installation position of the sensor, the shielding of surrounding environment obstacles and other reasons, and a large sensing blind area exists when the robot explores in an unknown space. And when a single robot executes a long-time mapping task, the problems of long mapping time, high computational power requirement, difficulty in meeting the real-time requirement and the like exist. The existing research aiming at multi-robot cooperative mapping is still in the germination stage, and most of the existing researches mainly aim at indoor mapping, so that the requirement of unknown space exploration on multi-robot cooperative mapping cannot be met.
Disclosure of Invention
The invention aims to provide a multi-machine collaborative fusion positioning and mapping method for unknown space exploration, and high-precision positioning and mapping of a robot in an unknown complex environment are realized.
In order to achieve the purpose, the invention adopts the following technical scheme: the method for multi-machine collaborative fusion positioning and mapping for unknown space exploration comprises the following steps:
the lightweight multi-sensor fusion mileage calculation method designs an inertial odometer, a laser-inertial odometer and a vision-inertial odometer, and realizes the fusion of multi-sensor information; the inertial odometer, the laser-inertial odometer and the vision-inertial odometer are respectively used for estimating pose information of the robot;
designing a degradation identification and fusion decision strategy algorithm based on the visibility analysis, and judging whether the laser-inertial odometer and the vision-inertial odometer degrade or not based on the visibility analysis; if degradation exists, carrying out degradation processing on the pose estimation information acquired by the laser-inertial odometer and the pose estimation information of the vision-inertial odometer, and fusing the pose estimation information after the degradation processing and the pose estimation information of the inertial odometer according to a fusion decision to obtain total pose estimation output; if the degradation does not exist, fusing the pose estimation information acquired by the inertial odometer, the laser-inertial odometer and the vision-inertial odometer according to a fusion decision, obtaining total attitude estimation output;
and in the multi-machine cooperative mapping algorithm, the construction of a globally consistent map and the distribution of a cooperative mapping task are completed among all the robots through the information sharing network.
As an alternative embodiment, the method for fusing multi-sensor information by the odometer system includes:
the inertial odometer forms a dead reckoning system through an inertial measurement unit and an airspeed meter, and the dead reckoning system is used for providing high-frequency initial position estimation for the robot; the position drift of the inertial odometer in the vertical direction is eliminated through height constraint generated by the barometer and the laser ranging module;
the laser-inertial odometer receives initial pose estimation of the inertial odometer and observed quantity of the solid-state laser radar, and provides a corrected pose for the inertial odometer by constructing an iterative error state Kalman filtering estimator;
the vision-inertia odometer receives initial pose estimation of the inertia odometer, image data collected by a vision sensor and point cloud data, and a local sliding window optimization estimator is constructed by adopting a formula (1):
Figure BDA0003677541630000031
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003677541630000032
indicates that the collection is->
Figure BDA0003677541630000033
Optimal estimation of (2); />
Figure BDA0003677541630000034
Representing the set of all key frames in the sliding window at the k moment;
Figure BDA0003677541630000035
respectively representing the relative constraints of the laser-inertia odometer, the vision-inertia odometer and the inertia odometer between the ith moment and the jth moment; />
Figure BDA0003677541630000036
A priori constraint representing a first state quantity in a sliding window; sigma LIO 、∑ VIO 、∑ IMU 、∑ 0 Representing the covariance of each constraint separatelyAnd (4) matrix.
As an alternative embodiment, the method for determining whether the laser-inertia odometer and the vision-inertia odometer are degraded based on the observability analysis comprises the following steps:
constructing information matrixes of the laser-inertia odometer and the vision-inertia odometer according to the laser constraint and the vision constraint;
and (3) carrying out eigenvalue decomposition on the information matrix, and representing the degradation degree of the odometer through the magnitude of the eigenvalue:
if the characteristic value is smaller than the set threshold value, judging that the odometer is degraded, and regarding the characteristic vector corresponding to the characteristic value as a degradation direction vector; and if the characteristic value is greater than or equal to the set threshold value, judging that the odometer is not degraded.
As an alternative embodiment, the degradation process comprises the steps of:
the direction vector of the nonlinear optimization solution of the pose estimation problem in a degradation direction and the direction vector of a non-degradation direction can be determined through the characteristic values;
decomposing the nonlinear optimization solution of the pose estimation into a projection of a real solution in a non-degradation direction and a projection of a predicted value in a degradation direction, considering that constraint is added in a partial subspace of the pose estimation problem, not performing iterative optimization in the degradation direction, and only outputting the solution in the non-degradation direction;
and (3) only adding the increment of the solution in the non-degradation direction into the state quantity by discarding the component of the optimal solution in the degradation direction to obtain the pose estimation information after degradation processing.
As an alternative embodiment, obtaining the total attitude estimation output includes the following steps:
inputting the degradation state information of the laser-inertia subsystem and the vision-inertia subsystem into the fusion decision table, and obtaining the decision output of the fusion decision table based on the degradation state information;
through fusing the decision output of the decision table and the observation data of each odometer subsystem, the total attitude estimation output of the odometer system is obtained through decision making and is input into a rear-end optimization system;
the back-end optimization system constructs the relationship between the state quantities at all times through a pose graph and models a batch recursive smoothing problem, which is specifically shown in a formula (2):
Figure BDA0003677541630000041
wherein the content of the first and second substances,
Figure BDA0003677541630000042
indicates that the collection is->
Figure BDA0003677541630000043
Optimal estimation of (2); />
Figure BDA0003677541630000044
Representing the set of all key frames by time k;
Figure BDA0003677541630000045
respectively representing the relative constraints of the odometer and the loop between the ith moment and the jth moment; sigma odom 、∑ loop Representing the covariance matrix of each constraint separately.
As an alternative embodiment, the step of completing the construction of the global consistent map and the distribution of the collaborative map construction task between the robots through the information sharing network includes the following steps:
based on a clustering algorithm of communication quality and cluster and view constraint, maximizing the coupling performance consisting of unmanned aerial vehicle communication quality and cluster operation view set under the standard constraint, obtaining a group of optimal solutions to divide unmanned aerial vehicle cluster information sharing areas, and selecting a network main control unit in each information sharing area based on the principle of computing resource optimization;
performing information association by searching for an overlapping area between robot maps, determining a coordinate transformation relation between robots, and constructing a global consistent map based on the coordinate transformation relation; detecting an overlapping area based on a bag-of-words model of the key frame; the bag-of-words model utilizes the similarity between two images to judge whether the two images come from the same scene; if two robots pass through the same scene, the fact that the maps of the two robots are overlapped is proved to be large and an overlapping area can exist;
after receiving the similar key frames and the local map data sent by the robot, the main control unit firstly carries out feature matching on the similar key frames and the key frames of the similar key frames and the local map data, obtains an initial pose transformation matrix by utilizing the epipolar constraint relation, provides the initial pose transformation matrix for a normal distribution transformation algorithm as an initial value, then uses the normal distribution transformation algorithm to iteratively calculate an accurate pose transformation matrix, and fuses the local map sent by the robot and the global map maintained by the main control unit according to the final pose transformation matrix;
the normal distribution transformation algorithm is mainly used for registering by utilizing a probability density function of point distribution and determining the optimal matching between point clouds by an optimization algorithm, and comprises the following steps:
step1: gridding, namely dividing a global point cloud map maintained by a main control unit into three-dimensional cells with uniform size;
step2: calculating a multi-dimensional normal distribution parameter, and respectively calculating a mean vector q and a covariance matrix sigma of space points in each cell according to formulas (3) and (4):
Figure BDA0003677541630000061
Figure BDA0003677541630000062
where n is the total number of points in each cell, x k Represents a point in a cell;
step3: coordinate transformation, assuming a pose transformation matrix as T, an initial value is provided by an epipolar constraint relationship, and the method comprises the following steps:
Figure BDA0003677541630000063
where R is a 3 × 3 rotation matrix and t is a 3 × 1 translation matrix. Transforming each point in the local point cloud map to a global map by using a pose transformation matrix T,transformed point y i (i =1, \ 8230;, m), where m is the total number of all points in the local point cloud map;
step4: calculating a probability density based on the transformed points y i Calculating the probability density of each transformation point by the normal distribution parameter of the cell, wherein the calculation formula (5) is as follows:
Figure BDA0003677541630000064
where detΣ represents the determinant of the covariance matrix Σ;
step5: creating an objective function, and constructing an objective function evaluation pose transformation matrix T by using a formula (6) for the probability density of each point:
Figure BDA0003677541630000065
step6: optimizing an objective function, and solving the minimum value of the objective function by using a Newton optimization method so as to obtain an optimal pose transformation matrix T;
based on a fast random search tree algorithm to find the boundary point between the mapped area and the unknown area, the RRT algorithm comprises the following steps:
step1: calling an Init function to initialize the tree T;
step2: calling a random sample function to randomly sample a point X in a state space X rand
Step3: calling the nerest function to find the closest x on the current tree T rand Point x of near
Step4: calling Steer function with x near As a starting point, ε is the step size, iteratively examine x near And x rand Whether the point in between is a collision-free state, and if not, stopping the iteration, at which time the distance x near The farthest passable point is the new node x new
Step5: new node x new And x near And x new New edge in between is added to the tree T, overExpanding the tree and entering next iteration;
step6: after the iteration is finished, generating a new tree T;
the main control unit screens a large number of boundary points obtained by the global and local detectors by using a mean shift clustering algorithm to obtain a required task target point, wherein the mean shift clustering algorithm comprises the following steps:
step1: randomly selecting one point from the unmarked boundary points as a starting central point x center
Step2: find out the value of x center Taking the spherical area as a center, marking all boundary points in the spherical area with the radius of r;
step3: calculate from x center Starting with vectors to each boundary point in the spherical region, adding these vectors to obtain a drift vector x shift
Step4: according to x center =x center +x shift Moving the centre point, i.e. x center Along x shift Is moved in the direction of (1), the moving distance is | | x shift ||;
Step5: repeating Step2 to Step4 until | | | x shift If | is small, the description has iterated to converge, recording the current center point x center All boundary points encountered during the iteration process will be marked;
step6: if converging, the current center point x center If the distance between the spherical area and other existing central points is smaller than a threshold value, central points of more boundary points in the spherical area are reserved; otherwise, x is center As a new center point;
step7: repeating the previous steps until all points are marked for access;
the master control unit will assign the task target points to the robots in the shared network using an auction mechanism.
The invention also provides a multi-machine collaborative fusion positioning and mapping device for unknown space exploration, which comprises a memory, a processor and a multi-machine collaborative fusion positioning and mapping method program for unknown space exploration, wherein the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration is stored in the memory and can run on the processor, and the steps of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration are realized when the processor executes the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration.
A computer readable storage medium, on which a multi-machine collaborative fusion positioning and mapping method program for unknown space exploration is stored, and when being executed by a processor, the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration realizes the steps of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
the lightweight multi-sensor fusion mileage calculation method designed by the invention can realize the close coupling state estimation of the inertial odometer, the laser-inertial odometer and the vision-inertial odometer. Compared with the existing method, the method can effectively improve the accuracy of robot alignment posture estimation and the robustness for dealing with various challenges in unknown space, and realize the real-time operation of the multi-sensor fusion algorithm on an airborne computer.
The degradation recognition and fusion decision algorithm based on the observability analysis can judge whether the odometer is degraded or not and can complete the fusion of the odometer information based on the degradation state of the odometer. Compared with the existing method, the pose estimation result of the robot in the characteristic sparse scene can be improved, the multi-source information fusion of the robot is realized, and the reliable guarantee is provided for autonomous navigation, real-time obstacle avoidance and map construction of the robot in a complex unknown space.
The multi-machine collaborative map building algorithm designed by the invention can realize the map information sharing of each robot and the optimized distribution of computing resources, and complete the construction of a global consistent map and the distribution of collaborative map building tasks based on an information sharing network. Compared with the existing method, the method designs a multi-machine collaborative map building algorithm for unknown space exploration, and improves the efficiency of the robot in building the unknown space map.
Drawings
FIG. 1 is a schematic diagram of an algorithmic framework in accordance with one embodiment of the present invention;
FIG. 2 is a schematic diagram of a sensor information fusion process according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an iterative error Kalman filter of one embodiment of the present invention;
FIG. 4 is a schematic representation of an odometer fusion process according to one embodiment of the invention;
FIG. 5 is a schematic diagram of degradation determination based on an observability analysis according to one embodiment of the present invention;
FIG. 6 is a schematic diagram of an information fusion framework according to one embodiment of the invention;
FIG. 7 is a schematic diagram of an overlap region detection process according to an embodiment of the present invention;
FIG. 8 is a schematic flow chart of constructing a globally consistent map, in accordance with one embodiment of the present invention;
FIG. 9 is a schematic diagram of a collaborative mapping system framework according to an embodiment of the invention;
figure 10 is a schematic flow diagram of an auction mechanism according to one embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention and are not to be construed as limiting the present invention.
The method and system for multi-machine collaborative fusion positioning and mapping oriented to unknown space exploration according to the embodiment of the present invention are described below with reference to fig. 1 to 10.
Aiming at a complex unknown space environment, the autonomous positioning and environment sensing capability of the robot needs to be improved urgently, and as shown in fig. 1, the robustness and the smoothness of positioning and mapping of the robot during exploration of the unknown complex space environment are improved by designing a lightweight multi-sensor fusion mileage calculation method, a degradation identification and fusion decision strategy algorithm based on observability analysis and a multi-machine collaborative mapping algorithm.
As an optional embodiment, a lightweight multi-sensor fusion mileage calculation method is designed, for example, as shown in fig. 2, the lightweight multi-sensor fusion mileage calculation method adopts a redundant design, and three subsystems, namely an inertial odometer, a laser-inertial odometer and a vision-inertial odometer, are respectively designed, so that fusion of multi-sensor information is realized, and robustness of the robot to various challenges of an unknown space is enhanced.
Firstly, the inertial odometer forms a dead reckoning system through an inertial measurement unit and an airspeed meter, and provides high-frequency initial pose estimation for the robot. Meanwhile, the height constraint generated by the barometer and the laser ranging module can effectively eliminate the pose drift in the vertical direction of the inertial odometer. And because the inertial odometer is only used for estimating the state of the robot, the accuracy of the position and attitude estimation of the robot can be ensured in a short time even if the visual-inertial odometer and the laser-inertial odometer are degraded or invalid.
Secondly, the laser-inertial odometer receives initial pose estimation and solid-state laser radar observation of the inertial odometer, and as shown in fig. 3, an iterative error state Kalman filtering estimator is constructed to provide a corrected pose for the inertial odometer, so that an accumulated error of the inertial odometer is eliminated.
And finally, the vision-inertial odometer receives the initial pose estimation, the vision sensor image data and the point cloud data of the inertial odometer, and a local sliding window optimization estimator is constructed by adopting the following formula (1):
Figure BDA0003677541630000101
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003677541630000102
indicates that the collection is->
Figure BDA0003677541630000103
Optimal estimation of (2); />
Figure BDA0003677541630000104
Representing the set of all key frames in the sliding window at the k moment;
Figure BDA0003677541630000111
respectively representing the relative constraints of the laser-inertia odometer, the vision-inertia odometer and the inertia odometer between the first moment and the second moment; />
Figure BDA0003677541630000112
A priori constraint representing a first state quantity in a sliding window; sigma LIO 、∑ VIO 、∑ IMU 、∑ 0 Representing the covariance matrix of each constraint separately. The fusion method overcomes the problem that the pose estimation is influenced due to inaccurate depth estimation by using point cloud data to assist vision to perform depth estimation.
Further, an embodiment of designing a degradation recognition and fusion decision algorithm based on the observability analysis is as follows:
pose estimation of the robot can be respectively obtained through the inertial odometer, the vision-inertial odometer and the laser-inertial odometer, and pose estimation information of the three odometers needs to be fused in the next step.
For the visual-inertial odometer and the laser-inertial odometer, the visual sensor needs texture information to determine the feature points, and the laser sensor needs geometry information to determine the feature points. In the characteristic sparse scene with single texture or repeated geometric structures, such as a long straight corridor, a long tunnel, a single-sided wall, an open road, a bridge and the like, the visual sensor and the laser sensor cannot extract enough characteristic points. And the lack of sufficient characteristic points can cause the state estimators of the vision-inertia odometer and the laser-inertia odometer to be degraded, thereby influencing the accuracy and stability of pose estimation. Aiming at the problem that the visual-inertial odometer and the laser-inertial odometer are possibly degraded, the invention judges and analyzes whether the odometer is degraded or not based on the observability analysis, and fuses the pose estimation information of a plurality of odometer subsystems based on the degradation state of the odometer.
As shown in fig. 4, since the odometer is not affected by the sparse structure scene, the pose estimation of the odometer is directly input into the fusion decision system. And the laser-inertia odometer and the vision-inertia odometer need to judge whether the laser-inertia odometer and the vision-inertia odometer are degraded or not, if the laser-inertia odometer and the vision-inertia odometer are not degraded, the pose estimation information is directly input into the fusion decision system like the inertia odometer, and the pose estimation information of the degraded odometer is processed and then input into the fusion decision system.
As shown in FIG. 5, the present invention determines whether the visual-inertial odometer and the laser-inertial odometer are degraded in a characteristic sparse scene based on an observability analysis.
Firstly, according to laser constraint and visual constraint, information matrixes of a laser-inertial odometer and a visual-inertial odometer are constructed, and further, the problem of observability analysis is converted into the problem of analyzing an observable subspace. And then, carrying out eigenvalue decomposition on the information matrix, representing the degradation degree of the odometer according to the magnitude of the eigenvalue, considering that the odometer is degraded when the eigenvalue is smaller than a set threshold value, and regarding the eigenvector corresponding to the eigenvalue as a degradation direction vector.
And when the laser-inertial odometer and/or the vision-inertial odometer are judged to be degraded, the pose estimation information of the laser-inertial odometer and/or the vision-inertial odometer needs to be processed. For an odometer system, the pose estimation problem can be finally converted into a nonlinear optimization problem. And whether the degradation of the odometer is connected with the pose estimation problem or not is not necessarily connected. Therefore, even if the odometer is degraded, the pose estimation problem is likely to be solved. And the direction vector of the nonlinear optimization solution of the pose estimation problem in a degradation direction and the direction vector of a non-degradation direction can be determined through the characteristic values. Therefore, the invention decomposes the nonlinear optimization solution of the pose estimation into the projection of the real solution in the non-degradation direction and the projection of the predicted value in the degradation direction, considers adding the constraint in partial subspace of the pose estimation problem, does not execute the iterative optimization in the degradation direction, and only outputs the solution in the non-degradation direction. And finally, only the increment of the solution in the non-degradation direction is superposed into the state quantity by discarding the component of the optimal solution in the degradation direction, so that the optimal solution is prevented from moving in a null space, and the accuracy of the optimal solution is ensured. And finally, inputting the processed pose estimation information into a fusion decision system. And if the odometer is degraded in all directions, directly discarding the pose estimation information without inputting the pose estimation information into the fusion decision system.
The fusion decision-making system receives pose estimation information of three odometers, and the invention designs a decision-making driving fusion method based on a degradation state aiming at the information fusion problem of a plurality of odometer subsystems of a robot. Firstly, the degradation state information of the laser-inertia subsystem and the vision-inertia subsystem is input into a decision table, and then the decision output of the decision table is obtained based on the degradation state of the odometer subsystem. It is clear that the odometer has a higher percentage in the decision output when it is not degraded or the degree of degradation is relatively low. And then, by fusing decision data of the decision table and observation data of each odometer subsystem, deciding to obtain total attitude estimation output of the lightweight multi-sensor fusion odometer, and inputting the total attitude estimation output into a rear-end optimization system.
The back-end optimization system constructs the relationship among the state quantities at all times through a pose graph and models a batch recursive smoothing problem, which is specifically shown in a formula (2):
Figure BDA0003677541630000131
/>
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003677541630000132
representing a collection +>
Figure BDA0003677541630000133
Optimal estimation of (2); />
Figure BDA0003677541630000134
Representing the set of all key frames by the time k;
Figure BDA0003677541630000135
respectively representing the relative constraints of the odometer and the loop between the ith time and the jth time; sigma odom 、∑ loop Representing the covariance matrix of each constraint separately. And finally, obtaining a track with global consistency by solving the formula (2), and further obtaining a dense three-dimensional point cloud map.
As shown in fig. 6, each odometer subsystem performs state estimation by fusing multi-sensor data in a tightly coupled manner, and then performs fusion decision on pose estimation information of each odometer in a loosely coupled manner. The fusion mode combining tight coupling and loose coupling improves the positioning precision and ensures the stability of the system.
Further, an embodiment of designing a multi-machine collaborative mapping algorithm is as follows:
firstly, aiming at the problems of complex unknown space network and limited communication, the invention maximizes the coupling performance consisting of unmanned aerial vehicle communication quality and cluster operation view set under the standard constraint based on the clustering algorithm of communication quality and cluster and view constraint, obtains a group of better solutions, realizes the high-efficiency division of unmanned aerial vehicle cluster information sharing areas, and selects a network main control unit in each information sharing area based on the principle of optimal computing resources.
Secondly, aiming at the problems that the probability of the robot meeting in an unknown space is low and the road signs cannot be arranged in advance, the invention constructs a global consistent map based on the map overlapping area. And performing information association by searching for an overlapping area between the robot maps so as to determine a coordinate transformation relation between the robots, and constructing a global consistent map based on the coordinate transformation relation. Overlap region detection is accomplished using a keyframe based bag of words model. The bag-of-words model measures the similarity between two images, so that whether the two images come from the same scene can be judged. Therefore, the bag-of-words model is used for comparing the key frames between the two robots to judge whether the two robots pass through the same scene, and if the two robots pass through the same scene, the fact that the two robots are likely to have an overlapping area between the maps is proved.
Meanwhile, as shown in fig. 7, in order to avoid that the real-time performance of system communication is affected by continuously transmitting three-dimensional map data, the master control unit may first send the global descriptor corresponding to the key frame to each robot in the network in a manner of broadcasting in the communication network. And after receiving the global descriptor sent by the main control unit, the robot in the network judges whether a map overlapping area exists with the main control unit or not through the word bag model. And the robot performs data transmission only when an overlapping area exists between the robot and the main control unit, and sends the corresponding similar key frame and the local map data to the main control unit. The mechanism can effectively avoid the transmission of a large amount of redundant data and improve the transmission efficiency.
As shown in fig. 8, after receiving the similar key frame and the local map data sent by the robot, the main control unit first performs feature matching on the similar key frame and its own key frame, obtains an initial pose transformation matrix by using an epipolar constraint relationship, provides the initial pose transformation matrix to a normal distribution transformation algorithm as an initial value, iteratively calculates an accurate pose transformation matrix by using the normal distribution transformation algorithm, and fuses the local map sent by the robot and the global map maintained by the main control unit according to the final pose transformation matrix.
The Normal Distribution Transformation (NDT) algorithm is mainly to perform registration by using a probability density function of point Distribution, and determine the best match between point clouds by an optimization algorithm. The method comprises the following specific steps:
and Step1, meshing, namely dividing the global point cloud map maintained by the main control unit into three-dimensional cells with uniform size.
Step2: calculating multidimensional normal distribution parameters, and respectively calculating a mean vector q and a covariance matrix sigma of space points in each cell according to formulas (3) and (4):
Figure BDA0003677541630000151
Figure BDA0003677541630000152
where n is the total number of points in each cell, x k Representing a point in the cell.
Step3: coordinate transformation, assuming the pose transformation matrix as T, the initial value is provided by epipolar constraint relation, and there are:
Figure BDA0003677541630000153
where R is a 3 × 3 rotation matrix and t is a 3 × 1 translation matrix. Each point in the local point cloud map is transformed to the global map by using the pose transformation matrix T, and the transformed point uses y i (i =1, ..., m) is the total number of all points in the local point cloud map.
Step4: calculating a probability density based on the transformed points y i Calculating the probability density of each transformation point by the normal distribution parameter of the located cell:
Figure BDA0003677541630000154
where detΣ represents the determinant of the covariance matrix Σ;
step5: creating an objective function, and constructing an objective function evaluation pose transformation matrix T by using a formula (6) for the probability density of each point:
Figure BDA0003677541630000161
step6: and optimizing the objective function, and solving the minimum value of the objective function by using a Newton optimization method so as to obtain the optimal pose transformation matrix T.
And performing global optimization based on a pose graph optimization algorithm to obtain a pose constraint relation between a global consistent map with higher precision and the robot.
Finally, aiming at the problem of how to build a map cooperatively when a plurality of robots explore an unknown space, a rapid-expansion Random Trees (RRT) algorithm is used for searching for boundary points between the mapped region and the unknown region. The RRT algorithm comprises the following steps:
step1: calling an Init function to initialize the tree T;
step2: calling a random sample function to randomly sample a point X in a state space X rand
Step3: calling the nerest function to find the closest x on the current tree T rand Point x of near
Step4: calling Steer function with x near For the starting point, ε is the step size, x is examined iteratively near And x rand Whether the point in between is a collision-free state, and if not, stopping the iteration, at which time the distance x near The farthest passable point is the new node x new
Step5: new node x new And x near And x new Adding a new edge to the tree T, completing the expansion of the tree and entering the next iteration;
step6: and ending the iteration to generate a new tree T.
As shown in fig. 9, the present invention considers the efficiency and completeness of the boundary point search process by setting the global boundary detector and the local boundary detector. Each robot in the network maintains a local detector, randomly detects the surrounding environment through an RRT algorithm, deletes the original tree and restarts to expand the tree at the current position after the boundary point is detected, and therefore the speed of detecting the boundary point is increased. While the global border detector is maintained by the master control unit, unlike the local detectors, where the root nodes of the tree in the global detector are not updated, only one tree is generated during the task, the global detector can detect the border points that are ignored by the local detectors and provide more valuable border points when the RRT algorithm of the local detectors falls into growing "traps". And the main control unit screens a large number of boundary points obtained by the global and local detectors by using a mean shift clustering algorithm to obtain a required task target point.
The specific steps of the mean shift clustering algorithm are as follows:
stepl: randomly selecting one point from the unmarked boundary points as a starting central point x center
Step2: find out the value of x center Centered, all boundary points within a spherical region of radius r and labeled.
Step3: calculate from x center Starting with vectors to each boundary point in the spherical region, adding these vectors to obtain a drift vector x shift
Step4: according to x center =x center +x shift Moving the centre point, i.e. x center Along x shift Is moved in the direction of (1), the moving distance is | | x shift ||。
Step5: repeating Step2 to Step4 until | | | x shift If | is small, the description has iterated to converge, recording the current center point x center All boundary points encountered during the iteration will be marked.
Step6: if converging, the current center point x center The distance from other already existing center points is less than a threshold, then center points with more boundary points present within the spherical region are retained. Otherwise, x is center As a new center point.
Step7: the previous steps are repeated until all points are marked for access.
After mean shift clustering, the original huge number of boundary points are clustered to some central points, the central points are reserved, and the rest boundary points are deleted, wherein the central points are the task target points of the robot.
After the task goal point is obtained, the master unit will assign the task goal point to the robot in the shared network using the auction mechanism as shown in fig. 10. The auction algorithm simulates the market auction process, and the main control unit acts as an auctioneer, while the idle robots in the information sharing network act as auction participants. The auctioneer selects the auction player with the highest return as the winner of the bid based on the principle of maximizing the profit of the system. The task nodes are distributed by measuring the length of the path from each robot to the task node, the height of the repeated coverage rate and the number of islands to be formed by taking the exploration path, the field of view overlap and the islands as cost criteria, so that the optimal distribution of the multi-robot collaborative mapping task is realized.
The invention also discloses a multi-machine collaborative fusion positioning and mapping device for unknown space exploration, which comprises a memory, a processor and a multi-machine collaborative fusion positioning and mapping method program for unknown space exploration, wherein the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration is stored in the memory and can run on the processor, and the steps of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration are realized when the processor executes the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration.
Comprises at least one processor and a memory. The processor is an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and combines hardware thereof to complete the steps of the method.
It will be appreciated that the memory in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), double data rate Synchronous Dynamic random access memory (ddr DRAM), enhanced Synchronous SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory of the systems and methods described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The invention also discloses a computer readable storage medium, wherein the computer readable storage medium is stored with a multi-machine collaborative fusion positioning and mapping method program for unknown space exploration, and the steps of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration are realized when the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration is executed by a processor.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Other components and operations of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration according to the embodiment of the present invention are known to those skilled in the art and will not be described in detail herein.
In the description herein, references to the description of the terms "embodiment," "example," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (7)

1. An unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method is characterized in that: the method comprises the following steps:
designing an inertial odometer, a laser-inertial odometer and a vision-inertial odometer by a lightweight multi-sensor fusion mileage calculation method to realize the fusion of multi-sensor information; the inertial odometer, the laser-inertial odometer and the vision-inertial odometer are respectively used for estimating pose information of the robot;
designing a degradation recognition and fusion decision strategy algorithm based on the observability analysis, and judging whether the laser-inertia odometer and the vision-inertia odometer are degraded or not based on the observability analysis; if degradation exists, carrying out degradation processing on the pose estimation information acquired by the laser-inertial odometer and the pose estimation information of the vision-inertial odometer, and fusing the degraded pose estimation information and the pose estimation information of the inertial odometer according to a fusion decision to obtain total pose estimation output; if the degradation does not exist, fusing the position and pose estimation information acquired by the inertial odometer, the laser-inertial odometer and the vision-inertial odometer according to a fusion decision to obtain a total position and pose estimation output;
the multi-machine cooperation mapping algorithm is adopted, and the construction of a global consistent map and the distribution of a cooperation mapping task are completed among all robots through an information sharing network;
the fusion method of the multi-sensor information comprises the following steps:
the inertial odometer and the airspeed meter form a dead reckoning system through an inertial measurement unit, and the dead reckoning system is used for providing high-frequency initial position estimation for the robot; the position drift of the inertial odometer in the vertical direction is eliminated through height constraint generated by the barometer and the laser ranging module;
the laser-inertial odometer receives initial pose estimation of the inertial odometer and observed quantity of the solid-state laser radar, and provides a corrected pose for the inertial odometer by constructing an iterative error state Kalman filtering estimator;
the vision-inertia odometer receives initial pose estimation of the inertia odometer, image data collected by a vision sensor and point cloud data, and a local sliding window optimization estimator is constructed by adopting a formula (1):
Figure QLYQS_1
(1);
wherein the content of the first and second substances,
Figure QLYQS_9
indicates that the collection is->
Figure QLYQS_10
Optimal estimation of (2); />
Figure QLYQS_11
Represents->
Figure QLYQS_12
A set of all keyframes within a time sliding window;
Figure QLYQS_13
Figure QLYQS_14
Figure QLYQS_15
respectively denote a fifth->
Figure QLYQS_2
Time and/or th>
Figure QLYQS_3
Relative constraints of the laser-inertia odometer, the vision-inertia odometer and the inertia odometer at the time; />
Figure QLYQS_4
A priori constraint representing a first state quantity in a sliding window; />
Figure QLYQS_5
Figure QLYQS_6
Figure QLYQS_7
Figure QLYQS_8
Representing the covariance matrix of each constraint separately.
2. The unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method as claimed in claim 1, wherein: the method for judging whether the laser-inertia odometer and the vision-inertia odometer are degraded or not based on the observability analysis comprises the following steps:
constructing information matrixes of the laser-inertia odometer and the vision-inertia odometer according to the laser constraint and the vision constraint;
and (3) carrying out eigenvalue decomposition on the information matrix, and representing the degradation degree of the odometer through the magnitude of the eigenvalue:
if the characteristic value is smaller than the set threshold value, judging that the odometer is degraded, and regarding the characteristic vector corresponding to the characteristic value as a degradation direction vector; and if the characteristic value is greater than or equal to the set threshold value, judging that the odometer is not degraded.
3. The unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method as claimed in claim 1, wherein: the degradation process comprises the following steps:
determining a direction vector of a non-linear optimization solution of the pose estimation problem in a degradation direction and a direction vector of a non-degradation direction through the characteristic values;
decomposing the nonlinear optimization solution of the pose estimation into a projection of a real solution in a non-degradation direction and a projection of a predicted value in a degradation direction, considering that constraint is added in a partial subspace of the pose estimation problem, not performing iterative optimization in the degradation direction, and only outputting the solution in the non-degradation direction;
and (3) overlapping the increment of the solution in the non-degradation direction into the state quantity by discarding the component of the optimal solution in the degradation direction to obtain the pose estimation information after degradation processing.
4. The unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method as claimed in claim 3, wherein: obtaining the total attitude estimation output comprises the following steps:
inputting the degradation state information of the laser-inertia subsystem and the vision-inertia subsystem into the fusion decision table, and obtaining the decision output of the fusion decision table based on the degradation state information;
through the decision output of the fusion decision table and the observation data of each odometer, the total attitude estimation output is obtained through decision making, and the total attitude estimation output is input into a rear-end optimization system;
the back-end optimization system constructs the relationship between the state quantities at all times through a pose graph and models a batch recursive smoothing problem, which is specifically shown in a formula (2):
Figure QLYQS_16
(2);
wherein the content of the first and second substances,
Figure QLYQS_18
indicates that the collection is->
Figure QLYQS_19
Optimal estimation of (2); />
Figure QLYQS_20
Indicates cutoff to->
Figure QLYQS_21
A set of all key frames at a time;
Figure QLYQS_22
are respectively provided with indicates the fifth->
Figure QLYQS_23
At the moment and in conjunction with a fifth->
Figure QLYQS_24
Relative constraints of odometer and loop back between moments; />
Figure QLYQS_17
Representing the covariance matrix of each constraint separately.
5. The unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method according to claim 1, wherein: the construction of the global consistent map and the distribution of the collaborative map construction tasks among all robots through the information sharing network comprise the following steps:
based on a clustering algorithm of communication quality and cluster and view constraint, maximizing the coupling performance consisting of the communication quality of the robot and a cluster operation view set under the standard constraint, obtaining a group of optimal solutions to divide information sharing areas of the unmanned aerial vehicle cluster, and selecting a network main control unit in each information sharing area based on the principle of computing resource optimization;
performing information association by searching for an overlapping area between robot maps, determining a coordinate transformation relation between robots, and constructing a global consistent map based on the coordinate transformation relation; detecting an overlapping region based on a bag-of-words model of the key frame; the bag-of-words model utilizes the similarity between two images to judge whether the two images come from the same scene; if two robots pass through the same scene, the fact that the maps of the two robots are overlapped is proved to be large and an overlapping area can exist;
after receiving the similar key frames and the local map data sent by the robot, the main control unit firstly carries out feature matching on the similar key frames and the key frames of the similar key frames and the local map data, obtains an initial pose transformation matrix by utilizing the epipolar constraint relation, provides the initial pose transformation matrix for a normal distribution transformation algorithm as an initial value, then uses the normal distribution transformation algorithm to iteratively calculate an accurate pose transformation matrix, and fuses the local map sent by the robot and the global map maintained by the main control unit according to the final pose transformation matrix;
the normal distribution transformation algorithm is mainly used for registering by utilizing a probability density function of point distribution and determining the optimal matching between point clouds by an optimization algorithm, and comprises the following steps:
step1, gridding, namely dividing the global point cloud map maintained by the main control unit into three-dimensional cells with uniform size;
step2, calculating multidimensional normal distribution parameters, and respectively solving the mean vector of space points in each cell according to formulas (3) and (4)
Figure QLYQS_25
And covariance matrix Σ:
Figure QLYQS_26
(3);
Σ=
Figure QLYQS_27
(4);
wherein the content of the first and second substances,
Figure QLYQS_28
is the total number of points in each cell, is greater than>
Figure QLYQS_29
Represents a point in a cell;
step3, coordinate transformation, assuming a pose transformation matrix of
Figure QLYQS_32
The initial value is provided by the epipolar constraint relationship and has:
Figure QLYQS_33
(ii) a Wherein->
Figure QLYQS_34
Is->
Figure QLYQS_35
Is rotated by the motor unit, is greater than or equal to>
Figure QLYQS_36
Is->
Figure QLYQS_37
The translation matrix of (a); utilize the position and pose transformation matrix->
Figure QLYQS_38
Each point in the local point cloud map is transformed to the global map, and the transformed point is used for ^ and ^ for>
Figure QLYQS_30
To indicate that the user is not in a normal position,
Figure QLYQS_31
the total number of all points in the local point cloud map;
step4, calculating probability density according to the transformation points
Figure QLYQS_39
Calculating the probability density of each transformation point by the normal distribution parameter of the cell, wherein the calculation formula (5) is as follows:
Figure QLYQS_40
(5);
wherein
Figure QLYQS_41
Σ represents the determinant of the covariance matrix Σ;
step5, an objective function is created, and for the probability density of each point, an objective function evaluation pose transformation matrix is constructed by using a formula (6)
Figure QLYQS_42
Figure QLYQS_43
(6);
Step6, optimizing an objective function, and solving the minimum value of the objective function by using a Newton optimization method so as to obtain the optimal pose transformation matrix
Figure QLYQS_44
Based on a fast random search tree algorithm to find the boundary point between the mapped area and the unknown area, the RRT algorithm comprises the following steps:
step1, calling an Init function to initialize a tree T;
step2, calling RandomSample function in state space
Figure QLYQS_45
Internally randomly sampling a point pick>
Figure QLYQS_46
Step3, calling the nerest function to find the Nearest tree T
Figure QLYQS_47
Is/are>
Figure QLYQS_48
Step 4. Call Steer function to
Figure QLYQS_49
Is a starting point, epsilon is a step size, iteratively checking ∑ and ∑ are selected>
Figure QLYQS_50
And/or>
Figure QLYQS_51
Whether or not the point therebetween is collision-freeIf not, stops the iteration, when the distance ∑ is greater than>
Figure QLYQS_52
The farthest passable point is the new node->
Figure QLYQS_53
Step5, new node is added
Figure QLYQS_54
And->
Figure QLYQS_55
And &>
Figure QLYQS_56
Adding new edges to the tree T, completing the expansion of the tree and entering the next iteration;
step6, ending iteration and generating a new tree T;
the main control unit screens a large number of boundary points obtained by the global and local detectors by using a mean shift clustering algorithm to obtain a required task target point, wherein the mean shift clustering algorithm comprises the following steps:
step1, randomly selecting one point from unmarked boundary points as a starting central point
Figure QLYQS_57
Step2: finding out
Figure QLYQS_58
Taking the spherical area as a center, marking all boundary points in the spherical area with the radius of r;
step 3. Calculating from
Figure QLYQS_59
Starting the vectors to each boundary point in the spherical area, adding these vectors to obtain a drift vector->
Figure QLYQS_60
Step4: according to
Figure QLYQS_61
=/>
Figure QLYQS_62
+/>
Figure QLYQS_63
Moving the central point, i.e.. Sup or>
Figure QLYQS_64
Along->
Figure QLYQS_65
Is moved in a direction of a movement distance of->
Figure QLYQS_66
Step5 repeating Step2 to Step4 until
Figure QLYQS_67
Very small, when the statement has iterated to converge, recording the current center point @>
Figure QLYQS_68
All boundary points encountered during the iteration process will be marked;
step6, current center point if convergence
Figure QLYQS_69
If the distance between the spherical area and other existing center points is smaller than the threshold value, the center points with more boundary points in the spherical area are reserved; otherwise, will +>
Figure QLYQS_70
As a new center point;
step7, repeating the previous steps until all points are marked for access;
the master control unit will assign the task target points to the robots in the shared network using an auction mechanism.
6. The unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping device is characterized by comprising a memory, a processor and an unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method program which is stored in the memory and can run on the processor, wherein the processor realizes the steps of the unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method recited in any one of claims 1 to 5 when executing the unknown space exploration-oriented multi-machine collaborative fusion positioning and mapping method program.
7. A computer-readable storage medium, wherein the computer-readable storage medium stores a multi-machine collaborative fusion positioning and mapping method program for unknown space exploration, and when the multi-machine collaborative fusion positioning and mapping method program for unknown space exploration is executed by a processor, the steps of the multi-machine collaborative fusion positioning and mapping method for unknown space exploration according to any one of claims 1 to 5 are implemented.
CN202210622880.0A 2022-06-02 2022-06-02 Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration Active CN114964212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210622880.0A CN114964212B (en) 2022-06-02 2022-06-02 Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210622880.0A CN114964212B (en) 2022-06-02 2022-06-02 Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration

Publications (2)

Publication Number Publication Date
CN114964212A CN114964212A (en) 2022-08-30
CN114964212B true CN114964212B (en) 2023-04-18

Family

ID=82959466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210622880.0A Active CN114964212B (en) 2022-06-02 2022-06-02 Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration

Country Status (1)

Country Link
CN (1) CN114964212B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115597585A (en) * 2022-12-16 2023-01-13 青岛通产智能科技股份有限公司(Cn) Robot positioning method, device, equipment and storage medium
CN116380935A (en) * 2023-06-02 2023-07-04 中南大学 High-speed railway box girder damage detection robot car and damage detection method
CN116704388B (en) * 2023-08-09 2023-11-03 南京航空航天大学 Multi-unmanned aerial vehicle cooperative target positioning method based on vision
CN117213470B (en) * 2023-11-07 2024-01-23 武汉大学 Multi-machine fragment map aggregation updating method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN112634451A (en) * 2021-01-11 2021-04-09 福州大学 Outdoor large-scene three-dimensional mapping method integrating multiple sensors
CN113433553A (en) * 2021-06-23 2021-09-24 哈尔滨工程大学 Precise navigation method for multi-source acoustic information fusion of underwater robot
CN113819905A (en) * 2020-06-19 2021-12-21 北京图森未来科技有限公司 Multi-sensor fusion-based odometer method and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3078935A1 (en) * 2015-04-10 2016-10-12 The European Atomic Energy Community (EURATOM), represented by the European Commission Method and device for real-time mapping and localization
CN109682373B (en) * 2018-12-28 2021-03-09 中国兵器工业计算机应用技术研究所 Perception system of unmanned platform
CN110726406A (en) * 2019-06-03 2020-01-24 北京建筑大学 Improved nonlinear optimization monocular inertial navigation SLAM method
CN111707261A (en) * 2020-04-10 2020-09-25 南京非空航空科技有限公司 High-speed sensing and positioning method for micro unmanned aerial vehicle
CN114022519A (en) * 2020-07-28 2022-02-08 清华大学 Coordinate system conversion method and device and multi-source fusion SLAM system comprising device
CN112304307A (en) * 2020-09-15 2021-02-02 浙江大华技术股份有限公司 Positioning method and device based on multi-sensor fusion and storage medium
CN114001733B (en) * 2021-10-28 2024-03-15 浙江大学 Map-based consistent efficient visual inertial positioning algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN113819905A (en) * 2020-06-19 2021-12-21 北京图森未来科技有限公司 Multi-sensor fusion-based odometer method and device
CN112634451A (en) * 2021-01-11 2021-04-09 福州大学 Outdoor large-scene three-dimensional mapping method integrating multiple sensors
CN113433553A (en) * 2021-06-23 2021-09-24 哈尔滨工程大学 Precise navigation method for multi-source acoustic information fusion of underwater robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Raouf Rasoulzadeh等.Implementation of A low-cost multi-IMU hardware by using a homogenous multi-sensor fusion.《2016 4th International Conference on Control, Instrumentation, and Automation (ICCIA)》.2016,第451-456页. *
卫文乐 ; 金国栋 ; 谭力宁 ; 芦利斌 ; 陈丹琪 ; .利用惯导测量单元确定关键帧的实时SLAM算法.计算机应用.2020,(第04期),233-239. *

Also Published As

Publication number Publication date
CN114964212A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
CN114964212B (en) Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration
Li et al. Multi-sensor fusion for navigation and mapping in autonomous vehicles: Accurate localization in urban environments
CN113168717B (en) Point cloud matching method and device, navigation method and equipment, positioning method and laser radar
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
CN112639502A (en) Robot pose estimation
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
Cui et al. Efficient large-scale structure from motion by fusing auxiliary imaging information
CN110487286B (en) Robot pose judgment method based on point feature projection and laser point cloud fusion
US11880931B2 (en) High-definition city mapping
CN113674412B (en) Pose fusion optimization-based indoor map construction method, system and storage medium
Tong et al. Multi-UAV collaborative absolute vision positioning and navigation: A survey and discussion
CN115494533A (en) Vehicle positioning method, device, storage medium and positioning system
CN117218350A (en) SLAM implementation method and system based on solid-state radar
Fasiolo et al. Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping
Jiang et al. 3D SLAM based on NDT matching and ground constraints for ground robots in complex environments
Shang et al. Topology-based UAV path planning for multi-view stereo 3D reconstruction of complex structures
CN114047766B (en) Mobile robot data acquisition system and method for long-term application of indoor and outdoor scenes
Zhong et al. Low‐cost solid‐state LiDAR/inertial‐based localization with prior map for autonomous systems in urban scenarios
Zhang et al. Range-Aided Drift-Free Cooperative Localization and Consistent Reconstruction of Multi-Ground Robots
Li et al. An SLAM Algorithm Based on Laser Radar and Vision Fusion with Loop Detection Optimization
Li et al. BA-LIOM: tightly coupled laser-inertial odometry and mapping with bundle adjustment
MAILKA et al. An efficient end-to-end EKF-SLAM architecture based on LiDAR, GNSS, and IMU data sensor fusion for autonomous ground vehicles
CN113850915A (en) Vehicle tracking method based on Autoware
Liu et al. Laser 3D tightly coupled mapping method based on visual information
Ahmed et al. An Extensive Analysis and Fine-Tuning of Gmapping’s Initialization Parameters.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant