CN110929402A - Probabilistic terrain estimation method based on uncertain analysis - Google Patents

Probabilistic terrain estimation method based on uncertain analysis Download PDF

Info

Publication number
CN110929402A
CN110929402A CN201911155457.9A CN201911155457A CN110929402A CN 110929402 A CN110929402 A CN 110929402A CN 201911155457 A CN201911155457 A CN 201911155457A CN 110929402 A CN110929402 A CN 110929402A
Authority
CN
China
Prior art keywords
coordinate system
terrain
estimation
sensor
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911155457.9A
Other languages
Chinese (zh)
Inventor
白成超
郭继峰
郑红星
刘天航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201911155457.9A priority Critical patent/CN110929402A/en
Publication of CN110929402A publication Critical patent/CN110929402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Abstract

The invention relates to a probabilistic terrain estimation method based on uncertain analysis. The method comprises the steps of establishing a patrol coordinate system, and collecting ranging data, vibration and gyroscope data when a patrol platform runs to a measuring point P based on the established patrol coordinate system; adopting Gaussian probability distribution to carry out approximate processing on the estimation result; setting a new measurement update at a point i in the grid, and determining the influence of noise errors generated by the motion of the inspection platform on terrain fusion estimation; determining sensor position information and a terrain covariance estimation value based on the triaxial vibration touch sensing unit and the uniaxial gyroscope; and corresponding the measuring points to uncertain terrain estimation corresponding to each point in the terrain, and corresponding the measuring points p to uncertain terrain updating corresponding to each point in the terrain. The terrain estimation method provided by the invention has higher terrain precision in two environments, the estimation error is within 2cm, and the method has practical application value.

Description

Probabilistic terrain estimation method based on uncertain analysis
Technical Field
The invention relates to the technical field of terrain reconstruction, in particular to a probabilistic terrain estimation method based on uncertain analysis.
Background
The research aiming at the terrain perception problem can be analyzed from three aspects, namely sensor type selection, algorithm and application. The sensor is developed from early vision and laser radar mainly to multi-sensor fusion gradually; the algorithm gradually develops from previous triangular measurement and filtering optimization to machine learning and biological inspiration; the application gradually changes from the initial terrain estimation to the geographic environment modeling and the scientific attribute detection. The general trend is also more towards multi-sensor fusion terrain estimation that does not rely on external information. In 2007, Olson realizes high-precision interframe matching in the region near the patroller through adjustment by a light beam method based on a large number of pictures shot in the on-orbit and patrolling processes, realizes remote terrain reconstruction by using wide baseline binocular vision, and provides an implementation idea of Mars long-distance patrolling navigation. Li et al, based on beam-method adjustment, gave topological terrain estimates of the courage number near GusevCrater, and generated digital elevation Maps based on orthographic Maps (Ortho Maps). In 2011, a Toronto university Barfoot team applies the SLAM technology to an extraterrestrial celestial body inspection process, global consistent terrain accurate estimation is achieved through laser radar and odometer compensation, and the sparse feature method and the batch alignment algorithm effectively solve the robustness problems of feature correlation and measurement abnormal values. 2013, an MRPTA project initiated by the Canadian space agency realizes terrain construction based on minimum sensor configuration, provides local annular grid representation with a patrol instrument as the center, and performs fusion management on the ranging maps at different moments through a map manager, thereby obtaining globally optimized terrain representation. Then, Bajpai et al from Surrey university propose monocular-based extraterrestrial celestial body synchronous positioning and mapping technology (PM-SLAM), the innovation of the method is inspired by biological semantic feature detection, and a visual saliency model is proposed, namely, a method for generating mixed saliency features is given based on point descriptors firstly, and then the features are used for terrain state estimation, and the method can effectively improve the robustness of visual perception as shown by test results of a plurality of scenes. In 2017, Gonzalez et al from MIT propose a wheel-ground interactive slip detection method based on machine learning aiming at the slip problem possibly encountered in the process of patrolling extraterrestrial bodies and only relying on an internal sensing module, on one hand, the complexity of the system is not increased due to the fact that original internal sensing equipment is still used, on the other hand, the adaptability to conditions such as illumination is improved, and the detection effectiveness of a supervised and unsupervised learning method is contrastively analyzed in the actual verification process. Meanwhile, with the continuous development of deep learning and convolutional neural networks, more and more algorithms are applied to the patroller task. Carrio et al propose a fusion SLAM terrain estimation method based on a visual odometer, an IMU and a wheel-type odometer, and the innovation is that terrain reconstruction is performed through multi-sensor fusion, a non-system error caused by wheel-ground interaction is predicted based on an odometer error model in a Gaussian process, and estimation accuracy is improved. Shaukat et al also proposed a fusion strategy of vision and lidar, and verified the advantages of this mode in terms of distance, flexibility, accuracy, etc. through experiments. Recently, with the breakthrough achievement in the aspect of surface configuration estimation of extraterrestrial celestial bodies, the estimation of the terrain with multi-attribute state gradually goes to the next stage, and the method no longer only focuses on whether the environmental terrain is flat and has obstacles, but more characteristic attributes such as hardness and material are fused. A Dunconzon professor team from Harbin industry university provides a new idea of representing the pressure characteristic of the terrain by using equivalent stiffness and representing the shear characteristic by using a friction angle, and a digital elevation map containing physical characteristics is provided by proving that an interactive mechanical model of stress between a wheel and soil and a contact model of stress between the wheel and rocks have equivalence.
Although good precision is obtained based on the sensing mode and the terrain reconstruction method, the effect is obvious in practical application, but the uncertain extraterrestrial environment still needs to be considered, and how to still have the terrain perception capability when an accident (failure or partial failure) occurs, of course, many factors causing the accident are caused, and probably because the vibration frequency is too high in the soft landing process and the environment changes such as illumination, temperature and the like can also be caused, so that how to construct the terrain perception capability with higher robustness is the next difficult point to be broken through urgently in the existing terrain reconstruction capability. On the basis of the research, a vibration/gyroscope coupling elevation terrain construction method based on multipoint ranging is mainly discussed in combination with an extraterrestrial celestial body patrol environment, so that the robustness of a patrol device to environmental changes is improved, and support is provided for subsequent motion planning and 3D perception semantic field construction.
Disclosure of Invention
The invention provides a probabilistic terrain estimation method based on uncertain analysis, aiming at solving the challenging problem that the terrain environment cannot be accurately sensed under the condition that sensors such as vision and the like fail, and the invention provides the following technical scheme:
a probabilistic terrain estimation method based on uncertain analysis, which is based on a patrol platform and a sensor, comprises the following steps:
step 1: establishing a patrol coordinate system, and determining covariance matrixes of poses of the patrol platform at different moments;
step 2: based on the established patrol coordinate system, acquiring ranging data, vibration and gyroscope data when the patrol platform operates to a measuring point P;
and step 3: carrying out terrain fusion estimation according to the acquired patrol operation distance measurement data, vibration and gyroscope data at the measurement point P, and carrying out approximate processing on the estimation result by adopting Gaussian probability distribution;
and 4, step 4: setting a new measurement update at a point i in the grid, and determining the influence of noise errors generated by the motion of the inspection platform on terrain fusion estimation;
and 5: determining sensor position information and a terrain covariance estimation value based on the triaxial vibration touch sensing unit and the uniaxial gyroscope;
step 6: the uncertainty terrain estimation corresponding to each point in the terrain is carried out on the measuring point, the time from the k moment to the k +1 moment is obtained, and the measuring point p is updated corresponding to the uncertainty terrain of each point in the terrain;
and 7: and repeating the steps 2 to 6 to obtain the estimation update of the terrain.
Preferably, the step 1 specifically comprises:
step 1.1: establishing a patrol coordinate system, wherein the patrol coordinate system comprises an inertia coordinate system I, a terrain coordinate system T, a patrol platform body coordinate system R and a sensor coordinate system S;
the system comprises an inertial coordinate system I, a patrolling platform body coordinate system R, a sensor coordinate system S and a sensor body mass center, wherein the inertial coordinate system I is fixed in an inertial space, the patrolling platform body coordinate system R is fixed at the position of the mass center of the patrolling platform, and the sensor coordinate system S is fixedly connected with the mass center of the sensor body;
step 1.2: because the poses of the inspection platform at different moments have height uncertainty for the inertial coordinate system I, the covariance matrix of the pose at each moment is synchronously given:
Figure BDA0002284677640000031
wherein r isIRAnd phiIRThe coordinate system is a conversion coordinate between an inertial coordinate system and a coordinate system of the inspection device body;
determining the change of the three-dimensional attitude of the patrol platform through pitching, yawing and rolling when the patrol platform advances in unknown terrain, and determining the conversion between the patrol platform body coordinate system R and the inertial coordinate system I through the following formula:
Figure BDA0002284677640000032
wherein the content of the first and second substances,
Figure BDA0002284677640000033
an intermediate coordinate system is obtained for the rotation of the coordinate system R of the inspection platform body by a psi angle around the Z axis
Figure BDA0002284677640000034
The transformation from the middle coordinate system to the coordinate system R of the inspection platform body is realized, theta is a pitch angle,
Figure BDA0002284677640000035
is the roll angle.
Preferably, set psi(I→T)By
Figure BDA0002284677640000036
And the conversion between the terrain coordinate system T and the inspection platform body coordinate system R only has two degrees of freedom of pitching and rolling, so that the dimension reduction processing between coordinate conversion is realized.
Preferably, the step 3 specifically comprises:
step 3.1: coordinates for determining point P as a measurement point
Figure BDA0002284677640000037
The measuring point P is under the terrain coordinate system (x)p,yp) The height estimate at a point is
Figure BDA0002284677640000038
The estimation of the height is approximated by using gaussian probability distribution, the mean value of the height distribution is obtained by the conversion from the sensor coordinate system S to the terrain coordinate system T, and the mean value of the height distribution is represented by the following formula:
Figure BDA0002284677640000039
wherein H ═ 001],hpIs the average of the height distribution,
Figure BDA00022846776400000310
for the variance of the distribution, the measured values of the measuring points P in the sensor coordinate system S areSrSPSrSTFor the position of the sensor in the topographic coordinate system, phiSTA transformation matrix from a sensor coordinate system to a terrain coordinate system;
step 3.2: extracting the three-dimensional coordinate of the measuring point P in the height direction, determining the relation between the height estimation and conversion matrix and the sensor measuring value, solving a first derivative of the formula (3) to obtain a Jacobian matrix corresponding to the error, and expressing the Jacobian matrix measured by the sensor and the rotating Jacobian matrix of the sensor coordinate system through the following formulas:
Figure BDA0002284677640000041
Figure BDA0002284677640000042
wherein, JSMeasuring the Jacobian matrix for the sensor, JΦRotating the Jacobian matrix for the sensor coordinate system, wherein C (phi) is a mapping of a corresponding rotation matrix;
determining the variance according to equation (4) and equation (5)
Figure BDA0002284677640000043
Is determined by the following equation
Figure BDA0002284677640000044
Figure BDA0002284677640000045
Wherein phiISConverting a matrix from a sensor coordinate system to an inertial coordinate system;
step 3.3: noise error estimation based on sensor measurement is already obtained, corresponding altitude estimation exists for each measurement update, and the newly obtained altitude measurement estimation and the existing elevation topographic map are fused based on a fusion form of Kalman filtering.
Preferably, the step 4 specifically includes:
step 4.1: setting a new measurement update at a point i in the grid, determining the covariance of i, and expressing the covariance of i by the following formula:
Figure BDA0002284677640000046
Figure BDA0002284677640000047
wherein, Σ piIs the covariance of the point i,
Figure BDA0002284677640000048
and
Figure BDA0002284677640000049
are uncertainties in the horizontal direction, d is the side length of the grid,
Figure BDA00022846776400000410
is the altitude uncertainty;
step 4.2: determining the relative position relation between the k time and the k +1 time,
Figure BDA00022846776400000411
the representation in the terrain coordinate system at time k is shown as follows:
Figure BDA00022846776400000412
wherein the content of the first and second substances,
Figure BDA00022846776400000413
for the measurement point in the topographic coordinate system at time k +1,
Figure BDA00022846776400000414
is the distance relation between the front and the rear moments of the terrain coordinate system,
Figure BDA00022846776400000415
is a measuring point in a terrain coordinate system at the moment k;
determining a k +1 time terrain coordinate system, and expressing the k +1 time terrain coordinate system by the following formula:
Figure BDA00022846776400000416
step 4.3: aligning the Z axis of the inspection platform body coordinate system with the Z axis of the inertial coordinate system, obtaining that the processed attitude uncertainty is only related to the yaw angle, and giving covariance matrix representation of the inspection platform pose at the moment k again through dimension reduction processing:
Figure BDA0002284677640000051
wherein x isIRIn order to tour the platform covariance matrix,
Figure BDA0002284677640000052
the coordinate system of the inspection platform body after alignment is satisfied at the same time
Figure BDA0002284677640000053
Figure BDA0002284677640000054
Preferably, the step 5 specifically comprises:
step 5.1: based on the sensor position information, the output values of the three-axis vibration touch sensor are the amplitude and frequency values of the measuring point, and the output of the single-axis gyroscope is the angular acceleration around the Z axis and is recorded as the angular acceleration
Figure BDA0002284677640000055
In order to reduce the error caused by nonlinearity, the sensor fixing connection point is used as a coordinate origin, and the sensor position information is determined by the following formula:
Figure BDA0002284677640000056
where Δ v is the velocity difference, Δ t is the measurement time interval,
Figure BDA0002284677640000057
is an acceleration signal;
step 5.2: in the sampling process, there is uncertainty, where the gaussian noise vector is n ═ n (n)a,nβ)TDetermining a Jacobian matrix for the state and the noise by the following formula, wherein the Jacobian matrix for the state and the noise is the Jacobian matrix for the state and the noise:
Figure BDA0002284677640000058
Figure BDA0002284677640000059
while the translation of the patrol platform from time k to time k +1 is determined by translation and rotation, the covariance can be given by
Figure BDA00022846776400000510
Solving for the value of the first derivative only in the Z-axis direction
Figure BDA0002284677640000061
Is determined by
Figure BDA0002284677640000062
Figure BDA0002284677640000063
Wherein the content of the first and second substances,
Figure BDA0002284677640000064
the invention has the following beneficial effects:
aiming at the condition that the sensor is easy to fail in a special environment, the invention provides the probabilistic terrain estimation method considering uncertainty influence based on the ranging information and the vibration information, namely simultaneously considering the influence of uncertainty of the sensor and uncertainty of platform motion, so that the method still has reconstruction capability on terrain under the condition that the image information fails, and the value of a task is guaranteed to the maximum extent. Comprehensive verification is respectively carried out in a lunar surface simulation environment and an indoor Optitrack environment based on Unity3D/ROS, and the result shows that the terrain estimation method provided by the invention has higher terrain accuracy in the two environments, the estimation error is within 2 centimeters, and the method has practical application value.
Aiming at the requirement that the patrolling device still has terrain perception capability under the condition that image information is unavailable, the invention provides the vibration/gyroscope coupling terrain estimation method considering uncertainty influence, and considers the influence of sensor measurement uncertainty and platform motion uncertainty. Comprehensive tests are carried out on a lunar surface simulation environment, an indoor Optitrack auxiliary environment and an outdoor soil environment based on the Unity3D/ROS, the result can show that the proposed terrain estimation algorithm has better estimation accuracy in the simulation environment and the indoor environment, wherein the simulation environment accuracy is within 1cm, the estimation accuracy in the indoor environment is within 2cm, and the proposed sensor configuration and reconstruction algorithm has centimeter-level reconstruction capability.
Drawings
FIG. 1 is a coordinate system definition diagram;
FIG. 2 is a schematic diagram of the position relationship at different times;
FIG. 3 is a schematic diagram of a lunar surface simulation environment based on Unity 3D/ROS;
FIG. 4 is a diagram illustrating a result of terrain reconstruction;
FIG. 5 is a detail view of a terrain intersection;
FIG. 6 is a graph of comparative analysis of estimated values and actual values;
FIG. 7 is a result graph of a terrain estimation profile;
FIG. 8 is a schematic diagram of a point cloud result of a terrain estimation profile;
FIG. 9 is a schematic diagram of an indoor verification environment;
FIG. 10 is a schematic diagram of an experimental platform and a rigid body capturing construction;
FIG. 11 is a schematic diagram of an experiment at different times;
FIG. 12 is a diagram illustrating terrain reconstruction results;
FIG. 13 is a graph showing a comparative analysis of a topographic reconstruction result;
FIG. 14 is a graph of a comparison analysis of terrain estimates with real values;
FIG. 15 is a schematic cross-sectional view of the result of the terrain reconstruction;
fig. 16 is a schematic diagram of a terrain reconstruction three-dimensional point cloud.
Detailed Description
The present invention will be described in detail with reference to specific examples.
The first embodiment is as follows:
the invention provides a probabilistic terrain estimation method based on uncertain analysis, which is based on a patrol platform and a sensor, wherein the patrol platform comprises a patrol device body and the sensor inherent to the patrol platform, and comprises the following steps:
step 1: establishing a patrol coordinate system, wherein the coordinate system comprises an inertial coordinate system I, a terrain coordinate system T, a patrol platform body coordinate system R and a sensor coordinate system S, and determining covariance matrixes of poses of the patrol platform at different moments;
according to the illustration of fig. 1, the present invention defines four coordinate systems, namely an inertial coordinate system I, a terrain coordinate system T, a rover body coordinate system R, and a sensor coordinate system S. The inertial coordinate system is fixed in the inertial space, the coordinate system of the patrolling device body is fixed at the mass center position of the patrolling device, and the coordinate system of the sensor is fixedly connected with the mass center of the sensor body, which is shown in detail in figure 3-1. The transformation relationship between the coordinate systems is given by transformation matrices, namely three-dimensional translation r and three-dimensional rotation phi, wherein the sensor and the rover are both fixedly and adjustably mounted while performing the task, so that the mutual transformation relationship between the rover body coordinate system and the sensor coordinate system is known, defined herein as (r)RSRS) Similarly, the conversion between the inertial coordinate system and the coordinate system of the rover body is (r)IRIR). Because the poses of the patroller at different moments have height uncertainty relative to the inertial coordinate system, the covariance matrix of the pose at each moment is synchronously given:
Figure BDA0002284677640000071
the three-dimensional attitude change of the patrolling device can be described by pitching, yawing and rolling when the patrolling device advances on unknown terrain, and the conversion between the coordinate system of the patrolling device body and the inertial coordinate system can be described by the following formula on the assumption that the terrain is fixed relative to the inertial coordinate system in the current chapter:
Figure BDA0002284677640000072
wherein the content of the first and second substances,
Figure BDA0002284677640000073
describing the rotation of the inertial frame by an angle psi about the Z-axis to obtain an intermediate frame
Figure BDA0002284677640000074
Pitch and roll transformations from the intermediate coordinate system to the rover body system are described.
To simplify the subsequent derivation and calculation, the inertial frame Z axis is set to face vertically upwards, and the terrain frame Z axis is always parallel to the inertial frame Z axis, so that there is only one degree of freedom in the conversion between the two frames, i.e. yaw around the Z axis. Setting psi at the same time(I→T)By
Figure BDA0002284677640000081
The method has the advantages that only two degrees of freedom of pitching and rolling exist in the conversion of the terrain coordinate system and the patrolling device body coordinate system, and therefore dimension reduction processing between coordinate conversion is achieved.
Step 2: based on the established patrol coordinate system, acquiring ranging data, vibration and gyroscope data when the patrol platform operates to a measuring point P;
and step 3: carrying out terrain fusion estimation according to the acquired patrol operation distance measurement data, vibration and gyroscope data at the measurement point P, and carrying out approximate processing on the estimation result by adopting Gaussian probability distribution;
the step 3 specifically comprises the following steps: for each measurement, there are different numbers of sampling results according to the sensing ability and task requirement of the sensing unit, and for simplifying understanding, any point is taken for analysis, as shown in fig. 1, a point P is a measurement point and coordinates thereof are
Figure BDA0002284677640000082
I.e. in the topographic coordinate system (x)p,yp) The height estimate at a point is
Figure BDA0002284677640000083
For the height estimation, this chapter adopts the Gaussian probability distribution to itApproximation treatment, i.e.
Figure BDA0002284677640000084
Wherein h ispIs the average of the distribution of the number of pixels,
Figure BDA0002284677640000085
is the variance of the distribution. As can be seen from FIG. 1, the measured value of point P in the sensor coordinate system isSrSPBy conversion of the sensor coordinate system into the terrain coordinate system
Figure BDA0002284677640000086
Wherein H ═ 001]And extracting the three-dimensional coordinates of the point P in the height direction.SrSTFor the position of the sensor in the topographic coordinate system, phiSTFor the transformation matrix from the sensor coordinate system to the terrain coordinate system, it can be further known that the altitude estimation is directly related to the transformation matrix and the sensor measurement value, corresponding to the error source analyzed before, so that a first derivative is obtained from the above formula to obtain the Jacobian matrix of the corresponding error, that is
1) The sensor measures the Jacobian matrix:
Figure BDA0002284677640000087
2) sensor coordinate system rotation Jacobian matrix:
Figure BDA0002284677640000088
wherein C (Φ) is defined as the mapping of the corresponding rotation matrix, i.e., C: SO (3) →3×3Φ (r) C (Φ) r. The Jacobian matrix is substituted into the following formula to obtain the variance
Figure BDA0002284677640000089
Error transfer relationship of (1):
Figure BDA00022846776400000810
the first term is the error transfer caused by the sensor noise, which is determined by the nature of the sensor used, and the covariance value is solved by a noise model, which is described in the second chapter of sensor model analysis. The second term is the transfer of errors due to the transformation between coordinate systems, it being noted that the transformation consists of two parts, i.e. translation and rotation, defining the orientation of the terrain coordinate system when the previous coordinate system was defined, so that the effect of the translation is negligible here.
Thus far, noise error estimates based on sensor measurements have been obtained, and for each measurement update, there will be a corresponding elevation estimate, so the newly obtained elevation measurement estimate will need to be fused with the existing elevation topography map in the next step. Because the height measurement estimation has no complex dynamic relationship with each measurement point, the transfer equation of the state is more intuitive, which is only the update of the measurement value of a certain point (x, y), so that each point in the topographic map is greatly updated along with the measurement of the sensor, and conversely, if the measurement value is not updated, the state is kept unchanged. A form of fusion based on kalman filtering is presented here.
First, a simplified discrete kalman filter equation is given:
and (3) time updating:
Figure BDA0002284677640000091
and (3) updating the state:
Kk=Pk-1HT(HPk-1HT+R)-1(8)
xk=xk-1+Kk(zk-Hxk-1) (9)
Pk=(I-KkH)Pk-1(10)
wherein x iskIs a variable at time K, uk-1Is a control quantity at the time k-1, xk-1Is a time variable of k-1, Pk-1The state is the state at the moment k-1; kkIs a middle variable at the K moment;
for the rover terrain estimation, the state vector is actually the height scale of each measurement point, so the term H is I, and the state z iskHeight estimate h corresponding to the current measurement pointpThe observation covariance R corresponds to the variance of the current measurement point height estimate
Figure BDA0002284677640000092
Observed value xkCorresponding to the existing height value
Figure BDA0002284677640000093
The error covariance P corresponds to
Figure BDA0002284677640000094
Can be obtained by substituting the formulas (8) to (10),
Figure BDA0002284677640000095
Figure BDA0002284677640000096
Figure BDA0002284677640000097
wherein the content of the first and second substances,
Figure BDA0002284677640000098
is the height at time K;
Figure BDA0002284677640000099
is a height value at the time k-1,
Figure BDA00022846776400000910
is the variance of the height at time k-1;
by substituting formula (11) for formula (12)
Figure BDA00022846776400000911
By substituting formula (11) for formula (13)
Figure BDA0002284677640000101
Thereby giving a new measured height
Figure BDA0002284677640000102
Estimation from existing elevation maps
Figure BDA0002284677640000103
The fusion method of (1), wherein the upper left (k-1) represents the estimation before updating and (k) represents the estimation after updating.
And 4, step 4: setting a new measurement update at a point i in the grid, and determining the influence of noise errors generated by the motion of the inspection platform on terrain fusion estimation;
according to the previous definition, i.e. the terrain coordinate system is associated with the current rover pose, without loss of generality, assuming that there is a new measurement update at one point i in the grid, its covariance can be set to
Figure BDA0002284677640000104
Wherein the altitude estimation variance
Figure BDA0002284677640000105
As given by the calculations in the previous section,
Figure BDA0002284677640000106
and
Figure BDA0002284677640000107
the value of (a) is approximated to reflect the uncertainty in the horizontal direction, the calculation of which is given by
Figure BDA0002284677640000108
Wherein d is the side length of the square grid. Therefore, even if the current time does not receive the updating of the sensor measurement, the current time is before the patrol deviceRelative motion change at later time, Σ piUpdating is also carried out, so that the robustness of the system to motion noise is ensured. The following gives a terrain correlation derivation based on motion information, as can be seen from fig. 2, corresponding to the relative position relationship between time k and time k +1,
Figure BDA0002284677640000109
the representation in the terrain coordinate system at time k is shown as follows:
Figure BDA00022846776400001010
wherein the content of the first and second substances,
Figure BDA00022846776400001011
for the representation of the measurement point in the topographic coordinate system at the time k +1,
Figure BDA00022846776400001012
is the distance relation between the front and the rear moments of the terrain coordinate system,
Figure BDA00022846776400001013
is a representation of the measurement point in the terrain coordinate system at time k. Converting the above equation to the terrain coordinate system at the time k +1 can obtain:
Figure BDA0002284677640000111
therefore, the value of the measuring point in the terrain coordinate system can be given corresponding to each moment, but the result newly estimated at each moment needs to be fused and optimized with the previous result, so that errors and complex calculation are brought, if the terrain coordinate system of the moment k and the terrain coordinate system of the moment k +1 can be kept consistent, the influence is avoided, the addition and deletion of terrain data are not needed, and the actual operation and running are facilitated. Assumptions can be made from translation and rotation, i.e.
Figure BDA0002284677640000112
Figure BDA0002284677640000113
From the above equation, the topographic coordinate system at time k and time k +1 are the same reference coordinate system, and equation (20) is further developed to obtain:
Figure BDA0002284677640000114
unifying it to the terrain coordinate system at the moment k +1 has:
Figure BDA0002284677640000115
similarly, equation (21) can be expanded to obtain:
Figure BDA0002284677640000116
unifying the time coordinates to a k +1 terrain coordinate system to obtain:
Figure BDA0002284677640000117
therefore, the following conclusions can be drawn by combining the formulae (19), (23) and (25).
Figure BDA0002284677640000118
Namely, the k-time terrain coordinate system and the k + 1-time terrain coordinate system are aligned, the value of the point P in the k-time terrain coordinate system is equal to the value in the k + 1-time terrain coordinate system, and for a dynamic process, terrain representation references are unified, so that the difficulty of fusion and registration of a large amount of data is simplified.
In combination with equation (19), the covariance transfer relationship due to motion at different time instants can be obtained, namely:
Figure BDA0002284677640000119
Figure BDA00022846776400001110
therein, sigmaP,k+1The covariance of point p at time point K +1,
Figure BDA00022846776400001111
a point p Jacobian matrix inverse matrix; j. the design is a squarerA Jacobian matrix for distance;
Figure BDA00022846776400001112
a Jacobian matrix for rotation;
knowing the estimate of the point P at the k +1 time
Figure BDA00022846776400001113
And estimation of point P at time k
Figure BDA00022846776400001114
And the coordinate conversion of the patrolling device body from the k moment to the k +1 moment
Figure BDA00022846776400001115
Therefore, the corresponding Jacobian matrix can be obtained by taking the first derivative of equation (19).
1) Error propagation at time k +1 due to observation at time k:
Figure BDA0002284677640000121
the combination formula (21) shows
JP=I (30)
2) Error transmission caused by translation transformation of the patrolling device body coordinate system from the k moment to the k +1 moment:
Figure BDA0002284677640000122
3) error transmission caused by rotation transformation of the coordinate system of the patrolling device body from the k moment to the k +1 moment:
Figure BDA0002284677640000123
wherein the content of the first and second substances,
Figure BDA0002284677640000124
for intermediate quantities, which up to now are not sufficient to solve the covariance at time k +1, it is also necessary to know the uncertainty effect caused by the motion estimation error at time k to time k +1, i.e. on ∑ er,∑ΦIs expressed as follows
Figure BDA0002284677640000125
Figure BDA0002284677640000126
According to the relationship of the coordinate system defined previously, the Z axis of the coordinate system of the rover body is aligned with the Z axis of the inertial coordinate system, the processed attitude uncertainty is only related to the yaw angle, and therefore the covariance matrix representation of the pose of the rover at the moment k can be given again through dimension reduction processing, namely
Figure BDA0002284677640000127
Wherein the content of the first and second substances,
Figure BDA0002284677640000128
for the aligned coordinate system of the patrol instrument body, simultaneously satisfies
Figure BDA0002284677640000129
And 5: determining sensor position information and a terrain covariance estimation value based on the triaxial vibration touch sensing unit and the uniaxial gyroscope;
(1) location resolution based on sensor information
The invention adopts the output value of the three-axis vibration touch sensor as the amplitude sum of the measuring pointsFrequency value, which can be converted into an acceleration signal, i.e. input as
Figure BDA0002284677640000131
Single axis gyroscope output is angular acceleration about the Z axis, recorded as
Figure BDA0002284677640000132
In order to reduce the error caused by nonlinearity, the fixed connection point of the sensor is taken as the origin of coordinates, and the time interval between the front measurement and the back measurement is taken as delta t, so that
Figure BDA0002284677640000133
Where Δ v is the velocity difference, Δ t is the measurement time interval,
Figure BDA0002284677640000134
is an acceleration signal;
Figure BDA0002284677640000135
and because the tour device is relatively stable and gentle in the process of traveling, the displacement change from the time k to the time k +1 can be obtained as Δ x by the following formula, where Δ x is the distance difference:
Figure BDA0002284677640000136
similarly, the change in yaw angle from time k to time k +1, i.e. based on the gyroscope input, can be found
Figure BDA0002284677640000137
Figure BDA0002284677640000138
Wherein, Δ ω is an angular velocity difference, and Δ ψ is an angular deviation;
(2) terrain covariance estimation
In the actual sampling process, storeWith uncertainty, it is assumed that its gaussian noise vector is n ═ n (n)a,nβ)TIs obtained by
Figure BDA0002284677640000139
Wherein the content of the first and second substances,
Figure BDA00022846776400001310
the state is set to be in the state at the moment K,
Figure BDA00022846776400001311
the state is at the moment K + 1;
by bringing formulae (37) and (39) into (40)
Figure BDA00022846776400001312
Wherein the content of the first and second substances,
Figure BDA0002284677640000141
in order to be the amount of translation,
Figure BDA0002284677640000142
is the rotation amount;
a first derivative is obtained from the above equation to obtain a Jacobian matrix for state and noise.
Figure BDA0002284677640000143
Figure BDA0002284677640000144
JxIs a Jacobian matrix for states, JnIs a Jacobian matrix for noise;
its covariance transfer relationship can be written as:
Figure BDA0002284677640000145
wherein the content of the first and second substances,
Figure BDA0002284677640000146
omega is the noise covariance relative to the covariance at the moment k +1 of the inertial system;
further, the conversion relation of the patrol instrument from the k time to the k +1 time, namely
Figure BDA0002284677640000147
Wherein the content of the first and second substances,
Figure BDA0002284677640000148
the state change from the time k to the time k + 1;
Figure BDA0002284677640000149
the distance from the time K to the time K +1 is under the system of the patrol device at the time K +1,
Figure BDA00022846776400001410
rotation from time K to time K + 1;
at the same time, the displacement and the angle value are decomposed to obtain
Figure BDA00022846776400001411
Figure BDA00022846776400001412
Brought into formula (45) to obtain
Figure BDA0002284677640000151
Then can find out
Figure BDA0002284677640000152
Covariance of
Figure BDA0002284677640000153
Figure BDA0002284677640000154
As can be seen from the formula (44),
Figure BDA0002284677640000155
wherein the content of the first and second substances,
Figure BDA0002284677640000156
is a matrix of the noise Jacobian,
Figure BDA0002284677640000157
is a noise Jacobian matrix inverse matrix;
is brought into (49)
Figure BDA0002284677640000158
Wherein the content of the first and second substances,
Figure BDA0002284677640000159
Figure BDA00022846776400001510
wherein the content of the first and second substances,
Figure BDA00022846776400001511
is a state Jacobian matrix;
while the transition of the rover from time k to time k +1 is determined by translation and rotation, the covariance thereof can be given by
Figure BDA00022846776400001512
Wherein the content of the first and second substances,
Figure BDA00022846776400001513
from time k to time k for the roverThe covariance at the time K +1 is,
Figure BDA00022846776400001514
is the covariance caused by the rotation.
Therefore, sigma can be obtained by combining formulas (51) and (54)rThe value of (a). The last one to be solved is
Figure BDA00022846776400001515
Since this chapter is defined by a coordinate system, only the azimuth is changed, and the first derivative thereof also has only the value in the Z-axis direction, it can be obtained:
Figure BDA0002284677640000161
wherein the content of the first and second substances,
Figure BDA0002284677640000162
Figure BDA0002284677640000163
for the covariance of point p at time k caused by translation,
Figure BDA0002284677640000164
covariance caused by rotation at time k for point p
Step 6: the uncertainty terrain estimation corresponding to each point in the terrain is carried out on the measuring point, the time from the k moment to the k +1 moment is obtained, and the measuring point p is updated corresponding to the uncertainty terrain of each point in the terrain; the uncertainty terrain update corresponding to each point i in the terrain for the measured point P from time k to time k +1 is given in connection with the previous steps (14) (15) (16) (28).
And 7: and repeating the steps 2 to 6 to obtain the estimation update of the terrain.
The accuracy of the proposed algorithm is tested from two aspects, namely 1) a terrain estimation experiment based on a Unity3D/ROS lunar surface simulation environment; 2) indoor terrain testing is assisted based on the Optitrack. It should be noted that, in order to increase the contrast from simulation to physical verification, the sensors are all configured identically except for different environments, i.e. the simulation environment is a high-precision simulation of physical objects. Wherein, the simulation environment is to build a simulated lunar terrain based on a Unity3D/ROS system, and the physical test is to be carried out in an indoor Optitrack auxiliary environment. The unmanned operation platform adopts a Jacklal unmanned vehicle of Clearpath company of Canada, carries ROS operation environment, and the sensor is selected from a Microsoft Kinect V1.0 depth camera, an AFT601D three-axis vibration touch sensor and a single-axis gyroscope.
Test 1: lunar surface simulation environment terrain test based on Unity3D/ROS
A simulated lunar surface simulation environment built on the basis of Unity3D/ROS in the section is shown in FIG. 3, wherein a large graph is a Unity3D environment operation schematic diagram, and a small graph is an ROS environment terrain reconstruction result schematic diagram. From the foregoing, the accuracy of reconstruction of the terrain is improved by updating the height and covariance estimates for each detection point, where 3 σ distribution estimates are synchronously given based on the estimated values to better judge the reasonability of the terrain estimation.
Fig. 4 shows the result of estimating the terrain in the detection process, where the blue curved surface and the yellow curved surface are the upper and lower 3 σ boundaries of the terrain estimation, the red curved surface is the result of estimating the terrain, the green curved surface is the actual value of the terrain, and most of the areas meet the requirement that the estimated value and the actual value are in the reasonable boundary area. Meanwhile, with reference to fig. 5, it can be clearly seen that the estimated value almost coincides with the true value, and since the test region has an upward slope trend, the estimated boundary value at the later stage is significantly reduced, and it can be seen that the estimation accuracy in the flat region is higher than that in the region with a slope.
In order to compare the error of the estimated value of the algorithm with the actual shape value, as shown in fig. 6, the detail comparison of the red and blue two regions is given, and from the red oval region, the maximum height estimation error is 0.44cm when the x and y values are the same; from the blue elliptical area, the maximum height estimation error is 1.363cm when x and y are the same. This error meets the accuracy requirements in actual operation compared to the diameter of the moving platform wheels.
Next, based on the result of the terrain surface, the estimation accuracy of the algorithm for irregular terrain is analyzed from the height estimation profile. As can be seen from FIG. 7, the estimated value represented by the red curve and the real value represented by the yellow curve almost coincide in the operation process and are completely within reasonable upper and lower boundaries, and meanwhile, the height deviation is 0.52cm as can be seen from the partially enlarged schematic diagram, which all illustrates the rationality of the algorithm for the estimation of irregular terrain. Moreover, for the 3 σ estimation distribution area of the terrain, it can be seen that the distribution boundary is narrower than the undulating part at the initial and final relatively flat parts, which also accords with the characteristics of high estimation precision of the flat terrain and large estimation deviation of the undulating terrain, and meanwhile, the above-mentioned point can be verified from the distribution of the point cloud graph in combination with fig. 8. Generally speaking, from the reconstruction result of the simulation environment algorithm, the accuracy is still more ideal, but after all, compared with the real environment, the differences exist in the aspects of environmental noise, running deviation, wheel-to-wheel interaction uncertainty and the like, so the performance comparison test is carried out in the real environment based on the unmanned vehicle platform and the sensing unit in the next section.
And (3) testing 2: indoor terrain test based on Optitack assistance
This section has built the simulation topography under the Optitrack experimental environment, as shown in fig. 9, contains barrier and simulation relief topography in the place. Meanwhile, capturing points are arranged on the unmanned vehicle platform and the sensor, so that a rigid body is constructed and tracked in an Optitrack environment, as shown in fig. 10, a capturing schematic of an experimental platform is given.
The experimental process is shown in fig. 11, which shows the process of traveling at different times. Fig. 12 shows a result of reconstructing the terrain, where the sampling range of the vision sensor is limited in order to improve the calculation efficiency, and it can be seen from the result that the relief trend of the terrain is substantially consistent with the set terrain, at the position 1, corresponding to a place where the slope change of the actual terrain is large, because the sensor is in a blind area with respect to the slope during the operation of the vehicle body in the upward-inclined trend, and cannot be effectively detected, a missing phenomenon occurs at the terrain, and in the subsequent analysis, the full-connection estimation of the terrain is realized through the fitting of the change trend between sampling points; in position 2, the actual carpet is green, and since the end of the carpet is bent to a certain curvature, the effective detection of such details can be clearly seen from the experimental results.
Fig. 13 shows an experimental result of a test terrain, in which a red curved surface is an estimated value of the terrain, a blue curved surface and a green curved surface respectively represent upper and lower boundaries of 3 σ distribution of the estimated value of the terrain, and a yellow curved surface is an actual terrain value, and according to the limitation of a measurement range of a sensor, a range of an effective estimation area is given by a yellow dotted line in the figure, and a detail display diagram of a large deviation part is given, so that the estimated value and the actual value of the terrain are both within the upper and lower boundary ranges from a small diagram, and the deviation of the upper and lower boundary ranges is less than 5 cm.
Meanwhile, in combination with fig. 14, the error between the estimated value and the true value of the terrain of the algorithm can be compared, a detail map of terrain comparison is given for the terrain relief of the effective identification area, and the same x and y points are selected for comparing the height values, as shown in the left graph, the error is 0.26cm in relatively flat terrain, as shown in the right graph, and the deviation of the height of the tail end is 1.51 cm. The process deviation is also substantially within a 2cm deviation.
Fig. 15 analyzes the terrain profile, because the experiment is not based on fixed straight line walking and completely follows the three-dimensional actual motion process, the terrain profile corresponding to the middle measurement value is selected for analysis, the part with the maximum deviation between the estimated value and the true value is in the range of a blue dotted line frame in the large graph, and as can be seen from the detail graph, the height deviation corresponding to the same x value is in the range of 0.8cm, because the selected reference tangent plane is the more concentrated and average position of the camera sampling, the measurement precision is relatively higher, and meanwhile, as can be seen from the curve change trend, the distribution range of the terrain change position is relatively wider than that of the flat area 3 sigma, which also accords with the characteristics of high flat terrain estimation precision and large fluctuation terrain estimation deviation. By combining the global point cloud information shown in fig. 16, it can be analyzed that the overall estimation effect of the terrain is close to the actual situation, and compared with the diameter of the platform wheel, the error magnitude completely meets the requirement of the actual running process.
The above is only a preferred embodiment of the probabilistic terrain estimation method based on the uncertainty analysis, and the protection range of the probabilistic terrain estimation method based on the uncertainty analysis is not limited to the above embodiments, and all technical solutions belonging to the idea belong to the protection range of the present invention. It should be noted that modifications and variations which do not depart from the gist of the invention will be those skilled in the art to which the invention pertains and which are intended to be within the scope of the invention.

Claims (6)

1. A probabilistic terrain estimation method based on uncertain analysis is based on a patrol platform and a sensor, and is characterized in that: the method comprises the following steps:
step 1: establishing a patrol coordinate system, and determining covariance matrixes of poses of the patrol platform at different moments;
step 2: based on the established patrol coordinate system, acquiring ranging data, vibration and gyroscope data when the patrol platform operates to a measuring point P;
and step 3: carrying out terrain fusion estimation according to the acquired patrol operation distance measurement data, vibration and gyroscope data at the measurement point P, and carrying out approximate processing on the estimation result by adopting Gaussian probability distribution;
and 4, step 4: setting a new measurement update at a point i in the grid, and determining the influence of noise errors generated by the motion of the inspection platform on terrain fusion estimation;
and 5: determining sensor position information and a terrain covariance estimation value based on the triaxial vibration touch sensing unit and the uniaxial gyroscope;
step 6: the uncertainty terrain estimation corresponding to each point in the terrain is carried out on the measuring point, the time from the k moment to the k +1 moment is obtained, and the measuring point p is updated corresponding to the uncertainty terrain of each point in the terrain;
and 7: and repeating the steps 2 to 6 to obtain the estimation update of the terrain.
2. A probabilistic terrain estimation method based on uncertainty analysis, as defined in claim 1, wherein: the step 1 specifically comprises the following steps:
step 1.1: establishing a patrol coordinate system which comprises an inertial coordinate system I, a terrain coordinate system T, a patrol platform body coordinate system R and a sensor coordinate system S,
the system comprises an inertial coordinate system I, a patrolling platform body coordinate system R, a sensor coordinate system S and a sensor body mass center, wherein the inertial coordinate system I is fixed in an inertial space, the patrolling platform body coordinate system R is fixed at the position of the mass center of the patrolling platform, and the sensor coordinate system S is fixedly connected with the mass center of the sensor body;
step 1.2: because the poses of the inspection platform at different moments have height uncertainty for the inertial coordinate system I, the covariance matrix of the pose at each moment is synchronously given:
Figure FDA0002284677630000011
wherein r isIRAnd phiIRThe coordinate system is a conversion coordinate between an inertial coordinate system and a coordinate system of the inspection device body;
determining the change of the three-dimensional attitude of the patrol platform through pitching, yawing and rolling when the patrol platform advances in unknown terrain, and determining the conversion between the patrol platform body coordinate system R and the inertial coordinate system I through the following formula:
Figure FDA0002284677630000012
wherein the content of the first and second substances,
Figure FDA0002284677630000021
an intermediate coordinate system is obtained for the rotation of the coordinate system R of the inspection platform body by a psi angle around the Z axis
Figure FDA0002284677630000022
The transformation from the middle coordinate system to the coordinate system R of the inspection platform body is realized, theta is a pitch angle,
Figure FDA0002284677630000023
is the roll angle.
3. A probabilistic terrain estimation method based on uncertainty analysis, as defined in claim 2, wherein: setting psi(I→T)By
Figure FDA0002284677630000024
And the conversion between the terrain coordinate system T and the inspection platform body coordinate system R only has two degrees of freedom of pitching and rolling, so that the dimension reduction processing between coordinate conversion is realized.
4. A probabilistic terrain estimation method based on uncertainty analysis, as defined in claim 1, wherein: the step 3 specifically comprises the following steps:
step 3.1: coordinates for determining point P as a measurement point
Figure FDA0002284677630000025
The measuring point P is under the terrain coordinate system (x)p,yp) The height estimate at a point is
Figure FDA0002284677630000026
The estimation of the height is approximated by using gaussian probability distribution, the mean value of the height distribution is obtained by the conversion from the sensor coordinate system S to the terrain coordinate system T, and the mean value of the height distribution is represented by the following formula:
Figure FDA0002284677630000027
wherein H ═ 001],hpIs the average of the height distribution,
Figure FDA0002284677630000028
for the variance of the distribution, the measured values of the measuring points P in the sensor coordinate system S areSrSPSrSTFor the position of the sensor in the topographic coordinate system, phiSTA transformation matrix from a sensor coordinate system to a terrain coordinate system;
step 3.2: extracting the three-dimensional coordinate of the measuring point P in the height direction, determining the relation between the height estimation and conversion matrix and the sensor measuring value, solving a first derivative of the formula (3) to obtain a Jacobian matrix corresponding to the error, and expressing the Jacobian matrix measured by the sensor and the rotating Jacobian matrix of the sensor coordinate system through the following formulas:
Figure FDA0002284677630000029
Figure FDA00022846776300000210
wherein, JSMeasuring the Jacobian matrix for the sensor, JΦRotating the Jacobian matrix for the sensor coordinate system, wherein C (phi) is a mapping of a corresponding rotation matrix;
determining the variance according to equation (4) and equation (5)
Figure FDA00022846776300000211
Is determined by the following equation
Figure FDA00022846776300000212
Figure FDA00022846776300000213
Wherein phiISConverting a matrix from a sensor coordinate system to an inertial coordinate system;
step 3.3: noise error estimation based on sensor measurement is already obtained, corresponding altitude estimation exists for each measurement update, and the newly obtained altitude measurement estimation and the existing elevation topographic map are fused based on a fusion form of Kalman filtering.
5. A probabilistic terrain estimation method based on uncertainty analysis, as defined in claim 1, wherein: the step 4 specifically comprises the following steps:
step 4.1: setting a new measurement update at a point i in the grid, determining the covariance of i, and expressing the covariance of i by the following formula:
Figure FDA0002284677630000031
Figure FDA0002284677630000032
wherein, Σ piIs the covariance of the point i,
Figure FDA0002284677630000033
estimating sum of variances for x-direction
Figure FDA0002284677630000034
The variance is estimated for the y direction, d is the side length of the grid,
Figure FDA0002284677630000035
is the altitude uncertainty;
step 4.2: determining the relative position relation between the k time and the k +1 time,
Figure FDA0002284677630000036
the representation in the terrain coordinate system at time k is shown as follows:
Figure FDA0002284677630000037
wherein the content of the first and second substances,
Figure FDA0002284677630000038
for the measurement point in the topographic coordinate system at time k +1,
Figure FDA0002284677630000039
is the distance relation between the front and the rear moments of the terrain coordinate system,
Figure FDA00022846776300000310
is a measuring point in a terrain coordinate system at the moment k;
determining a k +1 time terrain coordinate system, and expressing the k +1 time terrain coordinate system by the following formula:
Figure FDA00022846776300000311
step 4.3: aligning the Z axis of the inspection platform body coordinate system with the Z axis of the inertial coordinate system, obtaining that the processed attitude uncertainty is only related to the yaw angle, and giving covariance matrix representation of the inspection platform pose at the moment k again through dimension reduction processing:
Figure FDA00022846776300000312
wherein x isIRIn order to tour the platform covariance matrix,
Figure FDA00022846776300000313
the coordinate system of the inspection platform body after alignment is satisfied at the same time
Figure FDA00022846776300000314
Figure FDA00022846776300000315
6. A probabilistic terrain estimation method based on uncertainty analysis, as defined in claim 1, wherein: the step 5 specifically comprises the following steps:
step 5.1: based on the sensor position information, the output values of the three-axis vibration touch sensor are the amplitude and frequency values of the measuring point, and the output of the single-axis gyroscope is the angular acceleration around the Z axis and is recorded as the angular acceleration
Figure FDA00022846776300000316
In order to reduce the error caused by nonlinearity, the sensor fixing connection point is used as a coordinate origin, and the sensor position information is determined by the following formula:
Figure FDA0002284677630000041
where Δ v is the velocity difference, Δ t is the measurement time interval,
Figure FDA0002284677630000042
is an acceleration signal;
step 5.2: in the sampling process, there is uncertainty, where the gaussian noise vector is n ═ n (n)a,nβ)TDetermining a Jacobian matrix for the state and the noise by the following formula, wherein the Jacobian matrix for the state and the noise is the Jacobian matrix for the state and the noise:
Figure FDA0002284677630000043
Figure FDA0002284677630000044
while the translation of the patrol platform from time k to time k +1 is determined by translation and rotation, the covariance can be given by
Figure FDA0002284677630000045
Solving for the value of the first derivative only in the Z-axis direction
Figure FDA0002284677630000046
Is determined by
Figure FDA0002284677630000047
Figure FDA0002284677630000048
Wherein the content of the first and second substances,
Figure FDA0002284677630000049
CN201911155457.9A 2019-11-22 2019-11-22 Probabilistic terrain estimation method based on uncertain analysis Pending CN110929402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911155457.9A CN110929402A (en) 2019-11-22 2019-11-22 Probabilistic terrain estimation method based on uncertain analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911155457.9A CN110929402A (en) 2019-11-22 2019-11-22 Probabilistic terrain estimation method based on uncertain analysis

Publications (1)

Publication Number Publication Date
CN110929402A true CN110929402A (en) 2020-03-27

Family

ID=69850712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911155457.9A Pending CN110929402A (en) 2019-11-22 2019-11-22 Probabilistic terrain estimation method based on uncertain analysis

Country Status (1)

Country Link
CN (1) CN110929402A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111896027A (en) * 2020-07-15 2020-11-06 北京控制工程研究所 Distance measuring sensor simulation modeling method considering topography fluctuation
CN112987560A (en) * 2021-04-19 2021-06-18 长沙智能驾驶研究院有限公司 Filter control method, device, equipment and computer storage medium
CN113639946A (en) * 2021-08-13 2021-11-12 吉林大学 Method for determining mechanism bumping and vibrating conditions during movement of patrol device
CN114021376A (en) * 2021-11-17 2022-02-08 中国北方车辆研究所 Terrain slope estimation method for quadruped robot
WO2022241951A1 (en) * 2021-05-21 2022-11-24 魔门塔(苏州)科技有限公司 Method for fusing data of multiple sensors

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103363993A (en) * 2013-07-06 2013-10-23 西北工业大学 Airplane angular rate signal reconstruction method based on unscented kalman filter
CN107314718A (en) * 2017-05-31 2017-11-03 中北大学 High speed rotating missile Attitude estimation method based on magnetic survey rolling angular rate information
CN108036785A (en) * 2017-11-24 2018-05-15 浙江大学 A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion
CN109931955A (en) * 2019-03-18 2019-06-25 北京工业大学 Strapdown inertial navigation system Initial Alignment Method based on the filtering of state correlation Lie group
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103363993A (en) * 2013-07-06 2013-10-23 西北工业大学 Airplane angular rate signal reconstruction method based on unscented kalman filter
CN107314718A (en) * 2017-05-31 2017-11-03 中北大学 High speed rotating missile Attitude estimation method based on magnetic survey rolling angular rate information
CN108036785A (en) * 2017-11-24 2018-05-15 浙江大学 A kind of aircraft position and orientation estimation method based on direct method and inertial navigation fusion
CN109931955A (en) * 2019-03-18 2019-06-25 北京工业大学 Strapdown inertial navigation system Initial Alignment Method based on the filtering of state correlation Lie group
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHENGCHAO BAI 等: "Uncertainty-Based Vibration/Gyro Composite Planetary Terrain Mapping", 《SENSORS 2019》 *
CHENGCHAO BAI 等: "Uncertainty-Based Vibration/Gyro Composite Planetary Terrain Mapping", 《SENSORS 2019》, 13 June 2019 (2019-06-13), pages 1 - 26 *
JENELTEN, FABIAN: "Perceptive locomotion for legged robots in rough terrain", 《HTTPS://DOI.ORG/10.3929/ETHZ-B-000284254》, 31 December 2018 (2018-12-31), pages 158 *
郭继峰 等: "基于不确定性分析的巡视器地形估计方法", 《无人系统技术》, vol. 2, no. 3, 13 June 2019 (2019-06-13), pages 1 - 19 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111896027A (en) * 2020-07-15 2020-11-06 北京控制工程研究所 Distance measuring sensor simulation modeling method considering topography fluctuation
CN111896027B (en) * 2020-07-15 2022-07-29 北京控制工程研究所 Distance measuring sensor simulation modeling method considering topographic relief
CN112987560A (en) * 2021-04-19 2021-06-18 长沙智能驾驶研究院有限公司 Filter control method, device, equipment and computer storage medium
CN112987560B (en) * 2021-04-19 2021-09-10 长沙智能驾驶研究院有限公司 Filter control method, device, equipment and computer storage medium
WO2022222889A1 (en) * 2021-04-19 2022-10-27 长沙智能驾驶研究院有限公司 Filter control method and apparatus, and device and computer storage medium
WO2022241951A1 (en) * 2021-05-21 2022-11-24 魔门塔(苏州)科技有限公司 Method for fusing data of multiple sensors
CN113639946A (en) * 2021-08-13 2021-11-12 吉林大学 Method for determining mechanism bumping and vibrating conditions during movement of patrol device
CN114021376A (en) * 2021-11-17 2022-02-08 中国北方车辆研究所 Terrain slope estimation method for quadruped robot
CN114021376B (en) * 2021-11-17 2024-04-09 中国北方车辆研究所 Terrain gradient estimation method for quadruped robot

Similar Documents

Publication Publication Date Title
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN110929402A (en) Probabilistic terrain estimation method based on uncertain analysis
CN109885080B (en) Autonomous control system and autonomous control method
Kelly et al. Combined visual and inertial navigation for an unmanned aerial vehicle
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN107728182B (en) Flexible multi-baseline measurement method and device based on camera assistance
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
Zhang et al. Lidar-IMU and wheel odometer based autonomous vehicle localization system
JP2020064056A (en) Device and method for estimating position
CN110455301A (en) A kind of dynamic scene SLAM method based on Inertial Measurement Unit
Steiner et al. A vision-aided inertial navigation system for agile high-speed flight in unmapped environments: Distribution statement a: Approved for public release, distribution unlimited
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
Lambert et al. Visual odometry aided by a sun sensor and inclinometer
CN114485643B (en) Coal mine underground mobile robot environment sensing and high-precision positioning method
Karam et al. Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping
Deschênes et al. Lidar scan registration robust to extreme motions
Peng et al. Vehicle odometry with camera-lidar-IMU information fusion and factor-graph optimization
Hsu et al. Application of multisensor fusion to develop a personal location and 3D mapping system
Hussnain et al. Enhanced trajectory estimation of mobile laser scanners using aerial images
Bai et al. Graph-optimisation-based self-calibration method for IMU/odometer using preintegration theory
Wang et al. Micro aerial vehicle navigation with visual-inertial integration aided by structured light
CN115797490A (en) Drawing construction method and system based on laser vision fusion
CN117128953A (en) Dead reckoning method, equipment and storage medium for pipeline wall-climbing robot
Liu et al. Relative Flash LiDAR Aided-Inertial Navigation using Surfel Grid Maps

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination