CN110207704B - Pedestrian navigation method based on intelligent identification of building stair scene - Google Patents
Pedestrian navigation method based on intelligent identification of building stair scene Download PDFInfo
- Publication number
- CN110207704B CN110207704B CN201910422538.4A CN201910422538A CN110207704B CN 110207704 B CN110207704 B CN 110207704B CN 201910422538 A CN201910422538 A CN 201910422538A CN 110207704 B CN110207704 B CN 110207704B
- Authority
- CN
- China
- Prior art keywords
- time
- pedestrian
- scene
- moment
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 230000005021 gait Effects 0.000 claims abstract description 50
- 230000001133 acceleration Effects 0.000 claims abstract description 32
- 238000001914 filtration Methods 0.000 claims abstract description 13
- 230000009194 climbing Effects 0.000 claims abstract description 3
- 239000011159 matrix material Substances 0.000 claims description 63
- 238000012549 training Methods 0.000 claims description 27
- 238000003066 decision tree Methods 0.000 claims description 21
- 238000005259 measurement Methods 0.000 claims description 13
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000005070 sampling Methods 0.000 claims description 9
- 230000007704 transition Effects 0.000 claims description 9
- 238000007637 random forest analysis Methods 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 6
- 238000012546 transfer Methods 0.000 claims description 6
- 125000004432 carbon atom Chemical group C* 0.000 claims description 5
- 238000012937 correction Methods 0.000 claims description 5
- 238000012935 Averaging Methods 0.000 claims description 3
- 230000001174 ascending effect Effects 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 3
- 230000005484 gravity Effects 0.000 claims description 3
- 230000009191 jumping Effects 0.000 claims description 3
- 230000017105 transposition Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 8
- 238000007796 conventional method Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention discloses a pedestrian navigation method based on intelligent identification of a building stair scene. The method comprises the following steps: arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics; when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information; predicting the walking posture, speed and position of the pedestrian; judging whether the foot is in a zero-speed state, and if so, correcting the predicted speed and position information through a Kalman filter; and judging the walking scene of the pedestrian in the building, and if the pedestrian is in the state of going up and down the stairs, correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map. The invention improves the positioning precision of the pedestrian walking in the building with stairs.
Description
Technical Field
The invention belongs to the technical field of pedestrian navigation, and particularly relates to a pedestrian navigation method based on a building stair scene.
Background
The pedestrian navigation system is an important branch in the navigation positioning field, has gained more and more attention of researchers in recent years, and is widely applied to the fields of rescue and emergency rescue, military operation and the like. Traditional pedestrian navigation mainly adopts GPS location technique, but the GPS signal has the signal loss phenomenon under indoor and urban environment, and civilian precision is relatively poor, can't satisfy people's indoor navigation demand. With the development of micro-electro-mechanical system (MEMS) technology, the advantages of small volume, low power consumption, light weight, portability and the like of the MEMS inertial measurement unit are gradually highlighted, and research on an indoor pedestrian navigation system based on the MEMS-IMU has also become a hotspot.
The inertial sensor has drift error accumulated by time, and is a main error source of course divergence of the pedestrian navigation position. Through the zero-speed correction algorithm, the speed error divergence can be restrained and the navigation precision can be improved. Due to poor observability of the position and course errors, the zero-speed correction algorithm cannot correct the position and course errors, and the track is diverged finally along with the accumulation of time.
Disclosure of Invention
In order to solve the technical problems mentioned in the background art, the invention provides a pedestrian navigation method based on intelligent identification of a building stair scene.
In order to achieve the technical purpose, the technical scheme of the invention is as follows:
a pedestrian navigation method based on intelligent identification of building stair scenes comprises the following steps:
(1) arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics;
(2) when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
(3) predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
(4) judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step (5);
(5) judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step (1), and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to the step (2).
Further, in step (1), the specific process of establishing the gait feature-based scene recognition model is as follows:
(101) recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to put the single gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
(102) note the bookAcquiring inertia sample data for jth gait, wherein N is the number of samples, and acquiring a characteristic vector T ═ c (acc)xmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
wherein the content of the first and second substances,the acceleration of the body system relative to the inertia system at the nth time in the jth gait is in the body system XY, Z components on the axis;the amount of the acceleration of the body system relative to the inertia system on the axis of the body system X, Y, Z at the nth moment in the jth gait; vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) and training a scene recognition model by adopting a random forest algorithm.
Further, the specific process of step (103) is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
the training set D is (X, Y), X is the acquired gait feature, Y is the label corresponding to each feature, and training samples (D) having the same size as D are randomly extracted from the data set D in a set-back manner (D, Y)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiExtracting L partial attributes from the total attribute S randomly and unreplaceably to form attribute set A of the decision treeiUsing the decision tree training model to train a decision tree;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
Further, in step (3), the attitude is predicted using the following equation:
in the above formula:
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,the angular velocity of the machine relative to the inertial system at time k,for the angular velocity of the machine system relative to the navigation system at time kThe component on the axis of the chassis X, Y, Z, at is the sample period,is the attitude transition matrix at time k-1,the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
in the above formula, the first step is,respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
Further, in step (4), the method for determining whether the foot is in the zero velocity state is as follows:
and (3) judging the zero-speed moment by using a three-condition judgment method:
wherein the content of the first and second substances,the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,is the component of the acceleration of the gantry relative to the inertial frame at time k on the axis of the gantry X, Y, Z;in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&Representation and operation, ZUPT (k)) 1 indicates a zero speed state, and zupt (k) 0 indicates a non-zero speed state.
Further, in step (4), the process of correcting the predicted speed and position information by the kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1, is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;for the measurement noise at time k, diag represents the matrix diagonalization,are respectively asThe noise of (2) is detected, is the component of the carrier velocity on the axis of the navigation system X, Y, Z; h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
in the above formula, the first and second carbon atoms are,is an estimate of the state quantity at time k,the state variables from k-1 to k are predicted values in one step, L, lambda and h are longitude, latitude and height, and theta, gamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
Further, the specific process of step (5) is as follows:
(501) the scene association is performed using the following equation:
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the point as a related stair position point;
(502) and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the moment of filter k-1. Is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]Lελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ]2×2 02×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
Adopt the beneficial effect that above-mentioned technical scheme brought:
the pedestrian walking scene identification method based on the established scene identification model identifies the pedestrian walking scene in real time, and if the pedestrian walking scene is in the stage of going up and down stairs, the position information of the pedestrian is corrected by using the known stair position. Compared with the traditional pedestrian navigation algorithm, the method can effectively improve the positioning precision of the pedestrian when the pedestrian walks in the building with stairs.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a diagram of a solution result obtained by a conventional method, including two sub-diagrams (a) and (b), which are a two-dimensional diagram and a three-dimensional diagram, respectively;
FIG. 3 is a diagram of the result of the solution obtained by the method of the present invention, which includes two sub-diagrams (a) and (b), respectively a two-dimensional diagram and a three-dimensional diagram.
Detailed Description
The technical scheme of the invention is explained in detail in the following with the accompanying drawings.
The invention designs a pedestrian navigation method based on intelligent identification of a building stair scene, which comprises the following steps as shown in figure 1.
Step 1: arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, establishing a gait feature library comprising scenes of walking on the flat ground, going upstairs and going downstairs, and establishing a scene recognition model based on gait features;
step 2: when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
and step 3: predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
and 4, step 4: judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step 5;
and 5: judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step 1, and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to step 2.
In this embodiment, step 1 is implemented by the following preferred scheme:
the specific process of establishing the gait feature-based scene recognition model is as follows:
101. data segmentation: recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to put the single gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
102. feature extraction: note the bookAcquiring inertia sample data for jth gait, wherein N is the number of samples, and acquiring a characteristic vector T ═ c (acc)xmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
wherein the content of the first and second substances,is the component of the acceleration of the body system relative to the inertial system on the axis of the body system X, Y, Z at the nth time in the jth gait;the amount of the acceleration of the body system relative to the inertia system on the axis of the body system X, Y, Z at the nth moment in the jth gait; vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) training a scene recognition model by adopting a random forest algorithm, and the specific process is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
the training set D is (X, Y), X is the acquired gait feature, Y is the label corresponding to each feature, and training samples (D) having the same size as D are randomly extracted from the data set D in a set-back manner (D, Y)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiThe number L is randomly extracted from the total attribute S without being replaced (usually) Part of the attributes of (2), the attribute set A forming the decision treeiUsing the decision tree training model to train a decision tree; when node splitting is carried out, the splitting rule adopted is the principle of minimum Gini value, and the calculation formula is as follows;
in the above formula, plIs the probability that the sample point belongs to class i;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
In this embodiment, step 3 is implemented by the following preferred scheme:
attitude was predicted using the following equation:
in the above formula:
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,the angular velocity of the machine relative to the inertial system at time k,for the angular velocity of the machine system relative to the navigation system at time kThe component on the axis of the chassis X, Y, Z, at is the sample period,is the attitude transition matrix at time k-1,the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
in the above formula, the first and second carbon atoms are,respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
In this embodiment, step 4 is implemented by using the following preferred scheme:
and (3) judging the zero-speed moment by using a three-condition judgment method:
wherein the content of the first and second substances,the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,is the component of the acceleration of the gantry relative to the inertial frame at time k on the axis of the gantry X, Y, Z;in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&And operation is shown, where zupt (k) 1 shows the zero speed state, and zupt (k) 0 shows the non-zero speed state.
The process of correcting the predicted speed and position information by the kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1, is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;for the measurement noise at time k, diag represents the matrix diagonalization,are respectively asThe noise of (2) is detected, is the component of the carrier velocity on the axis of the navigation system X, Y, Z; h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
in the above formula, the first and second carbon atoms are,is an estimate of the state quantity at time k,the state variable from k-1 to k is predicted in one step, L, lambda and h are longitude, latitude and height, thetaGamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
In this embodiment, step 5 is implemented by the following preferred scheme:
501. the scene association is performed using the following equation:
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the point as a related stair position point;
502. and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the moment of filter k-1. Is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]Lελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ]2× 202×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
In practical experiments, the model of the adopted inertia device is MTW awind a. Walk around the corridor in the experimental field, and walk along the flat ground, go up stairs and go down stairs. The total travel mileage is 1100 m. Under the same environment, the traditional navigation method and the navigation method are respectively adopted for resolving, the obtained results are respectively shown in fig. 2 and fig. 3, and the experimental precision is shown in table 1.
TABLE 1
Results of the experiment | Conventional methods | The method of the invention |
Maximum position error | 12m | 1.9m |
Course drift | 13° | 1° |
Positioning accuracy | 1.1% | 0.2% |
The embodiments are only for illustrating the technical idea of the present invention, and the technical idea of the present invention is not limited thereto, and any modifications made on the basis of the technical scheme according to the technical idea of the present invention fall within the scope of the present invention.
Claims (6)
1. A pedestrian navigation method based on intelligent identification of a building stair scene is characterized by comprising the following steps:
(1) arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics;
the specific process of establishing the gait feature-based scene recognition model is as follows:
(101) recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to make the individual gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
(102) note the bookInertia for jth gait acquisitionSample data, where N is the number of samples, and then collecting a feature vector T ═ acxmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
wherein the content of the first and second substances,is the component of the acceleration of the body system relative to the inertial system on the axis of the body system X, Y, Z at the nth time in the jth gait;the angular velocity of the body system relative to the inertia system at the nth time in the jth gait is on the X, Y, Z axisThe amount of (c); vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) training a scene recognition model by adopting a random forest algorithm;
(2) when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
(3) predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
(4) judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step (5);
(5) judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step (1), and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to the step (2).
2. The pedestrian navigation method based on intelligent identification of the building stair scene as claimed in claim 1, wherein the specific process of the step (103) is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
memory training set D ═ X1,Y1),X1For the gait feature to be captured, Y1For each feature corresponding label, a training sample of the same size as D is randomly selected from the data set D (D)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiExtracting L partial attributes from the total attribute S randomly and unreplaceably to form attribute set A of the decision treeiUsing the decision tree training model to train a decision tree;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
3. The pedestrian navigation method based on intelligent identification of the building stair scene in claim 1, wherein in the step (3), the posture is predicted by adopting the following formula:
in the above formula:
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,the angular velocity of the machine relative to the inertial system at time k,for the angular velocity of the machine system relative to the navigation system at time kThe component on the axis of the chassis X, Y, Z, at is the sample period,is the attitude transition matrix at time k-1,the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
in the above formula, the first and second carbon atoms are,respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
4. The pedestrian navigation method based on intelligent identification of the building stair scene in claim 1, wherein in the step (4), the method for judging whether the feet are in the zero-speed state is as follows:
and (3) judging the zero-speed moment by using a three-condition judgment method:
wherein the content of the first and second substances,the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,is the component of the angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&And operation is shown, where zupt (k) 1 shows the zero speed state, and zupt (k) 0 shows the non-zero speed state.
5. The pedestrian navigation method based on intelligent identification of the building stair scene in the claim 1, wherein in the step (4), the process of correcting the predicted speed and position information through the Kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)T
in the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1, is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;for the measurement noise at time k, diag represents the matrix diagonalization,are respectively asThe noise of (2) is detected, for carrier speed on axis of navigation system X, Y, ZA component of (a); h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
in the above formula, the first and second carbon atoms are,is an estimate of the state quantity at time k,the state variables from k-1 to k are predicted values in one step, L, lambda and h are longitude, latitude and height, and theta, gamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
6. The pedestrian navigation method based on intelligent identification of the building stair scene according to claim 1, wherein the specific process of the step (5) is as follows:
(501) the scene association is performed using the following equation:
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the position coordinate as a related stair position point;
(502) and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)T
in the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter; I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period, is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1, is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively asAndmodel noise of (e ∈)ax、εayAnd εazAre respectively asAndthe noise of the model (2) is,andthe amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,andis the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]L ελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ]2×2 02×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910422538.4A CN110207704B (en) | 2019-05-21 | 2019-05-21 | Pedestrian navigation method based on intelligent identification of building stair scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910422538.4A CN110207704B (en) | 2019-05-21 | 2019-05-21 | Pedestrian navigation method based on intelligent identification of building stair scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110207704A CN110207704A (en) | 2019-09-06 |
CN110207704B true CN110207704B (en) | 2021-07-13 |
Family
ID=67787920
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910422538.4A Active CN110207704B (en) | 2019-05-21 | 2019-05-21 | Pedestrian navigation method based on intelligent identification of building stair scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110207704B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111076718B (en) * | 2019-12-18 | 2021-01-15 | 中铁电气化局集团有限公司 | Autonomous navigation positioning method for subway train |
CN111368749B (en) * | 2020-03-06 | 2023-06-13 | 创新奇智(广州)科技有限公司 | Automatic identification method and system for stair area |
CN111649742B (en) * | 2020-05-08 | 2022-02-08 | 北京航空航天大学 | Elevation estimation method based on ANFIS assistance |
CN111596325A (en) * | 2020-05-09 | 2020-08-28 | 北京东方海龙消防科技有限公司 | Positioning search and rescue system |
CN112418649B (en) * | 2020-11-19 | 2022-11-11 | 东南大学 | Building stair pedestrian flow estimation system based on multi-dimensional MEMS inertial sensor |
CN113295158B (en) * | 2021-05-14 | 2024-05-14 | 江苏大学 | Indoor positioning method integrating inertial data, map information and pedestrian motion state |
CN113720332B (en) * | 2021-06-30 | 2022-06-07 | 北京航空航天大学 | Floor autonomous identification method based on floor height model |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105628027A (en) * | 2016-02-19 | 2016-06-01 | 中国矿业大学 | Indoor environment precise real-time positioning method based on MEMS inertial device |
CN106017461A (en) * | 2016-05-19 | 2016-10-12 | 北京理工大学 | Pedestrian navigation system three-dimensional spatial positioning method based on human/environment constraints |
CN106482733A (en) * | 2016-09-23 | 2017-03-08 | 南昌大学 | Zero velocity update method based on plantar pressure detection in pedestrian navigation |
CN106908060A (en) * | 2017-02-15 | 2017-06-30 | 东南大学 | A kind of high accuracy indoor orientation method based on MEMS inertial sensor |
CN108759814A (en) * | 2018-04-13 | 2018-11-06 | 南京航空航天大学 | A kind of quadrotor roll axis angular rate and pitch axis Attitude rate estimator method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102060723B1 (en) * | 2013-01-18 | 2019-12-31 | 삼성전자주식회사 | Apparatus for recognizing feature of user, method for establishing basis matrix based on orthogonal non-negative matrix factorization and method for establishing basis matrix based on orthogonal semi-supervised non-negative matrix factorization |
CN109298389B (en) * | 2018-08-29 | 2022-09-23 | 东南大学 | Indoor pedestrian combination pose estimation method based on multi-particle swarm optimization |
-
2019
- 2019-05-21 CN CN201910422538.4A patent/CN110207704B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105628027A (en) * | 2016-02-19 | 2016-06-01 | 中国矿业大学 | Indoor environment precise real-time positioning method based on MEMS inertial device |
CN106017461A (en) * | 2016-05-19 | 2016-10-12 | 北京理工大学 | Pedestrian navigation system three-dimensional spatial positioning method based on human/environment constraints |
CN106482733A (en) * | 2016-09-23 | 2017-03-08 | 南昌大学 | Zero velocity update method based on plantar pressure detection in pedestrian navigation |
CN106908060A (en) * | 2017-02-15 | 2017-06-30 | 东南大学 | A kind of high accuracy indoor orientation method based on MEMS inertial sensor |
CN108759814A (en) * | 2018-04-13 | 2018-11-06 | 南京航空航天大学 | A kind of quadrotor roll axis angular rate and pitch axis Attitude rate estimator method |
Non-Patent Citations (2)
Title |
---|
李辰祥.基于MEMS行人惯性导航的零速度修正技术研究 李辰祥 I136-605.《中国优秀硕士学位论文全文数据库 信息科技辑》.2014,第I136-605页. * |
行人自主导航算法研究;左德胜;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190115;第I136-1955页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110207704A (en) | 2019-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110207704B (en) | Pedestrian navigation method based on intelligent identification of building stair scene | |
CN110118549B (en) | Multi-source information fusion positioning method and device | |
US10267646B2 (en) | Method and system for varying step length estimation using nonlinear system identification | |
CN110553643B (en) | Pedestrian self-adaptive zero-speed updating point selection method based on neural network | |
CN108225304A (en) | Based on method for rapidly positioning and system in Multiple Source Sensor room | |
CN104061934A (en) | Pedestrian indoor position tracking method based on inertial sensor | |
CN108537101B (en) | Pedestrian positioning method based on state recognition | |
JP6462692B2 (en) | Autonomous mobile device | |
CN103968827A (en) | Wearable human body gait detection self-localization method | |
CN108426582B (en) | Indoor three-dimensional map matching method for pedestrians | |
EP3908805A1 (en) | Method and system for tracking a mobile device | |
Deng et al. | Foot-mounted pedestrian navigation method based on gait classification for three-dimensional positioning | |
CN111174781A (en) | Inertial navigation positioning method based on wearable device combined target detection | |
CN109459028A (en) | A kind of adaptive step estimation method based on gradient decline | |
CN108180923A (en) | A kind of inertial navigation localization method based on human body odometer | |
CN113188557A (en) | Visual inertial integrated navigation method fusing semantic features | |
Xia et al. | Autonomous pedestrian altitude estimation inside a multi-story building assisted by motion recognition | |
CN116448103A (en) | Pedestrian foot binding type inertial navigation system error correction method based on UWB ranging assistance | |
CN113848878B (en) | Indoor and outdoor three-dimensional pedestrian road network construction method based on crowd source data | |
Lin et al. | LocMe: Human locomotion and map exploitation based indoor localization | |
CN112729282B (en) | Indoor positioning method integrating single anchor point ranging and pedestrian track calculation | |
CN114674317A (en) | Self-correcting dead reckoning system and method based on activity recognition and fusion filtering | |
CN110332936B (en) | Indoor motion trail navigation method based on multiple sensors | |
CN110766154B (en) | Pedestrian track inference method, device, equipment and storage medium | |
CN107958118A (en) | A kind of wireless signal acquiring method based on spatial relationship |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |