CN110207704B - Pedestrian navigation method based on intelligent identification of building stair scene - Google Patents

Pedestrian navigation method based on intelligent identification of building stair scene Download PDF

Info

Publication number
CN110207704B
CN110207704B CN201910422538.4A CN201910422538A CN110207704B CN 110207704 B CN110207704 B CN 110207704B CN 201910422538 A CN201910422538 A CN 201910422538A CN 110207704 B CN110207704 B CN 110207704B
Authority
CN
China
Prior art keywords
time
pedestrian
scene
moment
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910422538.4A
Other languages
Chinese (zh)
Other versions
CN110207704A (en
Inventor
朱超群
吕品
赖际舟
袁诚
叶素芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201910422538.4A priority Critical patent/CN110207704B/en
Publication of CN110207704A publication Critical patent/CN110207704A/en
Application granted granted Critical
Publication of CN110207704B publication Critical patent/CN110207704B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a pedestrian navigation method based on intelligent identification of a building stair scene. The method comprises the following steps: arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics; when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information; predicting the walking posture, speed and position of the pedestrian; judging whether the foot is in a zero-speed state, and if so, correcting the predicted speed and position information through a Kalman filter; and judging the walking scene of the pedestrian in the building, and if the pedestrian is in the state of going up and down the stairs, correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map. The invention improves the positioning precision of the pedestrian walking in the building with stairs.

Description

Pedestrian navigation method based on intelligent identification of building stair scene
Technical Field
The invention belongs to the technical field of pedestrian navigation, and particularly relates to a pedestrian navigation method based on a building stair scene.
Background
The pedestrian navigation system is an important branch in the navigation positioning field, has gained more and more attention of researchers in recent years, and is widely applied to the fields of rescue and emergency rescue, military operation and the like. Traditional pedestrian navigation mainly adopts GPS location technique, but the GPS signal has the signal loss phenomenon under indoor and urban environment, and civilian precision is relatively poor, can't satisfy people's indoor navigation demand. With the development of micro-electro-mechanical system (MEMS) technology, the advantages of small volume, low power consumption, light weight, portability and the like of the MEMS inertial measurement unit are gradually highlighted, and research on an indoor pedestrian navigation system based on the MEMS-IMU has also become a hotspot.
The inertial sensor has drift error accumulated by time, and is a main error source of course divergence of the pedestrian navigation position. Through the zero-speed correction algorithm, the speed error divergence can be restrained and the navigation precision can be improved. Due to poor observability of the position and course errors, the zero-speed correction algorithm cannot correct the position and course errors, and the track is diverged finally along with the accumulation of time.
Disclosure of Invention
In order to solve the technical problems mentioned in the background art, the invention provides a pedestrian navigation method based on intelligent identification of a building stair scene.
In order to achieve the technical purpose, the technical scheme of the invention is as follows:
a pedestrian navigation method based on intelligent identification of building stair scenes comprises the following steps:
(1) arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics;
(2) when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
(3) predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
(4) judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step (5);
(5) judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step (1), and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to the step (2).
Further, in step (1), the specific process of establishing the gait feature-based scene recognition model is as follows:
(101) recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to put the single gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
(102) note the book
Figure BDA0002066475480000021
Acquiring inertia sample data for jth gait, wherein N is the number of samples, and acquiring a characteristic vector T ═ c (acc)xmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
Figure BDA0002066475480000022
Figure BDA0002066475480000023
Figure BDA0002066475480000031
Figure BDA0002066475480000032
Figure BDA0002066475480000033
wherein the content of the first and second substances,
Figure BDA0002066475480000034
the acceleration of the body system relative to the inertia system at the nth time in the jth gait is in the body system XY, Z components on the axis;
Figure BDA0002066475480000035
the amount of the acceleration of the body system relative to the inertia system on the axis of the body system X, Y, Z at the nth moment in the jth gait; vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) and training a scene recognition model by adopting a random forest algorithm.
Further, the specific process of step (103) is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
the training set D is (X, Y), X is the acquired gait feature, Y is the label corresponding to each feature, and training samples (D) having the same size as D are randomly extracted from the data set D in a set-back manner (D, Y)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiExtracting L partial attributes from the total attribute S randomly and unreplaceably to form attribute set A of the decision treeiUsing the decision tree training model to train a decision tree;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
Further, in step (3), the attitude is predicted using the following equation:
Figure BDA0002066475480000041
Figure BDA0002066475480000042
Figure BDA0002066475480000043
Figure BDA0002066475480000044
in the above formula:
Figure BDA0002066475480000045
Figure BDA0002066475480000046
Figure BDA0002066475480000047
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,
Figure BDA0002066475480000048
the angular velocity of the machine relative to the inertial system at time k,
Figure BDA0002066475480000049
for the angular velocity of the machine system relative to the navigation system at time k
Figure BDA00020664754800000410
The component on the axis of the chassis X, Y, Z, at is the sample period,
Figure BDA00020664754800000411
is the attitude transition matrix at time k-1,
Figure BDA00020664754800000412
the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,
Figure BDA00020664754800000413
is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
Figure BDA0002066475480000051
Figure BDA0002066475480000052
Figure BDA0002066475480000053
in the above formula, the first step is,
Figure BDA0002066475480000054
respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,
Figure BDA0002066475480000055
the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
Figure BDA0002066475480000056
Figure BDA0002066475480000057
Figure BDA0002066475480000058
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
Further, in step (4), the method for determining whether the foot is in the zero velocity state is as follows:
and (3) judging the zero-speed moment by using a three-condition judgment method:
Figure BDA0002066475480000059
Figure BDA00020664754800000510
Figure BDA0002066475480000061
wherein the content of the first and second substances,
Figure BDA0002066475480000062
the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA0002066475480000063
is the component of the acceleration of the gantry relative to the inertial frame at time k on the axis of the gantry X, Y, Z;
Figure BDA0002066475480000064
in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&Representation and operation, ZUPT (k)) 1 indicates a zero speed state, and zupt (k) 0 indicates a non-zero speed state.
Further, in step (4), the process of correcting the predicted speed and position information by the kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure BDA0002066475480000065
Figure BDA0002066475480000066
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure BDA0002066475480000067
Figure BDA0002066475480000068
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1,
Figure BDA0002066475480000069
Figure BDA00020664754800000610
is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure BDA0002066475480000071
And
Figure BDA0002066475480000072
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure BDA0002066475480000073
And
Figure BDA0002066475480000074
the noise of the model (2) is,
Figure BDA0002066475480000075
and
Figure BDA0002066475480000076
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA0002066475480000077
and
Figure BDA0002066475480000078
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;
Figure BDA0002066475480000079
for the measurement noise at time k, diag represents the matrix diagonalization,
Figure BDA00020664754800000710
are respectively as
Figure BDA00020664754800000711
The noise of (2) is detected,
Figure BDA00020664754800000712
Figure BDA00020664754800000713
is the component of the carrier velocity on the axis of the navigation system X, Y, Z; h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
Figure BDA00020664754800000714
in the above formula, the first and second carbon atoms are,
Figure BDA00020664754800000715
is an estimate of the state quantity at time k,
Figure BDA00020664754800000716
the state variables from k-1 to k are predicted values in one step, L, lambda and h are longitude, latitude and height, and theta, gamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
Further, the specific process of step (5) is as follows:
(501) the scene association is performed using the following equation:
Figure BDA00020664754800000717
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the point as a related stair position point;
(502) and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure BDA0002066475480000081
Figure BDA0002066475480000082
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure BDA0002066475480000083
Figure BDA0002066475480000084
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the moment of filter k-1.
Figure BDA0002066475480000085
Figure BDA0002066475480000086
Is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure BDA0002066475480000087
And
Figure BDA0002066475480000088
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure BDA0002066475480000089
And
Figure BDA00020664754800000810
the noise of the model (2) is,
Figure BDA00020664754800000811
and
Figure BDA00020664754800000812
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA00020664754800000813
and
Figure BDA00020664754800000814
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]Lελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ]2×2 02×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
Adopt the beneficial effect that above-mentioned technical scheme brought:
the pedestrian walking scene identification method based on the established scene identification model identifies the pedestrian walking scene in real time, and if the pedestrian walking scene is in the stage of going up and down stairs, the position information of the pedestrian is corrected by using the known stair position. Compared with the traditional pedestrian navigation algorithm, the method can effectively improve the positioning precision of the pedestrian when the pedestrian walks in the building with stairs.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a diagram of a solution result obtained by a conventional method, including two sub-diagrams (a) and (b), which are a two-dimensional diagram and a three-dimensional diagram, respectively;
FIG. 3 is a diagram of the result of the solution obtained by the method of the present invention, which includes two sub-diagrams (a) and (b), respectively a two-dimensional diagram and a three-dimensional diagram.
Detailed Description
The technical scheme of the invention is explained in detail in the following with the accompanying drawings.
The invention designs a pedestrian navigation method based on intelligent identification of a building stair scene, which comprises the following steps as shown in figure 1.
Step 1: arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, establishing a gait feature library comprising scenes of walking on the flat ground, going upstairs and going downstairs, and establishing a scene recognition model based on gait features;
step 2: when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
and step 3: predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
and 4, step 4: judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step 5;
and 5: judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step 1, and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to step 2.
In this embodiment, step 1 is implemented by the following preferred scheme:
the specific process of establishing the gait feature-based scene recognition model is as follows:
101. data segmentation: recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to put the single gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
102. feature extraction: note the book
Figure BDA0002066475480000101
Acquiring inertia sample data for jth gait, wherein N is the number of samples, and acquiring a characteristic vector T ═ c (acc)xmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
Figure BDA0002066475480000102
Figure BDA0002066475480000111
Figure BDA0002066475480000112
Figure BDA0002066475480000113
Figure BDA0002066475480000114
wherein the content of the first and second substances,
Figure BDA0002066475480000115
is the component of the acceleration of the body system relative to the inertial system on the axis of the body system X, Y, Z at the nth time in the jth gait;
Figure BDA0002066475480000116
the amount of the acceleration of the body system relative to the inertia system on the axis of the body system X, Y, Z at the nth moment in the jth gait; vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) training a scene recognition model by adopting a random forest algorithm, and the specific process is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
the training set D is (X, Y), X is the acquired gait feature, Y is the label corresponding to each feature, and training samples (D) having the same size as D are randomly extracted from the data set D in a set-back manner (D, Y)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiThe number L is randomly extracted from the total attribute S without being replaced (usually
Figure BDA0002066475480000117
) Part of the attributes of (2), the attribute set A forming the decision treeiUsing the decision tree training model to train a decision tree; when node splitting is carried out, the splitting rule adopted is the principle of minimum Gini value, and the calculation formula is as follows;
Figure BDA0002066475480000118
in the above formula, plIs the probability that the sample point belongs to class i;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
In this embodiment, step 3 is implemented by the following preferred scheme:
attitude was predicted using the following equation:
Figure BDA0002066475480000121
Figure BDA0002066475480000122
Figure BDA0002066475480000123
Figure BDA0002066475480000124
in the above formula:
Figure BDA0002066475480000125
Figure BDA0002066475480000126
Figure BDA0002066475480000127
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,
Figure BDA0002066475480000128
the angular velocity of the machine relative to the inertial system at time k,
Figure BDA0002066475480000129
for the angular velocity of the machine system relative to the navigation system at time k
Figure BDA00020664754800001210
The component on the axis of the chassis X, Y, Z, at is the sample period,
Figure BDA00020664754800001211
is the attitude transition matrix at time k-1,
Figure BDA00020664754800001212
the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,
Figure BDA00020664754800001213
is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
Figure BDA0002066475480000131
Figure BDA0002066475480000132
Figure BDA0002066475480000133
in the above formula, the first and second carbon atoms are,
Figure BDA0002066475480000134
respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,
Figure BDA0002066475480000135
the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
Figure BDA0002066475480000136
Figure BDA0002066475480000137
Figure BDA0002066475480000138
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
In this embodiment, step 4 is implemented by using the following preferred scheme:
and (3) judging the zero-speed moment by using a three-condition judgment method:
Figure BDA0002066475480000139
Figure BDA00020664754800001310
Figure BDA0002066475480000141
wherein the content of the first and second substances,
Figure BDA0002066475480000142
the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA0002066475480000143
is the component of the acceleration of the gantry relative to the inertial frame at time k on the axis of the gantry X, Y, Z;
Figure BDA0002066475480000144
in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&And operation is shown, where zupt (k) 1 shows the zero speed state, and zupt (k) 0 shows the non-zero speed state.
The process of correcting the predicted speed and position information by the kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure BDA0002066475480000145
Figure BDA0002066475480000146
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure BDA0002066475480000147
Figure BDA0002066475480000148
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1,
Figure BDA0002066475480000149
Figure BDA00020664754800001410
is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure BDA00020664754800001411
And
Figure BDA00020664754800001412
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure BDA00020664754800001413
And
Figure BDA00020664754800001414
the noise of the model (2) is,
Figure BDA0002066475480000151
and
Figure BDA0002066475480000152
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA0002066475480000153
and
Figure BDA0002066475480000154
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;
Figure BDA0002066475480000155
for the measurement noise at time k, diag represents the matrix diagonalization,
Figure BDA0002066475480000156
are respectively as
Figure BDA0002066475480000157
The noise of (2) is detected,
Figure BDA0002066475480000158
Figure BDA0002066475480000159
is the component of the carrier velocity on the axis of the navigation system X, Y, Z; h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
Figure BDA00020664754800001510
in the above formula, the first and second carbon atoms are,
Figure BDA00020664754800001511
is an estimate of the state quantity at time k,
Figure BDA00020664754800001512
the state variable from k-1 to k is predicted in one step, L, lambda and h are longitude, latitude and height, thetaGamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
In this embodiment, step 5 is implemented by the following preferred scheme:
501. the scene association is performed using the following equation:
Figure BDA00020664754800001513
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the point as a related stair position point;
502. and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)Tin the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure BDA0002066475480000161
Figure BDA0002066475480000162
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure BDA0002066475480000163
Figure BDA0002066475480000164
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the moment of filter k-1.
Figure BDA0002066475480000165
Figure BDA0002066475480000166
Is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure BDA0002066475480000167
And
Figure BDA0002066475480000168
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure BDA0002066475480000169
And
Figure BDA00020664754800001610
the noise of the model (2) is,
Figure BDA00020664754800001611
and
Figure BDA00020664754800001612
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure BDA00020664754800001613
and
Figure BDA00020664754800001614
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]Lελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ] 202×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
In practical experiments, the model of the adopted inertia device is MTW awind a. Walk around the corridor in the experimental field, and walk along the flat ground, go up stairs and go down stairs. The total travel mileage is 1100 m. Under the same environment, the traditional navigation method and the navigation method are respectively adopted for resolving, the obtained results are respectively shown in fig. 2 and fig. 3, and the experimental precision is shown in table 1.
TABLE 1
Results of the experiment Conventional methods The method of the invention
Maximum position error 12m 1.9m
Course drift 13°
Positioning accuracy 1.1% 0.2%
The embodiments are only for illustrating the technical idea of the present invention, and the technical idea of the present invention is not limited thereto, and any modifications made on the basis of the technical scheme according to the technical idea of the present invention fall within the scope of the present invention.

Claims (6)

1. A pedestrian navigation method based on intelligent identification of a building stair scene is characterized by comprising the following steps:
(1) arranging an inertial sensor on the foot of a pedestrian, enabling the pedestrian to walk for multiple times in a scene in a building, wherein the scene comprises a flat ground walking scene, a stair climbing scene and a stair descending scene, and establishing a scene recognition model based on gait characteristics;
the specific process of establishing the gait feature-based scene recognition model is as follows:
(101) recording the j gait of the pedestrian as TjUsing the gait characteristics of the pedestrian to make the individual gait TjDivided into active data segments Tj1And inactive data segment Tj2Taking Tj1As the characteristic value range of the gait;
(102) note the book
Figure FDA0003005689670000011
Inertia for jth gait acquisitionSample data, where N is the number of samples, and then collecting a feature vector T ═ acxmin,accysk,acczmax,accmax,gyroxmean,gyroyvar,gyrozmax,gyrozmean,gyrozsk,gyrozvar,hdiff,Vmax) In the above formula:
Figure FDA0003005689670000012
Figure FDA0003005689670000013
Figure FDA0003005689670000014
Figure FDA0003005689670000015
Figure FDA0003005689670000016
hdiff=h(j+1)-h(j),
Figure FDA0003005689670000017
wherein the content of the first and second substances,
Figure FDA0003005689670000018
is the component of the acceleration of the body system relative to the inertial system on the axis of the body system X, Y, Z at the nth time in the jth gait;
Figure FDA0003005689670000019
the angular velocity of the body system relative to the inertia system at the nth time in the jth gait is on the X, Y, Z axisThe amount of (c); vn(jn) is the speed of the navigation system at the nth time in the jth gait; n is 1,2, …, N; h (j) is the initial height of the jth gait in the navigation system; e represents a mathematical expectation;
(103) training a scene recognition model by adopting a random forest algorithm;
(2) when a pedestrian walks in a building, acquiring inertial sensing data comprising acceleration information and gyroscope information;
(3) predicting the walking posture, speed and position of the pedestrian based on an inertial navigation algorithm;
(4) judging whether the foot is in a zero-speed state or not based on the inertial sensing data, and if so, correcting the predicted speed and position information through a Kalman filter; otherwise, directly entering the step (5);
(5) judging the walking scene of the pedestrian in the building based on the inertial sensing data and the scene recognition model established in the step (1), and correcting the position of the pedestrian through Kalman filtering based on the stair position in the building map if the pedestrian is in the state of going up and down the stairs; otherwise, jumping to the step (2).
2. The pedestrian navigation method based on intelligent identification of the building stair scene as claimed in claim 1, wherein the specific process of the step (103) is as follows:
(a) generating a training set for each decision tree by using bootstrap sampling:
memory training set D ═ X1,Y1),X1For the gait feature to be captured, Y1For each feature corresponding label, a training sample of the same size as D is randomly selected from the data set D (D)1,D2,...,Dn) Using each training sample DiTraining and constructing a decision tree;
(b) constructing a decision tree:
for training sample DiExtracting L partial attributes from the total attribute S randomly and unreplaceably to form attribute set A of the decision treeiUsing the decision tree training model to train a decision tree;
(c) generating a random forest:
and (4) marking the walking scene of the flat ground as 0, marking the walking scene of the flat ground as 1, marking the walking scene of the flat ground as 2, repeating the step (b), and training to obtain a plurality of decision trees to obtain a scene recognition model.
3. The pedestrian navigation method based on intelligent identification of the building stair scene in claim 1, wherein in the step (3), the posture is predicted by adopting the following formula:
Figure FDA0003005689670000031
Figure FDA0003005689670000032
Figure FDA0003005689670000033
Figure FDA0003005689670000034
in the above formula:
Figure FDA0003005689670000035
Figure FDA0003005689670000036
Figure FDA0003005689670000037
wherein q is0(k)、q1(k)、q2(k)、q3(k) Is the attitude quaternion at time k, q0(k-1)、q1(k-1)、q2(k-1)、q3(k-1) is the attitude quaternion at the time of k-1,
Figure FDA0003005689670000038
the angular velocity of the machine relative to the inertial system at time k,
Figure FDA0003005689670000039
for the angular velocity of the machine system relative to the navigation system at time k
Figure FDA00030056896700000310
The component on the axis of the chassis X, Y, Z, at is the sample period,
Figure FDA00030056896700000311
is the attitude transition matrix at time k-1,
Figure FDA00030056896700000312
the component of the angular velocity of the navigation system with respect to the inertial system at time k-1 on the axis X, Y, Z of the navigation system,
Figure FDA00030056896700000313
is the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k-1, L (k-1) and h (k-1) are the latitude and height of the carrier at the moment k-1, RM、RNRadius of meridian and unit of the earth, omegaieThe rotational angular velocity of the earth;
the velocity is predicted using the following equation:
Figure FDA0003005689670000041
Figure FDA0003005689670000042
Figure FDA0003005689670000043
in the above formula, the first and second carbon atoms are,
Figure FDA0003005689670000044
respectively estimates of the components of the acceleration of the body frame relative to the inertial frame on the axis of the body frame X, Y, Z,
Figure FDA0003005689670000045
the component of the carrier velocity on the axis of the navigation system X, Y, Z at the moment k, and g is the gravity acceleration;
the position is predicted using the following formula:
Figure FDA0003005689670000046
Figure FDA0003005689670000047
Figure FDA0003005689670000048
in the above formula, λ (k), L (k), h (k) are longitude, latitude and altitude at time k, and λ (k-1), L (k-1) and h (k-1) are longitude, latitude and altitude at time k-1.
4. The pedestrian navigation method based on intelligent identification of the building stair scene in claim 1, wherein in the step (4), the method for judging whether the feet are in the zero-speed state is as follows:
and (3) judging the zero-speed moment by using a three-condition judgment method:
Figure FDA0003005689670000049
Figure FDA00030056896700000410
Figure FDA0003005689670000051
wherein the content of the first and second substances,
Figure FDA0003005689670000052
the component of the acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure FDA0003005689670000053
is the component of the angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
Figure FDA0003005689670000054
in order to judge the threshold value, w is a sliding window, and the upper horizontal line represents the averaging operation;
the final zero-speed detection result is ZUPT (k) ═ C1&C2&C3,&And operation is shown, where zupt (k) 1 shows the zero speed state, and zupt (k) 0 shows the non-zero speed state.
5. The pedestrian navigation method based on intelligent identification of the building stair scene in the claim 1, wherein in the step (4), the process of correcting the predicted speed and position information through the Kalman filter is as follows:
(A) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)T
in the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure FDA0003005689670000055
Figure FDA0003005689670000056
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure FDA0003005689670000057
Figure FDA0003005689670000058
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1,
Figure FDA0003005689670000059
Figure FDA00030056896700000510
is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure FDA0003005689670000061
And
Figure FDA0003005689670000062
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure FDA0003005689670000063
And
Figure FDA0003005689670000064
the noise of the model (2) is,
Figure FDA0003005689670000065
and
Figure FDA0003005689670000066
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure FDA0003005689670000067
and
Figure FDA0003005689670000068
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z; superscript T represents matrix transposition;
(B) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k;
Figure FDA0003005689670000069
for the measurement noise at time k, diag represents the matrix diagonalization,
Figure FDA00030056896700000610
are respectively as
Figure FDA00030056896700000611
The noise of (2) is detected,
Figure FDA00030056896700000612
Figure FDA00030056896700000613
for carrier speed on axis of navigation system X, Y, ZA component of (a); h (k) is a k time measurement matrix, H (k) is [03×3 I3×3 03×3];
(C) Calculating a k-time Kalman filter state estimation value:
Figure FDA00030056896700000614
in the above formula, the first and second carbon atoms are,
Figure FDA00030056896700000615
is an estimate of the state quantity at time k,
Figure FDA00030056896700000616
the state variables from k-1 to k are predicted values in one step, L, lambda and h are longitude, latitude and height, and theta, gamma and psi are pitch angle, roll angle and yaw angle; y (k) ═ 000]TIs the measured value at the k moment;
(D) calculating an estimated mean square error of a Kalman filter at the k moment:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
6. The pedestrian navigation method based on intelligent identification of the building stair scene according to claim 1, wherein the specific process of the step (5) is as follows:
(501) the scene association is performed using the following equation:
Figure FDA0003005689670000071
in the above formula, sr (k) is a scene recognition result of the kth gait, 0 indicates that the gait of the pedestrian is walking on the flat ground, 1 indicates that the gait of the pedestrian is ascending stairs, 2 indicates that the gait is descending stairs, and "|" indicates "or";
acquiring a position coordinate closest to the current position of the pedestrian in a database, and selecting the position coordinate as a related stair position point;
(502) and (3) performing position correction by adopting Kalman filtering:
(502a) calculating a one-step predicted mean square error:
P(k|k-1)=A(k,k-1)P(k-1|k-1)A(k,k-1)T+G(k-1)W(k-1)G(k-1)T
in the above formula, A (k, k-1) is a filter one-step transfer matrix from the moment k-1 to the moment k of the filter;
Figure FDA0003005689670000072
Figure FDA0003005689670000073
I3×3is a 3 × 3 identity matrix, 03×3A zero matrix of 3 x 3, Δ T is the sampling period,
Figure FDA0003005689670000074
Figure FDA0003005689670000075
is the component of the acceleration of the gantry relative to the navigation system at time k on the axis of the gantry X, Y, Z;
p (k-1| k-1) is the state estimation mean square error at the moment k-1, and P (k | k-1) is the one-step prediction mean square error from the moment k-1 to the moment k;
g (k-1) is the filter noise coefficient matrix at the instant of filter k-1,
Figure FDA0003005689670000076
Figure FDA0003005689670000077
is an attitude transition matrix; w (k-1) [. epsilon ]rx εry εrz εax εay εaz]TIs the state noise at time k-1, epsilonrx、εryAnd εrzAre respectively as
Figure FDA0003005689670000078
And
Figure FDA0003005689670000079
model noise of (e ∈)ax、εayAnd εazAre respectively as
Figure FDA00030056896700000710
And
Figure FDA00030056896700000711
the noise of the model (2) is,
Figure FDA00030056896700000712
and
Figure FDA00030056896700000713
the amount of angular velocity of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z,
Figure FDA00030056896700000714
and
Figure FDA00030056896700000715
is the amount of acceleration of the frame relative to the inertial frame at time k on the axis of the frame X, Y, Z;
(502b) calculating the filtering gain of a Kalman filter at the k moment:
K(k)=P(k|k-1)H(k)T[H(k)P(k|k-1)H(k)T+R(k)]-1
in the above formula, k (k) is the filter gain at time k; r (k) ═ diag ([ epsilon ]L ελ]2) For the measurement noise at time k, diag denotes the matrix diagonalization, εL、ελNoise in longitude and latitude, respectively; h (k) is a k time measurement matrix, H (k) is [ I ]2×2 02×302×2 02×2],I2×2Is a 2 × 2 identity matrix, 02×3Is a 2 × 3 zero matrix, 02×2Is a zero matrix of 2 x 2;
(502c) calculating an estimated mean square error of the k-time extended Kalman filter:
P(k|k)=[I-K(k)H(k)]P(k|k-1)
in the above equation, P (k | k) is the estimated mean square error at time k, and I is the identity matrix.
CN201910422538.4A 2019-05-21 2019-05-21 Pedestrian navigation method based on intelligent identification of building stair scene Active CN110207704B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910422538.4A CN110207704B (en) 2019-05-21 2019-05-21 Pedestrian navigation method based on intelligent identification of building stair scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910422538.4A CN110207704B (en) 2019-05-21 2019-05-21 Pedestrian navigation method based on intelligent identification of building stair scene

Publications (2)

Publication Number Publication Date
CN110207704A CN110207704A (en) 2019-09-06
CN110207704B true CN110207704B (en) 2021-07-13

Family

ID=67787920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910422538.4A Active CN110207704B (en) 2019-05-21 2019-05-21 Pedestrian navigation method based on intelligent identification of building stair scene

Country Status (1)

Country Link
CN (1) CN110207704B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111076718B (en) * 2019-12-18 2021-01-15 中铁电气化局集团有限公司 Autonomous navigation positioning method for subway train
CN111368749B (en) * 2020-03-06 2023-06-13 创新奇智(广州)科技有限公司 Automatic identification method and system for stair area
CN111649742B (en) * 2020-05-08 2022-02-08 北京航空航天大学 Elevation estimation method based on ANFIS assistance
CN111596325A (en) * 2020-05-09 2020-08-28 北京东方海龙消防科技有限公司 Positioning search and rescue system
CN112418649B (en) * 2020-11-19 2022-11-11 东南大学 Building stair pedestrian flow estimation system based on multi-dimensional MEMS inertial sensor
CN113295158B (en) * 2021-05-14 2024-05-14 江苏大学 Indoor positioning method integrating inertial data, map information and pedestrian motion state
CN113720332B (en) * 2021-06-30 2022-06-07 北京航空航天大学 Floor autonomous identification method based on floor height model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105628027A (en) * 2016-02-19 2016-06-01 中国矿业大学 Indoor environment precise real-time positioning method based on MEMS inertial device
CN106017461A (en) * 2016-05-19 2016-10-12 北京理工大学 Pedestrian navigation system three-dimensional spatial positioning method based on human/environment constraints
CN106482733A (en) * 2016-09-23 2017-03-08 南昌大学 Zero velocity update method based on plantar pressure detection in pedestrian navigation
CN106908060A (en) * 2017-02-15 2017-06-30 东南大学 A kind of high accuracy indoor orientation method based on MEMS inertial sensor
CN108759814A (en) * 2018-04-13 2018-11-06 南京航空航天大学 A kind of quadrotor roll axis angular rate and pitch axis Attitude rate estimator method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102060723B1 (en) * 2013-01-18 2019-12-31 삼성전자주식회사 Apparatus for recognizing feature of user, method for establishing basis matrix based on orthogonal non-negative matrix factorization and method for establishing basis matrix based on orthogonal semi-supervised non-negative matrix factorization
CN109298389B (en) * 2018-08-29 2022-09-23 东南大学 Indoor pedestrian combination pose estimation method based on multi-particle swarm optimization

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105628027A (en) * 2016-02-19 2016-06-01 中国矿业大学 Indoor environment precise real-time positioning method based on MEMS inertial device
CN106017461A (en) * 2016-05-19 2016-10-12 北京理工大学 Pedestrian navigation system three-dimensional spatial positioning method based on human/environment constraints
CN106482733A (en) * 2016-09-23 2017-03-08 南昌大学 Zero velocity update method based on plantar pressure detection in pedestrian navigation
CN106908060A (en) * 2017-02-15 2017-06-30 东南大学 A kind of high accuracy indoor orientation method based on MEMS inertial sensor
CN108759814A (en) * 2018-04-13 2018-11-06 南京航空航天大学 A kind of quadrotor roll axis angular rate and pitch axis Attitude rate estimator method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李辰祥.基于MEMS行人惯性导航的零速度修正技术研究 李辰祥 I136-605.《中国优秀硕士学位论文全文数据库 信息科技辑》.2014,第I136-605页. *
行人自主导航算法研究;左德胜;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190115;第I136-1955页 *

Also Published As

Publication number Publication date
CN110207704A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN110207704B (en) Pedestrian navigation method based on intelligent identification of building stair scene
CN110118549B (en) Multi-source information fusion positioning method and device
US10267646B2 (en) Method and system for varying step length estimation using nonlinear system identification
CN110553643B (en) Pedestrian self-adaptive zero-speed updating point selection method based on neural network
CN108225304A (en) Based on method for rapidly positioning and system in Multiple Source Sensor room
CN104061934A (en) Pedestrian indoor position tracking method based on inertial sensor
CN108537101B (en) Pedestrian positioning method based on state recognition
JP6462692B2 (en) Autonomous mobile device
CN103968827A (en) Wearable human body gait detection self-localization method
CN108426582B (en) Indoor three-dimensional map matching method for pedestrians
EP3908805A1 (en) Method and system for tracking a mobile device
Deng et al. Foot-mounted pedestrian navigation method based on gait classification for three-dimensional positioning
CN111174781A (en) Inertial navigation positioning method based on wearable device combined target detection
CN109459028A (en) A kind of adaptive step estimation method based on gradient decline
CN108180923A (en) A kind of inertial navigation localization method based on human body odometer
CN113188557A (en) Visual inertial integrated navigation method fusing semantic features
Xia et al. Autonomous pedestrian altitude estimation inside a multi-story building assisted by motion recognition
CN116448103A (en) Pedestrian foot binding type inertial navigation system error correction method based on UWB ranging assistance
CN113848878B (en) Indoor and outdoor three-dimensional pedestrian road network construction method based on crowd source data
Lin et al. LocMe: Human locomotion and map exploitation based indoor localization
CN112729282B (en) Indoor positioning method integrating single anchor point ranging and pedestrian track calculation
CN114674317A (en) Self-correcting dead reckoning system and method based on activity recognition and fusion filtering
CN110332936B (en) Indoor motion trail navigation method based on multiple sensors
CN110766154B (en) Pedestrian track inference method, device, equipment and storage medium
CN107958118A (en) A kind of wireless signal acquiring method based on spatial relationship

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant