CN113720332B - Floor autonomous identification method based on floor height model - Google Patents

Floor autonomous identification method based on floor height model Download PDF

Info

Publication number
CN113720332B
CN113720332B CN202110762713.1A CN202110762713A CN113720332B CN 113720332 B CN113720332 B CN 113720332B CN 202110762713 A CN202110762713 A CN 202110762713A CN 113720332 B CN113720332 B CN 113720332B
Authority
CN
China
Prior art keywords
flr
str
sec
floor
total
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110762713.1A
Other languages
Chinese (zh)
Other versions
CN113720332A (en
Inventor
夏鸣
施闯
李团
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Geoelectron Co ltd
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110762713.1A priority Critical patent/CN113720332B/en
Publication of CN113720332A publication Critical patent/CN113720332A/en
Application granted granted Critical
Publication of CN113720332B publication Critical patent/CN113720332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a floor autonomous identification method based on a floor height model, which is based on a shoe embedded MEMS-IMU sensor, completes floor identification by constructing the floor height model, has autonomy, is not limited by weather conditions and is not interfered by external electromagnetic signals, does not need any extra hardware deployment and early-stage survey work, is suitable for floor estimation under scenes of fire fighting, disaster relief and the like, and does not need to acquire the floor height of a building in advance; meanwhile, the algorithm of the invention has low complexity and is easy to realize in engineering.

Description

Floor autonomous identification method based on floor height model
Technical Field
The invention relates to the field of indoor positioning, in particular to a floor autonomous identification method based on a floor height model.
Background
In recent years, urban high-rise buildings, basements and the like are seen everywhere, and the indoor movement track of pedestrians not only comprises horizontal plane movement, but also comprises vertical up-and-down movement in the vertical direction. In emergency rescue accidents such as fire scenes and the like, information in the vertical direction is often more important, because the height error of a few meters can cause misjudgment of building floors, and the positioning error of one floor is far more serious than the horizontal positioning error of a few meters. Accurate floor information can reduce a space search domain, improve rescue efficiency and provide guarantee for life safety of rescued people.
In the indoor, satellite positioning cannot provide effective altitude information due to occlusion. The air pressure height measurement method is influenced by environmental factors such as indoor humidity, humidity and the like, so that the positioning accuracy of elevation is reduced; the SLAM (Simultaneous Localization and Mapping) method acquires indoor three-dimensional information by constructing an indoor three-dimensional dense map, but the SLAM method has a large calculation amount and is influenced by illumination; other indoor positioning technologies include WIFI, UWB (Ultra Wide Band ), and the like, although the positioning accuracy is high, facilities need to be deployed in advance or a database needs to be established before use, a large amount of cost is consumed, and the practicability is not strong for the situations that electromagnetic interference is large in a fire scene, shielding is serious, and even power failure occurs.
With the continuous development of the MEMS (Micro Electro Mechanical System) technology, the positioning based on the IMU (Inertial Measurement Unit) does not require the prior deployment of equipment, and has the advantages of continuity, autonomy, and the like, and thus has received extensive attention of researchers. However, MEMS-IMUs are typically used only for horizontal positioning, and inertial vertical channel instability causes vertical displacement to diverge.
Disclosure of Invention
The method aims to solve the problems in the prior art, namely the problem of inertia height measurement divergence in the traditional inertia method. The invention provides a floor autonomous identification method based on a floor height model, which is not limited by meteorological conditions and is not interfered by external electromagnetic signals, does not need any extra hardware deployment and early-stage survey work, and can realize accurate identification of floors when walking in a complex environment only based on strapdown inertial positioning assisted by the floor model.
In order to achieve the technical purpose, the invention provides a floor autonomous identification method based on a floor height model, which mainly comprises the following steps:
1. the position of the heel is embedded with an MEMS-IMU sensor;
2. acquiring accelerometer and gyroscope data of an MEMS-IMU sensor, performing strapdown inertial resolution by combining the formula (1), and acquiring attitude, speed and position information of a pedestrian;
Figure BDA0003142237540000021
the superscripts n and b respectively represent a navigation coordinate system and a carrier coordinate system; vnAnd
Figure BDA0003142237540000022
respectively representing three-dimensional speed and differential thereof under a navigation coordinate system;
Figure BDA0003142237540000023
representing the differential of the three-dimensional position under the navigation coordinate system;
Figure BDA0003142237540000024
and
Figure BDA0003142237540000025
respectively representing the attitude matrix converted from the b system to the n system and the differential thereof; omegabRepresenting an antisymmetric matrix formed by gyroscope output angular velocities; f. ofbRepresenting specific force under a carrier coordinate system; gnRepresenting an earth gravity field vector;
3. based on the output of the accelerometer and the gyroscope, zero-speed update (ZUPT) detection is carried out, namely periodic touchdown detection is carried out on a fixed foot of the sensor in the walking process;
4. based on the zero-speed updating detection, the serial number of the gait cycle is recorded as m, and the static state in the mth gait cycle is recorded as StanmIn the quiescent state StanmAnd when the foot is close to the ground, the actual walking speed is zero, the three-dimensional speed V calculated based on the step two is not zero, and the difference between the actual walking speed and the calculated three-dimensional speed V is the static speed error, which is as follows:
dV(Stanm,i)=[0 0 0] (2)
wherein, Stanm,iWhen the step period is m, the ith sampling of the static state is carried out, i is an integer from 1 to the sampling number last of the static state of the step period, and the last values are different for different step periods;
5. and taking the speed error in the fourth step as an observed quantity, and correcting the three-dimensional speed, the three-dimensional position and the three-dimensional attitude of the positioning result obtained in the second step by adopting Extended Kalman Filtering (EKF), wherein a 15-dimensional error state vector in the extended Kalman filtering is defined as follows:
Figure BDA0003142237540000026
wherein, δ Pn、δVn、δφn、εbAnd
Figure BDA0003142237540000027
respectively representing a three-dimensional position error vector, a three-dimensional speed error vector, a three-dimensional attitude error vector, three-dimensional gyro drift and three-dimensional acceleration bias;
6. in the mth gait cycle, the pedestrian heading is denoted as yawm
yawm=-atan2(C1,C2) (4)
Wherein atan2 represents a value range of [ -pi, pi [ ]]In which pi is the circumference ratio, and takes a value of 3.14, C1,C2Are each Stanm,lastElements in the attitude matrix corresponding to the sampling time
Figure BDA0003142237540000028
And
Figure BDA0003142237540000029
7. according to the height change of each step, judging the motion state of the pedestrian as ascending stairs or descending stairs or plane walking, and sequentially indicating the motion state as +1, -1 and 0 as follows:
Figure BDA00031422375400000210
wherein σthA judgment threshold value representing the movement state of the person; clsmType of person movement state, δ P, for the m-th step period3(Stanm,last) The height difference of the static state of two adjacent step periods is obtained;
8. pedestrian at StanmHeight h of the timemCalculated by the following way:
Figure BDA0003142237540000031
wherein K is the height of each step and is acquired by a laser range finder or drawing data;
9. the floor where the person is located is represented by flr, the total number of floors transformed by the pedestrian is recorded as flr _ total, and the initial values are all set to be 1; the height of the floor is recorded as flr _ h (flr), and the floor _ h (1) is 0; the number of the current stair sections is recorded as str _ sec, the initial value is 0, and two to three stair sections are usually included between the stairs; the direction of the stair flight is denoted str _ sec _ dir (str _ sec);
10. if Cls ism+Cls m-11, and hm-flr _ h (flr _ total) > γ 1, then:
str_sec=str_sec+1 (7)
str_sec_dir(str_sec)=yawm (8)
wherein gamma 1 is a height threshold value larger than zero, m is a step cycle number, and yaw is the heading of the pedestrian;
11. if Cls ism+Clsm-1=1,hm-flr _ h (flr _ total) > γ 2, and α × pi/180 < | str _ sec _ dir (str _ sec) -str _ sec _ dir (str _ sec-1) | < β × pi/180, then:
flr=flr+1 (9)
flr_total=flr_total+1 (10)
flr_h(flr_total)=hm (11)
wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
12. If flr takes the value 0, then the value is increased by 1.
13. If Cls ism+Clsm-1Is not ═ 1, andand h ism-flr _ h (flr _ total) < - γ 1, then:
str_sec=str_sec+1 (12)
str_sec_dir(str_sec)=yawm (13)
wherein γ 1 is the same as defined in step (10) and is a height threshold greater than zero.
14. If Cls ism+Clsm-1=-1,hm-flr _ h (flr _ total) < - γ 2, and α × pi/180 < | str _ sec _ dir (str _ sec) -str _ sec _ dir (str _ sec-1) | < β × pi/180, then:
flr=flr-1 (14)
flr_total=flr_total+1 (15)
flr_h(flr_total)=hm (16)
wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
15. If flr takes the value 0, then the value is subtracted by 1.
16. Judging whether to finish the data acquisition according to whether the MEMS-IMU continues to acquire the data, and if so, finishing the operation; if not, return to step 2.
The floor recognition method is autonomous, is not limited by meteorological conditions and is not interfered by external electromagnetic signals, and does not need any extra hardware deployment, early-stage survey work and the like. The floor estimation method is suitable for floor estimation under scenes such as fire fighting and disaster relief; the floor height of the building does not need to be acquired in advance; meanwhile, the algorithm provided by the invention is low in complexity and easy to realize in engineering.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a general block diagram of the system.
Fig. 2 is a flow chart of floor autonomous identification based on floor height model assistance.
Fig. 3 is an example of floor identification.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The invention aims to provide accurate floor information for firemen, combat soldiers and the like without depending on external infrastructure or under the conditions of being not influenced by electromagnetic interference, meteorological conditions, object shielding and the like. The present invention is described in further detail below with reference to the attached drawing figures. The MEMS-IMU sensor is arranged at the heel position of a user, firstly, strapdown Inertial solution is carried out on the acceleration of the MEMS-IMU sensor and the output of a gyroscope under the traditional IEZ (Inertial Navigation System-Extended Kalman Filter-Zero Velocity Update, INS-EKF-ZUPT) framework; secondly, the model established based on the direction of the stair and the vertical motion state can realize the autonomous identification of the floor. A block diagram of a floor autonomous identification system based on floor height model assistance is shown in fig. 1. Fig. 2 is a flow chart of floor autonomous identification based on floor height model assistance, and the flow chart comprises the following steps:
1. the position of the heel is embedded with an MEMS-IMU sensor;
2. and (3) acquiring accelerometer and gyroscope data of the MEMS-IMU sensor, and performing strapdown inertial solution by combining the formula (1) to acquire the attitude, speed and position information of the pedestrian.
Figure BDA0003142237540000051
The superscripts n and b respectively represent a navigation coordinate system and a carrier coordinate system; vnAnd
Figure BDA0003142237540000052
respectively representing three-dimensional speed and differential thereof under a navigation coordinate system;
Figure BDA0003142237540000053
representing three-dimensional position differential under a navigation coordinate system;
Figure BDA0003142237540000054
and
Figure BDA0003142237540000055
respectively representing the attitude matrix converted from the b system to the n system and the differential thereof; omegabRepresenting an antisymmetric matrix formed by gyroscope output angular velocities; f. ofbRepresenting specific force under a carrier coordinate system; gnRepresenting the earth gravity field vector.
3. Based on the outputs of the accelerometer and gyroscope, ZUPT (Zero Velocity Update) detection is performed, i.e., periodic touchdown detection is performed on the fixed foot of the sensor during walking.
4. Based on the ZUPT detection method in the previous step, the serial number of the gait cycle is marked as m, and the static state in the mth gait cycle is marked as Stanm. In a quiescent state StanmAnd (3) internally, the actual walking speed is zero when the feet are close to the ground, the speed calculated based on the step (2) is not zero, and the difference between the actual walking speed and the speed is the speed error in the static state, which is as follows:
dV(Stanm,i)=[0 0 0] (2)
wherein, Stanm,iWhen the step period is m, the ith sampling is in a static state, i is an integer from 1 to last, and the last values are different for different step periods.
5. And (3) taking the speed error in the step (4) as an observed quantity, and correcting the positioning result obtained in the step (2) by adopting EKF (Extended Kalman Filter). The 15-dimensional error state vector in the EKF is defined as follows:
Figure BDA0003142237540000056
wherein, δ Pn、δVn、δφn、εbAnd
Figure BDA0003142237540000057
respectively representing a three-dimensional position error vector, a three-dimensional speed error vector, a three-dimensional attitude error vector, three-dimensional gyro drift and three-dimensional acceleration bias.
6. In the mth gait cycle, the pedestrian heading is denoted as yawm
yawm=-atan2(C1,C2) (4)
Wherein atan2 represents a value range of [ -pi, pi [ ]]An arctangent function of (pi) 3.14, C1,C2Respectively, the elements in the attitude matrix corresponding to the Stanm and last sampling time
Figure BDA0003142237540000058
And
Figure BDA0003142237540000059
7. according to the height change of each step, judging the motion state of the pedestrian { going upstairs, going downstairs and plane walking }, which are sequentially represented by +1, -1 and 0, as follows:
Figure BDA0003142237540000061
wherein σthA judgment threshold value representing a person motion state; clsmThe type of the motion state of the person; delta P3(Stanm,last) Is based on the step-to-step height difference obtained in step (2).
8. Pedestrian at StanmHeight h of the timemCalculated by the following way:
Figure BDA0003142237540000062
where K is the height of each step and needs to be obtained by, for example, a laser rangefinder or drawing data.
9. The floor where the person is located is represented by flr, the total number of floors changed by the pedestrian is recorded as flr _ total, and the initial values are all set to be 1; the height of the floor is recorded as flr _ h (flr), and the floor _ h (1) is 0; the number of the current stair sections is recorded as str _ sec, the initial value is 0, and two to three stair sections are usually contained between the stairs; the direction of the step is denoted as str _ sec _ dir (str _ sec).
10. If Cls ism+Cls m-11, and hm-flr _ h (flr _ total) > γ 1, then:
str_sec=str_sec+1 (7)
str_sec_dir(str_sec)=yawm (8)
where γ 1 is a height threshold greater than zero.
11. If Cls ism+Clsm-1=1,hm-flr _ h (flr _ total) > y2, and α × pi/180 < | str _ sec _ dir (str _ sec) -str _ sec _ dir (str sec-1) | < β × pi/180, then:
flr=flr+1 (9)
flr_total=flr_total+1 (10)
flr_h(flr_total)=hm (11)
wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
12. If flr takes the value 0, then the value is increased by 1.
13. If Cls ism+Classm-1Is 1, and hm-flr _ h (flr _ total) < - γ 1, then:
str_sec=str_sec+1 (12)
str_sec_dir(str_sec)=yawm (13)
where γ 1 is a height threshold greater than zero.
14. If Cls ism+Clsm-1=-1,hm-flr _ h (flr _ total) < - γ 2, and α × pi/180 < | str _ sec _ dir (str _ sec) -str _ sec _ dir (str _ sec-1) | < β × pi/180, then:
flr=flr-1(14)
flr_total=flr_total+1 (15)
flr_h(flr_total)=hm (16)
wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
15. If flr takes the value 0, then the value is subtracted by 1.
16. Judging whether to finish the data acquisition according to whether the MEMS-IMU continues to acquire the data, and if so, finishing the operation; if not, return to step 2.
Fig. 3 shows vertical displacement and floor information of a user when going up and down six floors indoors. Firstly, starting from a first-level corridor, and walking to a staircase; and then walk to the sixth floor. And then returns to the starting point along the original track. 3 stairs and 33 steps are contained between the first floor and the second floor; the two to six floors each contain 2 stairs and 26 steps. A total of 148 step cycles are experienced on stairs from one to six floors back to the first floor. The results show that: the accurate floor identification rate estimated by the patent reaches 100%.
All or part of the flow of the method of the embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and executed by a processor, to instruct related hardware to implement the steps of the embodiments of the methods. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (7)

1. A floor autonomous identification method based on a floor height model is characterized by comprising the following method steps:
embedding an MEMS-IMU sensor in a heel position;
collecting accelerometer and gyroscope data of the MEMS-IMU sensor, carrying out strapdown inertial resolution by combining the formula (1), and acquiring attitude, speed and position information of the pedestrian;
Figure FDA0003142237530000011
the superscripts n and b respectively represent a navigation coordinate system and a carrier coordinate system; vnAnd
Figure FDA0003142237530000012
respectively representing three-dimensional speed and differential thereof under a navigation coordinate system;
Figure FDA0003142237530000013
representing the differential of the three-dimensional position under the navigation coordinate system;
Figure FDA0003142237530000014
and
Figure FDA0003142237530000015
respectively representing the attitude matrix converted from the b system to the n system and the differential thereof; omegabRepresenting an antisymmetric matrix formed by gyroscope output angular velocities; f. ofbThe specific force under the carrier coordinate system is represented; gnRepresenting an earth gravity field vector;
performing zero-speed updating detection based on the output of the accelerometer and the gyroscope, namely performing periodic touchdown detection on a sensor fixing foot in the walking process;
step four, based on the zero-speed updating detection, the serial number of the gait cycle is recorded as m, and the static state in the mth gait cycle is recorded as StanmIn the quiescent state StanmAnd when the foot is close to the ground, the actual walking speed is zero, the three-dimensional speed V calculated based on the step two is not zero, and the difference between the actual walking speed and the calculated three-dimensional speed V is the static speed error, which is as follows:
dV(Stanm,i)=[0 0 0] (2)
wherein Stanm,iWhen the step period is m, the ith sampling of the static state is carried out, i is an integer from 1 to the sampling number last of the static state of the step period, and the last values are different for different step periods;
and step five, taking the speed error in the step four as an observed quantity, correcting the three-dimensional speed, the three-dimensional position and the three-dimensional attitude of the positioning result obtained in the step two by adopting extended Kalman filtering, wherein a 15-dimensional error state vector in the extended Kalman filtering is defined as follows:
Figure FDA0003142237530000016
wherein, δ Pn、δVn、δφn、εbAnd
Figure FDA0003142237530000017
respectively representing a three-dimensional position error vector, a three-dimensional speed error vector, a three-dimensional attitude error vector, three-dimensional gyro drift and three-dimensional acceleration bias;
step six, recording the pedestrian course as yaw in the mth gait cyclem
yawm=-atan2(C1,C2) (4)
Wherein atan2 represents a value range of [ -pi, pi [ ]]Where pi is the circumference ratio, taking the value 3.14, C1,C2Are each Stanm,lastElements in the attitude matrix corresponding to the sampling time
Figure FDA0003142237530000018
And
Figure FDA0003142237530000019
and seventhly, judging the motion state of the pedestrian as ascending stairs or descending stairs or plane walking according to the height change of each step, and sequentially representing the motion state by +1, -1 and 0 as follows:
Figure FDA0003142237530000021
wherein σthA judgment threshold value representing the movement state of the person; clsmType of person movement state, δ P, for the m-th step period3(Stanm,last) The height difference of the static state of two adjacent step periods is obtained;
step eight, pedestrians are at StanmHeight h of the timemCalculated by the following way:
Figure FDA0003142237530000022
k is the height of each step and needs to be obtained through a laser range finder or drawing data;
step nine, the floor where the person is located is represented by flr, the total number of the floors transformed by the pedestrian is recorded as flr _ total, and the initial values are all set to be 1; the height of the floor is recorded as flr _ h (flr), and the floor _ h (1) is 0; the number of the current stair sections is recorded as str _ sec, the initial value is 0, and two to three stair sections are contained between the stairs; the direction of the stair flight is denoted str _ sec _ dir (str _ sec);
step ten if Clsm+Clsm-11, and hm-flr_h(flr_total)>γ 1, then:
str_sec=str_sec+1 (7)
str_sec_dir(str_sec)=yawm (8)
wherein gamma 1 is a height threshold value larger than zero, m is a step cycle number, and yaw is the heading of the pedestrian;
step eleven if Clsm+Clsm-1=1,hm-flr_h(flr_total)>γ 2, and α x pi/180<|str_sec_dir(str_sec)-str_sec_dir(str_sec-1)|<β pi/180, then:
flr=flr+1 (9)
flr_total=flr_total+1 (10)
flr_h(flr_total)=hm (11)
wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
2. The floor autonomous recognition method of the floor height model according to claim 1, wherein if the value of flr is 0, the value is increased by 1.
3. Floor autonomous identification method of a floor height model according to claim 2, if Clsm+Clsm-1Is 1, and hm-flr_h(flr_total)<- γ 1, then:
str_sec=str_sec+1 (12)
str_sec_dir(str_sec)=yawm (13)
where γ 1 is a height threshold greater than zero.
4. Floor autonomous identification method of a floor height model according to claim 3, if Clsm+Clsm-1=-1,hm- flr_h(flr_total)<- γ 2, and α xpi/180<|str_sec_dir(str_sec)-str_sec_dir(str_sec-1)|<Beta x pi/180, then
flr=flr-1 (14)
flr_total=flr_total+1 (15)
flr_h(flr_total)=hm (16)
Wherein pi is 3.14; γ 2 is a height threshold greater than zero; both alpha and beta are angle thresholds greater than zero.
5. The floor autonomous identification method of a floor height model according to claim 4, wherein if the value of flr is 0, 1 is subtracted from the value.
6. The floor autonomous identification method of the floor height model according to claim 4, judging whether to end by whether the MEMS-IMU continues to collect data, and if so, ending the operation; if not, returning to the step two.
7. A readable storage medium, characterized by comprising a program or instructions for performing the method of any of claims 1 to 6 when the program or instructions are run on a computer.
CN202110762713.1A 2021-06-30 2021-06-30 Floor autonomous identification method based on floor height model Active CN113720332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110762713.1A CN113720332B (en) 2021-06-30 2021-06-30 Floor autonomous identification method based on floor height model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110762713.1A CN113720332B (en) 2021-06-30 2021-06-30 Floor autonomous identification method based on floor height model

Publications (2)

Publication Number Publication Date
CN113720332A CN113720332A (en) 2021-11-30
CN113720332B true CN113720332B (en) 2022-06-07

Family

ID=78673018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110762713.1A Active CN113720332B (en) 2021-06-30 2021-06-30 Floor autonomous identification method based on floor height model

Country Status (1)

Country Link
CN (1) CN113720332B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175447A (en) * 2010-02-24 2011-09-08 Zenrin Datacom Co Ltd Apparatus, method and program for generating three-dimensional position information
CN105898863A (en) * 2016-05-12 2016-08-24 西北工业大学 Indoor floor positioning method based on TOA (Time Of Arrival)
CN108426582A (en) * 2018-03-03 2018-08-21 北京工业大学 Three-dimensional map matching process in pedestrian room
CN109579832A (en) * 2018-11-26 2019-04-05 重庆邮电大学 A kind of personnel's height autonomous positioning algorithm
CN110207704A (en) * 2019-05-21 2019-09-06 南京航空航天大学 A kind of pedestrian navigation method based on the identification of architectural stair scene intelligent
CN111337026A (en) * 2020-02-17 2020-06-26 安徽建筑大学 Indoor positioning system
CN111649742A (en) * 2020-05-08 2020-09-11 北京航空航天大学 Elevation estimation method based on ANFIS assistance
CN111765887A (en) * 2020-07-10 2020-10-13 北京航空航天大学 Indoor three-dimensional positioning method based on MEMS sensor and FM broadcast signal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011175447A (en) * 2010-02-24 2011-09-08 Zenrin Datacom Co Ltd Apparatus, method and program for generating three-dimensional position information
CN105898863A (en) * 2016-05-12 2016-08-24 西北工业大学 Indoor floor positioning method based on TOA (Time Of Arrival)
CN108426582A (en) * 2018-03-03 2018-08-21 北京工业大学 Three-dimensional map matching process in pedestrian room
CN109579832A (en) * 2018-11-26 2019-04-05 重庆邮电大学 A kind of personnel's height autonomous positioning algorithm
CN110207704A (en) * 2019-05-21 2019-09-06 南京航空航天大学 A kind of pedestrian navigation method based on the identification of architectural stair scene intelligent
CN111337026A (en) * 2020-02-17 2020-06-26 安徽建筑大学 Indoor positioning system
CN111649742A (en) * 2020-05-08 2020-09-11 北京航空航天大学 Elevation estimation method based on ANFIS assistance
CN111765887A (en) * 2020-07-10 2020-10-13 北京航空航天大学 Indoor three-dimensional positioning method based on MEMS sensor and FM broadcast signal

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Chuanhua Lu 等."Indoor Positioning System Based on Chest-Mounted IMU".《Sensors》.2019,第19卷(第2期), *
Haque, F.1 等."A Sensor Fusion-Based Framework for Floor Localization".《IEEE Sensors Journal》.2019,第19卷(第2期), *
Li Cong 等."A Practical Floor Localization Algorithm Based on Multifeature Motion Mode Recognition Utilizing FM Radio Signals and Inertial Sensors".《Li Cong》.2020,第20卷(第15期), *
Ming Xia 等."Autonomous Pedestrian Altitude Estimation Inside a Multi-Story Building Assisted by Motion Recognition".《IEEE Access》.2020,(第8期), *
刘宇 等."基于MEMS传感器组合的行人室内高度定位算法".《压电与声光》.2019,第41卷(第5期), *
施闯 等."雾定位及其应用研究".《全球定位系统》.2019,第44卷(第5期), *

Also Published As

Publication number Publication date
CN113720332A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
Tong et al. A double-step unscented Kalman filter and HMM-based zero-velocity update for pedestrian dead reckoning using MEMS sensors
CN106017461B (en) Pedestrian navigation system three-dimensional fix method based on human body/environmental constraints
US9664521B2 (en) System and method for localizing a trackee at a location and mapping the location using signal-based features
Beauregard Omnidirectional pedestrian navigation for first responders
Romanovas et al. A study on indoor pedestrian localization algorithms with foot-mounted sensors
Ladetto et al. In step with INS navigation for the blind, tracking emergency crews
Ruppelt et al. High-precision and robust indoor localization based on foot-mounted inertial sensors
Zhang et al. Pedestrian motion based inertial sensor fusion by a modified complementary separate-bias Kalman filter
KR101642286B1 (en) Heading Orientation Estimation Method Using Pedestrian Characteristics in Indoor Environment
CN111649742B (en) Elevation estimation method based on ANFIS assistance
Ruppelt et al. A novel finite state machine based step detection technique for pedestrian navigation systems
Rydell et al. CHAMELEON: Visual-inertial indoor navigation
Xia et al. Autonomous pedestrian altitude estimation inside a multi-story building assisted by motion recognition
CN113720332B (en) Floor autonomous identification method based on floor height model
Zhang et al. Indoor localization using inertial sensors and ultrasonic rangefinder
Kim et al. Height estimation scheme of low-cost pedestrian dead-reckoning system using Kalman Filter and walk condition estimation algorithm
An et al. Three-dimensional indoor location estimation using single inertial navigation system with linear regression
CN113483753B (en) Inertial course error elimination method based on environmental constraint
Davidson Algorithms for autonomous personal navigation systems
Li et al. A robust humanoid robot navigation algorithm with ZUPT
Ascher et al. Using OrthoSLAM and aiding techniques for precise pedestrian indoor navigation
Hili et al. Pedestrian tracking through inertial measurements
Gui et al. Heading constraint algorithm for foot-mounted PNS using low-cost IMU
CN113483763B (en) Indoor personnel elevation estimation method with autonomy
Wang et al. 3D reconstruction of pedestrian trajectory with moving direction learning and optimal gait recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230223

Address after: 702, 704, No.7, Caibin Road, Science City, High-tech Industrial Development Zone, Guangzhou, Guangdong, 510700

Patentee after: GUANGZHOU GEOELECTRON Co.,Ltd.

Address before: 100191 No. 37, Haidian District, Beijing, Xueyuan Road

Patentee before: BEIHANG University

TR01 Transfer of patent right