WO2017215024A1 - Dispositif de navigation pour piétons et procédé basé sur une nouvelle technologie de fusion de multiples capteurs - Google Patents

Dispositif de navigation pour piétons et procédé basé sur une nouvelle technologie de fusion de multiples capteurs Download PDF

Info

Publication number
WO2017215024A1
WO2017215024A1 PCT/CN2016/087281 CN2016087281W WO2017215024A1 WO 2017215024 A1 WO2017215024 A1 WO 2017215024A1 CN 2016087281 W CN2016087281 W CN 2016087281W WO 2017215024 A1 WO2017215024 A1 WO 2017215024A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
imu
processing unit
raw data
heading angle
Prior art date
Application number
PCT/CN2016/087281
Other languages
English (en)
Chinese (zh)
Inventor
庄园
杨军
戚隆宁
李由
Original Assignee
东南大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 东南大学 filed Critical 东南大学
Publication of WO2017215024A1 publication Critical patent/WO2017215024A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Definitions

  • the invention relates to the field of multi-sensor fusion and pedestrian navigation, in particular to a pedestrian navigation device and method based on a novel multi-sensor fusion technology.
  • Pedestrian navigation technology based on wireless systems such as WiFi and Bluetooth usually has large fluctuations in wireless signal strength in harsh environments, and cannot provide complete navigation information such as three-dimensional position, velocity and attitude. System performance is highly dependent on the distribution and number of transmitting devices. , location information is not continuous smooth and other defects.
  • the pedestrian navigation technology based on the micro inertial unit is short-term accurate, but the navigation error accumulates faster. Camera-based visual positioning is characterized by slow calibration of visual sensors, high error rate of feature information extraction, and slow calculation of navigation information in complex environments. Therefore, multi-sensor fusion has become the mainstream solution for pedestrian navigation.
  • the existing multi-sensor fusion technology generally includes the following steps: (1) Calculating the position, velocity and attitude of the tracking object by the inertial mechanical programming algorithm with the measurement data of the inertial unit (three-axis accelerometer and three-axis gyroscope) (2) Establish an error model corresponding to the inertial mechanical programming algorithm and use it as a system model of the fusion filter; (3) Establish other auxiliary systems (GPS, WiFi, Bluetooth, RFID, GNSS, etc.) as fusion filters Observing the model; (4) estimating the state quantity error of the system through the prediction and update process of the fusion filter; and (5) compensating the inertial unit error of the system state quantity error and the position, velocity and attitude information based on the inertial mechanical programming algorithm Final location information, speed and attitude information.
  • the inertial mechanical programming algorithm three-axis accelerometer and three-axis gyroscope
  • the existing multi-sensor fusion technology has the following two fatal disadvantages: (1) in the absence of other auxiliary systems, the navigation error will accumulate rapidly; (2) when the inertial unit and the carrier are not fixed, for example: in pedestrian navigation Mobile phones and pedestrians, traditional multi-sensor fusion technology can not correctly estimate the carrier information. Therefore, the existing multi-sensor fusion technology cannot provide accurate pedestrian navigation information in many scenarios.
  • the technical problem to be solved by the present invention is to provide a pedestrian navigation device and method based on a novel multi-sensor fusion technology, which can improve the navigation accuracy and usability of pedestrians.
  • the present invention provides a pedestrian navigation device and method based on a novel multi-sensor fusion technology, including: a handheld intelligent device platform, an observation measurement processing unit, and a fusion filter; the handheld intelligent device acquires an inertial measurement unit by using its own hardware.
  • IMU Inertial Measurement Unit
  • IMU magnetometer
  • pressure gauge pressure gauge
  • WiFi Bluetooth Low Energy
  • GNSS Global Navigation Satellite System
  • observation processing unit handles handheld
  • the raw data provided by the intelligent device provides the positional or velocity observation to the fusion filter
  • the fusion filter uses the kinematic model as the system model, and the observation processing unit results in the observation model, and the fusion filter is processed to finally obtain the pedestrian navigation. result.
  • the handheld smart device platform comprises an IMU, a magnetometer, a pressure gauge, a WiFi, a low energy Bluetooth and a GNSS, which are common to existing smart devices;
  • the IMU provides raw data of acceleration and angular velocity;
  • the magnetometer provides raw data of geomagnetism
  • the pressure gauge provides raw data of atmospheric pressure;
  • WiFi provides raw data of Received Signal Strength (RSS);
  • BLE provides raw data of BLERSS;
  • GNSS receiver provides raw data of GNSS; any device of intelligent equipment can provide observation
  • Other sensors of information can be included in the proposed multi-sensor fusion algorithm.
  • the observation processing unit comprises: an IMU processing unit, a magnetometer processing unit, a pressure gauge processing unit, a WiFi processing unit, a BLE processing unit, a GNSS processing unit, and the like; and the IMU processing unit processes the original acceleration and angular velocity provided by the IMU.
  • the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter
  • the pressure gauge processing unit processes Raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitted to the fusion filter
  • the WiFi processing unit processes the RSS original data provided by the WiFi to obtain WiFi location information and transmit to the fusion filter
  • BLE The processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits to the fusion filter
  • the GNSS processing unit processes the position and velocity information provided by the GNSS receiver and transmits the information to the fusion filter.
  • the observation processing unit also includes other processing units to process other sensors of the smart device platform to obtain position or velocity information and transmit to the fusion filter.
  • the fusion filter comprises a system model and an observation model; the system model uses the kinematic model to predict the position and velocity information of the object to be measured, and transmits the information to the observation model; the observation model predicts the position of the system model,
  • the speed information is combined with information such as position, speed, etc. of the IMU, magnetometer, pressure gauge, WiFi, BLE, and GNSS provided by the observation processing unit to update the final position and speed information of the target to be tested.
  • the IMU processing unit comprises a user motion mode and a device usage pattern recognition module, a heading angle deviation estimation module, an improved dead reckoning algorithm module, a user motion mode and a device usage pattern recognition module according to the IMU of the handheld smart device platform and others
  • the raw data provided by optional hardware identifies user motion patterns such as stationary, walking, running, and the device usage modes of handheld, text messaging, telephone, navigation, pocket, backpack, etc.
  • the heading angle deviation estimation module is based on the user
  • the motion mode and the device use the user motion mode and device usage mode output by the pattern recognition module and the raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer) to estimate the heading angle deviation
  • the estimation algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimation module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer) and transmits the information to the fusion. filter.
  • the improved dead reckoning algorithm module comprises a attitude measuring system module, a heading angle deviation compensation module, a step detecting module, a step estimating module, a dead reckoning algorithm module, and the attitude determining system module is based on the IMU of the handheld smart device platform.
  • the original data provided by other optional magnetometers recognizes the attitude information of the handheld smart device; the heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm;
  • the detection module algorithm module detects the step number of the pedestrian to the step estimation module according to the original data of the IMU of the handheld smart device platform; the step estimation module according to the result of the step detection module and the IMU of the handheld smart device platform
  • the raw data estimates the step size of the pedestrian and feeds back to the dead reckoning module; the dead reckoning module estimates the step information output by the module according to the step size and the pedestrian heading angle output by the heading angle offset compensation module
  • the information calculates the IMU position observation and feeds back to the fusion filter.
  • the present invention also provides a pedestrian navigation method based on a novel multi-sensor fusion technology, comprising the following steps:
  • the handheld smart device uses its own hardware to obtain raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS; (2)
  • the observation processing unit processes the raw data provided by the handheld smart device to provide position or velocity measurement.
  • the fusion filter is used; (3) the fusion filter uses the kinematics model as the system model, and the observation processing unit is used to establish the observation model. After the fusion filter is processed, the pedestrian navigation result is finally obtained.
  • the observation processing unit processes the raw data provided by the handheld smart device to provide a position or velocity observation to the fusion filter, including the following steps:
  • the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU Position information is transmitted to the fusion filter;
  • the magnetometer processing unit processes the geomagnetic raw data provided by the magnetometer to obtain geomagnetic position information and transmits to the fusion filter;
  • the pressure gauge processing unit Processing raw data of atmospheric pressure provided by the pressure gauge to obtain elevation information and transmitting to the fusion filter;
  • the WiFi processing unit processes the RSS provided raw data to obtain WiFi location information and transmitting the information to the a fusion filter;
  • the BLE processing unit processes the RSS raw data provided by the BLE to obtain BLE location information and transmits the BLE location information to the fusion filter;
  • the GNSS processing unit processes the raw data of the GNSS provided by the GNSS chip The position and velocity information of the GNSS is obtained and transmitted to the fusion filter.
  • the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU position information and transmits the information to the fusion filter, including the following steps:
  • the user motion mode and device usage pattern recognition module identifies user motion patterns such as still, walking, running, etc. according to raw data provided by the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer). Handheld, SMS, phone, navigation, pocket, backpack and other device usage modes;
  • the heading angle deviation estimation module is provided according to the user motion mode and the device usage mode output by the user motion mode and the device usage mode recognition module, and the IMU of the handheld smart device platform and other optional hardware (eg, a magnetometer)
  • the raw data estimates the heading angle deviation
  • the improved dead reckoning algorithm module obtains IMU position information according to the heading angle deviation output by the heading angle deviation estimating module and the original data provided by the IMU of the handheld smart device platform and other optional hardware (for example, a magnetometer). And transmitted to the fusion filter.
  • the improved dead reckoning algorithm module comprises the following steps:
  • the attitude measuring system module identifies the posture information of the handheld smart device according to the original data provided by the IMU of the handheld smart device platform and other optional magnetometers;
  • the heading angle deviation compensation module reads the heading angle deviation outputted by the heading angle deviation estimation module and compensates the pedestrian heading angle and outputs to the dead reckoning algorithm;
  • the step detection module algorithm module detects the step number of the pedestrian feedback to the step size estimation module according to the original data of the IMU of the handheld smart device platform;
  • the step estimation module estimates the step size of the pedestrian according to the result of the step detection module and the original data of the IMU of the handheld smart device platform, and feeds back to the dead reckoning module;
  • the dead reckoning module calculates the IMU position observation based on the step information output by the step estimation module and the pedestrian heading angle information output by the heading angle offset compensation module, and feeds back to the fusion filter.
  • the invention has the beneficial effects that the invention optimizes the use method of the IMU in the traditional pedestrian navigation, liberates it from the system model of the fusion filter into an observation model, and overcomes the situation that the traditional multi-sensor is integrated without other auxiliary systems. Under the disadvantage that navigation errors will accumulate quickly.
  • the IMU processing module in the present invention considers various modes of handheld smart devices in daily life, and breaks through the limitation that the traditional multi-sensor fusion IMU needs to be fixed with the carrier. Therefore, the present invention greatly improves the accuracy and usability of pedestrian navigation.
  • FIG. 1 is a schematic structural diagram of a pedestrian navigation device based on a novel multi-sensor fusion according to the present invention.
  • FIG. 2 is a schematic structural view of an inertial unit processing module according to the present invention.
  • FIG. 3 is a schematic diagram of a Gaussian kernel support vector machine nonlinear classifier according to the present invention.
  • FIG. 4 is a schematic diagram of a user motion mode and a smart device usage pattern recognition support vector machine in the present invention.
  • FIG. 5 is a schematic diagram of an improved pedestrian dead reckoning algorithm in the present invention.
  • a pedestrian navigation device based on a novel multi-sensor fusion includes: a handheld smart device platform 1 , an observation measurement processing unit 2 , and a fusion filter 3 .
  • the handheld smart device 1 utilizes its own hardware to acquire an Inertial Measurement Unit (IMU) 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a Bluetooth Low Energy (BLE) 15 and a Global Navigation Satellite System (Global Navigation).
  • IMU Inertial Measurement Unit
  • BLE Bluetooth Low Energy
  • GNSS Global Navigation Satellite System
  • the observation processing unit 2 processes the raw data provided by the handheld smart device 1 to provide a position or velocity observation to the fusion filter 3, and the fusion filter 3 utilizes the kinematic model as the system model 31.
  • the result of the observation processing unit establishes the observation model 32, and the processing of the fusion filter 3 finally obtains the pedestrian navigation result.
  • the above novel multi-sensor fusion pedestrian navigation device can be used for various handheld smart devices (including smart phones, tablet computers, smart watches, etc.), and the handheld smart device can be held by hand or fixed on a pedestrian.
  • the new multi-sensor fusion pedestrian navigation system shown in Figure 1 subverts the use of IMU in traditional pedestrian navigation, liberating it from the system model 31 of the fusion filter into the observation model 32, overcoming the integration of traditional multi-sensor pedestrians.
  • the disadvantage that the navigation device can accumulate rapidly without other auxiliary systems.
  • the above-mentioned handheld smart device platform 1 includes an IMU 11, a magnetometer 12, a pressure gauge 13, a WiFi 14, a BLE 15, and a GNSS 16 which are common to existing smart devices; the IMU 11 provides raw data of acceleration and angular velocity; and the magnetometer 12 provides geomagnetism.
  • GNSS 16 provides raw speed and position data for GNSS. Any other sensor that can hold the observation information of the handheld smart device 1 can be included in the proposed multi-sensor fusion algorithm.
  • the above-described observation processing unit 2 includes an IMU processing unit 21, a magnetometer processing unit 22, a pressure gauge processing unit 23, a WiFi processing unit 24, a BLE processing unit 25, a GNSS processing unit 26, and the like.
  • the IMU processing unit 21 processes the raw data of the acceleration and angular velocity provided by the IMU 11 to obtain IMU position information and transmits it to the fusion filter 3;
  • the magnetometer processing unit 22 processes the original data of the geomagnetism provided by the magnetometer 12 to The geomagnetic position information is obtained and transmitted to the fusion filter 3;
  • the pressure gauge processing unit 23 processes the raw data of the atmospheric pressure provided by the pressure gauge 13 to obtain elevation information and transmits to the fusion filter 3;
  • the WiFi processing unit 24 Processing the RSS raw data provided by the WiFi 14 to obtain WiFi location information and transmitting the same to the fusion filter 3;
  • the BLE processing unit 25 processes the RSS raw data provided by the BLE 15 to obtain BLE location information and transmit the information to the fusion.
  • GNSS processing unit 26 processes the GNSS 16 raw data to obtain image position information and transmits to the fusion filter 3.
  • the observation processing unit 2 also includes other processing units to process other sensors of the handheld smart device platform 1 to obtain position or velocity information and transmit to the fusion filter 3.
  • the above-described fusion filter 3 includes a system model 31 and an observation model 32.
  • the system model 31 predicts the position and velocity information of the object to be measured using the kinematic model and transmits it to the observation model 32;
  • the observation model 32 predicts the position and velocity information of the system model 31 and the IMU based on the observation processing unit, and the magnetic force.
  • the information such as the position, speed, and the like of the gauge 12, the pressure gauge 13, the WiFi 14, the BLE 15, and the GNSS 16 are combined to update the final position and speed information of the target to be tested.
  • the IMU processing unit 21 includes a user motion mode and device usage mode recognition module 211, a heading angle deviation estimation module 212, a modified dead reckoning algorithm module 213, and a user motion mode and device usage pattern recognition module 211 according to the
  • the raw data provided by the IMU 11 and other optional hardware (such as the magnetometer 12) of the handheld smart device platform recognizes user motion patterns such as still, walking, running, and device usage modes such as handheld, text messaging, telephone, navigation, pocket, and backpack. .
  • the heading angle deviation estimation 212 module is based on the user motion mode and the device usage mode output by the user motion mode and the device usage pattern recognition module 211 and the IMU 11 and other optional hardware of the handheld smart device platform (eg, magnetometer 12)
  • the raw data provided estimates the heading angle deviation.
  • the improved dead reckoning algorithm module 213 obtains IMU position information based on the heading angle deviation output by the heading angle deviation estimating module 212 and the raw data provided by the handheld smart device platform 1IMU 11 and other optional hardware (eg, magnetometer 12). And transmitted to the fusion filter 3.
  • the IMU processing unit in Figure 2 takes into account multiple movements of pedestrians Modes and multiple use modes of smart devices, designed IMU data processing methods for a variety of use scenarios, breaking through the limitations of IMU and carrier fixed in traditional algorithms, improving the usability of pedestrian navigation systems.
  • the user motion mode and device usage pattern recognition module 211 uses existing handheld smart device 1 related sensor outputs: IMU 11, magnetometer 12, ranging sensor (optional), light sensor (optional).
  • the IMU 11 and magnetometer 12 update frequency is 50-200 Hz; the output of the latter two sensors is scalar and updated to trigger user behavior.
  • the user motion mode and the device use the pattern recognition algorithm to extract sensor statistics within 1-3 seconds to make a classification decision.
  • User motion patterns and device usage pattern recognition algorithms can be implemented in a variety of ways.
  • the present invention uses a Gaussian kernel dual type support vector machine as an example of implementation.
  • the Gaussian kernel-based support vector machine can implicitly map eigenvectors to infinite dimensional linear spaces to achieve or exceed the effects of nonlinear classifications (such as traditional KNN).
  • the prototype of the 1 norm soft marginal support vector machine is as follows:
  • w ⁇ R d is the weight vector
  • C is a controllable normalized constant for balancing the training
  • ⁇ ( ⁇ ) is a feature vector mapping function.
  • Participate in online classification calculations for example, to implement a nonlinear classifier based on randomly generated data as shown in Figure 3, KNN needs to store and use 1000 original training feature vectors, and support vector machines only need 142 support vectors) Therefore, the requirements for processor battery consumption and system memory are greatly reduced, and the application is suitable for the handheld smart device platform.
  • the user motion mode and device usage pattern recognition module 211 classifies the pedestrian behavior patterns into five categories: 1. stationary; 2. walking; 3. running; 4. bicycle; 5. driving a car.
  • the identification of pedestrian behavior patterns can be used to apply zero-speed correction, to adjust the variance of the tracking filter process noise, and to adjust the correlation time of the dynamic system Markov process.
  • the user motion mode and the device usage mode recognition module 211 classify the device usage modes into four categories: 1. the front end is flat; 2. the ear side is vertical; 3. the backpack; 4. the armband.
  • the identification of the handset gesture mode can be used for the determination of the heading direction (coordinate transformation) and for adjusting the variance of the tracking filter process noise.
  • FIG. 4 a schematic diagram of a support vector machine for user motion mode and device usage pattern identification using a secondary classifier, including acceleration statistics 2111, angular velocity statistics 2112, rotation angle and tilt angle statistics 2113, light and distance statistics A quantity 2114, a speed feedback statistic 2115, a feature normalization module 2116, a principal component analysis module 2117, Support vector machine module 2118, user motion mode primary classifier 2119, and device usage mode secondary classifier 2110.
  • the specific implementation steps include: collecting representative data sets offline, performing eigenvector normalization and principal component analysis, applying formulas (5) and (6) for training, extracting and storing support vectors, and calculating sensor output statistics online.
  • Feature vector normalization and principal component extraction (same as the training concentration factor), application of the stored support vector, and equations (5) and (7) for secondary classification, determining user motion patterns and device usage patterns.
  • the heading angle deviation estimation module 212 in the present invention includes a number of different methods.
  • PCA Principal Component Analysis
  • One of the characteristics of pedestrian movement is that the direction of pedestrian acceleration and deceleration is in the direction of travel. Therefore, the data of the accelerometer can be analyzed by PCA to obtain the traveling direction of the pedestrian.
  • GNSS 16 the direction in which pedestrians travel can be calculated from the speed of the GNSS.
  • the magnetometer 12 the direction in which the pedestrian travels can also be calculated by the magnetometer 12.
  • the heading angle of the handheld smart device is derived from a nine-axis fusion or a six-axis fusion. Therefore, the heading angle deviation can be obtained by subtracting the convergence direction of the traveling direction of the pedestrian and the heading angle of the hand-held smart device obtained by various methods, and outputting to the heading angle deviation compensation module 2132.
  • FIG. 5 it is a schematic diagram of the improved dead reckoning algorithm module 213, including a attitude measuring system module 2131, a heading angle deviation compensation module 2132, a step detecting module 2133, a step estimating module 2134, and a dead reckoning algorithm module 2135.
  • the attitude system module 2131 identifies the gesture information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12.
  • the heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and compensates the pedestrian heading angle to the dead reckoning algorithm module 2135.
  • the step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform and feeds back to the step estimation module 2134.
  • the step estimation module 2134 estimates the step size of the pedestrian based on the result of the step detection module and the original data of the IMU 11 of the handheld smart device platform 1 and feeds back to the dead reckoning module 2135.
  • the dead reckoning module 2135 calculates the IMU position observation based on the step information output by the step estimating module 2134 and the heading information output by the heading angle offset compensation module 2132, and outputs the IMU position observation to the fusion filter 3.
  • the attitude measurement system module 2131 identifies the attitude information of the handheld smart device 1 based on the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12.
  • the attitude measurement system module 2131 uses an algorithm to select a nine-axis attitude determination algorithm or a six-axis attitude determination algorithm according to whether or not the geomagnetic information is available.
  • the attitude measuring system module 2131 outputs the heading angle of the smart device 1 to the heading angle deviation compensation module 2132.
  • the heading angle deviation compensation module 2132 reads the heading angle deviation output by the heading angle deviation estimation module 212 and The compensation is given to the pedestrian heading angle and output to the dead reckoning algorithm module 2135.
  • the specific calculation formula is as follows:
  • ⁇ p ⁇ d + ⁇ offset . (1)
  • ⁇ p the pedestrian heading angle
  • ⁇ d the heading angle of the equipment
  • ⁇ offset the heading angle deviation
  • the step detection module 2133 detects the step number of the pedestrian from the raw data of the IMU 11 of the handheld smart device platform 1 and feeds back to the step estimation module 2134.
  • Pace detection can detect the pace by means of peak detection, zero-crossing detection, correlation detection and power spectrum detection.
  • the present invention contemplates a variety of user motion modes and device usage modes.
  • the step detection algorithm uses peak detection to simultaneously detect acceleration data and gyroscope data of the IMU 11.
  • the step estimation module 2134 estimates the step size of the pedestrian according to the result of the step detection module 2133 and the original data of the IMU 11 of the handheld smart device platform 1, and outputs the step to the dead reckoning module 2135.
  • the step size estimation can be calculated by different methods such as acceleration integral, pendulum model, linear model, and empirical model.
  • the present invention contemplates a variety of user motion patterns and device usage patterns, and the step size estimation uses the following linear model:
  • A, B and C are constants
  • f k-1 and f k are the step frequencies at k-1 and k
  • ⁇ acc, k-1 and ⁇ acc, k are the accelerations at k-1 and k
  • the variance of the meter
  • the dead reckoning module 2135 is based on the position [r e,k-1 r n,k-1 ] T at the k-1 time, the step information s k-1,k output by the step estimation module 2134 , and the heading angle offset compensation module.
  • the heading angle information ⁇ k-1 outputted by 2132 derives the position [r e,k r n,k ] T at time k .
  • the corresponding calculation formula is as follows:
  • the dead reckoning module 2135 outputs an IMU position measurement to the fusion filter 3.
  • the fusion filter 3 includes a system model 31 and an observation model 32.
  • the traditional multi-sensor fusion structure generally processes the IMU measurement data through an inertial mechanical orchestration algorithm and establishes a related fusion filter system model. Since there are many integral operations in the inertial mechanical arrangement, the positioning error of the conventional multi-sensor fusion structure is rapidly accumulated when there is no external auxiliary system.
  • the present invention overcomes the shortcomings of the conventional multi-sensor fusion structure, and uses the pedestrian motion model as a system model, and the IMU related data is used as an observation model like other systems.
  • the fusion filter 3 can adopt Kalman Filter (KF), Adaptive Kalman Filter (AKF), UKF Lossless Kalman Filter (UKF) or particles.
  • Filter (PF) The present invention gives a design example of KF.
  • Other filters can refer to the KF design.
  • the state vector implemented by the fusion filter 3 as KF is defined as follows:
  • rn and ru are three-dimensional positions (Northeast celestial coordinate system), and ve, vn and vu are corresponding three-dimensional velocity components.
  • the KF system model 32 uses a classical kinematic model and is defined as follows:
  • k is the predicted state vector
  • k is the previous state vector at time k
  • ⁇ k,k+1 is a 6 ⁇ 6 transition matrix
  • ⁇ t is the time difference between two moments.
  • ⁇ k is the covariance matrix
  • the processing noise is defined as follows:
  • n e , n n and n u are Gaussian white noise, and ⁇ t is the time difference between two moments.
  • the measurement model 31 implemented by the fusion filter 3 as KF is defined as follows:
  • z k is the measurement vector and H k is the decision matrix.
  • ⁇ k is the measurement noise with Gaussian white noise as the model, and its covariance matrix is z k and H k vary depending on the measurement.
  • the typical z k and H k are defined as follows:
  • the KF process has two phases: forecasting and updating.
  • the prediction process the state vector and the covariance matrix are predicted based on the system model.
  • the state vector and covariance matrix are updated according to the measurement model:
  • K k is called the Kalman gain

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un appareil et un procédé de navigation pour piétons basés sur une nouvelle technologie de fusion de multiples capteurs. L'appareil comprend un dispositif intelligent portatif (1), une unité de traitement d'observation (2) et un filtre de fusion (3). Le procédé comprend les étapes suivantes : le dispositif intelligent portatif (1) acquiert, au moyen d'un matériel correspondant, des données originales d'une IMU (11), d'un magnétomètre (12), d'un manomètre (13), de WiFi (14), de BLE (15) et de système mondial de navigation par satellite (GNSS) (16) ; l'unité de traitement d'observation (2) traite les données originales fournies par le dispositif intelligent portatif (1) pour fournir des quantités observées de position et de vitesse au filtre de fusion (3) ; et le filtre de fusion (3) obtient un résultat de navigation pour piétons en utilisant un modèle cinématique en tant que modèle de système et un modèle d'observation qui est établi selon un résultat de traitement par l'unité de traitement d'observation (2) ainsi que par le traitement du filtre de fusion (3). Le procédé, sans l'aide d'autres systèmes supplémentaires, surmonte le problème d'erreurs de navigation rapidement accumulées. Une unité de traitement d'IMU (21) prend en compte de multiples modes du dispositif intelligent portatif (1) et surmonte une limitation selon laquelle de multiples capteurs classiques combinés à une IMU doivent être fixés à un support, ce qui améliore la précision de la navigation pour piétons.
PCT/CN2016/087281 2016-06-16 2016-06-27 Dispositif de navigation pour piétons et procédé basé sur une nouvelle technologie de fusion de multiples capteurs WO2017215024A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610431107.0A CN106017454B (zh) 2016-06-16 2016-06-16 一种基于多传感器融合技术的行人导航装置和方法
CN201610431107.0 2016-06-16

Publications (1)

Publication Number Publication Date
WO2017215024A1 true WO2017215024A1 (fr) 2017-12-21

Family

ID=57089015

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/087281 WO2017215024A1 (fr) 2016-06-16 2016-06-27 Dispositif de navigation pour piétons et procédé basé sur une nouvelle technologie de fusion de multiples capteurs

Country Status (2)

Country Link
CN (1) CN106017454B (fr)
WO (1) WO2017215024A1 (fr)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109682372A (zh) * 2018-12-17 2019-04-26 重庆邮电大学 一种结合建筑物结构信息与rfid标定的改进型pdr方法
CN110132257A (zh) * 2019-05-15 2019-08-16 吉林大学 基于多传感器数据融合的人体行为预测方法
CN110427046A (zh) * 2019-07-26 2019-11-08 沈阳航空航天大学 一种三维平滑随机游走无人机群移动模型
CN110764506A (zh) * 2019-11-05 2020-02-07 广东博智林机器人有限公司 移动机器人的航向角融合方法、装置和移动机器人
CN111174780A (zh) * 2019-12-31 2020-05-19 同济大学 盲人道路惯导定位系统
CN111811502A (zh) * 2020-07-10 2020-10-23 北京航空航天大学 一种运动载体多源信息融合导航方法及系统
CN112268557A (zh) * 2020-09-22 2021-01-26 宽凳(北京)科技有限公司 一种城市场景实时高精定位方法
CN112556696A (zh) * 2020-12-03 2021-03-26 腾讯科技(深圳)有限公司 一种对象定位方法、装置、计算机设备以及存储介质
CN112747754A (zh) * 2019-10-30 2021-05-04 北京初速度科技有限公司 一种多传感器数据的融合方法、装置及系统
CN113008224A (zh) * 2021-03-04 2021-06-22 国电瑞源(西安)智能研究院有限公司 一种融合多传感器的室内外自适应导航系统及方法
CN113029153A (zh) * 2021-03-29 2021-06-25 浙江大学 基于智能手机多传感器融合和svm分类的多场景pdr定位方法
CN113229804A (zh) * 2021-05-07 2021-08-10 陕西福音假肢有限责任公司 一种用于关节活动度的磁场数据融合电路及其方法
CN113655439A (zh) * 2021-08-31 2021-11-16 上海第二工业大学 一种改进粒子滤波的室内定位方法
CN113790722A (zh) * 2021-08-20 2021-12-14 北京自动化控制设备研究所 一种基于惯性数据时频域特征提取的行人步长建模方法
WO2024082214A1 (fr) * 2022-10-20 2024-04-25 Telefonaktiebolaget Lm Ericsson (Publ) Positionnement de cible amélioré à l'aide de multiples équipements terminaux

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767784B (zh) * 2016-12-21 2019-11-08 上海网罗电子科技有限公司 一种蓝牙训练惯导的消防精确室内定位方法
CN107328406B (zh) * 2017-06-28 2020-10-16 中国矿业大学(北京) 一种基于多源传感器的矿井移动目标定位方法与系统
CN107990901B (zh) * 2017-11-28 2021-03-12 元力云网络有限公司 一种基于传感器的用户方向定位方法
CN107943042B (zh) * 2017-12-06 2021-04-27 东南大学 一种地磁指纹数据库自动化构建方法与装置
CN110118549B (zh) * 2018-02-06 2021-05-11 刘禹岐 一种多源信息融合定位方法和装置
CN108413968B (zh) * 2018-07-10 2018-10-09 上海奥孛睿斯科技有限公司 一种运动识别的方法和系统
CN111984853B (zh) * 2019-05-22 2024-03-22 北京车和家信息技术有限公司 试驾报告生成方法和云端服务器
CN110849392A (zh) * 2019-11-15 2020-02-28 上海有个机器人有限公司 一种机器人的里程计数据校正方法及机器人
CN110986941B (zh) * 2019-11-29 2021-09-24 武汉大学 一种手机安装角的估计方法
CN111174781B (zh) * 2019-12-31 2022-03-04 同济大学 一种基于可穿戴设备联合目标检测的惯导定位方法
CN111256709B (zh) * 2020-02-18 2021-11-02 北京九曜智能科技有限公司 基于编码器和陀螺仪的车辆航位推算定位方法及装置
CN112379395B (zh) * 2020-11-24 2023-09-05 中国人民解放军海军工程大学 一种定位导航授时系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103968827A (zh) * 2014-04-09 2014-08-06 北京信息科技大学 一种可穿戴式人体步态检测的自主定位方法
CN104931049A (zh) * 2015-06-05 2015-09-23 北京信息科技大学 一种基于运动分类的行人自主定位方法
WO2016042296A2 (fr) * 2014-09-15 2016-03-24 Isis Innovation Limited Détermination de la position d'un dispositif mobile dans une zone géographique
CN105433949A (zh) * 2014-09-23 2016-03-30 飞比特公司 混合角运动传感器
CN105588566A (zh) * 2016-01-08 2016-05-18 重庆邮电大学 一种基于蓝牙与mems融合的室内定位系统及方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102175463B (zh) * 2011-02-12 2012-08-22 东南大学 一种基于改进卡尔曼滤波的汽车路试制动性能检测方法
CN103759730B (zh) * 2014-01-16 2016-06-29 南京师范大学 一种基于导航信息双向融合的行人与智能移动载体的协同导航系统及其导航方法
CN104613963B (zh) * 2015-01-23 2017-10-10 南京师范大学 基于人体运动学模型的行人导航系统与导航定位方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103968827A (zh) * 2014-04-09 2014-08-06 北京信息科技大学 一种可穿戴式人体步态检测的自主定位方法
WO2016042296A2 (fr) * 2014-09-15 2016-03-24 Isis Innovation Limited Détermination de la position d'un dispositif mobile dans une zone géographique
CN105433949A (zh) * 2014-09-23 2016-03-30 飞比特公司 混合角运动传感器
CN104931049A (zh) * 2015-06-05 2015-09-23 北京信息科技大学 一种基于运动分类的行人自主定位方法
CN105588566A (zh) * 2016-01-08 2016-05-18 重庆邮电大学 一种基于蓝牙与mems融合的室内定位系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HUANG, CHENGKAI: "Application of Multiple Sensor Information Fusion Algorithms in Indoor Positioning", ELECTRONIC TECHNOLOGY & INFORMATION SCIENCE , CHINA MASTER'S THESES FULL-TEXT DATABASE, 15 August 2015 (2015-08-15), ISSN: 1674-0246 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109682372A (zh) * 2018-12-17 2019-04-26 重庆邮电大学 一种结合建筑物结构信息与rfid标定的改进型pdr方法
CN109682372B (zh) * 2018-12-17 2022-10-18 重庆邮电大学 一种结合建筑物结构信息与rfid标定的改进型pdr方法
CN110132257A (zh) * 2019-05-15 2019-08-16 吉林大学 基于多传感器数据融合的人体行为预测方法
CN110427046B (zh) * 2019-07-26 2022-09-30 沈阳航空航天大学 一种三维平滑随机游走无人机群移动模型
CN110427046A (zh) * 2019-07-26 2019-11-08 沈阳航空航天大学 一种三维平滑随机游走无人机群移动模型
CN112747754A (zh) * 2019-10-30 2021-05-04 北京初速度科技有限公司 一种多传感器数据的融合方法、装置及系统
CN110764506A (zh) * 2019-11-05 2020-02-07 广东博智林机器人有限公司 移动机器人的航向角融合方法、装置和移动机器人
CN110764506B (zh) * 2019-11-05 2022-10-11 广东博智林机器人有限公司 移动机器人的航向角融合方法、装置和移动机器人
CN111174780A (zh) * 2019-12-31 2020-05-19 同济大学 盲人道路惯导定位系统
CN111811502A (zh) * 2020-07-10 2020-10-23 北京航空航天大学 一种运动载体多源信息融合导航方法及系统
CN112268557A (zh) * 2020-09-22 2021-01-26 宽凳(北京)科技有限公司 一种城市场景实时高精定位方法
CN112268557B (zh) * 2020-09-22 2024-03-05 宽凳(湖州)科技有限公司 一种城市场景实时高精定位方法
CN112556696A (zh) * 2020-12-03 2021-03-26 腾讯科技(深圳)有限公司 一种对象定位方法、装置、计算机设备以及存储介质
CN113008224A (zh) * 2021-03-04 2021-06-22 国电瑞源(西安)智能研究院有限公司 一种融合多传感器的室内外自适应导航系统及方法
CN113029153A (zh) * 2021-03-29 2021-06-25 浙江大学 基于智能手机多传感器融合和svm分类的多场景pdr定位方法
CN113029153B (zh) * 2021-03-29 2024-05-28 浙江大学 基于智能手机多传感器融合和svm分类的多场景pdr定位方法
CN113229804A (zh) * 2021-05-07 2021-08-10 陕西福音假肢有限责任公司 一种用于关节活动度的磁场数据融合电路及其方法
CN113790722A (zh) * 2021-08-20 2021-12-14 北京自动化控制设备研究所 一种基于惯性数据时频域特征提取的行人步长建模方法
CN113790722B (zh) * 2021-08-20 2023-09-12 北京自动化控制设备研究所 一种基于惯性数据时频域特征提取的行人步长建模方法
CN113655439A (zh) * 2021-08-31 2021-11-16 上海第二工业大学 一种改进粒子滤波的室内定位方法
WO2024082214A1 (fr) * 2022-10-20 2024-04-25 Telefonaktiebolaget Lm Ericsson (Publ) Positionnement de cible amélioré à l'aide de multiples équipements terminaux

Also Published As

Publication number Publication date
CN106017454A (zh) 2016-10-12
CN106017454B (zh) 2018-12-14

Similar Documents

Publication Publication Date Title
WO2017215024A1 (fr) Dispositif de navigation pour piétons et procédé basé sur une nouvelle technologie de fusion de multiples capteurs
EP2946167B1 (fr) Procédé et appareil de détermination de désalignement entre un dispositif et un piéton
Ban et al. Indoor positioning method integrating pedestrian Dead Reckoning with magnetic field and WiFi fingerprints
US11041725B2 (en) Systems and methods for estimating the motion of an object
US10652696B2 (en) Method and apparatus for categorizing device use case for on foot motion using motion sensor data
US10429196B2 (en) Method and apparatus for cart navigation
US20170176188A1 (en) Apparatus and methods for ultrasonic sensor navigation
EP3077992B1 (fr) Procédé et système pour déterminer la position d'un objet en fusionnant des caractéristiques de mouvement et des images de l'objet
Wang et al. Pedestrian dead reckoning based on walking pattern recognition and online magnetic fingerprint trajectory calibration
US8775128B2 (en) Selecting feature types to extract based on pre-classification of sensor measurements
US8473241B2 (en) Navigation trajectory matching
US8332180B2 (en) Determining user compass orientation from a portable device
US20160069690A1 (en) Method and apparatus for using map information aided enhanced portable navigation
US10302434B2 (en) Method and apparatus for determining walking direction for a pedestrian dead reckoning process
EP3814864A1 (fr) Systèmes et procédés de suivi de machine autonome et de localisation d'objets mobiles
US10837794B2 (en) Method and system for characterization of on foot motion with multiple sensor assemblies
US9818037B2 (en) Estimating heading misalignment between a device and a person using optical sensor
Filardo et al. C-IPS: A smartphone based indoor positioning system
KR20160004084A (ko) 실내 보행자 위치 추적 장치 및 방법
Saadatzadeh et al. Pedestrian dead reckoning using smartphones sensors: an efficient indoor positioning system in complex buildings of smart cities
Xie et al. Holding-manner-free heading change estimation for smartphone-based indoor positioning
Zeng et al. Method of Smartphone Navigation Heading Compensation Based on Gravimeter
Aljeroudi et al. MOBILITY DETERMINATION AND ESTIMATION BASED ON SMARTPHONES-REVIEW OF SENSING AND SYSTEMS
CN116724213A (zh) 基于脚步的定位

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16905138

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16905138

Country of ref document: EP

Kind code of ref document: A1