CN107421537B - Object motion attitude sensing method and system based on rigid body grid of inertial sensor - Google Patents

Object motion attitude sensing method and system based on rigid body grid of inertial sensor Download PDF

Info

Publication number
CN107421537B
CN107421537B CN201710827423.4A CN201710827423A CN107421537B CN 107421537 B CN107421537 B CN 107421537B CN 201710827423 A CN201710827423 A CN 201710827423A CN 107421537 B CN107421537 B CN 107421537B
Authority
CN
China
Prior art keywords
inertial sensor
grid
rigid body
motion attitude
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710827423.4A
Other languages
Chinese (zh)
Other versions
CN107421537A (en
Inventor
刘杰
林科
覃志松
张纪元
甘国宁
卢艳梅
姜虎成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Electronic Technology
Original Assignee
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Electronic Technology filed Critical Guilin University of Electronic Technology
Priority to CN201710827423.4A priority Critical patent/CN107421537B/en
Publication of CN107421537A publication Critical patent/CN107421537A/en
Application granted granted Critical
Publication of CN107421537B publication Critical patent/CN107421537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The invention discloses an object motion attitude sensing method and system based on an inertial sensor rigid grid, which combines a sensor rigid grid positioning technology, a self-adaptive weight distribution technology and a multi-inertial sensor information fusion technology, uses a peer-to-peer sensor grid network formed by a plurality of inertial sensor nodes to analyze and fuse the attitude sensing data of the plurality of sensor nodes, and calculates the complete motion attitude of an object attached to the grid system through rigid grid position error correction and grid node data dynamic error correction. The invention can improve the motion parameter sensing precision of the inertial sensor and realize the accurate sensing tracking of the real-time motion trail and the motion attitude of the moving object. The key steps include rigid grid position error correction and grid node data dynamic error correction.

Description

Object motion attitude sensing method and system based on rigid body grid of inertial sensor
Technical Field
The invention relates to the technical field of inertial sensors, in particular to a method and a system for sensing an object motion attitude based on an inertial sensor rigid body grid.
Background
At present, the inertial sensor is widely applied to mobile electronic equipment, calculates the motion attitude and motion track of an object by fusing the sensing data of an accelerometer, a gyroscope and a magnetometer of the inertial sensor, becomes the main application field of the inertial sensor, and has wide application requirements in the fields of indoor positioning, aircraft air position attitude sensing, object motion attitude detection and the like. Due to the limitation of factors such as the process, cost, precision, interference, accumulated error and the like of the inertial sensor, the motion attitude sensing precision of the current inertial sensor still cannot meet the precision requirements of real attitude monitoring and track tracking, so that the practicability and the available scene of the attitude monitoring scheme of the inertial sensor are restricted.
Disclosure of Invention
The invention aims to solve the problem that the motion attitude sensing precision of the existing inertial sensor cannot meet the practical requirement, and provides an object motion attitude sensing method and system based on an inertial sensor rigid body grid.
In order to solve the problems, the invention is realized by the following technical scheme:
the method for sensing the motion attitude of the object based on the rigid body grid of the inertial sensor comprises the following steps:
step 1, on an object to be measured, more than 2 inertial sensors with rigid spatial position relation are used as grid nodes, and an inertial sensor rigid body grid network with more than 2 nodes and a rigid spatial structure formed among the nodes is constructed;
step 2, correcting the position error of the rigid body grid:
step 2.1, the measured object is subjected to preset calibration movement;
step 2.2, by acquiring motion attitude data output by the inertial sensor when the measured object performs calibration motion and combining the known grid position relationship of the inertial sensor in the rigid grid network of the inertial sensor, performing grid position calibration and grid position relationship correction on the rigid grid network of the inertial sensor to obtain an accurate position relationship model of the rigid grid network of the inertial sensor;
step 3, when the measured object actually moves, each inertial sensor of the inertial sensor rigid body grid network on the measured object acquires and outputs corresponding motion attitude data in real time, and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor is obtained by resolving the motion attitude data in real time;
step 4, correcting dynamic errors of the grid node data;
step 4.1, calculating information confidence weight of each inertial sensor node based on the accurate position relation model of the inertial sensor rigid body grid network obtained in the step 2 and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor obtained in the step 3, and obtaining an information confidence model of the inertial sensor rigid body grid network under the motion attitude;
step 4.2, respectively calculating the motion attitude estimation of other inertial sensors according to the motion attitude data sensed by each inertial sensor of the rigid body grid network of the inertial sensor;
step 4.3, according to the information confidence model of the rigid body grid network of the inertial sensor obtained in the step 4.1, carrying out iterative computation among all grid nodes of the rigid body grid network of the inertial sensor, and obtaining accurate attitude data of all grid nodes of the rigid body grid network of the inertial sensor;
and 4.4, taking the accurate attitude data of the weight center particles of the rigid body grid network of the inertial sensor as the finally perceived motion attitude of the measured object.
In step 1, all the inertial sensor nodes form an inertial sensor grid network of an equality.
The object motion attitude sensing system based on the rigid body grid of the inertial sensor for realizing the method is characterized by comprising a data processing unit and more than 2 inertial sensor nodes with rigid spatial position relations;
the inertial sensor node is responsible for sensing the motion attitude data and transmitting the motion attitude data to the data processing unit through a communication network;
and the data processing unit is responsible for dynamically analyzing the motion attitude data of each inertial sensor node, and fusing and solving the accurate motion attitude data of each inertial sensor node through rigid body grid position error correction and grid node data dynamic error correction.
In the scheme, all the inertial sensor nodes form a peer-to-peer inertial sensor rigid grid network.
In the scheme, the inertial sensor node is connected with the data processing unit in a wired and/or wireless mode.
Compared with the prior art, the invention combines the rigid body grid positioning technology of the sensor, the self-adaptive weight distribution technology and the information fusion technology of the multiple inertial sensors, and has the following characteristics:
1. analyzing and fusing the attitude sensing data of the plurality of sensor nodes by using a peer-to-peer sensor grid network formed by the plurality of inertial sensor nodes, thereby calculating the complete motion attitude of an object attached to the grid system;
2. rigid spatial position relations exist among the nodes of the inertial sensors, and the spatial position relations also become influence factors of information confidence weight of the inertial sensors, so that the motion postures of other nodes can be estimated through the motion posture data of a single sensor, and a reference basis is provided for error correction of other nodes;
3. a dynamic error reference weight model is adopted, the sensor grid network weight model has no fixed center, nodes of the inertial sensors are in an equal relation, the error reference weight of each node is determined by the motion attitude estimation of a rigid body grid system and the spatial position relation between the nodes, iterative correction is continuously carried out, and high-precision motion attitude data is finally obtained;
4. the rigid body grid of the inertial sensor is subjected to position calibration by using specific object calibration motion, and the grid position relation is subjected to error correction, so that accurate grid position relation is provided for subsequent error model iterative computation.
Drawings
Fig. 1 is a flow chart of an object motion attitude sensing method based on an inertial sensor rigid body grid.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings in conjunction with specific examples.
The object motion attitude sensing system based on the rigid body grid of the inertial sensor consists of a data processing unit and more than 2 inertial sensor nodes. The inertial sensor nodes form a rigid grid network of the equivalent sensor, rigid space position relations exist among the inertial sensor nodes, and the grid nodes are in a centerless peer-to-peer relation, so that the motion postures of other nodes can be estimated through the motion posture data of a single inertial sensor, and a reference basis is provided for error correction of other nodes. The inertial sensor nodes are responsible for sensing motion attitude data (acceleration, angular velocity, magnetic field vector and the like) and transmitting the motion attitude data to the data processing unit through a communication network. The data processing unit is responsible for dynamically analyzing the motion attitude data of each inertial sensor node, and fusing and solving the accurate motion attitude data of each inertial sensor node through rigid body grid position error correction and grid node data dynamic error correction.
The method for sensing the motion attitude of the object based on the rigid body grid of the inertial sensor, which is realized by the system, as shown in fig. 1, specifically comprises the following steps:
step 1, on an object to be measured, more than 2 inertial sensors with rigid spatial position relation are used as grid nodes, and an inertial sensor rigid body grid network with more than 2 nodes and a rigid spatial structure formed among the nodes is constructed;
step 2, correcting the position error of the rigid body grid:
step 2.1, the measured object is subjected to preset calibration movement;
step 2.2, by acquiring motion attitude data output by the inertial sensor when the measured object performs calibration motion and combining the known grid position relationship of the inertial sensor in the rigid grid network of the inertial sensor, performing grid position calibration and grid position relationship correction on the rigid grid network of the inertial sensor to obtain an accurate position relationship model of the rigid grid network of the inertial sensor;
step 3, when the measured object actually moves, each inertial sensor of the inertial sensor rigid body grid network on the measured object acquires and outputs corresponding motion attitude data in real time, and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor is obtained by resolving the motion attitude data in real time;
step 4, correcting dynamic errors of the grid node data;
step 4.1, calculating information confidence weight of each inertial sensor node based on the accurate position relation model of the inertial sensor rigid body grid network obtained in the step 2 and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor obtained in the step 3, and obtaining an information confidence model of the inertial sensor rigid body grid network under the motion attitude;
step 4.2, respectively calculating the motion attitude estimation of other inertial sensors according to the motion attitude data sensed by each inertial sensor of the rigid body grid network of the inertial sensor;
step 4.3, according to the information confidence model of the rigid body grid network of the inertial sensor obtained in the step 4.1, carrying out iterative computation among all grid nodes of the rigid body grid network of the inertial sensor, and obtaining accurate attitude data of all grid nodes of the rigid body grid network of the inertial sensor;
and 4.4, taking the accurate attitude data of the weight center particles of the rigid body grid network of the inertial sensor as the finally perceived motion attitude of the measured object.
The invention constructs a sensor grid system with a plurality of multi-nodes and a rigid space structure formed between the nodes by using a plurality of inertial sensors with rigid space position relation as grid nodes on a rigid body object, thereby carrying out accurate attitude data calculation on the rigid body object. The method comprises the steps of calculating data of multiple inertial sensors in real time to form initial motion attitude estimation of a rigid body grid system; and dynamically distributing the information confidence coefficient weight of each node according to the current initial motion attitude estimation parameters, and calculating a reference weight model of the attitude error. And (3) performing iterative error correction among multiple sensors by combining known rigid body grid space positions through error reference weight models of various inertial sensors. Therefore, dynamic accurate correction of the motion attitude sensing data of each node of the sensor grid is realized, and accurate attitude data of each grid node and a grid system center is solved. The invention can improve the motion parameter sensing precision of the inertial sensor and realize the accurate sensing tracking of the real-time motion trail and the motion attitude of the moving object. The key steps include rigid grid position error correction and grid node data dynamic error correction.
Correcting position errors of the rigid body grid: an inertial sensor grid system with rigid body position relation is arranged on a measured object to enable the object to carry out specific calibration movement, such as horizontal static movement, free falling body movement and the like, the grid position is calibrated by acquiring the movement posture output data of the inertial sensor grid nodes when the object carries out calibration movement and combining the known inertial sensor grid position relation, and the grid position relation is corrected, so that the accurate position relation of the inertial sensor rigid body grid after error correction is acquired, and an accurate rigid body position relation model is provided for subsequent posture data correction.
Grid node data dynamic error correction: when the measured object moves, each sensor node of the rigid body grid of the inertial sensor acquires and outputs corresponding motion attitude data (acceleration, angular velocity, magnetic field vector and the like) in real time, the motion attitude data is transmitted to the data processing unit in a wired or wireless mode, and overall preliminary estimation of the motion attitude of the rigid body grid is carried out, so that overall motion data such as the motion direction, the velocity, the rotation angle and the like of the central point of the rigid body grid are obtained. And calculating the information confidence coefficient weight of each grid node by combining with the accurate position relation model of the grid, thereby obtaining the information confidence coefficient model of the grid node under the motion posture. On the basis, motion attitude estimation of other nodes is respectively calculated according to the motion attitude sensing data of the grid nodes of the inertial sensor, error iterative calculation among all the nodes is carried out according to the information confidence coefficient model of the grid nodes, and error correction models of all the grid nodes are obtained. And according to the error correction model, carrying out error compensation correction on the motion attitude data of each grid node so as to obtain the accurate motion attitude data of each grid node, and calculating the accurate attitude data of particle points in the weight center of the grid system.
It should be noted that, although the above-mentioned embodiments of the present invention are illustrative, the present invention is not limited thereto, and thus the present invention is not limited to the above-mentioned embodiments. Other embodiments, which can be made by those skilled in the art in light of the teachings of the present invention, are considered to be within the scope of the present invention without departing from its principles.

Claims (5)

1. An object motion attitude sensing method based on an inertial sensor rigid body grid is characterized by comprising the following steps:
step 1, on an object to be measured, more than 2 inertial sensors with rigid spatial position relation are used as grid nodes, and an inertial sensor rigid body grid network with more than 2 nodes and a rigid spatial structure formed among the nodes is constructed;
step 2, correcting the position error of the rigid body grid:
step 2.1, the measured object is subjected to preset calibration movement;
step 2.2, by acquiring motion attitude data output by the inertial sensor when the measured object performs calibration motion and combining the known grid position relationship of the inertial sensor in the rigid grid network of the inertial sensor, performing grid position calibration and grid position relationship correction on the rigid grid network of the inertial sensor to obtain an accurate position relationship model of the rigid grid network of the inertial sensor;
step 3, when the measured object actually moves, each inertial sensor of the inertial sensor rigid body grid network on the measured object acquires and outputs corresponding motion attitude data in real time, and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor is obtained by resolving the motion attitude data in real time;
step 4, correcting dynamic errors of the grid node data;
step 4.1, calculating information confidence weight of each inertial sensor node based on the accurate position relation model of the inertial sensor rigid body grid network obtained in the step 2 and the motion attitude of the inertial sensor rigid body grid network preliminarily estimated by each inertial sensor obtained in the step 3, and obtaining an information confidence model of the inertial sensor rigid body grid network under the motion attitude;
step 4.2, respectively calculating the motion attitude estimation of other inertial sensors according to the motion attitude data sensed by each inertial sensor of the rigid body grid network of the inertial sensor;
step 4.3, according to the information confidence model of the rigid body grid network of the inertial sensor obtained in the step 4.1, carrying out iterative computation among all grid nodes of the rigid body grid network of the inertial sensor, and obtaining accurate attitude data of all grid nodes of the rigid body grid network of the inertial sensor;
and 4.4, taking the accurate attitude data of the weight center particles of the rigid body grid network of the inertial sensor as the finally perceived motion attitude of the measured object.
2. The method for sensing the motion attitude of an object based on an inertial sensor rigid body grid as claimed in claim 1, wherein in step 1, all the inertial sensor nodes form an inertial sensor grid network of an equation pair.
3. An inertial sensor rigid body grid-based object motion attitude sensing system for implementing the method of claim 1, wherein the system is composed of a data processing unit and more than 2 inertial sensor nodes with rigid spatial position relationships;
the inertial sensor node is responsible for sensing the motion attitude data and transmitting the motion attitude data to the data processing unit through a communication network;
and the data processing unit is responsible for dynamically analyzing the motion attitude data of each inertial sensor node, and fusing and solving the accurate motion attitude data of each inertial sensor node through rigid body grid position error correction and grid node data dynamic error correction.
4. The inertial sensor rigid body mesh-based object motion pose sensing system of claim 3, wherein all inertial sensor nodes form a peer-to-peer inertial sensor rigid body mesh network.
5. The system for sensing the motion attitude of an object based on rigid body mesh of inertial sensors as claimed in claim 3, wherein the nodes of the inertial sensors are connected with the data processing unit by wire and/or wireless.
CN201710827423.4A 2017-09-14 2017-09-14 Object motion attitude sensing method and system based on rigid body grid of inertial sensor Active CN107421537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710827423.4A CN107421537B (en) 2017-09-14 2017-09-14 Object motion attitude sensing method and system based on rigid body grid of inertial sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710827423.4A CN107421537B (en) 2017-09-14 2017-09-14 Object motion attitude sensing method and system based on rigid body grid of inertial sensor

Publications (2)

Publication Number Publication Date
CN107421537A CN107421537A (en) 2017-12-01
CN107421537B true CN107421537B (en) 2020-07-17

Family

ID=60433359

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710827423.4A Active CN107421537B (en) 2017-09-14 2017-09-14 Object motion attitude sensing method and system based on rigid body grid of inertial sensor

Country Status (1)

Country Link
CN (1) CN107421537B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108627153B (en) * 2018-05-11 2022-06-07 南京大学 Rigid body motion tracking system based on inertial sensor and working method thereof
CN109001787B (en) * 2018-05-25 2022-10-21 北京大学深圳研究生院 Attitude angle resolving and positioning method and fusion sensor thereof
CN112050829B (en) * 2019-06-06 2023-04-07 华为技术有限公司 Motion state determination method and device
CN112147898B (en) * 2020-09-29 2022-05-31 陕西师范大学 Rigid system anti-interference control method and system only depending on control direction information
CN112729317B (en) * 2020-12-17 2023-09-19 大陆投资(中国)有限公司 Method for locating a vehicle and in-vehicle system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1818555A (en) * 2006-03-29 2006-08-16 北京航空航天大学 Microinertia measuring unit precisive calibration for installation fault angle and rating factor decoupling
CN102706347A (en) * 2012-05-17 2012-10-03 南京航空航天大学 Inertial sensor network node device and information fusion method thereof
CN103616710A (en) * 2013-12-17 2014-03-05 靳文瑞 Multi-sensor combined navigation time synchronizing system based on field programmable gate array (FPGA)
US8795078B1 (en) * 2006-07-14 2014-08-05 Ailive Inc. Method and system providing compatibility between two different controllers
CN106052719A (en) * 2016-08-01 2016-10-26 中科创达软件股份有限公司 Method and device for calibrating gyroscope
CN106979781A (en) * 2017-04-12 2017-07-25 南京航空航天大学 High-precision Transfer Alignment based on distributed inertance network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150092048A1 (en) * 2013-09-27 2015-04-02 Qualcomm Incorporated Off-Target Tracking Using Feature Aiding in the Context of Inertial Navigation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1818555A (en) * 2006-03-29 2006-08-16 北京航空航天大学 Microinertia measuring unit precisive calibration for installation fault angle and rating factor decoupling
US8795078B1 (en) * 2006-07-14 2014-08-05 Ailive Inc. Method and system providing compatibility between two different controllers
CN102706347A (en) * 2012-05-17 2012-10-03 南京航空航天大学 Inertial sensor network node device and information fusion method thereof
CN103616710A (en) * 2013-12-17 2014-03-05 靳文瑞 Multi-sensor combined navigation time synchronizing system based on field programmable gate array (FPGA)
CN106052719A (en) * 2016-08-01 2016-10-26 中科创达软件股份有限公司 Method and device for calibrating gyroscope
CN106979781A (en) * 2017-04-12 2017-07-25 南京航空航天大学 High-precision Transfer Alignment based on distributed inertance network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种新的MEMS陀螺温度误差建模与补偿方法;孙田川,刘洁瑜;《压电与声光》;20170228;第39卷(第1期);全文 *
基于WiFi和惯性传感器的多信息融合室内定位系统的设计与实现;姚志锋;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170215;全文 *

Also Published As

Publication number Publication date
CN107421537A (en) 2017-12-01

Similar Documents

Publication Publication Date Title
CN107421537B (en) Object motion attitude sensing method and system based on rigid body grid of inertial sensor
CN105589064B (en) WLAN location fingerprint database is quickly established and dynamic update system and method
CN105588566B (en) A kind of indoor locating system merged based on bluetooth with MEMS and method
CN104884902B (en) For three axle magnetometer and the method and apparatus of the data fusion of three axis accelerometer
CN108981693B (en) VIO rapid joint initialization method based on monocular camera
CN109612471B (en) Moving body attitude calculation method based on multi-sensor fusion
CN113324544B (en) Indoor mobile robot co-location method based on UWB/IMU (ultra wide band/inertial measurement unit) of graph optimization
CN107941211B (en) Multi-element fusion positioning method and device based on second-order cascade and electronic equipment
CN107621266B (en) Space non-cooperative target relative navigation method based on feature point tracking
CN107270898B (en) Double particle filter navigation devices and method based on MEMS sensor and VLC positioning fusion
US20170205490A1 (en) Coverage optimization for sensor networks
CN111091587A (en) Low-cost motion capture method based on visual markers
CN109631894A (en) A kind of monocular vision inertia close coupling method based on sliding window
CN110561424A (en) online robot kinematic calibration method based on multi-sensor hybrid filter
Gong et al. Robust inertial motion tracking through deep sensor fusion across smart earbuds and smartphone
CN111983660A (en) System and method for positioning quad-rotor unmanned aerial vehicle in GNSS rejection environment
CN114111776B (en) Positioning method and related device
CN114046800B (en) High-precision mileage estimation method based on double-layer filtering frame
CN111932637B (en) Vehicle body camera external parameter self-adaptive calibration method and device
CN107246872A (en) Single-particle filtering guider and method based on MEMS sensor and VLC positioning fusions
Liu et al. An autonomous positioning method for fire robots with multi-source sensors
CN111912295A (en) Trajectory drop point prediction system
JP2007538231A (en) Interferometric sensing system
CN107888289B (en) Indoor positioning method and platform based on fusion of visible light communication and inertial sensor
CN112284381B (en) Visual inertia real-time initialization alignment method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant