CN104501814A - Attitude and position estimation method based on vision and inertia information - Google Patents

Attitude and position estimation method based on vision and inertia information Download PDF

Info

Publication number
CN104501814A
CN104501814A CN201410765687.8A CN201410765687A CN104501814A CN 104501814 A CN104501814 A CN 104501814A CN 201410765687 A CN201410765687 A CN 201410765687A CN 104501814 A CN104501814 A CN 104501814A
Authority
CN
China
Prior art keywords
current
state vector
equipment
ekf
attitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410765687.8A
Other languages
Chinese (zh)
Other versions
CN104501814B (en
Inventor
林城
王梁昊
李东晓
张明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201410765687.8A priority Critical patent/CN104501814B/en
Publication of CN104501814A publication Critical patent/CN104501814A/en
Application granted granted Critical
Publication of CN104501814B publication Critical patent/CN104501814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Gyroscopes (AREA)

Abstract

The invention discloses an attitude and position estimation method based on vision and inertia information. According to the method, by fusing vision and inertia information of an image, motion state information is saved and updated by virtue of an expanded kalman filter so as to calculate the current accurate attitude and position of equipment, and the motion trail of the equipment within a certain duration of time can be obtained. According to the estimation method disclosed by the invention, the information of the image and a sensor can be flexibly utilized so as to achieve good robustness through mutual complementation of the information, and a situation of tracking losing is avoided. The expanded kalman filter adopted by the method is used for calculating based on an iterative form, and compared with a method of batched calculation, the estimation method can be used for calculating without acquiring all observed quantities, is relatively low in calculated quantity, adaptive to a condition of relatively few calculation resources at terminal equipment and capable of preferably achieving a requirement on real-time performance.

Description

A kind of Attitude and position estimation method of view-based access control model and Inertia information
Technical field
The invention belongs to equipment attitude and tracking technique field, position, be specifically related to a kind of Attitude and position estimation method of view-based access control model and Inertia information.
Background technology
It is utilize inertance element that locator meams conventional on current terminal device mainly contains two kinds: one, and the data namely mainly through gyroscope, gravitational accelerometer and accelerometer obtain device location; Two be by camera collection to visual information process the attitude information of the equipment of obtaining.A kind of front method limits due to sensor accuracy, easily produces the accumulative of error, causes the estimation of position to drift about, and more obvious when the movement velocity of equipment is less, particularly zero velocity detects and remains a difficult problem.The information of second method view-based access control model, the method of multiple view geometry is utilized to recover the position of vision terminal relative to scene, there is not the problem of deviation accumulation in these class methods, but be limited to the restriction of visual information, these class methods cannot obtain scene mesoscale, the estimation of relative orientation can only be done, and require that the translational speed of terminal should be less, otherwise can due to the larger calculation deviation even phenomenon such as position loss of the less generation of match point between picture frame.
Image and sensing data are that terminal device is for calculating the source of the main information of position and attitude information, along with the raising of the computing power of smart machine, position and the Attitude estimation method of some fused images and sensing data are there is, method main at present has: use harmless Kalman filtering (UKF, UnscentedKalman Filter) based on, its mode realized is tightly coupled, only have when obtaining view data and sensing data simultaneously and just may be used for calculating current position and attitude, because the sample frequency of camera is different from the sample frequency of sensor in the ordinary course of things, this method can not introduce regular hour error in synchronization sampling due to image information and Inertia information on the one hand, on the other hand because the sample frequency of sensor is very fast, need to abandon a part of data, cause the waste of data, affect last precision.Multi-sampling rate problem is the major issue that will solve in image and sensor fusion algorithm, in order to make the data syn-chronization of image and sensor, what a lot of method adopted at present is the method that hardware solves, namely by sensor-triggered camera, image and sensing data are obtained in synchronization sampling, it uses Kalman filter using the data of sensor as input, the motion of prediction camera, when obtaining view data, namely use the coupling of result for image and the estimation of camera position that are obtained by Kalman filter prediction.The major advantage of these class methods is the motions utilizing inertial sensor predict device, accelerates the characteristic matching express delivery of image, but cannot do very large raising in last degree of accuracy.Trigger camera collection image in order to enable during sensor sample simultaneously, also need to make corresponding change to hardware.
Summary of the invention
For the above-mentioned technical matters existing for prior art, the invention provides a kind of Attitude and position estimation method of view-based access control model and Inertia information, effectively can must merge the data message of vision and inertial sensor, obtain position accurately and the attitude information of terminal device.
An Attitude and position estimation method for view-based access control model and Inertia information, comprises the steps:
First, combined calibrating is carried out to the video camera be arranged on equipment and sensor, obtain the inner parameter of video camera and distortion parameter and video camera and the positional information of sensor in world coordinate system; Described sensor comprises gyroscope and accelerometer;
Then, the state vector of initialization EKF (extended Kalman filter), the state vector of the data that the image utilizing camera acquisition to obtain according to above-mentioned inner parameter, distortion parameter and positional information or sensor collect to EKF upgrades, and then from state vector, extract attitude and the position of equipment.
The detailed process utilizing image to upgrade state vector is as follows:
1.1 pairs of current frame images carry out FAST feature extraction, obtain the unique point of image;
1.2 make the unique point of current frame image mate with the unique point of previous frame image, utilize the unique point matched to calculate the essential matrix of two two field pictures; And then rotating vector and the motion vector that svd (SVD) obtains current camera is carried out to described essential matrix;
1.3 utilize the velocity information in EKF current state vector to correct described motion vector, motion vector after recycling corrects upgrades the positional information in EKF current state vector, utilizes the attitude information in described rotating vector renewal EKF current state vector simultaneously.
According to following formula, motion vector is corrected in described step 1.3:
s'=(0.5+α)λs+(0.5-α)(vt+0.5at 2)
Wherein: s and s' is respectively the motion vector before and after correcting, α is weighting factor and α=1/ (v+2), λ is the scale factor between s and s', v and a is respectively speed amount in EKF current state vector and amount of acceleration, and t is the last time interval upgrading EKF state vector and gather to current camera.
The detailed process utilizing data to upgrade state vector is as follows:
The current acceleration amount that 2.1 current pose degree of the will speed up meters utilizing gyroscope to collect equipment collect equipment projects in world coordinate system;
2.2 utilize the method weighted calculation of an interpolation to obtain the present speed estimator of equipment according to the velocity information in the current acceleration amount of equipment and EKF current state vector;
2.3 utilize the current acceleration amount of equipment and present speed estimator to upgrade velocity information in EKF current state vector, and the current pose simultaneously utilizing gyroscope to collect equipment upgrades the attitude information in EKF current state vector.
According to the present speed estimator of following formula computing equipment in described step 2.2:
v'=v+0.5(a'+a)t'
Wherein: v' is the present speed estimator of equipment, v and a is respectively speed amount in EKF current state vector and amount of acceleration, and a' is the current acceleration amount of equipment, and t' is the last time interval upgrading EKF state vector and gather to current sensor.
The present invention directly uses by the characteristic sum of image zooming-out through the input of pretreated sensing data as extended Kalman filter, and the state of Kalman filter comprises current pose and the positional information of terminal device.Compared to conventional method, difference of the present invention is, when the feature of image or the data of sensor arrive (because both sample frequency are different, it reaches generally can not at synchronization), namely can be used for renewal state, and characteristics of image or sensing data need not be waited to arrive together, the renewal rate of state can be improved like this, improve the degree of accuracy of result, avoided the global optimization procedure used in additive method by the mode of successive ignition.
Wherein extended Kalman filter maintains a state vector and represents current state, it comprises position (displacement relative to initial position), speed and acceleration information, preserve a cartographic information, the current rotating vector that it comprises equipment and the scene map be made up of the unique point in image sequence simultaneously.When new image arrives, first calculate current time equipment relative to the rotation amount of map and translational movement, then recalculate successively and obtain other status informations, upgrade state vector.Similar, if what arrive is the data of sensor, upgrade corresponding part in the state vector of Kalman filter too.
The present invention can use the information of image and sensor neatly, and reaches good robustness by mutually supplementing of information, there will not be the situation of pursuing missing.The extended Kalman filter that the present invention adopts is the form calculus based on iteration, compared with batch calculating the method for (as bundle adjustment), it does not need all observed quantities all to obtain just to start to calculate, its calculated amount is relatively little simultaneously, the condition that terminal device computational resource is less can be adapted to, the requirement of real-time can be reached preferably.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of the inventive method.
Embodiment
In order to more specifically describe the present invention, below in conjunction with the drawings and the specific embodiments, technical scheme of the present invention is described in detail.
The inventive method realized on smart mobile phone, utilize the camera on mobile phone and sensing data to carry out the calculating of position and attitude, as shown in Figure 1, its embodiment is as follows:
A. combined calibrating is carried out to the camera on equipment and sensor, obtain inner parameter and the distortion parameter of video camera, obtain sensor and the position relationship of camera in world coordinate system simultaneously, sensor and camera are alignd in world coordinate system.
B., the initial point of the initial position world coordinate system of camera is set, the initial state vector of extended Kalman filter (EKF) is set in an initial condition, initial velocity is set, position is zero, initial position and direction are set to world coordinate system.
The state of Kalman filter can be upgraded by the arrival of new picture frame or sensing data.Process when picture frame arrives is realized by step C, and the process of the arrival of sensing data is realized by step D.Use status information in Kalman filter can know current position and attitude.Preserve the unique point of the state vector of Kalman filter, the rotation amount of camera and respective image each time, form the map of whole scene.
C. for the ease of realizing present embodiment on the equipment that computing power is less, FAST unique point is extracted to the video sequence of input.
Use the unique point obtained in step B, FAST Feature Points Matching is made with previous frame, calculate the essential matrix between twice coupling, SVD is used to obtain rotation amount and the displacement of camera, obtaining displacement by visual component is relative scale, the positional information in Kalman filter should be used to correct this relative displacement, and the formula of correction is as follows:
s i′=(0.5+α)λs i+(0.5-α)(vt+0.5at 2)
Wherein: s i' and s irepresent the real displacement of current time respectively and decomposed the relative shift obtained by SVD, λ represents the scale factor between relative displacement and real displacement, preserve in kalman filter state vector and upgrade, v is the speed amount in Kalman filter, a is the amount of acceleration in Kalman filter, and t is the last time interval upgrading Kalman filter and sample to present image.α is weighting factor, its value is between 0 to 0.5, and its size is determined by following expression: α=1/ (v+2), along with the increase of speed, the positional information estimated by accelerometer is more and more accurate, and therefore its weight in location estimation is also increasing.
As the input of Kalman filter after correction, upgrade the state vector of Kalman filter.
D. when sensing data arrives, utilize the data projection of gyroscope degree of will speed up meter to world coordinate system, utilize the speed component in the methods combining kalman filter state vector of an interpolation to calculate new velocity estimation amount, the computing formula of an interpolation is as follows:
v i=v i-1+0.5(a i+a i-1)t
Using the input as Kalman filter of the speed amount that obtains, use the positional information that the positional information calculation of state vector in Kalman filtering is new, the state representation of Kalman filter is: z k=[a k, v k, s k, λ k], for the acceleration of characterization device, speed and positional information, new state obtains by with drag, z k+1=F (z k'), F uses the Forecasting Methodology of extended Kalman filter.
Wherein z k' utilize z kobtain in conjunction with current input, if what newly obtain is view data, then upgrades and correct z kin s kamount, if arrival is sensing data, then upgrades z kin a kand v k.
The fusion image information of camera, the acceleration information of accelerometer and gyrostatic directional data present embodiment be applied to the equipment that smart mobile phone etc. is configured with camera and sensor, by can be obtained mobile phone real time position and attitude in scene.On the equipment that other are configured with camera and related sensor, (as automated spacecraft, robot etc.) can obtain position and the attitude information of these equipment equally by present embodiment, are used to guide the motion of equipment in scene.
Above-mentioned is can understand and apply the invention for ease of those skilled in the art to the description of embodiment.Person skilled in the art obviously easily can make various amendment to above-described embodiment, and General Principle described herein is applied in other embodiments and need not through performing creative labour.Therefore, the invention is not restricted to above-described embodiment, those skilled in the art are according to announcement of the present invention, and the improvement made for the present invention and amendment all should within protection scope of the present invention.

Claims (5)

1. an Attitude and position estimation method for view-based access control model and Inertia information, comprises the steps:
First, combined calibrating is carried out to the video camera be arranged on equipment and sensor, obtain the inner parameter of video camera and distortion parameter and video camera and the positional information of sensor in world coordinate system; Described sensor comprises gyroscope and accelerometer;
Then, the state vector of initialization EKF, the state vector of the data that the image utilizing camera acquisition to obtain according to above-mentioned inner parameter, distortion parameter and positional information or sensor collect to EKF upgrades, and then from state vector, extract attitude and the position of equipment.
2. Attitude and position estimation method according to claim 1, is characterized in that: the detailed process utilizing image to upgrade state vector is as follows:
1.1 pairs of current frame images carry out FAST feature extraction, obtain the unique point of image;
1.2 make the unique point of current frame image mate with the unique point of previous frame image, utilize the unique point matched to calculate the essential matrix of two two field pictures; And then rotating vector and the motion vector that svd obtains current camera is carried out to described essential matrix;
1.3 utilize the velocity information in EKF current state vector to correct described motion vector, motion vector after recycling corrects upgrades the positional information in EKF current state vector, utilizes the attitude information in described rotating vector renewal EKF current state vector simultaneously.
3. Attitude and position estimation method according to claim 2, is characterized in that: correct motion vector according to following formula in described step 1.3:
s'=(0.5+α)λs+(0.5-α)(vt+0.5at 2)
Wherein: s and s' is respectively the motion vector before and after correcting, α is weighting factor and α=1/ (v+2), λ is the scale factor between s and s', v and a is respectively speed amount in EKF current state vector and amount of acceleration, and t is the last time interval upgrading EKF state vector and gather to current camera.
4. Attitude and position estimation method according to claim 1, is characterized in that: the detailed process utilizing data to upgrade state vector is as follows:
The current acceleration amount that 2.1 current pose degree of the will speed up meters utilizing gyroscope to collect equipment collect equipment projects in world coordinate system;
2.2 utilize the method weighted calculation of an interpolation to obtain the present speed estimator of equipment according to the velocity information in the current acceleration amount of equipment and EKF current state vector;
2.3 utilize the current acceleration amount of equipment and present speed estimator to upgrade velocity information in EKF current state vector, and the current pose simultaneously utilizing gyroscope to collect equipment upgrades the attitude information in EKF current state vector.
5. Attitude and position estimation method according to claim 4, is characterized in that: according to the present speed estimator of following formula computing equipment in described step 2.2:
v'=v+0.5(a'+a)t'
Wherein: v' is the present speed estimator of equipment, v and a is respectively speed amount in EKF current state vector and amount of acceleration, and a' is the current acceleration amount of equipment, and t' is the last time interval upgrading EKF state vector and gather to current sensor.
CN201410765687.8A 2014-12-12 2014-12-12 Attitude and position estimation method based on vision and inertia information Active CN104501814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410765687.8A CN104501814B (en) 2014-12-12 2014-12-12 Attitude and position estimation method based on vision and inertia information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410765687.8A CN104501814B (en) 2014-12-12 2014-12-12 Attitude and position estimation method based on vision and inertia information

Publications (2)

Publication Number Publication Date
CN104501814A true CN104501814A (en) 2015-04-08
CN104501814B CN104501814B (en) 2017-05-10

Family

ID=52943240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410765687.8A Active CN104501814B (en) 2014-12-12 2014-12-12 Attitude and position estimation method based on vision and inertia information

Country Status (1)

Country Link
CN (1) CN104501814B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105547326A (en) * 2015-12-08 2016-05-04 上海交通大学 Integrated calibration method for gyro and magnetic transducer
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN106020240A (en) * 2016-05-25 2016-10-12 南京安透可智能系统有限公司 Holder control system of autonomous homing calibration
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN106595640A (en) * 2016-12-27 2017-04-26 天津大学 Moving-base-object relative attitude measuring method based on dual-IMU-and-visual fusion and system
CN106989773A (en) * 2017-04-07 2017-07-28 浙江大学 A kind of attitude transducer and posture renewal method
CN107389968A (en) * 2017-07-04 2017-11-24 武汉视览科技有限公司 A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer
CN107491099A (en) * 2017-08-30 2017-12-19 浙江华飞智能科技有限公司 A kind of cloud platform control method and device of view-based access control model and gyroscope
CN107850901A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN107888828A (en) * 2017-11-22 2018-04-06 网易(杭州)网络有限公司 Space-location method and device, electronic equipment and storage medium
CN107941217A (en) * 2017-09-30 2018-04-20 杭州迦智科技有限公司 A kind of robot localization method, electronic equipment, storage medium, device
CN107941212A (en) * 2017-11-14 2018-04-20 杭州德泽机器人科技有限公司 A kind of vision and inertia joint positioning method
CN108537094A (en) * 2017-03-03 2018-09-14 株式会社理光 Image processing method, device and system
CN109631938A (en) * 2018-12-28 2019-04-16 湖南海迅自动化技术有限公司 Development machine autonomous positioning orientation system and method
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN110992405A (en) * 2018-10-01 2020-04-10 三星电子株式会社 Method and apparatus for outputting attitude information
CN111417016A (en) * 2019-01-07 2020-07-14 中国移动通信有限公司研究院 Attitude estimation method, server and network equipment
WO2020258901A1 (en) * 2019-06-25 2020-12-30 上海商汤智能科技有限公司 Method and apparatus for processing data of sensor, electronic device, and system
CN112461237A (en) * 2020-11-26 2021-03-09 浙江同善人工智能技术有限公司 Multi-sensor fusion positioning algorithm applied to dynamic change scene
CN113158459A (en) * 2021-04-20 2021-07-23 浙江工业大学 Human body posture estimation method based on visual and inertial information fusion
CN116801303A (en) * 2023-07-27 2023-09-22 测速网技术(南京)有限公司 ARCore-based indoor signal strength detection method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102162738A (en) * 2010-12-08 2011-08-24 中国科学院自动化研究所 Calibration method of camera and inertial sensor integrated positioning and attitude determining system
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103200358A (en) * 2012-01-06 2013-07-10 杭州普维光电技术有限公司 Coordinate transformation method and device between camera and goal scene

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102162738A (en) * 2010-12-08 2011-08-24 中国科学院自动化研究所 Calibration method of camera and inertial sensor integrated positioning and attitude determining system
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103200358A (en) * 2012-01-06 2013-07-10 杭州普维光电技术有限公司 Coordinate transformation method and device between camera and goal scene

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
夏凌楠等: "基于惯性传感器和视觉里程计的机器人定位", 《仪器仪表学报》 *
蒋鸿翔等: "新型无人直升机的视觉导引控制与仿真", 《信息与控制》 *
郑向阳等: "移动机器人导航和定位技术", 《机电工程》 *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113093808A (en) * 2015-05-23 2021-07-09 深圳市大疆创新科技有限公司 Sensor fusion using inertial and image sensors
CN107850901A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
CN107850901B (en) * 2015-05-23 2021-04-16 深圳市大疆创新科技有限公司 Sensor fusion using inertial and image sensors
CN105547326A (en) * 2015-12-08 2016-05-04 上海交通大学 Integrated calibration method for gyro and magnetic transducer
CN105547326B (en) * 2015-12-08 2018-04-06 上海交通大学 Gyro and Magnetic Sensor combined calibrating method
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN106020240A (en) * 2016-05-25 2016-10-12 南京安透可智能系统有限公司 Holder control system of autonomous homing calibration
CN106546238A (en) * 2016-10-26 2017-03-29 北京小鸟看看科技有限公司 Wearable device and the method that user's displacement is determined in wearable device
CN106595640A (en) * 2016-12-27 2017-04-26 天津大学 Moving-base-object relative attitude measuring method based on dual-IMU-and-visual fusion and system
CN108537094A (en) * 2017-03-03 2018-09-14 株式会社理光 Image processing method, device and system
CN108537094B (en) * 2017-03-03 2022-11-22 株式会社理光 Image processing method, device and system
CN106989773A (en) * 2017-04-07 2017-07-28 浙江大学 A kind of attitude transducer and posture renewal method
CN106989773B (en) * 2017-04-07 2019-07-16 浙江大学 A kind of attitude transducer and posture renewal method
CN107389968A (en) * 2017-07-04 2017-11-24 武汉视览科技有限公司 A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer
CN107491099A (en) * 2017-08-30 2017-12-19 浙江华飞智能科技有限公司 A kind of cloud platform control method and device of view-based access control model and gyroscope
CN107941217A (en) * 2017-09-30 2018-04-20 杭州迦智科技有限公司 A kind of robot localization method, electronic equipment, storage medium, device
CN107941217B (en) * 2017-09-30 2020-05-22 杭州迦智科技有限公司 Robot positioning method, electronic equipment, storage medium and device
CN107941212B (en) * 2017-11-14 2020-07-28 杭州德泽机器人科技有限公司 Vision and inertia combined positioning method
CN107941212A (en) * 2017-11-14 2018-04-20 杭州德泽机器人科技有限公司 A kind of vision and inertia joint positioning method
CN107888828B (en) * 2017-11-22 2020-02-21 杭州易现先进科技有限公司 Space positioning method and device, electronic device, and storage medium
CN107888828A (en) * 2017-11-22 2018-04-06 网易(杭州)网络有限公司 Space-location method and device, electronic equipment and storage medium
CN110992405A (en) * 2018-10-01 2020-04-10 三星电子株式会社 Method and apparatus for outputting attitude information
CN109631938A (en) * 2018-12-28 2019-04-16 湖南海迅自动化技术有限公司 Development machine autonomous positioning orientation system and method
CN111417016A (en) * 2019-01-07 2020-07-14 中国移动通信有限公司研究院 Attitude estimation method, server and network equipment
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
WO2020258901A1 (en) * 2019-06-25 2020-12-30 上海商汤智能科技有限公司 Method and apparatus for processing data of sensor, electronic device, and system
CN112461237A (en) * 2020-11-26 2021-03-09 浙江同善人工智能技术有限公司 Multi-sensor fusion positioning algorithm applied to dynamic change scene
CN113158459A (en) * 2021-04-20 2021-07-23 浙江工业大学 Human body posture estimation method based on visual and inertial information fusion
CN116801303A (en) * 2023-07-27 2023-09-22 测速网技术(南京)有限公司 ARCore-based indoor signal strength detection method and device

Also Published As

Publication number Publication date
CN104501814B (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN104501814A (en) Attitude and position estimation method based on vision and inertia information
CN107888828B (en) Space positioning method and device, electronic device, and storage medium
CN110084832B (en) Method, device, system, equipment and storage medium for correcting camera pose
CN107748569B (en) Motion control method and device for unmanned aerial vehicle and unmanned aerial vehicle system
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN109506642B (en) Robot multi-camera visual inertia real-time positioning method and device
Li et al. Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera
US12062210B2 (en) Data processing method and apparatus
US11181379B2 (en) System and method for enhancing non-inertial tracking system with inertial constraints
CN110660098B (en) Positioning method and device based on monocular vision
KR101985344B1 (en) Sliding windows based structure-less localization method using inertial and single optical sensor, recording medium and device for performing the method
CN114013449B (en) Data processing method and device for automatic driving vehicle and automatic driving vehicle
CN110296702A (en) Visual sensor and the tightly coupled position and orientation estimation method of inertial navigation and device
CN110231028B (en) Aircraft navigation method, device and system
CN111665512B (en) Ranging and mapping based on fusion of 3D lidar and inertial measurement unit
CN112880687A (en) Indoor positioning method, device, equipment and computer readable storage medium
CN109059907A (en) Track data processing method, device, computer equipment and storage medium
CN108871311A (en) Pose determines method and apparatus
CN113066127B (en) Visual inertial odometer method and system for calibrating equipment parameters on line
CN112945227A (en) Positioning method and device
WO2020135183A1 (en) Method and apparatus for constructing point cloud map, computer device, and storage medium
JP2014186004A (en) Measurement device, method and program
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
CN115218906A (en) Indoor SLAM-oriented visual inertial fusion positioning method and system
CN112731503B (en) Pose estimation method and system based on front end tight coupling

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant