CN108318029A - Attitude Tracking and image superimposing method and display equipment - Google Patents

Attitude Tracking and image superimposing method and display equipment Download PDF

Info

Publication number
CN108318029A
CN108318029A CN201711204062.4A CN201711204062A CN108318029A CN 108318029 A CN108318029 A CN 108318029A CN 201711204062 A CN201711204062 A CN 201711204062A CN 108318029 A CN108318029 A CN 108318029A
Authority
CN
China
Prior art keywords
coordinate system
image
determining
situation
kalman filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711204062.4A
Other languages
Chinese (zh)
Inventor
贺长宇
张红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Electronics Technology Group Corp CETC
Original Assignee
China Electronics Technology Group Corp CETC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Electronics Technology Group Corp CETC filed Critical China Electronics Technology Group Corp CETC
Priority to CN201711204062.4A priority Critical patent/CN108318029A/en
Publication of CN108318029A publication Critical patent/CN108318029A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a kind of Attitude Tracking and image superimposing method and display equipment.The method includes:The posture information for showing equipment is determined by preset extended Kalman filter;Situation image is generated according to the posture information;According to the coordinate system transformational relation of predetermined situation image model coordinate systems and display device coordinate system, the image transformational relation of situation image and real scene image is determined;According to described image transformational relation, the situation image of generation is added in the real scene image of the display equipment.The present invention is obviously improved the accuracy and safety that augmented reality is shown.

Description

Posture tracking and image superposition method and display equipment
Technical Field
The invention relates to the technical field of augmented reality, in particular to a posture tracking and image superposition method and display equipment.
Background
The augmented reality display navigation technology provides invisible navigation or auxiliary image information for a user through the superposition of image information, and can improve the visualization effect and the application efficiency of various applications. The situation display and navigation system based on the augmented reality technology enables a user to observe a target and a navigation image in a real environment at the same time.
In the existing augmented reality navigation, the accuracy and safety of augmented reality display are low.
Disclosure of Invention
In order to overcome the above defects, the present invention provides a method for tracking a pose and superimposing an image and a display device, so as to enhance the accuracy and safety of reality display.
In order to solve the above technical problem, a method for tracking a pose and superimposing an image according to the present invention includes:
determining pose information of the display equipment through a preset extended Kalman filter;
generating a situation image according to the pose information;
determining an image conversion relation between the situation image and the real scene image according to a coordinate system conversion relation between a predetermined situation image model coordinate system and a display equipment coordinate system;
and according to the image conversion relation, superimposing the generated situation image to the real scene image of the display equipment.
In order to solve the above technical problem, a display device according to the present invention includes an extended kalman filter, a memory, and a processor; the memory stores a computer program for attitude tracking and image superposition; the processor executes the computer program to implement the steps of the method as described above.
The invention has the following beneficial effects:
according to the method, the position and the posture are tracked by the extended Kalman filter, the posture information of the display equipment is obtained, the posture image is generated, and the posture image is superposed into the real scene image according to the determined image conversion relation, so that the accuracy and the safety of augmented reality display are obviously improved.
Drawings
FIG. 1 is a flow chart of a method for pose tracking and image overlay in an embodiment of the present invention;
FIG. 2 is a flow chart of an alternative pose tracking and image overlay method in an embodiment of the present invention;
FIG. 3 is a flow chart of an extended Kalman filter tracking position and attitude in an embodiment of the present invention.
Detailed Description
In order to solve the problems of the prior art, the present invention provides a method for tracking a pose and superimposing an image and a display device, and the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
Example one
The embodiment of the invention provides a method for tracking a posture and superposing an image, which comprises the following steps of:
s101, determining pose information of display equipment through a preset Extended Kalman Filter (EKF);
s102, generating a situation image according to the pose information;
s103, determining an image conversion relation between the situation image and the real scene image according to a predetermined coordinate system conversion relation between a situation image model coordinate system and a display equipment coordinate system;
and S104, superimposing the generated situation image to the real scene image of the display equipment according to the image conversion relation.
The method in the embodiment of the invention is executed in a display device, for example, a head-mounted display device.
The real scene image in the embodiment of the invention can be an image shot by the display device through the optical system directly.
According to the embodiment of the invention, the position and posture information of the display equipment is obtained by utilizing the extended Kalman filter to track the position and posture, the situation image is generated, and the situation image is superposed into the real scene image according to the determined image conversion relation, so that the accuracy and the safety of augmented reality display are obviously improved; the method can assist the user in better judging the social security situation and the position of a real environment target in application, and improve the sensing efficiency of the social security situation.
In this embodiment of the present invention, optionally, the determining, by using an extended kalman filter, the pose information of the display device includes:
acquiring offset through a preset inertial sensor system;
determining the speed, the three-dimensional posture and the three-dimensional position of a tracked target through a preset optical system;
and determining the pose information through the extended Kalman filter according to the offset, the speed, the three-dimensional posture and the three-dimensional position of the target.
The pose information in the embodiment of the invention comprises a position and a posture; the inertial sensor system includes a plurality of types of inertial sensors; such as gyroscopes, accelerometers and magnetic sensors.
The embodiment of the invention fuses the pose in augmented reality safety situation display and navigation by using the hybrid tracking of the optical system and the inertial sensor system, integrates the inertial tracking result with higher updating speed and the optical tracking result with higher precision by using the extended Kalman filtering for data fusion, simultaneously considers the inhibition on the optical shielding error, improves the refresh rate and robustness of the tracking system, and obviously improves the accuracy and safety of the augmented reality display.
Example two
An embodiment of the present invention provides a method for optionally tracking a posture and superimposing an image, as shown in fig. 1, the method includes:
step 1, modeling an inertial sensor error by using an FIR (Finite Impulse Response) filtering mode; the method specifically comprises the following steps:
determining the offset error of each inertial sensor in the inertial sensor system through a preset finite-length unit impulse response filter;
and correcting each inertial sensor according to the offset error.
In detail, the embodiment of the invention removes the noise in the bias error model of the inertial sensor by using an FIR filter.
The embodiment of the invention uses an FIR filter with the order of 20, effectively balances the calculation rate and the precision, and has the output expression as follows:
the offset of the inertial sensor is the output of the filter, the filter coefficients and the measured and ideal values of the inertial sensor are the inputs of the filter:
as determined by trial and error, the filter design parameters were defined as: the sampling frequency was 400Hz, the cut-off frequency was 50Hz, and the rank was 20.
Experiments prove that the filter parameters provided by the embodiment of the invention are as follows:
and after the parameters are brought into a filter to remove noise, the offset error in the measured value of each inertial sensor can be obtained.
According to the embodiment of the invention, the influence of the inertial sensor error on the tracking precision can be reduced by using the finite-length unit impulse response filter, and the tracking precision is improved.
Step 2, tracking the attitude and the position based on the extended Kalman filter; determining pose information of the display equipment through a preset extended Kalman filter; and generating a situation image according to the pose information.
As shown in fig. 3, an embodiment of the present invention uses an extended kalman filter to track the pose and position of a target to determine pose information of a display device, including:
acquiring offset through a preset inertial sensor system;
determining the speed, the three-dimensional posture and the three-dimensional position of a tracked target through a preset optical system;
and determining the pose information through the extended Kalman filter according to the offset, the speed, the three-dimensional posture and the three-dimensional position of the target.
Determining the pose information through the extended Kalman filter according to the offset, the speed of the target, the three-dimensional posture and the three-dimensional position, wherein the determining comprises the following steps:
constructing a state variable of the extended Kalman filter according to the offset, the speed, the three-dimensional posture and the three-dimensional position of the target;
and in the filtering cycle of the extended Kalman filter, estimating and updating the state variable to obtain the pose information.
According to the embodiment of the invention, the inertial sensor system and the optical system are utilized to form the hybrid tracking system, so that the influence of the partial shielding and the short-time complete shielding of the mark point on the position tracking is effectively overcome, and the robustness of the tracking system is improved.
In detail, the state variables of the extended kalman filter are estimated and updated in each filtering cycle, and the state variables of the embodiment of the present invention are composed of a quaternion representing the three-dimensional attitude angle of the target, the position of the target, the velocity of the target, and the offset of the inertial sensor:
q=[q1q2q3q4]T
p=[X Y Z]
v=[vxvyvz]T
B=[BgxBgyBgzBaxBayBazBmxBmyBmz]T
wherein q represents the three-dimensional attitude of the target and is represented by a quaternion, q1Is a quaternary imaginary part, q2、q3And q is4Is the real part of the quaternion. p represents the three-dimensional position of the object, represented by three-dimensional position coordinates, X, Y and Z respectively represent the components of the offset on the three coordinate axes of the reference coordinate system. v is the velocity of the tracked target, and the footmarks g, a and m represent a gyroscope, an accelerometer and a magnetic sensor respectively; b is an offset. The system state variables are updated according to the following equations:
vk=vk-1+Δt·ak
Bk=Bk-1+nb
the kalman gain matrix is updated by the following equation:
wherein R is a measurement noise covariance matrix:
wherein,the prior error covariance matrix of the system, and H is a Jacobian matrix for converting the system measurement value and the state quantity. And further obtaining a measurement update equation of the system:
in the formula, the measurement value Z is position and posture information acquired by the optical system.
The situation image in the embodiment of the invention can be a social security situation navigation image; the method for generating the situation image in the embodiment of the invention may include:
the real-time positions and postures of a real scene, a user and display equipment are captured by a hybrid tracking system composed of an optical tracking system and an inertial tracking system, and a social security situation navigation image can be generated after a situation information three-dimensional model is matched with the situation information.
Step 3, converting the coordinate systems of the two groups of navigation systems in the space; the disclosed device is provided with:
registering a situation image model coordinate system, a reference coordinate system of a real scene image, an optical system coordinate system and the display equipment coordinate system;
and determining the coordinate system conversion relation according to the registration result.
Optionally, the registering the posture image model coordinate system, the reference coordinate system of the real scene image, the optical system coordinate system, and the display device coordinate system includes:
registering the situation image model coordinate system and the reference coordinate system;
registering the reference coordinate system and the optical system coordinate system;
registering the optical system coordinate system and the display device coordinate system.
Optionally, the registering comprises, for any two coordinate systems:
determining the optimal translation vectors of two coordinate systems;
and registering the two coordinate systems according to the optimal translation vector.
Wherein the determining the optimal translation vector for the two coordinate systems comprises:
selecting the corresponding points with the same quantity from the two coordinate systems to form two groups of point sets;
calculating the gravity centers of the two groups of point sets;
forming a covariance matrix of two groups of point sets according to the gravity center;
forming a symmetric matrix according to the covariance matrix;
calculating the eigenvalue and the eigenvector of the symmetric matrix to obtain the eigenvector corresponding to the maximum eigenvalue;
and determining the optimal translation vector according to a preset conversion formula of two coordinate systems and the obtained characteristic vector.
In detail, the transformation of two coordinate systems in space can be represented by the following transformation formula:
wherein the subscript a represents the three-dimensional position coordinates of the object in the coordinate system a, the subscript b represents the three-dimensional position coordinates of the object in the coordinate system b,the rotation matrix from the b coordinate system to the a coordinate system is shown, the rotation matrix in the embodiment of the invention is the optimal rotation matrix,representing a displacement vector from a b coordinate system to an a coordinate system; in the embodiment of the invention, the displacement vector is the optimal translation vector.
Matching of the two sets of Point coordinates is performed by ICP (integrated Closest Point) algorithm.
During calculation, the corresponding points with the same number are respectively selected from the two coordinate systems, and the gravity centers of the two groups of point sets are calculated:
wherein a and b represent two point sets in two coordinate systems respectively, and a covariance matrix is formed by the two point sets:
the above covariance matrix constitutes a 4x4 symmetric matrix:
where tr (Σ)a,b) Is a matrix sigmaa,bThe trace of (a) is determined,
Δ=[A23A31A12]
computing matrix Q (Sigma)a,b) The eigenvector corresponding to the largest eigenvalue is the optimal rotation vector, and the optimal rotation matrix Rot can be obtained. This results in the best translation vector T:
T=μb-Rotμa
registering the coordinate systems of the parts, including: the three-dimensional situation information model comprises a three-dimensional situation information model coordinate system (IMG) corresponding to a situation information image, a reference coordinate system (REF) corresponding to a real scene position, a display equipment coordinate system (TOOL), an optical tracking system coordinate system (OTS) and an inertial tracking system coordinate system (IMU).
Step 4, converting a three-dimensional situation information model coordinate system, a real scene position reference coordinate system, a reference coordinate system and a display equipment coordinate system; the method specifically comprises the following steps: and determining the image conversion relation between the situation image and the real scene image according to the predetermined coordinate system conversion relation between the situation image model coordinate system and the display equipment coordinate system.
The situation image model in the embodiment of the invention can also be called as a situation information model, and comprises a three-dimensional situation information model.
In detail, after the real scene image is obtained, the position coordinates of the mark points in the real scene are extracted, and the conversion relation between the reference coordinate system and the three-dimensional situation information model coordinate system of the real scene can be calculated
Obtaining a translation relationship between a reference coordinate system and an optical tracking system by the optical tracking systemAnd the conversion relation between the display device coordinates and the optical tracking systemThen;
further, according to the conversion relation, the conversion relation between the augmented reality image and the real scene image can be obtained
And 5, displaying safety situation information under the assistance of real-time augmented reality navigation display.
In practical application, the augmented reality image is sent to augmented reality display equipment, and the posture information navigation image is subjected to error matching and real-time display through a pose tracking error visualization method. The user carries out social security situation perception and auxiliary information acquisition under the assistance of the situation information navigation system, and judges the relative position of the real scene and the social security situation or the key attention object through the navigation image.
According to the embodiment of the invention, the real-time positions and postures of a real scene, a user and display equipment are captured by the hybrid tracking system formed by the optical and inertial tracking systems, and the social security situation navigation image can be generated after the situation information three-dimensional model is matched with the situation information.
EXAMPLE III
The embodiment of the invention provides display equipment, which comprises an extended Kalman filter, a memory and a processor; the memory stores a computer program for attitude tracking and image superposition; the processor executes the computer program to implement the steps of the method according to any one of the first or second embodiments.
Optionally, the apparatus further comprises an inertial sensor system and an optical system.
When the embodiment of the invention is specifically implemented, reference can be made to the first embodiment and the second embodiment, and corresponding technical effects are achieved.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of pose tracking and image overlay, the method comprising:
determining pose information of the display equipment through a preset extended Kalman filter;
generating a situation image according to the pose information;
determining an image conversion relation between the situation image and the real scene image according to a predetermined coordinate system conversion relation between a situation image model coordinate system and a display equipment coordinate system;
and according to the image conversion relation, superimposing the generated situation image to the real scene image of the display equipment.
2. The method of claim 1, wherein determining pose information for the display device by the extended kalman filter comprises:
acquiring offset through a preset inertial sensor system;
determining the speed, the three-dimensional posture and the three-dimensional position of a tracked target through a preset optical system;
and determining the pose information through the extended Kalman filter according to the offset, the speed, the three-dimensional posture and the three-dimensional position of the target.
3. The method of claim 2, wherein the determining the pose information by the extended kalman filter according to the offset, the velocity, the three-dimensional attitude, and the three-dimensional position of the target comprises:
constructing a state variable of the extended Kalman filter according to the offset, the speed, the three-dimensional posture and the three-dimensional position of the target;
and in the filtering cycle of the extended Kalman filter, estimating and updating the state variable to obtain the pose information.
4. The method of claim 2, wherein prior to collecting the offset by the predetermined inertial sensor system, comprising:
determining the offset error of each inertial sensor in the inertial sensor system through a preset finite-length unit impulse response filter;
and correcting each inertial sensor according to the offset error.
5. The method of claim 2, wherein before determining the image transformation relationship between the posture image and the real scene image according to the predetermined coordinate system transformation relationship between the posture image model coordinate system and the display device coordinate system, the method comprises:
registering a situation image model coordinate system, a reference coordinate system of a real scene image, an optical system coordinate system and the display equipment coordinate system;
and determining the coordinate system conversion relation according to the registration result.
6. The method of claim 5, wherein the registering the posture image model coordinate system, the reference coordinate system of the real scene image, the optical system coordinate system, and the display device coordinate system comprises:
registering the situation image model coordinate system and the reference coordinate system;
registering the reference coordinate system and the optical system coordinate system;
registering the optical system coordinate system and the display device coordinate system.
7. The method of claim 6, wherein the registering comprises, for any two coordinate systems:
determining the optimal translation vectors of two coordinate systems;
and registering the two coordinate systems according to the optimal translation vector.
8. The method of claim 7, wherein determining the optimal translation vector for the two coordinate systems comprises:
selecting the corresponding points with the same quantity from the two coordinate systems to form two groups of point sets;
calculating the gravity centers of the two groups of point sets;
forming a covariance matrix of two groups of point sets according to the gravity center;
forming a symmetric matrix according to the covariance matrix;
calculating the eigenvalue and the eigenvector of the symmetric matrix to obtain the eigenvector corresponding to the maximum eigenvalue;
and determining the optimal translation vector according to a preset conversion formula of two coordinate systems and the obtained characteristic vector.
9. A display device, the device comprising an extended kalman filter, a memory, and a processor; the memory stores a computer program for attitude tracking and image superposition; the processor executes the computer program to implement the steps of the method according to any of claims 1-8.
10. The apparatus of claim 9, further comprising an inertial sensor system and an optical system.
CN201711204062.4A 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment Pending CN108318029A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711204062.4A CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711204062.4A CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Publications (1)

Publication Number Publication Date
CN108318029A true CN108318029A (en) 2018-07-24

Family

ID=62893101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711204062.4A Pending CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Country Status (1)

Country Link
CN (1) CN108318029A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (en) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 A kind of the coordinate system calibration method and device of augmented reality equipment
CN109815854A (en) * 2019-01-07 2019-05-28 亮风台(上海)信息科技有限公司 It is a kind of for the method and apparatus of the related information of icon to be presented on a user device
CN110879611A (en) * 2019-11-01 2020-03-13 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle cluster three-dimensional curve path tracking method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
CN106569044A (en) * 2016-11-02 2017-04-19 西安电子科技大学 Immersive virtual reality system-based electromagnetic spectrum situation observation method
CN106846920A (en) * 2017-01-24 2017-06-13 南京航空航天大学 A kind of blank pipe aid decision-making method based on nature extraction of semantics
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
CN106569044A (en) * 2016-11-02 2017-04-19 西安电子科技大学 Immersive virtual reality system-based electromagnetic spectrum situation observation method
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality
CN106846920A (en) * 2017-01-24 2017-06-13 南京航空航天大学 A kind of blank pipe aid decision-making method based on nature extraction of semantics

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
贺长宇: "增强现实手术导航中的混合位姿跟踪技术及可视化研究", 《中国博士学位论文全文数据库》 *
郭昌达: "增强现实三维配准技术方法研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (en) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 A kind of the coordinate system calibration method and device of augmented reality equipment
WO2020019962A1 (en) * 2018-07-27 2020-01-30 阿里巴巴集团控股有限公司 Coordinate system calibration method and device for augmented reality device
TWI712004B (en) * 2018-07-27 2020-12-01 開曼群島商創新先進技術有限公司 Coordinate system calibration method and device of augmented reality equipment
CN109815854A (en) * 2019-01-07 2019-05-28 亮风台(上海)信息科技有限公司 It is a kind of for the method and apparatus of the related information of icon to be presented on a user device
CN110879611A (en) * 2019-11-01 2020-03-13 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle cluster three-dimensional curve path tracking method and device

Similar Documents

Publication Publication Date Title
Ahmed et al. Accurate attitude estimation of a moving land vehicle using low-cost MEMS IMU sensors
CN110986939B (en) Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration
Valenti et al. A linear Kalman filter for MARG orientation estimation using the algebraic quaternion algorithm
CN106056664B (en) A kind of real-time three-dimensional scene reconstruction system and method based on inertia and deep vision
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN109376785A (en) Air navigation aid based on iterative extended Kalman filter fusion inertia and monocular vision
Fleps et al. Optimization based IMU camera calibration
CN107478223A (en) A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN105931275A (en) Monocular and IMU fused stable motion tracking method and device based on mobile terminal
JP4876204B2 (en) Small attitude sensor
CN109141433A (en) A kind of robot indoor locating system and localization method
CN108846857A (en) The measurement method and visual odometry of visual odometry
CN108225308A (en) A kind of attitude algorithm method of the expanded Kalman filtration algorithm based on quaternary number
JP2014089113A (en) Posture estimation device and program
CN106709222B (en) IMU drift compensation method based on monocular vision
CN112985450B (en) Binocular vision inertial odometer method with synchronous time error estimation
Sun et al. Adaptive sensor data fusion in motion capture
CN109186596B (en) IMU measurement data generation method, system, computer device and readable storage medium
CN108318029A (en) Attitude Tracking and image superimposing method and display equipment
CN110455309A (en) The vision inertia odometer based on MSCKF for having line duration calibration
CN114529576A (en) RGBD and IMU hybrid tracking registration method based on sliding window optimization
CN109000633A (en) Human body attitude motion capture algorithm design based on isomeric data fusion
CN108871319B (en) Attitude calculation method based on earth gravity field and earth magnetic field sequential correction
JP2013122384A (en) Kalman filter and state estimation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180724

RJ01 Rejection of invention patent application after publication