CN108318029A - Attitude Tracking and image superimposing method and display equipment - Google Patents

Attitude Tracking and image superimposing method and display equipment Download PDF

Info

Publication number
CN108318029A
CN108318029A CN201711204062.4A CN201711204062A CN108318029A CN 108318029 A CN108318029 A CN 108318029A CN 201711204062 A CN201711204062 A CN 201711204062A CN 108318029 A CN108318029 A CN 108318029A
Authority
CN
China
Prior art keywords
image
coordinate systems
situation
coordinate system
kalman filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711204062.4A
Other languages
Chinese (zh)
Inventor
贺长宇
张红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Electronics Technology Group Corp CETC
Electronic Science Research Institute of CTEC
Original Assignee
China Electronics Technology Group Corp CETC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Electronics Technology Group Corp CETC filed Critical China Electronics Technology Group Corp CETC
Priority to CN201711204062.4A priority Critical patent/CN108318029A/en
Publication of CN108318029A publication Critical patent/CN108318029A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a kind of Attitude Tracking and image superimposing method and display equipment.The method includes:The posture information for showing equipment is determined by preset extended Kalman filter;Situation image is generated according to the posture information;According to the coordinate system transformational relation of predetermined situation image model coordinate systems and display device coordinate system, the image transformational relation of situation image and real scene image is determined;According to described image transformational relation, the situation image of generation is added in the real scene image of the display equipment.The present invention is obviously improved the accuracy and safety that augmented reality is shown.

Description

Attitude Tracking and image superimposing method and display equipment
Technical field
The present invention relates to augmented reality fields, more particularly to a kind of Attitude Tracking and image superimposing method and show Show equipment.
Background technology
Augmented reality shows airmanship, provides naked eyes sightless navigation for user's life by superimposed image information Or auxiliary image information, the effect of visualization and application efficiency of types of applications can be improved.Situation based on augmented reality Display and navigation system, can make user simultaneously in true environment target and navigation picture row observe.
In existing augmented reality navigation, the accuracy and safety that augmented reality is shown are relatively low.
Invention content
In order to overcome drawbacks described above, the technical problem to be solved in the present invention is to provide a kind of Attitude Tracking and image superpositions Method and display equipment, the accuracy shown to augmented reality and safety.
In order to solve the above technical problems, a kind of Attitude Tracking in the present invention and image superimposing method, including:
The posture information for showing equipment is determined by preset extended Kalman filter;
Situation image is generated according to the posture information;
According to the coordinate system transformational relation of predetermined situation image model coordinate systems and display device coordinate system, really The image transformational relation of stationary state gesture image and real scene image;
According to described image transformational relation, the real scene figure for the display equipment that the situation image of generation is added to As in.
In order to solve the above technical problems, a kind of display equipment in the present invention, including extended Kalman filter, storage Device and processor;The memory is stored with Attitude Tracking and image superposition computer program;The processor executes the meter Calculation machine program, the step of to realize method as described above.
The present invention has the beneficial effect that:
The present invention utilizes extended Kalman filter tracing positional and posture, obtains the posture information of display equipment, and raw At situation image, situation image is added in real scene image according to determining image transformational relation, to be obviously improved The accuracy and safety that augmented reality is shown.
Description of the drawings
Fig. 1 is a kind of flow chart of Attitude Tracking and image superimposing method in the embodiment of the present invention;
Fig. 2 is a kind of flow chart of optionally Attitude Tracking and image superimposing method in the embodiment of the present invention;
Fig. 3 is the flow chart of extended Kalman filter tracing positional and posture in the embodiment of the present invention.
Specific implementation mode
In order to solve problems in the prior art, the present invention provides a kind of Attitude Trackings and image superimposing method and display Equipment, below in conjunction with attached drawing and embodiment, the present invention will be described in further detail.It should be appreciated that described herein Specific examples are only used to explain the present invention, does not limit the present invention.
Embodiment one
A kind of Attitude Tracking of offer of the embodiment of the present invention and image superimposing method, as shown in Figure 1, the method includes:
S101 determines the posture information for showing equipment by preset extended Kalman filter (EKF);
S102 generates situation image according to the posture information;
S103 is converted according to the coordinate system of predetermined situation image model coordinate systems and display device coordinate system and is closed System, determines the image transformational relation of situation image and real scene image;
S104, according to described image transformational relation, the true field for the display equipment that the situation image of generation is added to In scape image.
Method executes in the display device in the embodiment of the present invention, for example, head-mounted display apparatus.
Real scene image can be the image for showing equipment and directly being shot by optical system in the embodiment of the present invention.
The embodiment of the present invention utilizes extended Kalman filter tracing positional and posture, obtains the pose letter of display equipment Breath, and situation image is generated, situation image is added in real scene image according to determining image transformational relation, to It is obviously improved the accuracy and safety that augmented reality is shown;It can assist in user and preferably judge society's peace in the application The position of full situation and true environment target, improves the efficiency of social safety Situation Awareness.
In embodiments of the present invention, optionally, described determined by extended Kalman filter shows that the pose of equipment is believed Breath, including:
Amount of bias is acquired by preset Inertial Sensor System;
Speed, 3 d pose and the three-dimensional position of tracked target are determined by preset optical system;
According to the amount of bias, the speed of the target, 3 d pose and three-dimensional position, pass through the spreading kalman Filter determines the posture information.
Posture information includes position and posture in the embodiment of the present invention;The Inertial Sensor System includes multiple types Inertial sensor;Such as gyroscope, accelerometer and Magnetic Sensor.
The embodiment of the present invention merges augmented reality peace by using optical system and Inertial Sensor System combined tracking Full battle state display and the pose in navigation, carry out data fusion using Extended Kalman filter, it is faster to be integrated with renewal speed Inertia tracking result and the higher optical tracking of precision improve tracking as a result, take into account the inhibition to optical block error simultaneously The refresh rate and robustness of system are obviously improved accuracy and safety that augmented reality is shown.
Embodiment two
The embodiment of the present invention provides a kind of optionally Attitude Tracking and image superimposing method, as shown in Figure 1, the method Including:
Step 1 is using FIR (Finite Impulse Response, there is limit for length's unit impulse response) filtering mode Inertial sensor errors are modeled;It specifically includes:
There is limit for length's unit impulse response filter by preset, determines each inertia in the Inertial Sensor System The biased error of sensor;
According to the biased error, each inertial sensor is corrected.
Specifically, the embodiment of the present invention utilizes the noise in FIR filter removal inertial sensor biased error model.
The FIR filter that the embodiment of the present invention is 20 using an exponent number, active balance computation rate and precision, it is defeated Going out expression formula is:
The amount of bias of inertial sensor is the output of filter, the measurement of filter coefficient and inertial sensor and ideal Value is the input of filter:
It is determined by repetition test, filter design parameter is set to:Sample frequency is 400Hz, and cutoff frequency is 50Hz, order 20.
Experiment proves that the embodiment of the present invention proposes that filter parameter is as follows:
Above-mentioned parameter is brought into can obtain the biasing in each inertial sensor measured value after filter removal noise misses Difference.
The embodiment of the present invention, which can utilize, has limit for length's unit impact response filter to reduce inertial sensor errors to tracking The influence of precision improves tracking accuracy.
Step 2, posture and position tracking based on extended Kalman filter;Including passing through preset spreading kalman Filter determines the posture information for showing equipment;Situation image is generated according to the posture information.
As shown in figure 3, the embodiment of the present invention realizes the posture and position tracking to target using extended Kalman filter To determine the posture information for showing equipment, including:
Amount of bias is acquired by preset Inertial Sensor System;
Speed, 3 d pose and the three-dimensional position of tracked target are determined by preset optical system;
According to the amount of bias, the speed of the target, 3 d pose and three-dimensional position, pass through the spreading kalman Filter determines the posture information.
Wherein, according to the amount of bias, the speed of the target, 3 d pose and three-dimensional position, pass through the expansion card Thalmann filter determines the posture information, including:
The spreading kalman filter is built according to the amount of bias, the speed of the target, 3 d pose and three-dimensional position The state variable of wave device;
In the filtering cycle of the extended Kalman filter, estimation update is carried out to the state variable, obtains institute State posture information.
The embodiment of the present invention forms combined tracking system using Inertial Sensor System and optical system, effectively overcomes mark Will point partial occlusion and short time block the influence generated to position tracking completely, improve the robustness of tracking system.
Specifically, the state variable of extended Kalman filter is estimated and updates, the present invention in each filtering cycle The state variable of embodiment is by the quaternary number of expression target three-dimension altitude angle, the position of target, the speed of target and inertia sensing The biasing of device forms:
Q=[q1 q2 q3 q4]T
P=[X Y Z]
V=[vx vy vz]T
B=[Bgx Bgy Bgz Bax Bay Baz Bmx Bmy Bmz]T
Wherein, q indicates the 3 d pose of target, is indicated with quaternary number, q1For quaternary imaginary part, q2、q3And q4It is real for quaternary number Portion.P indicates the three-dimensional position of target, is indicated with three-dimensional location coordinates, X, Y and Z respectively represent amount of bias in reference frame three Component in a reference axis.V is the speed of tracked target, and footnote g, a and m respectively represent gyroscope, accelerometer and magnetic and pass Sensor;B is amount of bias.System state variables are updated according to following equation:
vk=vk-1+Δt·ak
Bk=Bk-1+nb
Kalman gain matrix is updated by following equation:
Wherein R is measurement noise covariance matrix:
Wherein,For the prior uncertainty covariance matrix of system, H converts system measurement and quantity of state Jacobian matrix.And then obtain the measurement updaue equation of system:
In formula, measured value Z is the position and posture information that optical system obtains.
Situation image can be social safety situation navigation picture in the embodiment of the present invention;It is generated in the embodiment of the present invention The mode of situation image may include:
Real scene, user and display are captured by the combined tracking system being made of optics and inertia tracking system The real time position and posture of equipment, can generate social safety after situation information threedimensional model is matched with posture information Situation navigation picture.
Step 3 converts two groups of navigation system coordinate systems in space;Have including:
To situation image model coordinate systems, the reference frame of real scene image, optical system coordinate system and described aobvious Show that device coordinate system is registrated;
According to registration result, the coordinate system transformational relation is determined.
Optionally, described that situation image model coordinate systems, the reference frame of real scene image, optical system are sat Mark system and display device coordinate system registration, including:
The situation image model coordinate systems and the reference frame are registrated;
To the reference frame and optical system co-registration of coordinate systems used;
The optical system coordinate system and the display device coordinate system are registrated.
Optionally, for any two coordinate system, the registration includes:
Determine the best translation vector of two coordinate systems;
According to the best translation vector, to two co-registration of coordinate systems used.
Wherein, the best translation vector of two coordinate systems of the determination, including:
The corresponding points of identical quantity are chosen in two coordinate systems, form two groups of point sets;
Calculate the center of gravity of two groups of point sets;
According to the center of gravity, the covariance matrix of two groups of point sets is constituted;
According to the covariance matrix, symmetrical matrix is constituted;
The characteristic value and feature vector for calculating the symmetrical matrix obtain the corresponding feature vector of maximum eigenvalue;
According to the conversion formula of pre-set two coordinate systems and obtained feature vector, best translation vector is determined.
Specifically, the conversion of two coordinate systems can be indicated by following conversion formula in space:
Wherein, footnote a represents the three-dimensional location coordinates of the target in a coordinate systems, and footnote b represents the target in b coordinate systems Three-dimensional location coordinates,Indicate that, from b coordinate systems to the spin matrix of a coordinate systems, spin matrix takes in the embodiment of the present invention Best spin matrix,It represents from b coordinate systems to the motion vector of a coordinate systems;Motion vector takes most preferably in the embodiment of the present invention Translation vector.
The matching of two groups of point coordinates is carried out by ICP (Integrative Closest Point) algorithm.
When calculating, the corresponding points of identical quantity are chosen in two coordinate systems respectively, calculate the center of gravity of two groups of point sets:
Wherein a and b respectively represents two point sets in two coordinate systems, and covariance matrix is constituted by two point sets:
Above-mentioned covariance matrix constitutes 4x4 symmetrical matrixes:
Wherein tr (∑sa,b) it is matrix ∑a,bMark,
Δ=[A23 A31 A12]
Calculating matrix Q (∑sa,b) characteristic value and feature vector, the corresponding feature vector of maximum eigenvalue is as best Rotating vector, you can obtain best spin matrix Rot.Thus best translation vector T is obtained:
T=μb-Rotμa
To each section co-registration of coordinate systems used, including:Three-dimensional situation information model coordinate system corresponding with situation information image (IMG), reference frame (REF) corresponding with real scene position, display device coordinate system (TOOL), optical tracking system System coordinate system (OTS) and inertia tracking system coordinate system (IMU).
Step 4, three-dimensional situation information model coordinate system, real scene reference by location coordinate system, reference frame and aobvious Show that device coordinate system is converted;It specifically includes:According to predetermined situation image model coordinate systems and show device coordinate system Coordinate system transformational relation determines the image transformational relation of situation image and real scene image.
Situation iconic model is referred to as situation information model, including three-dimensional situation information mould in the embodiment of the present invention Type.
Specifically, after obtaining real scene image, the position coordinates of index point in real scene are extracted, can be calculated true Transformational relation between the reference frame and three-dimensional situation information model coordinate system of real field scape
The transformational relation between reference frame and optical tracking system is obtained by optical tracking systemAnd Show the transformational relation between device coordinate and optical tracking systemAfterwards;
And then according to the above transformational relation, the conversion that can be obtained between augmented reality image and real scene image is closed System
Step 5 carries out the security postures presentation of information that real time enhancing reality is navigated under display assistance.
Augmented reality image is sent to augmented reality in practical applications and shows equipment, and passes through posture tracking error Situation information navigation picture is carried out error matching and real-time display by method for visualizing.User is in situation information navigation system Assistance under carry out social safety Situation Awareness and auxiliary information and obtain, real scene and society's peace are judged by navigation picture Full situation or the relative position for paying close attention to object.
The embodiment of the present invention by the combined tracking system that is made of optics and inertia tracking system capture real scene, The real time position and posture of user and display equipment, can after situation information threedimensional model is matched with posture information Social safety situation navigation picture is generated, this method can assist in user and preferably judge social safety situation in the application With the position of true environment target, the efficiency of social safety Situation Awareness is improved.
Embodiment three
The embodiment of the present invention provides a kind of display equipment, and the equipment includes extended Kalman filter, memory and place Manage device;The memory is stored with Attitude Tracking and image superposition computer program;The processor executes the computer journey Sequence, to realize such as the step of any one of embodiment one or embodiment two the method.
Optionally, the equipment further includes Inertial Sensor System and optical system.
The embodiment of the present invention can have corresponding technology effect in specific implementation refering to embodiment one and embodiment two Fruit.
Above-described specific implementation mode has carried out further the purpose of the present invention, technical solution and advantageous effect It is described in detail, it should be understood that the foregoing is merely the specific implementation mode of the present invention, is not used to limit this hair Bright protection domain, all within the spirits and principles of the present invention, any modification, equivalent substitution, improvement and etc. done should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of Attitude Tracking and image superimposing method, which is characterized in that the method includes:
The posture information for showing equipment is determined by preset extended Kalman filter;
Situation image is generated according to the posture information;
According to the coordinate system transformational relation of predetermined situation image model coordinate systems and display device coordinate system, situation is determined The image transformational relation of image and real scene image;
According to described image transformational relation, the situation image of generation is added in the real scene image of the display equipment.
2. the method as described in claim 1, which is characterized in that described determined by extended Kalman filter shows equipment Posture information, including:
Amount of bias is acquired by preset Inertial Sensor System;
Speed, 3 d pose and the three-dimensional position of tracked target are determined by preset optical system;
According to the amount of bias, the speed of the target, 3 d pose and three-dimensional position, pass through the Extended Kalman filter Device determines the posture information.
3. method as claimed in claim 2, which is characterized in that described according to the amount of bias, the speed of the target, three-dimensional Posture and three-dimensional position determine the posture information by the extended Kalman filter, including:
The extended Kalman filter is built according to the amount of bias, the speed of the target, 3 d pose and three-dimensional position State variable;
In the filtering cycle of the extended Kalman filter, estimation update is carried out to the state variable, obtains institute's rheme Appearance information.
4. method as claimed in claim 2, which is characterized in that described to acquire amount of bias by preset Inertial Sensor System Before, including:
There is limit for length's unit impulse response filter by preset, determines each inertial sensor in the Inertial Sensor System Biased error;
According to the biased error, each inertial sensor is corrected.
5. method as claimed in claim 2, which is characterized in that it is described according to predetermined situation image model coordinate systems and The coordinate system transformational relation for showing device coordinate system, before the image transformational relation for determining situation image and real scene image, Including:
Situation image model coordinate systems, the reference frame of real scene image, optical system coordinate system and the display are set Standby co-registration of coordinate systems used;
According to registration result, the coordinate system transformational relation is determined.
6. method as claimed in claim 5, which is characterized in that described to situation image model coordinate systems, real scene image Reference frame, optical system coordinate system and the display device coordinate system registration, including:
The situation image model coordinate systems and the reference frame are registrated;
To the reference frame and optical system co-registration of coordinate systems used;
The optical system coordinate system and the display device coordinate system are registrated.
7. method as claimed in claim 6, which is characterized in that be directed to any two coordinate system, the registration includes:
Determine the best translation vector of two coordinate systems;
According to the best translation vector, to two co-registration of coordinate systems used.
8. the method for claim 7, which is characterized in that the best translation vector of two coordinate systems of the determination, including:
The corresponding points of identical quantity are chosen in two coordinate systems, form two groups of point sets;
Calculate the center of gravity of two groups of point sets;
According to the center of gravity, the covariance matrix of two groups of point sets is constituted;
According to the covariance matrix, symmetrical matrix is constituted;
The characteristic value and feature vector for calculating the symmetrical matrix obtain the corresponding feature vector of maximum eigenvalue;
According to the conversion formula of pre-set two coordinate systems and obtained feature vector, best translation vector is determined.
9. a kind of display equipment, which is characterized in that the equipment includes extended Kalman filter, memory and processor;Institute It states memory and is stored with Attitude Tracking and image superposition computer program;The processor executes the computer program, with reality Now such as the step of any one of claim 1-8 the methods.
10. equipment as claimed in claim 9, which is characterized in that the equipment further includes Inertial Sensor System and optical system System.
CN201711204062.4A 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment Pending CN108318029A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711204062.4A CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711204062.4A CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Publications (1)

Publication Number Publication Date
CN108318029A true CN108318029A (en) 2018-07-24

Family

ID=62893101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711204062.4A Pending CN108318029A (en) 2017-11-27 2017-11-27 Attitude Tracking and image superimposing method and display equipment

Country Status (1)

Country Link
CN (1) CN108318029A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (en) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 A kind of the coordinate system calibration method and device of augmented reality equipment
CN109815854A (en) * 2019-01-07 2019-05-28 亮风台(上海)信息科技有限公司 It is a kind of for the method and apparatus of the related information of icon to be presented on a user device
CN110879611A (en) * 2019-11-01 2020-03-13 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle cluster three-dimensional curve path tracking method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
CN106569044A (en) * 2016-11-02 2017-04-19 西安电子科技大学 Immersive virtual reality system-based electromagnetic spectrum situation observation method
CN106846920A (en) * 2017-01-24 2017-06-13 南京航空航天大学 A kind of blank pipe aid decision-making method based on nature extraction of semantics
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030210228A1 (en) * 2000-02-25 2003-11-13 Ebersole John Franklin Augmented reality situational awareness system and method
CN106569044A (en) * 2016-11-02 2017-04-19 西安电子科技大学 Immersive virtual reality system-based electromagnetic spectrum situation observation method
CN106909215A (en) * 2016-12-29 2017-06-30 深圳市皓华网络通讯股份有限公司 Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality
CN106846920A (en) * 2017-01-24 2017-06-13 南京航空航天大学 A kind of blank pipe aid decision-making method based on nature extraction of semantics

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
贺长宇: "增强现实手术导航中的混合位姿跟踪技术及可视化研究", 《中国博士学位论文全文数据库》 *
郭昌达: "增强现实三维配准技术方法研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109272454A (en) * 2018-07-27 2019-01-25 阿里巴巴集团控股有限公司 A kind of the coordinate system calibration method and device of augmented reality equipment
WO2020019962A1 (en) * 2018-07-27 2020-01-30 阿里巴巴集团控股有限公司 Coordinate system calibration method and device for augmented reality device
TWI712004B (en) * 2018-07-27 2020-12-01 開曼群島商創新先進技術有限公司 Coordinate system calibration method and device of augmented reality equipment
CN109815854A (en) * 2019-01-07 2019-05-28 亮风台(上海)信息科技有限公司 It is a kind of for the method and apparatus of the related information of icon to be presented on a user device
CN110879611A (en) * 2019-11-01 2020-03-13 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle cluster three-dimensional curve path tracking method and device

Similar Documents

Publication Publication Date Title
Valenti et al. A linear Kalman filter for MARG orientation estimation using the algebraic quaternion algorithm
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN109084732A (en) Positioning and air navigation aid, device and processing equipment
Armesto et al. Multi-rate fusion with vision and inertial sensors
CN109141433A (en) A kind of robot indoor locating system and localization method
CN105180937B (en) A kind of MEMS IMU Initial Alignment Methods
CN105931275A (en) Monocular and IMU fused stable motion tracking method and device based on mobile terminal
Kneip et al. Closed-form solution for absolute scale velocity determination combining inertial measurements and a single feature correspondence
Hartmann et al. Indoor 3D position estimation using low-cost inertial sensors and marker-based video-tracking
JP2014089113A (en) Posture estimation device and program
CN108318029A (en) Attitude Tracking and image superimposing method and display equipment
WO2014111159A1 (en) Determining a speed of a multidimensional motion in a global coordinate system
Sun et al. Adaptive sensor data fusion in motion capture
Suh et al. Quaternion-based indirect Kalman filter discarding pitch and roll information contained in magnetic sensors
CN107728182A (en) Flexible more base line measurement method and apparatus based on camera auxiliary
CN109000633A (en) Human body attitude motion capture algorithm design based on isomeric data fusion
CN110455309A (en) The vision inertia odometer based on MSCKF for having line duration calibration
CN107782309A (en) Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods
CN110440797A (en) Vehicle attitude estimation method and system
CN108917755B (en) Imaging seeker line-of-sight angle zero error estimation method and device
CN109764870A (en) Carrier initial heading evaluation method based on transformation estimator modeling scheme
CN108534772A (en) Attitude angle acquisition methods and device
Truppa et al. An innovative sensor fusion algorithm for motion tracking with on-line bias compensation: Application to joint angles estimation in yoga
Llorach et al. Position estimation with a low-cost inertial measurement unit
JP2013185898A (en) State estimation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180724

RJ01 Rejection of invention patent application after publication