CN116058829A - System for displaying human lower limb gesture in real time based on IMU - Google Patents

System for displaying human lower limb gesture in real time based on IMU Download PDF

Info

Publication number
CN116058829A
CN116058829A CN202211676106.4A CN202211676106A CN116058829A CN 116058829 A CN116058829 A CN 116058829A CN 202211676106 A CN202211676106 A CN 202211676106A CN 116058829 A CN116058829 A CN 116058829A
Authority
CN
China
Prior art keywords
imu
coordinate system
lower limb
human body
measurement unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211676106.4A
Other languages
Chinese (zh)
Inventor
刘银华
付铭
杨海强
王晓燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao University
Original Assignee
Qingdao University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao University filed Critical Qingdao University
Priority to CN202211676106.4A priority Critical patent/CN116058829A/en
Publication of CN116058829A publication Critical patent/CN116058829A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Abstract

The invention relates to the technical field of human body posture analysis, in particular to a system for displaying human body lower limb postures in real time based on an IMU, which comprises an inertial measurement unit IMU, wherein the inertial measurement unit IMU is in communication connection with an upper computer, the inertial measurement unit IMU is respectively arranged above two knee joints, above lower leg ankle joints and at sacrum positions of a human body when in use and is used for measuring human body lower limbs, the upper computer is used for receiving information measured by the inertial measurement unit IMU and carrying out operation based on the measured information to display corresponding postures in a visual mode, and a carrier coordinate system and a navigation coordinate system are arranged on the upper computer, so that the system for displaying human body lower limb postures in real time based on the IMU, which can more accurately reproduce human body lower limb postures in real time, is provided.

Description

System for displaying human lower limb gesture in real time based on IMU
Technical Field
The invention relates to the technical field of human body posture analysis, in particular to a system for displaying human body lower limb postures in real time based on an IMU.
Background
In the fields of athlete running posture correction, on-line rehabilitation diagnosis and the like, the demand for real-time accurate estimation of the posture of the lower limb of the human body is increasing. Most of the mature schemes in the market are visual recognition schemes, the hardware conditions required by the visual recognition schemes are high, and the recognition delay of machine vision is generally high. At present, the IMU has accurate measurement data, small equipment size and high cost performance, is less interfered by the outside, is widely applied to the unmanned aerial vehicle gesture recognition and vehicle navigation field, but the application of the IMU in the wearable equipment field is still in a development stage, and the human lower limb gesture can be reproduced in real time more accurately by using a data processing technology based on Kalman filtering.
Disclosure of Invention
The invention aims to solve the technical problems that: the system for displaying the human lower limb gesture in real time based on the IMU can accurately reproduce the human lower limb gesture in real time.
The invention adopts the technical proposal for solving the technical problems that: the system for displaying the human lower limb posture in real time based on the IMU comprises an inertial measurement unit IMU, wherein the inertial measurement unit IMU is in communication connection with an upper computer, the inertial measurement unit IMU is respectively arranged above two knee joints, above a lower leg ankle joint and at a sacrum of a human body when in use and is used for measuring the human lower limb, the upper computer is used for receiving information measured by the inertial measurement unit IMU, performing operation based on the measured information and displaying the corresponding posture in a visual form, and a carrier coordinate system and a navigation coordinate system are arranged on the upper computer;
when the system for displaying the human lower limb gesture in real time based on the IMU operates, the method comprises the following steps:
step one: in the initial state, the human body is in a standing state, the upper computer establishes a corresponding human body model, and synchronous posture display is carried out based on the inertial measurement unit IMU;
step two: the upper computer receives data measured by the inertial measurement unit IMU, and calculates an initial quaternion and a quaternion form attitude matrix;
step three: estimating the human body posture based on a Kalman filtering algorithm;
step four: and synchronous pose display is carried out in the upper computer.
The inertial measurement unit IMU is a nine-axis inertial measurement unit.
The Y-axis of the carrier coordinate system of the upper computer points to the advancing direction, the X-axis is horizontal to the right, and the Z-axis is vertical to the downward; the navigation coordinate system adopts a north-east coordinate system.
The posture angle is represented by the relative angular position relation between the carrier coordinate system and the navigation coordinate system, and the Euler angle is adopted to describe the human posture;
the Euler angle comprises a pitch angle theta, a roll angle gamma and a yaw angle phi, wherein the pitch angle theta is the rotation angle of the carrier coordinate system along the Y axis, and the rotation range is-90 degrees to +90 degrees; the roll angle gamma is the angle by which the carrier coordinate system rotates about the X-axis, the rotation range is-180 DEG to +180 DEG, the yaw angle psi is the angle by which the carrier coordinate system rotates about the Z-axis, and the rotation range is 0 DEG to 360 deg.
In the second step, the interconversion between the carrier coordinate system and the navigation coordinate system adopts an attitude matrix
Figure BDA0004017076240000021
To express, i.e.)>
Figure BDA0004017076240000022
The above matrix is noted as:
Figure BDA0004017076240000023
the calculation formulas of the pitch angle θ, the roll angle γ, and the yaw angle ψ are as follows based on the above-described attitude matrix:
Figure BDA0004017076240000024
θ=sin -1 (-C 31 );
Figure BDA0004017076240000025
in the second step, an initial quaternion q 0 、q 1 、q 2 、q 3 The method comprises the following steps:
Figure BDA0004017076240000026
the gesture matrix represented by the quaternion is:
Figure BDA0004017076240000027
the third step comprises the following substeps:
3-1: calculating an updated form of the quaternion based on the differential form of the quaternion;
3-2: obtaining a state matrix F based on an updated version of the quaternion k
3-3: calculating a Kalman gain M;
3-4: calculating attitude quaternion from data measured by inertial measurement unit IMU
Figure BDA0004017076240000031
The fourth step comprises the following substeps:
4-1: calculating the moving speed and the position of the human body according to the estimated human body posture;
4-2: and displaying the bound model in the front-end webpage in real time.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a system for displaying the human lower limb gesture in real time based on an IMU, which realizes reliable and rapid human lower limb motion capture by reproducing the human lower limb gesture in real time through a webpage end.
Drawings
Fig. 1 is a schematic diagram of the positional relationship between an inertial measurement unit IMU and a lower limb of a human body according to the present invention.
Fig. 2 is a schematic diagram of the moving process of the present invention.
Detailed Description
Embodiments of the invention are further described below with reference to the accompanying drawings:
examples
Referring to fig. 1-2, the system for displaying the human lower limb posture in real time based on the IMU comprises an inertial measurement unit IMU, wherein the inertial measurement unit IMU is in communication connection with an upper computer, the inertial measurement unit IMU is respectively arranged above two knee joints, above a lower leg ankle joint and at a sacrum of a human body when in use and is used for measuring the human lower limb, the upper computer is used for receiving information measured by the inertial measurement unit IMU, performing operation based on the measured information and displaying the corresponding posture in a visual form, and a carrier coordinate system and a navigation coordinate system are arranged on the upper computer; the inertial measurement unit IMU in this embodiment is a nine-axis inertial measurement unit. As in FIG. 2, IMU-1 is located at the sacrum, IMU-2, IMU-3 are located above the two knee joints, and IMU-4, IMU-5 are located above the calf ankle joints.
The Y axis of the carrier coordinate system (b system) of the upper computer points to the advancing direction, the X axis is horizontally right, and the Z axis is vertically downward; the navigation coordinate system (n system) adopts a north-east coordinate system.
The posture angle is represented by the relative angular position relation between the carrier coordinate system and the navigation coordinate system, and the Euler angle is adopted to describe the human posture;
the Euler angle comprises a pitch angle theta, a roll angle gamma and a yaw angle phi, wherein the pitch angle theta is the rotation angle of the carrier coordinate system along the Y axis, and the rotation range is-90 degrees to +90 degrees; the roll angle gamma is the angle by which the carrier coordinate system rotates about the X-axis, the rotation range is-180 DEG to +180 DEG, the yaw angle psi is the angle by which the carrier coordinate system rotates about the Z-axis, and the rotation range is 0 DEG to 360 deg.
When the system for displaying the human lower limb gesture in real time based on the IMU operates, the method comprises the following steps:
step one: in the initial state, the human body is in a straight standing state, and when the human body model in the upper computer is initialized, the human body model is always in the straight standing state, and when the initialization is finished, the angle recognized at the moment is zeroized, namely, the initialized human body posture is set to be the straight standing posture. And binding each inertial measurement unit IMU with a corresponding model joint to realize synchronization of the model and the human body.
Step two: the upper computer receives data measured by the inertial measurement unit IMU, and calculates an initial quaternion and a quaternion form attitude matrix;
in the second step, the interconversion between the carrier coordinate system and the navigation coordinate system adopts an attitude matrix
Figure BDA0004017076240000041
Expressed, i.e
Figure BDA0004017076240000042
The above matrix is noted as:
Figure BDA0004017076240000043
the calculation formulas of the pitch angle θ, the roll angle γ, and the yaw angle ψ are as follows based on the above-described attitude matrix:
Figure BDA0004017076240000044
θ=sin -1 (-C 31 );
Figure BDA0004017076240000045
in the second step, an initial quaternion q 0 、q 1 、q 2 、q 3 The method comprises the following steps:
Figure BDA0004017076240000046
the gesture matrix represented by the quaternion is:
Figure BDA0004017076240000047
step three: estimating the human body posture based on a Kalman filtering algorithm; setting quaternion based on initialization position
Figure BDA0004017076240000048
The third step comprises the following substeps:
3-1: calculating an updated form of the quaternion based on the differential form of the quaternion;
specifically, the differential equation for the quaternion is:
Figure BDA0004017076240000051
omega in x 、ω y 、ω z Three axis angular velocities measured by gyroscopes in the high accuracy IMU, respectively.
Solving a differential equation of the quaternion by using a fourth-order Dragon-Gregory tower method to obtain an updated form of the quaternion:
Figure BDA0004017076240000052
t represents the time value, Δt represents the sampling time interval, and t+Δt represents the next time assuming that t represents this time.
3-2: obtaining a state matrix F based on an updated version of the quaternion k The method comprises the steps of carrying out a first treatment on the surface of the In particular, the method comprises the steps of,
Figure BDA0004017076240000053
(E is an identity matrix);
then the state matrix F k The method comprises the following steps:
Figure BDA0004017076240000054
k is the kth time.
3-3: calculating a Kalman gain M;
the prior estimation is in the form of:
Figure BDA0004017076240000055
wherein W is k Is process noise (set to Gaussian distribution), Q k The process noise covariance matrix is represented by P, the covariance matrix is represented by P, and the observation matrix is represented by H.
The gain based on the Kalman filtering algorithm is as follows:
Figure BDA0004017076240000061
m is Kalman gain, H k For the observation matrix, an identity matrix is set.
3-4: and calculating an attitude quaternion obtained from data measured by the Inertial Measurement Unit (IMU).
Through the system observations Z k (i.e. quaternion obtained from IMU measurement data) to obtain the final estimate
Figure BDA0004017076240000062
Namely the attitude quaternion obtained by the data measured by the IMU>
Figure BDA0004017076240000063
Figure BDA0004017076240000064
Z k Representing an observation matrix, which is known from sensor measurements, i.e. from IMU measurements.
Updating the error covariance matrix is as follows:
Figure BDA0004017076240000065
i is an identity matrix.
Step four: and synchronous pose display is carried out in the upper computer.
The fourth step comprises the following substeps:
4-1: calculating the movement speed and the position of the human body according to the estimated result; specifically, the position information of the IMU can be obtained by three-axis acceleration integration measured by the IMU, the gravity acceleration component is subtracted from the measured acceleration to obtain the average value of the acceleration at the current time t and the acceleration at the next time t+1, the average acceleration is taken as the average acceleration in the delta t time, and the speed and the position at the time t+1 can be approximately obtained by using the average acceleration, the initial speed and the initial position at the current time, and the calculation formula is as follows:
Figure BDA0004017076240000066
/>
wherein P is an IMU position matrix, V is an IMU speed matrix, deltat is the time difference between the time t+1 and the time t, and a is the acceleration. The acceleration and the speed are directional, namely the acceleration and the speed in the formula are three-dimensional, namely the acceleration and the speed are in a matrix form, the acceleration of the speed can obtain a three-dimensional position, the quaternion can obtain a joint gesture, namely a pose can be obtained, and the model of the upper computer is synchronous based on the gesture.
4-2: and displaying the bound model in the front-end webpage in real time.

Claims (8)

1. The system for displaying the human lower limb posture in real time based on the IMU is characterized by comprising an inertial measurement unit IMU, wherein the inertial measurement unit IMU is in communication connection with an upper computer, the inertial measurement unit IMU is respectively arranged above two knee joints, above a lower leg ankle joint and at a sacrum of a human body when in use and is used for measuring the human lower limb, the upper computer is used for receiving information measured by the inertial measurement unit IMU, carrying out operation based on the measured information and displaying the corresponding posture in a visual form, and a carrier coordinate system and a navigation coordinate system are arranged on the upper computer;
when the system for displaying the human lower limb gesture in real time based on the IMU operates, the method comprises the following steps:
step one: in the initial state, the human body is in a standing state, the upper computer establishes a corresponding human body model, and synchronous posture display is carried out based on the inertial measurement unit IMU;
step two: the upper computer receives data measured by the inertial measurement unit IMU, and calculates an initial quaternion and a quaternion form attitude matrix;
step three: estimating the human body posture based on a Kalman filtering algorithm;
step four: and synchronous pose display is carried out in the upper computer.
2. The IMU-based system for real-time display of human lower limb gestures of claim 1, wherein the inertial measurement unit IMU is a nine-axis inertial measurement unit.
3. The IMU-based real-time human lower limb posture display system of claim 1, wherein the carrier coordinate system of the upper computer is Y-axis directed in the forward direction, X-axis is horizontal to the right, and Z-axis is vertical to the down; the navigation coordinate system adopts a north-east coordinate system.
4. The IMU-based real-time human body lower limb posture display system of claim 3, wherein the posture angle is represented by a relative angular position relationship between a carrier coordinate system and a navigation coordinate system, and the human body posture is described by using the euler angle;
the Euler angle comprises a pitch angle theta, a roll angle gamma and a yaw angle phi, wherein the pitch angle theta is the rotation angle of the carrier coordinate system along the Y axis, and the rotation range is-90 degrees to +90 degrees; the roll angle gamma is the angle by which the carrier coordinate system rotates about the X-axis, the rotation range is-180 DEG to +180 DEG, the yaw angle psi is the angle by which the carrier coordinate system rotates about the Z-axis, and the rotation range is 0 DEG to 360 deg.
5. The IMU-based real-time human lower limb posture display system of claim 4, wherein said stepsIn the second step, the interconversion of the carrier coordinate system and the navigation coordinate system adopts an attitude matrix
Figure FDA0004017076230000011
Expressed, i.e
Figure FDA0004017076230000012
The above matrix is noted as:
Figure FDA0004017076230000021
the calculation formulas of the pitch angle θ, the roll angle γ, and the yaw angle ψ are as follows based on the above-described attitude matrix:
Figure FDA0004017076230000022
θ=sin -1 (-C 31 );
Figure FDA0004017076230000023
/>
6. the IMU-based real-time human lower limb posture display system of claim 5, wherein in said step two, an initial quaternion q 0 、q 1 、q 2 、q 3 The method comprises the following steps:
Figure FDA0004017076230000024
the gesture matrix represented by the quaternion is:
Figure FDA0004017076230000025
7. the IMU-based real-time human lower limb posture display system of claim 6, wherein said step three comprises the substeps of:
3-1: calculating an updated form of the quaternion based on the differential form of the quaternion;
3-2: obtaining a state matrix F based on an updated version of the quaternion k
3-3: calculating a Kalman gain M;
3-4: and calculating an attitude quaternion obtained from data measured by the Inertial Measurement Unit (IMU).
8. The IMU-based real-time human lower limb posture display system of claim 7, wherein said step four comprises the sub-steps of:
4-1: calculating the moving speed and the position of the human body according to the estimated human body posture;
4-2: and displaying the bound model in the front-end webpage in real time.
CN202211676106.4A 2022-12-26 2022-12-26 System for displaying human lower limb gesture in real time based on IMU Pending CN116058829A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211676106.4A CN116058829A (en) 2022-12-26 2022-12-26 System for displaying human lower limb gesture in real time based on IMU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211676106.4A CN116058829A (en) 2022-12-26 2022-12-26 System for displaying human lower limb gesture in real time based on IMU

Publications (1)

Publication Number Publication Date
CN116058829A true CN116058829A (en) 2023-05-05

Family

ID=86172622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211676106.4A Pending CN116058829A (en) 2022-12-26 2022-12-26 System for displaying human lower limb gesture in real time based on IMU

Country Status (1)

Country Link
CN (1) CN116058829A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106500695A (en) * 2017-01-05 2017-03-15 大连理工大学 A kind of human posture recognition method based on adaptive extended kalman filtering
KR101751760B1 (en) * 2016-10-31 2017-06-28 주식회사 모셔넥스 Method for estimating gait parameter form low limb joint angles
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN110705491A (en) * 2019-10-10 2020-01-17 青岛大学 Method and system for auxiliary operation of iron tower of electric power worker
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113793360A (en) * 2021-08-31 2021-12-14 大连理工大学 Three-dimensional human body reconstruction method based on inertial sensing technology
CN113892942A (en) * 2021-08-24 2022-01-07 重庆大学 Wearing equipment for tracking motion of lower limbs of human body in real time
CN114391831A (en) * 2022-01-17 2022-04-26 西北工业大学 Hip-knee joint angle and dynamic force line resolving system
CN114623826A (en) * 2022-02-25 2022-06-14 南京航空航天大学 Pedestrian inertial navigation positioning method based on human body lower limb DH model

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
KR101751760B1 (en) * 2016-10-31 2017-06-28 주식회사 모셔넥스 Method for estimating gait parameter form low limb joint angles
CN106500695A (en) * 2017-01-05 2017-03-15 大连理工大学 A kind of human posture recognition method based on adaptive extended kalman filtering
CN110705491A (en) * 2019-10-10 2020-01-17 青岛大学 Method and system for auxiliary operation of iron tower of electric power worker
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113892942A (en) * 2021-08-24 2022-01-07 重庆大学 Wearing equipment for tracking motion of lower limbs of human body in real time
CN113793360A (en) * 2021-08-31 2021-12-14 大连理工大学 Three-dimensional human body reconstruction method based on inertial sensing technology
CN114391831A (en) * 2022-01-17 2022-04-26 西北工业大学 Hip-knee joint angle and dynamic force line resolving system
CN114623826A (en) * 2022-02-25 2022-06-14 南京航空航天大学 Pedestrian inertial navigation positioning method based on human body lower limb DH model

Similar Documents

Publication Publication Date Title
WO2020253854A1 (en) Mobile robot posture angle calculation method
CN102538781B (en) Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN106705968B (en) Indoor inertial navigation algorithm based on attitude identification and step size model
CN110986939B (en) Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration
CN105203098B (en) Agricultural machinery all-attitude angle update method based on nine axis MEMS sensors
US7089148B1 (en) Method and apparatus for motion tracking of an articulated rigid body
CN110926460B (en) Uwb positioning abnormal value processing method based on IMU
CN108061855B (en) MEMS sensor based spherical motor rotor position detection method
CN110095116A (en) A kind of localization method of vision positioning and inertial navigation combination based on LIFT
CN109631888B (en) Motion trajectory identification method and device, wearable device and storage medium
CN106056664A (en) Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
CN108253963A (en) A kind of robot active disturbance rejection localization method and alignment system based on Multi-sensor Fusion
CN111578937B (en) Visual inertial odometer system capable of simultaneously optimizing external parameters
CN107941217A (en) A kind of robot localization method, electronic equipment, storage medium, device
CN103175502A (en) Attitude angle detecting method based on low-speed movement of data glove
CN104964685A (en) Judgment method for moving state of mobile phone
CN106370178B (en) Attitude measurement method and device of mobile terminal equipment
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN113052908A (en) Mobile robot pose estimation method based on multi-sensor data fusion
CN109443354B (en) Visual-inertial tight coupling combined navigation method based on firefly group optimized PF
CN115540860A (en) Multi-sensor fusion pose estimation algorithm
Chen et al. Human motion capture algorithm based on inertial sensors
CN104457741A (en) Human arm movement tracing method based on ant colony algorithm error correction
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN105735969A (en) Oil well bore track plotting device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination