CN110567484A - method and device for calibrating IMU and rigid body posture and readable storage medium - Google Patents

method and device for calibrating IMU and rigid body posture and readable storage medium Download PDF

Info

Publication number
CN110567484A
CN110567484A CN201910674584.3A CN201910674584A CN110567484A CN 110567484 A CN110567484 A CN 110567484A CN 201910674584 A CN201910674584 A CN 201910674584A CN 110567484 A CN110567484 A CN 110567484A
Authority
CN
China
Prior art keywords
imu
rigid body
data
pose
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910674584.3A
Other languages
Chinese (zh)
Other versions
CN110567484B (en
Inventor
吴迪云
许秋子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ruili Visual Multimedia Technology Co Ltd
Shenzhen Realis Multimedia Technology Co Ltd
Original Assignee
Shenzhen Ruili Visual Multimedia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ruili Visual Multimedia Technology Co Ltd filed Critical Shenzhen Ruili Visual Multimedia Technology Co Ltd
Priority to CN201910674584.3A priority Critical patent/CN110567484B/en
Publication of CN110567484A publication Critical patent/CN110567484A/en
Application granted granted Critical
Publication of CN110567484B publication Critical patent/CN110567484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a method, equipment and readable storage medium for calibrating postures of an IMU (inertial measurement Unit) and a rigid body, which are used for fusing the postures of the rigid body and the IMU so as to improve the calculation precision of the posture of the rigid body. The method comprises the following steps: fixing the position relation between the IMU and the rigid body; acquiring original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body; preprocessing original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body, wherein the preprocessing is used for eliminating invalid data, and the invalid data are attitude data when the IMU and the rigid body are not in a motion state; calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data; and converting the posture of the IMU into the posture of the rigid body according to the calculation data, and obtaining the posture of the rigid body.

Description

method and device for calibrating IMU and rigid body posture and readable storage medium
Technical Field
the present application relates to the field of measurement, and in particular, to a method and apparatus for calibrating an IMU and a rigid body posture, and a readable storage medium.
Background
The method relates to the application of optical motion capture in the field of Virtual Reality (VR) and Mixed Reality (MR), captures the spatial change of an object by identifying and tracking the motion track of the object, and is suitable for application scenes needing human-computer interaction.
The optical motion capture is realized by an optical rigid body pose tracking technology, taking large-space VR equipment of OptiTrack company and vicon company as an example, the positions of the reflective balls in the space are captured through a plurality of infrared camera visual angles, the positions of the rigid bodies can be calculated through the position relation of the reflective balls after the reflective balls are fixed on the rigid bodies, and then the motion capture is completed.
the accuracy of motion capture cannot be separated from the accurate calculation of the pose of the rigid body, but in practical application, the calculation of the pose of the rigid body still has larger errors, so that how to improve the accuracy of the calculation of the pose of the rigid body is a key link for realizing high-accuracy motion capture.
disclosure of Invention
the application provides a method, equipment and readable storage medium for calibrating postures of an IMU (Inertial Measurement Unit) and a rigid body, which are used for fusing the postures of the rigid body and an Inertial Measurement Unit (IMU) so as to improve the calculation precision of the posture of the rigid body.
In a first aspect, the present application provides a method for IMU and rigid body posture, the method comprising:
Fixing the position relation between the IMU and the rigid body;
Acquiring original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body;
preprocessing original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body, wherein the preprocessing is used for eliminating invalid data, and the invalid data are attitude data when the IMU and the rigid body are not in a motion state;
calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data;
And converting the posture of the IMU into the posture of the rigid body according to the calculation data, and obtaining the posture of the rigid body.
With reference to the first aspect of the present application, in a first possible implementation manner of the first aspect of the present application, the calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and an IMU-to-world pose data according to the preprocessed pose data includes:
Extracting IMU attitude data of a 0 th frame and IMU attitude data of an nth frame from the preprocessed IMU attitude data, and extracting rigid body-to-world attitude data of the 0 th frame and rigid body-to-world attitude data of the nth frame from the preprocessed rigid body attitude data;
calculating a rotation matrix of the IMU posture data from the IMU coordinates to rigid body coordinates and calculating the posture data of the 0 th frame IMU to the world according to a relational equation:
wherein,Pose data indicating the n-th frame rigid body to world,Rotation matrix for indicating IMU pose data from IMU coordinates to rigid body coordinates, [ R ]i]nFor indicating the nth frame of IMU pose data,Pose data for indicating frame 0 rigid body to world, [ R ]i]0indicating the IMU pose data of frame 0,pose data indicating the IMU to world frame 0,an inverse matrix indicating the IMU pose data of the nth frame relative to the IMU pose data of the 0 th frame, and E an identity matrix.
with reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect of the present application, calculating a rotation matrix of the IMU pose data from the IMU coordinates to the rigid body coordinates according to a relational equation, and calculating pose data of the IMU of the 0 th frame to the world includes:
Substituting the IMU attitude data of the 0 th frame, the IMU attitude data of the n th frame, the attitude data from the rigid body of the 0 th frame to the world and the attitude data from the rigid body of the n th frame to the world into a relational equation, and obtaining the equation by the least square method when the square sum of the difference between the equal-sign two-side data of the relational equation is minimumandthe least squares formula Q includes:
with reference to the first or second possible implementation manner of the first aspect of the present application, in a third possible implementation manner of the first aspect of the present application, the converting the pose of the IMU into the pose of the rigid body according to the calculation data includes:
Converting the posture of the IMU into the posture of a rigid body according to a posture conversion formula, wherein the posture conversion formula is as follows:
Wherein,For the inverse transformation of the IMU pose data from IMU coordinates to a rotation matrix of rigid body coordinates,the posture data of the rigid body after conversion.
With reference to the first possible implementation manner of the first aspect of the present application, in a fourth possible implementation manner of the first aspect of the present application, the attitude data of the IMU of the nth frame and the attitude data of the rigid body to the world of the nth frame have changes of at least 90 degrees in a pitch angle, a course angle, and a yaw angle, respectively.
With reference to the first possible implementation manner of the first aspect of the present application, in a fifth possible implementation manner of the first aspect of the present application, when the original pose data corresponding to the IMU and the original pose data corresponding to the rigid body are respectively represented by quaternions, the pose data of the IMU of the nth frame and the pose data of the rigid body to the world of the nth frame have 8x sets of data amounts, where x is a positive integer.
with reference to the first aspect of the present application, in a sixth possible implementation manner of the first aspect of the present application, the preprocessing the original pose data corresponding to the IMU and the original pose data corresponding to the rigid body includes:
calculating the distance between 2 frame data of adjacent frames and/or interval frames in the original attitude data corresponding to the IMU and the original attitude data corresponding to the rigid body by using a distance calculation formula, wherein the distance calculation formula is as follows:
Wherein x ism、ymand zmThree-axis coordinates, x, for representing IMU or rigid body pose data of the mth framen、ynAnd znThree-axis coordinates used for representing the IMU or rigid body posture data of the nth frame, and d is used for representing the distance between the IMU or rigid body posture data of the mth frame and the IMU or rigid body posture data of the nth frame;
and when d is smaller than the distance threshold, judging that the IMU or the rigid body is not in the motion state, and rejecting attitude data which is not in the motion state to finish preprocessing.
in a second aspect, the present application provides an apparatus for calibrating an IMU and a rigid body pose, the apparatus comprising:
The fixing unit is used for fixing the position relation between the IMU and the rigid body;
The acquisition unit is used for acquiring original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body;
the system comprises a preprocessing unit, a data processing unit and a data processing unit, wherein the preprocessing unit is used for preprocessing original attitude data corresponding to an IMU (inertial measurement Unit) and original attitude data corresponding to a rigid body, and the preprocessing unit is used for eliminating invalid data which are attitude data when the IMU and the rigid body are not in a motion state;
the computing unit is used for computing a rotation matrix from the IMU coordinate to the rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data;
And the conversion unit is used for converting the posture of the IMU into the posture of the rigid body according to the calculation data and obtaining the posture of the rigid body.
With reference to the second aspect of the present application, in a first possible implementation manner of the second aspect of the present application, the computing unit is specifically configured to:
extracting IMU attitude data of a 0 th frame and IMU attitude data of an nth frame from the preprocessed IMU attitude data, and extracting rigid body-to-world attitude data of the 0 th frame and rigid body-to-world attitude data of the nth frame from the preprocessed rigid body attitude data;
calculating a rotation matrix of the IMU posture data from the IMU coordinates to rigid body coordinates and calculating the posture data of the 0 th frame IMU to the world according to a relational equation:
Wherein,Pose data indicating the n-th frame rigid body to world,Rotation matrix for indicating IMU pose data from IMU coordinates to rigid body coordinates, [ R ]i]nfor indicating the nth frame of IMU pose data,Pose data for indicating frame 0 rigid body to world, [ R ]i]0Indicating the IMU pose data of frame 0,Pose data indicating the IMU to world frame 0,An inverse matrix indicating the IMU pose data of the nth frame relative to the IMU pose data of the 0 th frame, and E an identity matrix.
with reference to the first possible implementation manner of the second aspect, in a second possible implementation manner of the second aspect of the present application, the calculating unit is specifically configured to:
substituting the IMU attitude data of the 0 th frame, the IMU attitude data of the n th frame, the attitude data from the rigid body of the 0 th frame to the world and the attitude data from the rigid body of the n th frame to the world into a relational equation, and obtaining the equation by the least square method when the square sum of the difference between the equal-sign two-side data of the relational equation is minimumandThe least squares formula Q includes:
with reference to the first or second possible implementation manner of the second aspect of the present application, in a third possible implementation manner of the second aspect of the present application, the calculating unit is specifically configured to:
Converting the posture of the IMU into the posture of a rigid body according to a posture conversion formula, wherein the posture conversion formula is as follows:
wherein,For the inverse transformation of the IMU pose data from IMU coordinates to a rotation matrix of rigid body coordinates,The posture data of the rigid body after conversion.
with reference to the first possible implementation manner of the second aspect of the present application, in a fourth possible implementation manner of the second aspect of the present application, the attitude data of the IMU of the nth frame and the attitude data of the rigid body to the world of the nth frame have changes of at least 90 degrees in a pitch angle, a course angle, and a yaw angle, respectively.
With reference to the first possible implementation manner of the second aspect of the present application, in a fifth possible implementation manner of the second aspect of the present application, when the original posture data corresponding to the IMU and the original posture data corresponding to the rigid body are respectively represented by quaternions, the posture data of the IMU of the nth frame and the posture data of the rigid body to the world of the nth frame have 8x sets of data amounts, where x is a positive integer.
with reference to the second aspect of the present application, in a sixth possible implementation manner of the second aspect of the present application, the preprocessing unit is specifically configured to:
calculating the distance between 2 frame data of adjacent frames and/or interval frames in the original attitude data corresponding to the IMU and the original attitude data corresponding to the rigid body by using a distance calculation formula, wherein the distance calculation formula is as follows:
Wherein x ism、ymand zmThree-axis coordinates, x, for representing IMU or rigid body pose data of the mth framen、ynand znThree-axis coordinates used for representing the IMU or rigid body posture data of the nth frame, and d is used for representing the distance between the IMU or rigid body posture data of the mth frame and the IMU or rigid body posture data of the nth frame;
and when d is smaller than the distance threshold, judging that the IMU or the rigid body is not in the motion state, and rejecting attitude data which is not in the motion state to finish preprocessing.
In a third aspect, the present application provides an apparatus for calibrating IMU and rigid body pose, where the apparatus includes a memory, a processor, and a program for calibrating IMU and rigid body pose stored in the memory and executable on the processor, and when executed by the processor, the program for calibrating IMU and rigid body pose implements the method as provided in the first aspect of the present application or any one of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a calibration IMU and rigid body pose program, which when executed by a processor implements a method as provided in the first aspect of the present application or any one of the possible implementations of the first aspect.
according to the technical scheme, the method has the following advantages:
after the position relation between the IMU and the rigid body is fixed, original posture data respectively measured by the IMU and the rigid body are obtained, then the two kinds of original posture data are preprocessed, invalid data in the data are eliminated, data cleaning is completed, validity of the posture data is improved, a rotation matrix from IMU coordinates to rigid body coordinates and posture data from the IMU to the world are calculated according to the preprocessed posture data, therefore, the posture of the IMU is converted into the posture of the rigid body according to the two kinds of calculation data, the posture of the rigid body is obtained, fusion calibration of the posture of the IMU and the posture of the rigid body is achieved, accuracy of calculation of the posture of the rigid body is improved, and high-accuracy action capture is promoted.
Drawings
FIG. 1 is a schematic flow chart illustrating a method for calibrating IMU and rigid body postures according to the present application;
FIG. 2 is a schematic flow chart illustrating a method for calibrating IMU and rigid body postures according to the present application;
FIG. 3 is a schematic flow chart illustrating a method for calibrating IMU and rigid body postures according to the present application;
FIG. 4 is a schematic structural diagram of the apparatus for calibrating the pose of an IMU and a rigid body according to the present application;
FIG. 5 is a schematic structural diagram of an apparatus for calibrating the pose of an IMU with a rigid body according to the present application.
Detailed Description
the application provides a method, equipment and readable storage medium for calibrating postures of an IMU (inertial measurement Unit) and a rigid body, which are used for fusing the postures of the rigid body and the IMU so as to improve the calculation precision of the posture of the rigid body.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
the terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Moreover, the terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules explicitly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus. The naming or numbering of the steps appearing in the present application does not mean that the steps in the method flow have to be executed in the chronological/logical order indicated by the naming or numbering, and the named or numbered process steps may be executed in a modified order depending on the technical purpose to be achieved, as long as the same or similar technical effects are achieved.
the division of the modules presented in this application is a logical division, and in practical applications, there may be another division, for example, multiple modules may be combined or integrated into another system, or some features may be omitted, or not executed, and in addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some interfaces, and the indirect coupling or communication connection between the modules may be in an electrical or other similar form, which is not limited in this application. The modules or sub-modules described as separate components may or may not be physically separated, may or may not be physical modules, or may be distributed in a plurality of circuit modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of the present disclosure.
first, before the present application is described, an imaging device, an IMU, a rigid body, and an apparatus for calibrating the posture of the IMU and the rigid body according to the present application will be described.
The camera device is a device that may be involved in an optical motion capture process, and includes one or more cameras, such as an omni-directional camera or a plurality of motion capture cameras, and can capture an object in a scene where the camera device is located.
an IMU is a device that measures the three-axis attitude angles (or angular rates) and acceleration of an object. Typically, an IMU comprises three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detecting acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detecting angular velocity signals of the carrier relative to the navigation coordinate system, and the IMU attitude data of the object including angular velocity and acceleration of the object in three-dimensional space can be measured.
The rigid body is an object which has unchanged shape and size and the relative position of each point in the rigid body after movement and stress.
During the shooting task of the camera device, the camera device is in a motion state or temporarily, shoots objects, IMUs and rigid bodies of a scene, and the IMUs and the rigid bodies can be arranged on the camera device or arranged independently of the camera device, move along with the motion of the camera device, and respectively measure corresponding original attitude data. Of course, in practical applications, the IMU and the rigid body may be independent from the imaging device, and when the imaging device is not in a motion state or is not involved, the IMU and the rigid body may be in a motion state, and corresponding original posture data may be measured.
the device is a device with data processing capability, such as a host, a server or User Equipment (UE), and the method for calibrating the posture of the IMU and the rigid body is applied.
The UE is terminal equipment such as a desktop computer, a notebook computer, a computer all-in-one machine, a tablet computer or a smart phone.
Next, a detailed description of the method for calibrating the orientations of the IMU and the rigid body according to the present application will be started based on the above background description.
Referring to fig. 1, fig. 1 shows a schematic flow chart of the method for calibrating the posture of the IMU and the rigid body according to the present application, and specifically, the method for calibrating the posture of the IMU and the rigid body according to the present application may include the following steps:
step S101, fixing the position relation between the IMU and the rigid body;
it can be understood that the IMU and the rigid body are hardware components, and the subsequent steps of the method for posture alignment of the IMU and the rigid body according to the present application can be performed after the positional relationship between the IMU and the rigid body is fixed.
Step S102, acquiring original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body;
The method can be understood that the equipment for calibrating the postures of the IMU and the rigid body can be connected with the IMU and the rigid body, so that the original posture data measured by the IMU and the original posture data measured by the rigid body can be received in real time, and the effect of applying the method for calibrating the postures of the IMU and the rigid body in real time is achieved.
Or, the apparatus for calibrating the postures of the IMU and the rigid body may also extract the original posture data measured by the IMU and the original posture data measured by the rigid body from the related storage information of the image-capturing task or the posture data acquisition task, and separate the triggering time of the method for calibrating the postures of the IMU and the rigid body from the image-capturing task or the posture data acquisition task, so that the method for calibrating the postures of the IMU and the rigid body of the present application can be applied more flexibly.
If a user initiates a task request to equipment for calibrating IMU and rigid body postures through equipment such as UE (user equipment), and the equipment for requesting to calibrate IMU and rigid body postures applies the method for calibrating IMU and rigid body postures, the equipment for calibrating IMU and rigid body postures can extract the task ID of a target task from a preset field of the task request, further extract corresponding original posture data and original posture data from related storage information of the target task, and continue to perform subsequent processing of the method for calibrating IMU and rigid body postures.
step S103, preprocessing the original attitude data corresponding to the IMU and the original attitude data corresponding to the rigid body;
the preprocessing is used for removing invalid data in the two kinds of original attitude data, wherein the invalid data are invalid attitude data when the IMU and the rigid body are not in a motion state.
it is understood that the IMU and the rigid body are not moving, or do not move significantly, and the original pose data obtained at this time is unchanged and may be considered as invalid pose data, while the original pose data obtained when the IMU and the rigid body are in motion is valid pose data.
Therefore, after two kinds of original attitude data corresponding to the IMU and the rigid body are obtained, the equipment for calibrating the IMU and the rigid body attitude can carry out data cleaning on the two kinds of original attitude data and remove invalid data in the original attitude data so as to improve the validity of the attitude data and further improve the accuracy of subsequent rigid body attitude calculation.
Step S104, calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data;
After the two kinds of original attitude data are preprocessed, the equipment for calibrating the IMU and the rigid body attitude can calculate the two kinds of data according to the preprocessed attitude data, namely a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world.
And S105, converting the posture of the IMU into the posture of the rigid body according to the calculation data, and obtaining the posture of the rigid body.
after the rotation matrix from the IMU coordinate to the rigid body coordinate and the attitude data from the IMU to the world are obtained, the equipment for calibrating the IMU and the rigid body attitude can convert the coordinate format of the IMU attitude into the rigid body attitude, so that the calibration of the IMU attitude and the rigid body attitude on the coordinate format is realized, and the more accurate attitude of the rigid body can be obtained more quickly when the IMU attitude and the rigid body attitude are fused and calibrated.
from the above, the method for calibrating the postures of the IMU and the rigid body provided by the application obtains the original posture data respectively measured by the IMU and the rigid body after fixing the position relationship between the IMU and the rigid body, then preprocesses the two original posture data, eliminates invalid data in the data, completes the cleaning of the data, improves the validity of the posture data, and calculates the rotation matrix from the IMU coordinate to the rigid body coordinate and the posture data from the IMU to the world according to the preprocessed posture data, thereby converting the posture of the IMU into the posture of the rigid body according to the two calculated data, obtaining the posture of the rigid body, realizing the fusion calibration of the posture of the IMU and the posture of the rigid body, improving the accuracy of the calculation of the posture of the rigid body, and promoting the motion capture with high accuracy.
in an embodiment, with continuing reference to fig. 2, fig. 2 shows another schematic flow chart of the method for calibrating the orientations of the IMU and the rigid body according to the present application, in which step S104 may further include the following steps:
Step S201, extracting IMU attitude data of the 0 th frame and IMU attitude data of the nth frame from the original attitude data corresponding to the preprocessed IMU, and extracting rigid body to world attitude data of the 0 th frame and rigid body to world attitude data of the nth frame from the original attitude data corresponding to the preprocessed rigid body;
it can be understood that the raw pose data of the IMU includes pose data of a plurality of frames, wherein the pose data of the IMU of frame 0 represents the position initially measured by the IMU; similarly, the 0 th frame rigid body to world pose data represents the initial measured position of the rigid body.
Step S202, according to the relation equation, calculating the rotation matrix of the IMU posture data from the IMU coordinate to the rigid body coordinate, and calculating the posture data of the 0 th frame IMU posture data to the world, wherein the relation equation is as follows:
wherein,pose data indicating the n-th frame rigid body to world,rotation matrix for indicating IMU pose data from IMU coordinates to rigid body coordinates, [ R ]i]nfor indicating the nth frame of IMU pose data,pose data for indicating frame 0 rigid body to world, [ R ]i]0Indicating the IMU pose data of frame 0,pose data indicating the IMU to world frame 0,An inverse matrix indicating the IMU pose data of the nth frame relative to the IMU pose data of the 0 th frame, and E an identity matrix.
After the corresponding data is extracted in step S201, the device for calibrating the IMU and rigid body pose may substitute the data into the above equation, and calculate the rotation matrix of the IMU pose data from the IMU coordinate to the rigid body coordinate and the pose data from the IMU of frame 0 to the world.
Next, in step S105 of the embodiment corresponding to fig. 1, the following steps may be further included:
converting the posture of the IMU into the posture of a rigid body according to a posture conversion formula, wherein the posture conversion formula is as follows:
wherein,For the inverse transformation of the IMU pose data from IMU coordinates to a rotation matrix of rigid body coordinates,The posture data of the rigid body after conversion.
After the rotation matrix of the IMU posture data from the IMU coordinate to the rigid body coordinate and the posture data of the 0 th frame IMU to the world are obtained through calculation, the rotation matrix and the posture data can be substituted into the posture conversion formula, and the conversion formula when the IMU posture data are converted from the IMU coordinate to the rigid body coordinate is determined.
it can be understood that, when the method for calibrating the postures of the IMU and the rigid body of the present application is triggered, the posture conversion formula is determined, so that the coordinate conversion relationship when the posture data of the IMU is converted from the IMU coordinate to the rigid body coordinate can be ensured, the actual operation states of the IMU and the rigid body are met in the whole process, the unexpected situation that the position relationship between the IMU and the rigid body changes due to abnormal situations such as displacement and vibration in the operation states of the IMU and the rigid body is avoided, on the premise of ensuring high data accuracy, the method for calibrating the postures of the IMU and the rigid body of the present application can be promoted, the actual errors caused by different actual situations corresponding to different IMU and/or rigid bodies are overcome, and the stability and compatibility are improved.
In still another embodiment, it can be understood that, in the above step S202, the more sets of substituted IMU attitude data of the nth frame and rigid body attitude data of the nth frame, the more practical the calculation result is, i.e. the higher accuracy is.
correspondingly, in practical applications, in the step S202, specifically, the method may further include:
Substituting the IMU attitude data of the 0 th frame, the IMU attitude data of the n th frame, the attitude data from the rigid body of the 0 th frame to the world and the attitude data from the rigid body of the n th frame to the world into the relation equation, and obtaining the square sum of the difference between the equal-sign two-side data of the relation equation when the square sum is minimum by the least square methodAndthe least squares formula Q includes:
It can be understood that the optimal solution is easily obtained by solving through the least square method, the calculation workload is relatively small, the calculation duration can be relatively short, and the calculation efficiency of the equipment for calibrating the IMU and the rigid posture in the step S202 is improved.
In yet another embodiment, in order to further improve the accuracy of the target IMU, the attitude data of the IMU frame n and the attitude data of the rigid body to the world frame n extracted in step S201 have changes of at least 90 degrees in the pitch angle, the heading angle and the yaw angle, respectively, so as to avoid possible data noise and overcome errors caused by a small rotation angle of the frame data at the time of high-speed acquisition, thereby improving the quality of the data.
In yet another embodiment, when the raw pose data of the IMU and the raw pose data of the rigid body are respectively represented by quaternions, the nth frame IMU pose data and the nth frame rigid body to world pose data respectively have a data amount of 8x sets, x being a positive integer.
preferably, x is in the range of 10 to 15, and on the basis of the quaternion expression format, the IMU attitude data to be solved in step S202 has 8 variables to be determined from the rotation matrix from the IMU coordinates to the rigid body coordinates and the IMU-0 th frame to the world attitude data, and thus, each variable may have 80 to 120 sets of free test data, which not only can obtain a result, but also can bring a lower data processing amount, and achieve a better balance between accuracy and data processing amount.
in another embodiment, with reference to still another flow chart of the method for calibrating the IMU and the rigid body posture shown in fig. 3, the preprocessing mentioned in step S103 may specifically include the following steps:
Step S301, calculating a distance between 2 frame data in adjacent frames and/or interval frames in the original pose data corresponding to the IMU and the original pose data corresponding to the rigid body according to a distance calculation formula, where the distance calculation formula is:
Wherein x ism、ymAnd zmthree-axis coordinates, x, for representing IMU or rigid body pose data of the mth framen、ynand znthree-axis coordinates used for representing the IMU or rigid body posture data of the nth frame, and d is used for representing the distance between the IMU or rigid body posture data of the mth frame and the IMU or rigid body posture data of the nth frame;
It is understood that the invalid data may be filtered based on the distance between the adjacent frame data, and the apparatus for calibrating the rigid body posture and the IMU determines the distance between the adjacent frame and/or the interval frame through the distance calculation formula, which may represent the corresponding distance change between the adjacent frame data and/or the interval frame data.
step S302, when d is smaller than the distance threshold, the IMU or the rigid body is judged not to be in the motion state, the attitude data which is not in the motion state is removed, the preprocessing is finished,
Wherein, the distance threshold value can be any value from 3cm to 5 cm.
When the distance between the 2 frame data is smaller than the distance threshold, the IMU or the rigid body in the 2 frame data is not moved or the movement is not obvious, the frame data with the high frame number in the 2 frame data is identified as the standard-free data and deleted or removed, the frame data with the high frame number in the 2 frame data does not need to be used as the input data of the subsequent rigid body pose calculation, and the validity and the accuracy of the pose data are improved.
the above is the introduction of the method for calibrating the posture of the IMU and the rigid body of the present application, and the following is the introduction of the apparatus for calibrating the posture of the IMU and the rigid body of the present application.
Referring to fig. 4, fig. 4 shows a schematic structural diagram of the apparatus for calibrating an IMU and a rigid body posture according to the present application, specifically, the apparatus for calibrating an IMU and a rigid body posture may include the following structure:
A fixing unit 401 for fixing a positional relationship between the IMU and the rigid body;
an obtaining unit 402, configured to obtain original pose data corresponding to the IMU and original pose data corresponding to the rigid body;
A preprocessing unit 403, configured to preprocess original pose data corresponding to the IMU and original pose data corresponding to the rigid body, where the preprocessing is used to remove invalid data, where the invalid data is pose data when the IMU and the rigid body are not in a motion state;
A calculating unit 404, configured to calculate a rotation matrix from the IMU coordinate to the rigid body coordinate and pose data from the IMU to the world according to the preprocessed pose data;
And a conversion unit 405, configured to convert the pose of the IMU into a pose of the rigid body according to the calculation data, and obtain a pose of the rigid body.
In an embodiment, the calculating unit 404 is specifically configured to:
extracting IMU attitude data of a 0 th frame and IMU attitude data of an nth frame from the preprocessed IMU attitude data, and extracting rigid body-to-world attitude data of the 0 th frame and rigid body-to-world attitude data of the nth frame from the preprocessed rigid body attitude data;
Calculating a rotation matrix of the IMU posture data from the IMU coordinates to rigid body coordinates and calculating the posture data of the 0 th frame IMU to the world according to a relational equation:
Wherein,pose data indicating the n-th frame rigid body to world,rotation matrix for indicating IMU pose data from IMU coordinates to rigid body coordinates, [ R ]i]nfor indicating the nth framethe attitude data of the IMU is stored,Pose data for indicating frame 0 rigid body to world, [ R ]i]0Indicating the IMU pose data of frame 0,Pose data indicating the IMU to world frame 0,an inverse matrix indicating the IMU pose data of the nth frame relative to the IMU pose data of the 0 th frame, and E an identity matrix.
In another embodiment, the calculating unit 404 is specifically configured to:
Substituting the IMU attitude data of the 0 th frame, the IMU attitude data of the n th frame, the attitude data from the rigid body of the 0 th frame to the world and the attitude data from the rigid body of the n th frame to the world into a relational equation, and obtaining R by a least square method when the square sum of the difference between the equal-sign two-side data of the relational equation is minimumir and [ R ]i w]0The least squares formula Q includes:
in another embodiment, the computing unit is specifically configured to:
Converting the posture of the IMU into the posture of a rigid body according to a posture conversion formula, wherein the posture conversion formula is as follows:
wherein,For the inverse transformation of the IMU pose data from IMU coordinates to a rotation matrix of rigid body coordinates,the posture data of the rigid body after conversion.
in yet another embodiment, the nth frame IMU attitude data and the nth frame rigid body to world attitude data each have a change of at least 90 degrees in pitch angle, heading angle, and yaw angle, respectively.
In yet another embodiment, when the raw attitude data corresponding to the IMU and the raw attitude data corresponding to the rigid body are respectively represented by quaternions, the IMU attitude data of the nth frame and the rigid body-to-world attitude data of the nth frame have data amounts of 8x groups, respectively, x being a positive integer.
in another embodiment, the preprocessing unit 403 is specifically configured to:
calculating the distance between 2 frame data of adjacent frames and/or interval frames in the original attitude data corresponding to the IMU and the original attitude data corresponding to the rigid body by using a distance calculation formula, wherein the distance calculation formula is as follows:
wherein x ism、ymAnd zmThree-axis coordinates, x, for representing IMU or rigid body pose data of the mth framen、ynand znthree-axis coordinates used for representing the IMU or rigid body posture data of the nth frame, and d is used for representing the distance between the IMU or rigid body posture data of the mth frame and the IMU or rigid body posture data of the nth frame;
And when d is smaller than the distance threshold, judging that the IMU or the rigid body is not in the motion state, and rejecting attitude data which is not in the motion state to finish preprocessing.
referring to fig. 5, fig. 5 shows another schematic structural diagram of the apparatus for calibrating an IMU and a rigid body posture according to the present application, specifically, the apparatus for calibrating an IMU and a rigid body posture according to the present application includes a processor 501, where the processor 501 is configured to execute a program for calibrating an IMU and a rigid body posture stored in a memory 502 to implement steps of a method for calibrating an IMU and a rigid body posture according to any embodiment corresponding to fig. 1 to 3; alternatively, the processor 501 is configured to implement the functions of the units in the corresponding embodiment of fig. 4 when executing the calibration IMU and rigid body posture program stored in the memory 502.
Illustratively, the IMU is calibrated with a rigid body pose program, which may be partitioned into one or more modules/units, which are stored in the memory 502 and executed by the processor 501 to accomplish the present application. One or more modules/units may be a series of computer program instruction segments capable of performing certain functions, the instruction segments being used to describe the execution of a computer program in a computer device.
the apparatus for calibrating the pose of the IMU to the rigid body may include, but is not limited to, a processor 501 and a memory 502. Those skilled in the art will appreciate that the illustrations are merely exemplary of computer devices and do not constitute a limitation on the apparatus for calibrating IMU to rigid body pose and may include more or less components than those shown, or some combination of components, or different components, for example, the apparatus for calibrating IMU to rigid body pose may also include input output devices, network access devices, buses, etc., with the processor 501, memory 502, input output devices, network access devices, etc., being coupled via the buses.
the Processor 501 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center of the apparatus for calibrating the IMU to rigid body pose, the processor connecting the entire apparatus for calibrating the IMU to rigid body pose with various interfaces and lines.
The memory 502 may be used to store computer programs and/or modules, and the processor 501 may implement various functions of the computer device by running or executing the computer programs and/or modules stored in the memory 502, as well as invoking data stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, video data, etc.) created according to the use of the cellular phone, etc. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a flash memory Card (FlashCard), at least one magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
the present application further provides a readable storage medium having a program stored thereon for calibrating IMU and rigid body pose, the program, when executed by a processor, implementing a method for calibrating IMU and rigid body pose as in any of the embodiments corresponding to fig. 1-3.
It will be appreciated that the integrated unit, if implemented as a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the above-described specific working processes of the apparatus for calibrating the postures of the IMU and the rigid body and the units thereof may refer to the description of the method for calibrating the postures of the IMU and the rigid body in the embodiments corresponding to fig. 1 to 3, and details are not repeated herein.
To sum up, the method, the device and the readable storage medium for calibrating the postures of the IMU and the rigid body provided by the application acquire the original posture data respectively measured by the IMU and the rigid body after fixing the position relationship between the IMU and the rigid body, then preprocess the two original posture data, eliminate invalid data in the data, complete the cleaning of the data, improve the validity of the posture data, and calculate the rotation matrix from the IMU coordinate to the rigid body coordinate and the posture data from the IMU to the world according to the preprocessed posture data, thereby converting the posture of the IMU into the posture of the rigid body according to the two calculated data, and obtaining the posture of the rigid body, thereby realizing the fusion calibration of the posture of the IMU and the posture of the rigid body, improving the accuracy of the calculation of the posture of the rigid body, and promoting the high-accuracy action capture.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus for calibrating the orientations of the IMU and the rigid body and the units thereof may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. a method for calibrating IMU and rigid body postures is characterized by comprising the following steps:
Fixing the position relation between the IMU and the rigid body;
Acquiring original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body;
Preprocessing original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body, wherein the preprocessing is used for eliminating invalid data, and the invalid data are attitude data when the IMU and the rigid body are not in a motion state;
Calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data;
and converting the posture of the IMU into the posture of the rigid body according to the calculation data, and obtaining the posture of the rigid body.
2. The method of claim 1, wherein said computing an IMU coordinate-to-rigid body coordinate rotation matrix and IMU-to-world pose data from said preprocessed pose data comprises:
Extracting IMU attitude data of a 0 th frame and IMU attitude data of an nth frame from the preprocessed IMU attitude data, and extracting rigid body-to-world attitude data of the 0 th frame and rigid body-to-world attitude data of the nth frame from the preprocessed rigid body attitude data;
Calculating a rotation matrix of the IMU posture data from IMU coordinates to rigid body coordinates and calculating the posture data of the 0 th frame IMU to the world according to a relational equation:
wherein, thepose data for indicating the n-th frame rigid body to world, theA rotation matrix for indicating the IMU pose data from the IMU coordinates to the rigid body coordinates, the [ R [ ]i]nfor indicating the nth frame of IMU pose data, thepose data for indicating the 0 th frame rigid body to world, the [ R ]i]0For indicating the frame 0 IMU pose data, thePose data indicating the 0 th frame IMU to the world, thean inverse matrix indicating the nth frame IMU pose data relative to the 0 th frame IMU pose data, the E to indicate an identity matrix.
3. The method of claim 2, wherein the calculating a rotation matrix of the IMU pose data from IMU coordinates to rigid body coordinates and the calculating IMU-0 frame pose data to the world according to the relational equation comprises:
Substituting the IMU attitude data of the 0 th frame, the IMU attitude data of the n th frame, the attitude data from the rigid body of the 0 th frame to the world and the attitude data from the rigid body of the n th frame to the world into the relation equation, and solving the square sum of the differences of the equal-sign two-side data of the relation equation when the square sum is minimum through a least square methodandThe least squares formula Q includes:
4. the method of claim 2 or 3, wherein said converting the pose of the IMU to the pose of the rigid body according to the computed data comprises:
converting the posture of the IMU into the posture of the rigid body according to a posture conversion formula, wherein the posture conversion formula is as follows:
wherein, theinverse transformation of a rotation matrix for IMU pose data from the IMU coordinates to the rigid body coordinates, theThe attitude data of the rigid body after conversion.
5. The method according to any of claims 1-4, wherein when the raw pose data corresponding to the IMU and the raw pose data corresponding to the rigid body are represented by quaternions, at least 8x sets of data are acquired, wherein x is a positive integer.
6. the method of claim 5, wherein the acquired 8x sets of data have a change of at least 90 degrees in each of pitch angle, heading angle, and yaw angle.
7. The method of claim 1, wherein preprocessing the raw pose data corresponding to the IMU and the raw pose data corresponding to the rigid body comprises:
Calculating the distance between 2 frame data of adjacent frames and/or interval frames in the original attitude data corresponding to the IMU and the original attitude data corresponding to the rigid body by using a distance calculation formula, wherein the distance calculation formula is as follows:
Wherein, the xm、ymAnd zmThree-axis coordinates for representing the mth frame of IMU or rigid body pose data, the xn、ynAnd znThree-axis coordinates representing an nth frame of IMU or rigid body pose data, said d representing a distance between said mth frame of IMU or rigid body pose data and said nth frame of IMU or rigid body pose data;
And when the d is smaller than a distance threshold value, judging that the IMU or the rigid body is not in a motion state, and rejecting the attitude data which is not in the motion state to finish the preprocessing.
8. An apparatus for calibrating IMU and rigid body pose, the apparatus comprising:
the fixing unit is used for fixing the position relation between the IMU and the rigid body;
An obtaining unit, configured to obtain original pose data corresponding to the IMU and original pose data corresponding to the rigid body;
the preprocessing unit is used for preprocessing original attitude data corresponding to the IMU and original attitude data corresponding to the rigid body, eliminating invalid data, and enabling the invalid data to be attitude data when the IMU and the rigid body are not in a motion state;
The calculation unit is used for calculating a rotation matrix from an IMU coordinate to a rigid body coordinate and attitude data from the IMU to the world according to the preprocessed attitude data;
And the conversion unit is used for converting the posture of the IMU into the posture of the rigid body according to the calculation data and obtaining the posture of the rigid body.
9. an apparatus for calibrating IMU and rigid body pose, the apparatus comprising a memory, a processor, and a calibration IMU and rigid body pose program stored on the memory and executable on the processor, the calibration IMU and rigid body pose program when executed by the processor implementing the method of any of claims 1 to 7.
10. A computer readable storage medium having stored thereon a calibration IMU and rigid body pose program that, when executed by a processor, implements a method as recited in any of claims 1-7.
CN201910674584.3A 2019-07-25 2019-07-25 Method and device for calibrating IMU and rigid body posture and readable storage medium Active CN110567484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910674584.3A CN110567484B (en) 2019-07-25 2019-07-25 Method and device for calibrating IMU and rigid body posture and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910674584.3A CN110567484B (en) 2019-07-25 2019-07-25 Method and device for calibrating IMU and rigid body posture and readable storage medium

Publications (2)

Publication Number Publication Date
CN110567484A true CN110567484A (en) 2019-12-13
CN110567484B CN110567484B (en) 2021-07-02

Family

ID=68773126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910674584.3A Active CN110567484B (en) 2019-07-25 2019-07-25 Method and device for calibrating IMU and rigid body posture and readable storage medium

Country Status (1)

Country Link
CN (1) CN110567484B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504314A (en) * 2020-04-30 2020-08-07 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
CN112215955A (en) * 2020-09-27 2021-01-12 深圳市瑞立视多媒体科技有限公司 Rigid body mark point screening method, device, system, equipment and storage medium
CN112344960A (en) * 2020-10-23 2021-02-09 上海拿森汽车电子有限公司 IMU signal verification method and device and vehicle
CN112781589A (en) * 2021-01-05 2021-05-11 北京诺亦腾科技有限公司 Position tracking equipment and method based on optical data and inertial data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106251305A (en) * 2016-07-29 2016-12-21 长春理工大学 A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU
CN106953553A (en) * 2017-03-12 2017-07-14 纳恩博(北京)科技有限公司 The control method and device of a kind of head and horizontal stage electric machine
CN107102735A (en) * 2017-04-24 2017-08-29 广东虚拟现实科技有限公司 A kind of alignment schemes and alignment means
CN108846867A (en) * 2018-08-29 2018-11-20 安徽云能天智能科技有限责任公司 A kind of SLAM system based on more mesh panorama inertial navigations
CN109976344A (en) * 2019-03-30 2019-07-05 南京理工大学 Crusing robot posture antidote

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106251305A (en) * 2016-07-29 2016-12-21 长春理工大学 A kind of realtime electronic image stabilizing method based on Inertial Measurement Unit IMU
CN106953553A (en) * 2017-03-12 2017-07-14 纳恩博(北京)科技有限公司 The control method and device of a kind of head and horizontal stage electric machine
CN107102735A (en) * 2017-04-24 2017-08-29 广东虚拟现实科技有限公司 A kind of alignment schemes and alignment means
CN108846867A (en) * 2018-08-29 2018-11-20 安徽云能天智能科技有限责任公司 A kind of SLAM system based on more mesh panorama inertial navigations
CN109976344A (en) * 2019-03-30 2019-07-05 南京理工大学 Crusing robot posture antidote

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
方明等: "基于IMU-Camera 标定的鲁棒电子稳像方法", 《信息与控制》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504314A (en) * 2020-04-30 2020-08-07 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
WO2021218731A1 (en) * 2020-04-30 2021-11-04 深圳市瑞立视多媒体科技有限公司 Method and apparatus for position-attitude fusion of imu and rigid body, device, and storage medium
CN111504314B (en) * 2020-04-30 2021-11-12 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
CN113984051A (en) * 2020-04-30 2022-01-28 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
CN112215955A (en) * 2020-09-27 2021-01-12 深圳市瑞立视多媒体科技有限公司 Rigid body mark point screening method, device, system, equipment and storage medium
CN112344960A (en) * 2020-10-23 2021-02-09 上海拿森汽车电子有限公司 IMU signal verification method and device and vehicle
CN112344960B (en) * 2020-10-23 2022-12-13 上海拿森汽车电子有限公司 IMU signal checking method and device and vehicle
CN112781589A (en) * 2021-01-05 2021-05-11 北京诺亦腾科技有限公司 Position tracking equipment and method based on optical data and inertial data

Also Published As

Publication number Publication date
CN110567484B (en) 2021-07-02

Similar Documents

Publication Publication Date Title
CN110567484B (en) Method and device for calibrating IMU and rigid body posture and readable storage medium
CN111354042B (en) Feature extraction method and device of robot visual image, robot and medium
CN108765498B (en) Monocular vision tracking, device and storage medium
CN107888828B (en) Space positioning method and device, electronic device, and storage medium
CN106503671B (en) The method and apparatus for determining human face posture
CN110246147B (en) Visual inertial odometer method, visual inertial odometer device and mobile equipment
US9161027B2 (en) Method and apparatus for providing camera calibration
JP2018523865A (en) Information processing method, device, and terminal
WO2019019248A1 (en) Virtual reality interaction method, device and system
JP2023502635A (en) CALIBRATION METHOD AND APPARATUS, PROCESSOR, ELECTRONICS, STORAGE MEDIUM
CN110276774B (en) Object drawing method, device, terminal and computer-readable storage medium
CN110956666B (en) Motion data calibration method and device, terminal equipment and storage medium
CN108154533A (en) A kind of position and attitude determines method, apparatus and electronic equipment
CN109767470B (en) Tracking system initialization method and terminal equipment
CN112880687A (en) Indoor positioning method, device, equipment and computer readable storage medium
CN113587934B (en) Robot, indoor positioning method and device and readable storage medium
CN110068326B (en) Attitude calculation method and apparatus, electronic device, and storage medium
CN111791235B (en) Robot multi-camera visual inertia point-line characteristic positioning method and device
JP2007004578A (en) Method and device for acquiring three-dimensional shape and recording medium for program
CN109040525A (en) Image processing method, device, computer-readable medium and electronic equipment
CN116443028A (en) Head posture data acquisition system and method
CN112945231A (en) IMU and rigid body posture alignment method, device, equipment and readable storage medium
He et al. Three-point-based solution for automated motion parameter estimation of a multi-camera indoor mapping system with planar motion constraint
CN113052915A (en) Camera external parameter calibration method and device, augmented reality system, terminal device and storage medium
CN110991085B (en) Method, medium, terminal and device for constructing robot image simulation data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant