CN207923150U - A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude - Google Patents
A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude Download PDFInfo
- Publication number
- CN207923150U CN207923150U CN201720973935.7U CN201720973935U CN207923150U CN 207923150 U CN207923150 U CN 207923150U CN 201720973935 U CN201720973935 U CN 201720973935U CN 207923150 U CN207923150 U CN 207923150U
- Authority
- CN
- China
- Prior art keywords
- measurement unit
- inertial measurement
- depth camera
- relative attitude
- displacement information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The utility model embodiment discloses a kind of calibration system of depth camera and Inertial Measurement Unit relative attitude, depth camera is used to obtain the three-dimensional spatial information of target object, it is fixed on application scenarios side to be calibrated, is connected by wired with displacement information acquisition module;Inertial Measurement Unit is used for the angular speed and acceleration of measurement object object in three dimensions, and relative depth camera moves freely, is connected with displacement information acquisition module;Data obtaining module is connected with relative attitude demarcating module, for obtaining the first displacement information when depth camera acquisition target object is moved;And the second displacement information that Inertial Measurement Unit is acquired in the corresponding period;Relative attitude demarcating module is used to that the relative attitude spin matrix of depth camera and Inertial Measurement Unit to be calculated using calibration principle according to upper displacement information.Calibration process is easily operated, without additional calibration ancillary equipment and be contactless calibration, improves the accuracy of calibration relative attitude.
Description
Technical field
The utility model embodiment is related to body feeling interaction technical field, more particularly to a kind of depth camera and inertia measurement
The calibration system of unit relative attitude.
Background technology
With the fast development of computer technology and Internet technology, body feeling interaction technology is widely used in all trades and professions,
Such as the interaction etc. of somatic sensation television game, people and robot.In body feeling interaction technology, human body attitude, action etc. are accurately captured at one
Description in unified view coordinate is the key that other upper layer applications.
Multi-sensor data fusion theory and technology is to promote the indexs such as range and the precision of human body attitude and motion capture
Effective way.And the relative pose calibration between sensor is then the basis of multi-sensor data fusion.Application depth camera with
Inertial Measurement Unit captures human action and then when remote control robot, for fusion depth camera and Inertial Measurement Unit
Sensing data need to first demarcate the relative attitude between two sensors coordinate system, i.e., combined with Inertial Measurement Unit using depth camera
When capturing body motion information, how to obtain depth camera with easy method and retouched with the accurate of Inertial Measurement Unit relative attitude
It states.
Utility model content
The purpose of the utility model embodiment is to provide a kind of calibration of depth camera and Inertial Measurement Unit relative attitude
System, to improve the accuracy of calibration depth camera and Inertial Measurement Unit relative attitude.
In order to solve the above technical problems, the utility model embodiment provides following technical scheme:
The utility model embodiment provides a kind of calibration system of depth camera and Inertial Measurement Unit relative attitude, packet
It includes:
Depth camera, Inertial Measurement Unit, displacement information acquisition module and relative attitude demarcating module;
Wherein, the depth camera is used to obtain the three-dimensional spatial information of target object, is fixed on application scenarios to be calibrated
Side is connected with institute displacement information acquisition module by wired;
The Inertial Measurement Unit is for measuring the angular speed and acceleration of the target object in three dimensions, relatively
The depth camera moves freely, and is connected with institute displacement information acquisition module;
Described information acquisition module is connected with the relative attitude demarcating module, for obtaining depth camera acquisition target object
Body, according to preset action sequence, have the first displacement information when displacement movement in either direction;And Inertial Measurement Unit
In the second displacement information that the corresponding same period acquires the target object;
The relative attitude demarcating module is used to, according to first displacement information and the second displacement information, utilize mark
Determine principle, the relative attitude spin matrix of the depth camera and the Inertial Measurement Unit is calculated.
Optionally, the Inertial Measurement Unit is bound to the motive position of the target object, to realize with described right
It is moved as object is mobile.
Optionally, the Inertial Measurement Unit is integrated multi-axial accelerometer, multiaxis gyroscope or magnetometer.
Optionally, the Inertial Measurement Unit with institute displacement information acquisition module by being wirelessly connected.
Optionally, further include:
Relative attitude representation module is connected with the relative attitude demarcating module, for being rotated according to the relative attitude
Matrix is turned the relative attitude of the depth camera and the Inertial Measurement Unit according to preset posture representation
Change.
Optionally, further include:
Display is connected with the relative attitude demarcating module, for according to preset posture representation, described in display
The relative attitude of depth camera and the Inertial Measurement Unit.
The utility model embodiment provides a kind of calibration system of depth camera and Inertial Measurement Unit relative attitude, packet
Include depth camera, Inertial Measurement Unit, displacement information acquisition module and relative attitude demarcating module, wherein depth camera is used for
The three-dimensional spatial information for obtaining target object, is fixed on application scenarios side to be calibrated, with displacement information acquisition module by having
Line is connected;Inertial Measurement Unit is used for the angular speed and acceleration of measurement object object in three dimensions, relative depth camera
It moves freely, is connected with displacement information acquisition module;Data obtaining module is connected with relative attitude demarcating module, for obtaining depth
The first displacement information when degree camera acquisition target object is moved;And Inertial Measurement Unit was acquired in the corresponding period
Second displacement information;Relative attitude demarcating module is used to that depth phase to be calculated using calibration principle according to upper displacement information
The relative attitude spin matrix of machine and Inertial Measurement Unit.
The advantages of technical solution provided by the present application, is, is surveyed using fixed depth camera and the inertia that can move freely
Displacement information when amount unit record target object is moved in three dimensions, using calibration principle, according to these displacements
The relative attitude spin matrix of depth camera and Inertial Measurement Unit is calculated in information.The calibration process of entire relative attitude is easy
In operation, precision it is high, without additional calibration ancillary equipment and be contactless calibration, be conducive to promote depth camera and inertia
The efficiency and accuracy rate of the relative attitude calibration of measuring unit, to be robot remote manipulation and somatic sensation television game equipment Alignment etc.
Body feeling interaction technical field, which provides, to inspire and accurate calibration information.
Description of the drawings
For the clearer technical solution for illustrating the utility model embodiment or the prior art, below will to embodiment or
Attached drawing needed to be used in the description of the prior art is briefly described, it should be apparent that, the accompanying drawings in the following description is only
Some embodiments of the utility model, for those of ordinary skill in the art, without creative efforts,
Other drawings may also be obtained based on these drawings.
Fig. 1 is the block schematic illustration for the illustrative example that the utility model embodiment provides;
Fig. 2 is that the calibration system of depth camera and Inertial Measurement Unit relative attitude that the utility model embodiment provides exists
A kind of structural schematic diagram under specific implementation mode;
Fig. 3 is the schematic diagram for different Descartes's measuring coordinates system that the utility model embodiment provides;
Fig. 4 is the system of depth camera and Inertial Measurement Unit relative attitude that the utility model embodiment provides another
Structural schematic diagram under kind specific implementation mode.
Fig. 5 is curve of another illustrative example that provides of the utility model embodiment under specific application scenarios
Schematic diagram.
Specific implementation mode
In order to make those skilled in the art more fully understand the utility model, below in conjunction with the accompanying drawings and it is embodied
The utility model is described in further detail for mode.Obviously, described embodiment is only the utility model part
Embodiment, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not doing
Go out the every other embodiment obtained under the premise of creative work, shall fall within the protection scope of the present invention.
Term " first ", " second ", " third " " in the description and claims of this application and above-mentioned attached drawing
Four " etc. be for distinguishing different objects, rather than for describing specific sequence.In addition term " comprising " and " having " and
Their any deformations, it is intended that cover and non-exclusive include.Such as contain the process of series of steps or unit, method,
The step of system, product or equipment are not limited to list or unit, but the step of may include not listing or unit.
Present inventor is by the study found that for using depth camera to be combined capture human body with Inertial Measurement Unit
When movable information, the accurate description of depth camera and Inertial Measurement Unit relative attitude how is obtained with easy method, that is,
Simple effective calibration of depth camera and Inertial Measurement Unit relative attitude, current technology scheme cannot be satisfied practical application
Demand.
Three kinds of technical solutions of the prior art are as follows:The relative pose of general camera and Inertial Measurement Unit is demarcated, application
This method can obtain the solution of a suboptimum, but need the auxiliary of scaling board;Depth camera is used with Inertial Measurement Unit
The intelligent algorithms such as extended Kalman filter allow on two sensors the relative pose of coordinate system to converge on actual value, but due to
What the position relationship of depth camera and Inertial Measurement Unit was to determine, be that two sensors are mounted on the same platform, more
Suitable for robot localization and navigation;Both the position relationship of also a kind of depth camera and Inertial Measurement Unit is not fixed, i.e.,
Being located at two has on the carrier of relative motion, and the opposite appearance of the two is demarcated using the artificial method directly observed, estimated
State, manually-operated inevitable drawback, lead to precision it is difficult to ensure that.
In consideration of it, the application is fixed on the depth camera of application scenarios side to be calibrated, acquisition target object by obtaining
Do the first displacement information when rigid motion;Free-moving Inertial Measurement Unit opposite with depth camera is obtained, in same a period of time
Between angular speed and acceleration of the target object that acquires of section when doing rigid motion, to obtain target object the second of three dimensions
Displacement information;Depth camera is calculated and is surveyed with inertia using calibration principle according to the first displacement information and second displacement information
The relative attitude spin matrix for measuring unit, to realize the calibration of the relative attitude of depth camera and Inertial Measurement Unit.
Based on the technical solution of above-mentioned the utility model embodiment, combine Fig. 1 to the utility model embodiment first below
Some possible application scenarios for being related to of technical solution carry out citing introduction, Fig. 1 is one that the utility model embodiment provides
The block schematic illustration of a illustrative example.
As shown in Figure 1, the calibration system of a kind of depth camera and Inertial Measurement Unit relative attitude include depth camera,
Inertial Measurement Unit (Inertial measurement unit, IMU), computer.Depth camera is fixed on the one of application scenarios
Side, the three-dimensional spatial information for obtaining human body part movement;Inertial Measurement Unit is bundled in the motive position of human body, uses
Angular speed and acceleration when measuring people's a certain body part movement in the same period, by angular velocity and acceleration into
Row integral, obtains current posture information;Computer is connected with depth camera, Inertial Measurement Unit respectively, for obtaining depth phase
Machine acquisition target object, according to preset action sequence, have the first displacement information when displacement movement in either direction;And
The second displacement information that Inertial Measurement Unit acquires the target object in the corresponding same period;Then according to first
Information and second displacement information are moved, using calibration principle, the relative attitude that depth camera and Inertial Measurement Unit is calculated revolves
Torque battle array.
It should be noted that above application scene is merely for convenience of understanding the thought of the application and principle and showing, this
The embodiment of application is unrestricted in this regard.On the contrary, presently filed embodiment can be applied to it is applicable any
Scene.
After describing the technical solution of the utility model embodiment, detailed description below the application's is various unrestricted
Property embodiment.
Referring first to Fig. 2, Fig. 2 is the depth camera and Inertial Measurement Unit relative attitude that the utility model embodiment provides
A kind of structural schematic diagram of the calibration system under specific implementation mode, the utility model embodiment may include the following contents:
Depth camera 101, Inertial Measurement Unit 102, displacement information acquisition module 103 and relative attitude demarcating module 104.
Depth camera 101 is used to obtain the three-dimensional spatial information of target object, is fixed on application scenarios side to be calibrated, with
Institute's displacement information acquisition module 103 is connected by wired.
Depth camera 101 is fixed on the side of current application scenarios to be calibrated, for example, being statically placed in people and robot interactive
Scene in, the fixed position of depth camera 101 comprehensive can capture the whole action as forefathers.
Depth camera 101 is a kind of optical sensor, and the three-dimensional spatial information for capturing existing object object is and common
Unlike camera, the depth information of target object can be obtained.Depth camera is referred to but is not limited to based on binocular vision, structure light
Or the optical sensor for obtaining object dimensional spatial information of time-of-flight.
Target object is to meet the object of rigid motion, and rigid motion is it can be appreciated that keep length, angle, area etc.
Constant affine transformation keeps inner product and measures constant.From coordinate transform, the orthogonal moment that corresponding determinant is 1 is rotated
Battle array.Such as the movement at each position of human body is rigid motion;And the movement of people's entirety is non-rigid motion.In addition, rigid body translation
Under, the amount with physical significance, such as gradient, divergence and curl all remain unchanged.
For example, depth camera 101 can obtain the spatial position in palm of the hand joint in real time, use the position of front and back quiescent phase
Horizontalization mean value, which is subtracted each other, can be obtained motion vector, i.e. the first displacement information.
Inertial Measurement Unit 102 is used for the angular speed and acceleration of measurement object object in three dimensions, relative depth
Camera 101 can move freely, and be connected with displacement information acquisition module 103.It can pass through wired phase with displacement information acquisition module 103
Even, or it is wireless connected.
The angular speed when target object that Inertial Measurement Unit 102 acquires the acquisition of same period does rigid motion and
Acceleration, with obtain the target object three dimensions second displacement information.
It is free-moving, i.e., depth camera is statically placed in application for 102 relative depth camera 101 of Inertial Measurement Unit
In scene, Inertial Measurement Unit is not fixed, and can move freely.Optionally, Inertial Measurement Unit can be bound in target object,
So that Inertial Measurement Unit is moved as target object moves.For example, when target object is the hand of people, inertia can be surveyed
Amount unit be bound to human hand and with hand in spatial movement.
Inertial Measurement Unit refers to but is not limited to can directly surveying for integrated multi-axial accelerometer, multiaxis gyroscope and magnetometer
The sensor of the angular speed and acceleration of object in three dimensions is measured, angular velocity and integrated acceleration can be obtained object and exist
Posture information in three dimensions can get the displacement information of target object by posture information.
Data obtaining module 103 is connected with relative attitude demarcating module 104, for obtaining depth camera acquisition target object
In either direction according to preset action sequence, have the first displacement information when displacement movement;And Inertial Measurement Unit exists
The second displacement information that the corresponding same period acquires the target object.
Second displacement information is to be transported according to same action sequence to the same target object with the first displacement information
When dynamic, it is utilized respectively the information that different sampling instruments acquires in the same time, i.e. the first displacement information is depth camera
The information of capture, second displacement information are the information that Inertial Measurement Unit captures.
For example, Inertial Measurement Unit can get the acceleration information of palm of the hand movement corresponding thereto in geomagnetic coordinate system
Relative attitude information can construct the motion vector of Inertial Measurement Unit acquisition, first by acceleration from the seat of Inertial Measurement Unit
Mark system I is mapped to earth's magnetic field coordinate system G:
Wherein, AGIt is the vector acceleration in G,It is the phase between Inertial Measurement Unit coordinate system and earth's magnetic field coordinate system
To posture spin matrix, it can be directly read by Inertial Measurement Unit or integrate to obtain by angular velocity component, AIIt is from inertia
The vector acceleration read in measuring unit,gIt is the gravitation information in g-system, takes 9.8m/s2, and t turns by a certain percentage
It is changed to dimensionless vector g=[0,0,9.8t]T.Then the resultant acceleration in G is judged by given threshold, is obtained
The start and stop point of motor segment, the acceleration information for reusing motor segment each moment is iterated, in the hope of displacement:
Vi GIndicate the velocity vector at i moment, TiIt indicates from the i-1 moment to the time interval the i moment, each periodic system
The time that system reads data can be floated.The speed at each moment first is obtained with acceleration value iteration, and then acquires the i moment
Motion vector.Such iteration to motor segment terminates that the motion vector that Inertial Measurement Unit indicates under g-system can be acquired
Fail if detected to Inertial Measurement Unit motor segment, extracts and fail so as to cause motion vector, then reacquire
First displacement information and second displacement information.
Relative attitude demarcating module 104 is used to, according to first displacement information and the second displacement information, utilize mark
Determine principle, the relative attitude spin matrix of the depth camera and the Inertial Measurement Unit is calculated.
Based on the invariance that rigid motion describes under different Descartes's measuring coordinates system, i.e., established in depth camera three-dimensional
Coordinate system C, Inertial Measurement Unit establish three-dimensional system of coordinate I, in the coordinate system G of earth's magnetic field, please refer to shown in Fig. 3, rigid motion is retouched
It is identical to state.
Calibration solves target and is to determine the posture transformational relation between C and IMagnetometer in 9 axis Inertial Measurement Units
The relative attitude letter between coordinate system I and earth's magnetic field coordinate system G can be calibrated automatically in the case where no external magnetic field is interfered
Breath, i.e.,For known quantity.According to coordinate system transformation relationshipIt is found that rightSolve problems can be converted into pairSolution, that is, determine attitude description of the depth camera in the coordinate system of earth's magnetic field
Demarcate solving model according to the first displacement information and second displacement information structuring, solved using least square method described in
Calibration solving model specifically may be used with obtaining the relative attitude spin matrix of the depth camera and the Inertial Measurement Unit
For:
According to the corresponding motion vector of multigroup first displacement information, the first motion vector matrix is constructed;According to multigroup
The corresponding motion vector of the second displacement information, constructs second displacement vector matrix;
Based on the invariance that rigid motion describes under different Descartes's measuring coordinates system, according to first motion vector
Matrix and second displacement vector matrix construction calibration solving model;
The calibration solving model is solved according to preset algorithm, to obtain the depth camera and the Inertial Measurement Unit
Relative attitude spin matrix;
Multigroup first displacement information be the depth camera multi collect, the target object either direction according to
Preset action sequence have the first displacement information when displacement movement;Multigroup second displacement information is the inertia
Measuring unit acquires the second displacement information of the target object in the corresponding same period.
Action sequence is the track of target object movement, there is specific starting point and the direction of motion, it would be desirable to be able to which acquisition allows sky
Between middle any direction one section of displacement information and terminal location information, can be have the linear motion of displacement, or
Curvilinear motion, this does not influence the realization of the application.For example, static 2s-- movements -- the static 2s of static 2s, wherein hand is intended to
Starting and the final position for allowing sensor to there is the sufficient time to determine hand exercise.
For example, hand moves the input as system, and hand using a certain action sequence along any direction in space
Portion moves three times;Then upper computer software is used to record the acceleration and Euler's angle information of Inertial Measurement Unit generation respectively, it is deep
Magazine hand position information is spent, the timestamp of data is also recorded;Simultaneously using depth camera and Inertial Measurement Unit
Obtain the motion vector of same human hand movement;Using obtained three groups of motion vectors, can spin matrix can be acquired by following formula:
Wherein, MGWhat is indicated is one 3 × 3 motion vector of arbitrary 3 motion vectors composition in the coordinate of earth's magnetic field
Matrix, its form areRespectively arbitrary 3 × 1 under the coordinate of earth's magnetic field
Displacement column vector;Similarly, MCBe depth camera depth coordinate system under transposed matrix, representation and MGEqually;It is deep
Spend the spin matrix between camera coordinates system C and G.But since always there are deviations for the measurement data of sensor, according to upper
Formula solves next spin matrix, and deviation is very big sometimes.
The invariance construction calibration solving model described under different Descartes's measuring coordinates system based on rigid motionThen the Inertial Measurement Unit phase of depth camera and earth's magnetic field coordinate system is calculated by least square method
To posture spin matrix:
Posture of the Inertial Measurement Unit coordinate system in depth camera coordinate system known to coordinate system transformation relationship as shown in Figure 3
It is described as:
Wherein,It is the relative attitude spin moment on Inertial Measurement Unit plate between coordinate system and earth's magnetic field coordinate system
Battle array, i.e., can acquire the relative attitude matrix between depth camera and Inertial Measurement Unit using above formula.
It is executed, is referred to it should be noted that displacement information acquisition module and relative attitude demarcating module can be computer
Generation but be not limited to the personal computer with computing capability or embedded system;Depth camera and computer wired connection, from
And it can be in real time to computer transmission data;Inertial Measurement Unit and computer are wired or are wirelessly connected, equally can be in real time to meter
Calculation machine transmission data;And without being directly connected to or data sharing between depth camera and Inertial Measurement Unit;It is installed in computer
There is independently developed upper computer software, is responsible for obtaining and handling the data of depth camera and Inertial Measurement Unit, it is former according to calibration
Reason solves the relative attitude spin matrix of depth camera and Inertial Measurement Unit.
In the technical solution that the utility model embodiment provides, using fixed depth camera and it can move freely used
Property displacement information of measuring unit record target object when being moved in three dimensions, using calibration principle, according to these
The relative attitude spin matrix of depth camera and Inertial Measurement Unit is calculated in displacement information.Entirely relative attitude is calibrated
Journey is easily operated, precision is high, without additional calibration ancillary equipment and be contactless calibration, be conducive to promoted depth camera with
The efficiency and accuracy rate of the relative attitude calibration of Inertial Measurement Unit, to be manipulated and somatic sensation television game equipment school for robot remote
The body feeling interactions technical field such as standard, which provides, to inspire and accurate calibration information.
In a kind of specific embodiment, due under different application scenarios, depth camera and Inertial Measurement Unit
The form of presentation of relative attitude is different, in consideration of it, above-described embodiment is directed to, referring to Fig. 4, present invention also provides another
Embodiment may also include:
Relative attitude representation module 105 is connected with the relative attitude demarcating module 104, for according to the opposite appearance
State spin matrix, by the relative attitude of the depth camera and the Inertial Measurement Unit according to preset posture representation into
Row conversion.
For posture representation depending on current application scene or the demand of user, the application does not do this any restriction.
According to the relative attitude spin matrix, the relative attitude of the depth camera and the Inertial Measurement Unit is pressed
Converted according to Eulerian angles posture representation, with realize rotation angle, pitch angle and yaw angle indicate the depth camera with it is described
The relative attitude of Inertial Measurement Unit.
Such as examples detailed above, after the completion of calibration process, output be by3 ginsengs of obtained Eulerian angles posture representation
Number, i.e. rotation angle, pitch angle and yaw angle.
It is converted according to different form of presentation, improves the applicability of technical scheme, be conducive to be promoted and use
Family usage experience.
Under a kind of specific embodiment, referring to Fig. 4, present invention also provides another embodiment, can also it wrap
It includes:
Display 106 is connected with the relative attitude demarcating module 104, for according to preset posture representation, showing
Show the relative attitude of the depth camera and the Inertial Measurement Unit.
Certainly, when system includes relative attitude representation module 105, display 106 also can indicate mould with relative attitude
Block 105 is connected.
Increase display, user can more intuitively depth camera and Inertial Measurement Unit relative attitude description information, see
To being conducive to user experience.
The technical solution that the application is understood in order to which those skilled in the art are clearer additionally provides specific example, tool
Body is:
Operator stands before depth camera, is executed according to following action sequence:Hand naturally droops -- lift hand forward, the palm of the hand to
Before, arm is parallel as possible -- taking one pace forward --, and hand is put down naturally.The data transmission of depth camera and Inertial Measurement Unit is to upper
Position machine, which is responsible for and coordinates the operation of each sensor, and records the sensing data at each moment.Shown in Fig. 5
Curve indicate the spatial positional information in the coordinate system of earth's magnetic field in Z-direction, the information is by the palm of the hand measured by depth camera
Spatial position is according to gained after the relative attitude transition matrix conversion acquired above.1,2,3 stages corresponding lift respectively in Fig. 5
Hand, several stages, other periods such as take a step forward, put down hand are quiescent phase.The position that hand lifts is with the marker bit of metope
For reference point, which is pre-set according to the brachium of experimenter.The length of experimenter's palm of the hand to shoulder is about 63cm,
When arm lifts and parallel to the ground as possible, the palm of the hand forward when, the palm of the hand is higher by about 5cm, therefore the label that hand is sagging than shoulder
Position and the vertical height of the marker bit after lift hand are 68cm.It can be seen that shown in solid is according to calibrated spin matrixBy location information from depth coordinate system convert earth's magnetic field coordinate system, lift hand after with lift hand before alternate position spike 68 to 70cm it
Between, and it is 0.81cm to put down the position after hand with the alternate position spike before lift hand, it was demonstrated that the calibration result is feasible.In contrast, dotted line
Using without calibration, the transition matrix manually read handled, vertical drop 78 between 79cm, before putting down after hand
Position deviation is 5.42cm afterwards.Calibrated result is substantially better than not proven as a result, highlighting the reality of the utility model
Application value.
Each embodiment is described by the way of progressive in this specification, the highlights of each of the examples are with it is other
The difference of embodiment, just to refer each other for same or similar part between each embodiment.For being filled disclosed in embodiment
For setting, since it is corresponded to the methods disclosed in the examples, so description is fairly simple, related place is referring to method part
Explanation.
Professional further appreciates that, unit described in conjunction with the examples disclosed in the embodiments of the present disclosure
And algorithm steps, can be realized with electronic hardware, computer software, or a combination of the two, in order to clearly demonstrate hardware and
The interchangeability of software generally describes each exemplary composition and step according to function in the above description.These
Function is implemented in hardware or software actually, depends on the specific application and design constraint of technical solution.Profession
Technical staff can use different methods to achieve the described function each specific application, but this realization is not answered
Think to exceed the scope of the utility model.
The step of method described in conjunction with the examples disclosed in this document or algorithm, can directly be held with hardware, processor
The combination of capable software module or the two is implemented.Software module can be placed in random access memory (RAM), memory, read-only deposit
Reservoir (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or technology
In any other form of storage medium well known in field.
Above to a kind of calibration system of depth camera and Inertial Measurement Unit relative attitude provided by the utility model
It is described in detail.Specific case used herein is expounded the principles of the present invention and embodiment, with
The explanation of upper embodiment is merely used to help understand the method and its core concept of the utility model.It should be pointed out that for this skill
For the those of ordinary skill in art field, without departing from the principle of this utility model, can also to the utility model into
Row some improvements and modifications, modifications and modifications also fall within the protection scope of the claims of the utility model.
Claims (6)
1. the calibration system of a kind of depth camera and Inertial Measurement Unit relative attitude, which is characterized in that including:
Depth camera, Inertial Measurement Unit, displacement information acquisition module and relative attitude demarcating module;
Wherein, the depth camera is used to obtain the three-dimensional spatial information of target object, is fixed on application scenarios side to be calibrated,
It is connected by wired with institute displacement information acquisition module;
The Inertial Measurement Unit is relatively described for measuring the angular speed and acceleration of the target object in three dimensions
Depth camera moves freely, and is connected with institute displacement information acquisition module;
Described information acquisition module is connected with the relative attitude demarcating module, exists for obtaining depth camera acquisition target object
Either direction have the first displacement information when displacement movement according to preset action sequence;And Inertial Measurement Unit is in phase
The second displacement information that the same period answered acquires the target object;
The relative attitude demarcating module is used for according to first displacement information and the second displacement information, former using calibration
Reason, is calculated the relative attitude spin matrix of the depth camera and the Inertial Measurement Unit.
2. the calibration system of depth camera according to claim 1 and Inertial Measurement Unit relative attitude, which is characterized in that
The Inertial Measurement Unit is bound to the motive position of the target object, is moved with realizing as the target object is mobile
It is dynamic.
3. the calibration system of depth camera according to claim 2 and Inertial Measurement Unit relative attitude, which is characterized in that
The Inertial Measurement Unit is integrated multi-axial accelerometer, multiaxis gyroscope or magnetometer.
4. the calibration system of depth camera according to claim 3 and Inertial Measurement Unit relative attitude, which is characterized in that
The Inertial Measurement Unit is with institute displacement information acquisition module by being wirelessly connected.
5. the calibration system of depth camera and Inertial Measurement Unit relative attitude according to any one of claims 1-4,
It is characterized in that, further including:
Relative attitude representation module is connected with the relative attitude demarcating module, is used for according to the relative attitude spin matrix,
The relative attitude of the depth camera and the Inertial Measurement Unit is converted according to preset posture representation.
6. the calibration system of depth camera according to claim 5 and Inertial Measurement Unit relative attitude, which is characterized in that
Further include:
Display is connected with the relative attitude demarcating module, for according to preset posture representation, showing the depth
The relative attitude of camera and the Inertial Measurement Unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720973935.7U CN207923150U (en) | 2017-08-04 | 2017-08-04 | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720973935.7U CN207923150U (en) | 2017-08-04 | 2017-08-04 | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude |
Publications (1)
Publication Number | Publication Date |
---|---|
CN207923150U true CN207923150U (en) | 2018-09-28 |
Family
ID=63611543
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201720973935.7U Active CN207923150U (en) | 2017-08-04 | 2017-08-04 | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN207923150U (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109685852A (en) * | 2018-11-22 | 2019-04-26 | 上海肇观电子科技有限公司 | The scaling method of camera and inertial sensor, system, equipment and storage medium |
CN109798891A (en) * | 2019-01-25 | 2019-05-24 | 上海交通大学 | Inertial Measurement Unit calibration system based on high-precision motion capture system |
CN110928432A (en) * | 2019-10-24 | 2020-03-27 | 中国人民解放军军事科学院国防科技创新研究院 | Ring mouse, mouse control device and mouse control system |
CN111060138A (en) * | 2019-12-31 | 2020-04-24 | 上海商汤智能科技有限公司 | Calibration method and device, processor, electronic equipment and storage medium |
CN111240469A (en) * | 2019-12-31 | 2020-06-05 | 北京诺亦腾科技有限公司 | Calibration method and device for hand motion capture, electronic device and storage medium |
CN111750850A (en) * | 2019-03-27 | 2020-10-09 | 杭州海康威视数字技术股份有限公司 | Angle information acquisition method, device and system |
CN112272757A (en) * | 2019-11-22 | 2021-01-26 | 深圳市大疆创新科技有限公司 | External parameter calibration method and device for detection device and movable platform |
CN112577518A (en) * | 2020-11-19 | 2021-03-30 | 北京华捷艾米科技有限公司 | Inertial measurement unit calibration method and device |
CN113392909A (en) * | 2021-06-17 | 2021-09-14 | 深圳市睿联技术股份有限公司 | Data processing method, data processing device, terminal and readable storage medium |
CN113776556A (en) * | 2021-05-30 | 2021-12-10 | 南京理工大学 | Data fusion-based gyroscope and camera relative position matrix calibration method |
-
2017
- 2017-08-04 CN CN201720973935.7U patent/CN207923150U/en active Active
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109685852A (en) * | 2018-11-22 | 2019-04-26 | 上海肇观电子科技有限公司 | The scaling method of camera and inertial sensor, system, equipment and storage medium |
CN109798891A (en) * | 2019-01-25 | 2019-05-24 | 上海交通大学 | Inertial Measurement Unit calibration system based on high-precision motion capture system |
CN111750850B (en) * | 2019-03-27 | 2021-12-14 | 杭州海康威视数字技术股份有限公司 | Angle information acquisition method, device and system |
CN111750850A (en) * | 2019-03-27 | 2020-10-09 | 杭州海康威视数字技术股份有限公司 | Angle information acquisition method, device and system |
CN110928432A (en) * | 2019-10-24 | 2020-03-27 | 中国人民解放军军事科学院国防科技创新研究院 | Ring mouse, mouse control device and mouse control system |
CN110928432B (en) * | 2019-10-24 | 2023-06-23 | 中国人民解放军军事科学院国防科技创新研究院 | Finger ring mouse, mouse control device and mouse control system |
CN112272757A (en) * | 2019-11-22 | 2021-01-26 | 深圳市大疆创新科技有限公司 | External parameter calibration method and device for detection device and movable platform |
CN111060138A (en) * | 2019-12-31 | 2020-04-24 | 上海商汤智能科技有限公司 | Calibration method and device, processor, electronic equipment and storage medium |
WO2021134960A1 (en) * | 2019-12-31 | 2021-07-08 | 上海商汤智能科技有限公司 | Calibration method and apparatus, processor, electronic device, and storage medium |
CN111060138B (en) * | 2019-12-31 | 2022-01-28 | 上海商汤智能科技有限公司 | Calibration method and device, processor, electronic equipment and storage medium |
CN111240469A (en) * | 2019-12-31 | 2020-06-05 | 北京诺亦腾科技有限公司 | Calibration method and device for hand motion capture, electronic device and storage medium |
CN112577518A (en) * | 2020-11-19 | 2021-03-30 | 北京华捷艾米科技有限公司 | Inertial measurement unit calibration method and device |
CN113776556A (en) * | 2021-05-30 | 2021-12-10 | 南京理工大学 | Data fusion-based gyroscope and camera relative position matrix calibration method |
CN113776556B (en) * | 2021-05-30 | 2024-05-07 | 南京理工大学 | Gyroscope and camera relative position matrix calibration method based on data fusion |
CN113392909A (en) * | 2021-06-17 | 2021-09-14 | 深圳市睿联技术股份有限公司 | Data processing method, data processing device, terminal and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN207923150U (en) | A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude | |
CN107314778A (en) | A kind of scaling method of relative attitude, apparatus and system | |
Panahandeh et al. | Vision-aided inertial navigation based on ground plane feature detection | |
CN104748751B (en) | The calculation method of attitude matrix and positioning navigation method based on attitude matrix | |
CN104658012B (en) | Motion capture method based on inertia and optical measurement fusion | |
CN109313417A (en) | Help robot localization | |
JP4989660B2 (en) | Motion capture device and method related thereto | |
CN102023700B (en) | Three-dimensional man-machine interaction system | |
Roth et al. | Moving Volume KinectFusion. | |
CN201514612U (en) | Three-dimensional dynamic positioning equipment | |
CN111462231B (en) | Positioning method based on RGBD sensor and IMU sensor | |
Tian et al. | Accurate human navigation using wearable monocular visual and inertial sensors | |
CN105608421B (en) | A kind of recognition methods of human action and device | |
CN104834917A (en) | Mixed motion capturing system and mixed motion capturing method | |
JP4743818B2 (en) | Image processing apparatus, image processing method, and computer program | |
CN108846857A (en) | The measurement method and visual odometry of visual odometry | |
CN103314274A (en) | Method and system for estimating a path of a mobile element or body | |
CN109461208A (en) | Three-dimensional map processing method, device, medium and calculating equipment | |
CN109752003A (en) | A kind of robot vision inertia dotted line characteristic positioning method and device | |
CN110388919B (en) | Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality | |
CN110533719B (en) | Augmented reality positioning method and device based on environment visual feature point identification technology | |
CN106574836A (en) | A method for localizing a robot in a localization plane | |
CN107014377A (en) | A kind of multifunction shoe pads based on inertial positioning | |
CN109669533A (en) | A kind of motion capture method, the apparatus and system of view-based access control model and inertia | |
CN107782309A (en) | Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |