CN113267178A - Model pose measurement system and method based on multi-sensor fusion - Google Patents

Model pose measurement system and method based on multi-sensor fusion Download PDF

Info

Publication number
CN113267178A
CN113267178A CN202110320356.3A CN202110320356A CN113267178A CN 113267178 A CN113267178 A CN 113267178A CN 202110320356 A CN202110320356 A CN 202110320356A CN 113267178 A CN113267178 A CN 113267178A
Authority
CN
China
Prior art keywords
laser
model
scanning device
inertial navigation
navigation element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110320356.3A
Other languages
Chinese (zh)
Other versions
CN113267178B (en
Inventor
周一览
赵帅
舒晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110320356.3A priority Critical patent/CN113267178B/en
Publication of CN113267178A publication Critical patent/CN113267178A/en
Application granted granted Critical
Publication of CN113267178B publication Critical patent/CN113267178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Abstract

The invention discloses a model pose measurement system and method based on multi-sensor fusion, and belongs to the field of attitude measurement. Compared with the traditional laser scanning device, the laser scanning device provided by the invention has the advantages that the optical structure design is optimized according to the actual environment, so that the laser scanning device can be compactly arranged on the external glass. And respectively calculating model pose information according to output values of the inertial navigation element and the laser scanning device by establishing a model pose state equation, and calculating the final model pose by taking the difference value of the inertial navigation element and the laser scanning device as a measurement value of an iterative extended Kalman filter. The invention overcomes the limitation of a single sensor by combining the advantages of the inertial navigation element and the laser scanning device, and realizes the high-precision model pose measurement under the long-time condition.

Description

Model pose measurement system and method based on multi-sensor fusion
Technical Field
The invention relates to the field of attitude measurement, in particular to a model pose measurement system and method based on multi-sensor fusion.
Technical Field
The traditional model pose measurement method is contact measurement, high-speed dynamic measurement is realized by placing inertial navigation elements including an accelerometer and a high-precision gyroscope in a measured object, and high-precision pose measurement can be realized in a short time. The stereoscopic vision scheme has wide application prospect in the field of three-dimensional reconstruction and measurement of space objects, has been widely applied to foreign experiments, but generally needs to stick mark points on the surface of a measured object, has low angle measurement precision, and has the problems of difficult field system installation, difficult system online calibration and the like. Therefore, a system and a method for measuring the model pose capable of keeping high precision under long-time work are urgently needed, and three-dimensional displacement and three-dimensional rotation angle information of the target to be measured relative to the initial position are provided in real time.
Therefore, the invention provides a model pose measurement system and method based on multi-sensor fusion, and aims at the problems that a laser scanning device cannot be placed in a room of a model installation table and can only be placed outside an observation window of the room in an actual application scene, and the traditional laser scanning device cannot normally work due to serious attenuation when being separated from glass.
Disclosure of Invention
In order to overcome the defects of a pose measurement method in the prior art, the invention aims to provide a model pose measurement system and method based on multi-sensor fusion, and the model pose measurement accuracy under the long-time working condition is improved.
By using the inertial navigation element and the laser scanning device, the invention combines the advantages of high position and attitude measurement precision of the inertial navigation element in a short time and the advantages of no mark point and no drift of long-time position and attitude measurement results of the laser scanning device. The inertial navigation element is placed in the model to be tested; the laser scanning device is mounted in a compact manner in a support structure of the outer glass. The wireless communication module is arranged in the inertial navigation element, so that data can be transmitted to the upper computer in real time without additionally arranging cables. By using the iterative extended Kalman filter, the multi-sensor fusion of the inertial navigation element and the laser scanning device can be realized, and the pose measurement precision is improved.
The invention provides the following technical scheme:
the invention aims to provide a model pose measurement system based on multi-sensor fusion, which comprises a model mounting platform, an inertial navigation element, a laser scanning device and a data processing module;
the inertial navigation element is fixed in the model mounting table, a laser emergent port of the laser scanning device is right opposite to the model mounting table, and the inertial navigation element and the laser scanning device are in communication connection with the data processing module;
the inertial navigation element comprises a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module, and the wireless communication module is wirelessly connected with the data processing module;
the laser scanning device comprises an optical array, a collimating lens, a deflection prism and a scanning galvanometer, wherein a plurality of laser transmitting/receiving units are uniformly arranged on the optical array, laser beams emitted by the laser transmitting/receiving units sequentially pass through the collimating lens, the deflection prism and the scanning galvanometer, the laser beams are emitted to a model to be detected on a model mounting table perpendicular to a laser emitting port, and reflected light beams are received by a data processing module after returning to original paths.
The invention also aims to provide a model pose measuring method based on multi-sensor fusion, which comprises the following steps:
1) fixing a model to be tested on a model mounting table, and debugging a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module;
2) aligning a laser emergent port of the laser scanning device to a model to be tested on a model mounting table, starting the laser scanning device, emitting a laser beam by a laser emitting module, reflecting the laser beam by a beam splitter prism, and then emitting the laser beam perpendicularly to the optical array; the emergent laser beam is collimated by a collimating lens, then enters a scanning galvanometer after passing through a deflection prism, and the scanning galvanometer is debugged to enable the laser beam to be emitted to a model to be measured on a model mounting table in a way of being vertical to a laser emergent port;
3) the emitted laser is reflected by the model to be measured, returns to an optical array in the laser scanning device, and is received by the laser receiving module after passing through the light splitting prism;
4) and transmitting the acquired measurement data of the inertial navigation element and the point cloud data received by the laser receiving module to a data processing module, and calculating a final pose result by combining iterative extended Kalman filtering.
Compared with the prior art, the invention has the beneficial effects that: the laser scanning device is improved aiming at the traditional laser scanning device, the receiving efficiency of laser is improved to the maximum extent by using the optical array, and the acquisition of the close-range point cloud data with high precision and high frame rate can be realized. By using the iterative extended Kalman filter to realize multi-sensor fusion and combining the advantages of the inertial navigation element and the laser scanning device, high-precision pose measurement under long-time work is realized, and three-dimensional displacement and three-dimensional rotation angle information of the target to be measured relative to the initial position can be provided in real time.
Drawings
FIG. 1 is a structural diagram of a model pose measurement system based on multi-sensor fusion in the embodiment;
FIG. 2 is a schematic flow chart of a model pose measurement method based on multi-sensor fusion in the embodiment;
in the figure: the method comprises the following steps of 1-a model mounting table, 2-an inertial navigation element, 3-a laser scanning device, 4-a data processing module, 5-external glass, 6-a scanning galvanometer, 7-a deflection prism, 8-a collimating lens, 9-an optical array, 10-a laser emission module, 11-a laser receiving module and 12-a beam splitting prism.
Detailed Description
The technical solution of the present invention is further described in detail below with reference to the accompanying drawings.
The invention overcomes the limitation of a single sensor by combining the advantages of the inertial navigation element and the laser scanning device, and realizes the high-precision model pose measurement under the long-time condition.
As shown in fig. 1, the model pose measurement system based on multi-sensor fusion provided by the invention mainly comprises a model mounting table 1, an inertial navigation element 2, a laser scanning device 3 and a data processing module 4; compared with the traditional laser scanning device, the laser scanning device disclosed by the invention has the advantages that the optical structure design is optimized aiming at the actual environment, so that the laser scanning device can be compactly arranged on the external glass.
The inertial navigation element 2 is fixed inside the model mounting table 1, a laser exit port of the laser scanning device 3 is right opposite to the model mounting table 1, and the inertial navigation element 2 and the laser scanning device 3 are in communication connection with the data processing module 4;
the inertial navigation element 2 comprises a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module, and the wireless communication module is wirelessly connected with the data processing module; in this embodiment, a triaxial quartz flexible accelerometer is used as the triaxial accelerometer.
The laser scanning device 3 comprises an optical array 9, a collimating lens 8, a deflecting prism 7 and a scanning vibrating mirror 6, wherein a plurality of laser transmitting/receiving units are uniformly arranged on the optical array, laser beams emitted by the laser transmitting/receiving units sequentially pass through the collimating lens 8, the deflecting prism 7 and the scanning vibrating mirror 6, the laser beams are emitted to a model to be tested on the model mounting table 1 perpendicular to a laser emitting port, and the original reflected light beams are received by the data processing module 4 after being returned.
In this embodiment, the laser emitting/receiving unit includes a laser emitting module 10, a laser receiving module 11, and a beam splitter prism 12; the beam splitter prism is positioned on a 45-degree bisector of two optical axes of the laser emitting module and the laser receiving module, a laser beam emitted by the laser emitting module is reflected by the beam splitter prism and then is emitted out in a direction perpendicular to the optical array, and light returned by an original path after being reflected by the model to be detected is received by the laser receiving module after penetrating through the beam splitter prism.
The laser scanning device 3 is fixed in the supporting structure of the external glass 5, and the laser scanning device 3 transmits data to the data processing module 4 through a USB3.1 interface. The collimator lens 8 collimates the laser beam emitted by the laser emitting module 10 into a parallel beam, and converges the reflected beam to the laser receiving module 11. The deflection prism 7 deflects the converged parallel light beams by an angle of 90 degrees, and the optical dimension in the direction perpendicular to the outer glass 5 can be reduced, so that the laser scanning device 3 can be compactly installed in the support structure of the outer glass 5.
The laser emitting module 10 employs a vertical cavity surface emitting laser.
The laser receiving module 11 adopts a sub avalanche diode.
The scanning galvanometer 6 uses an MEMS galvanometer, reflects the deflected parallel light beams to a three-dimensional space, and receives the light beams reflected by the target.
The invention improves the traditional laser scanning device, improves the receiving efficiency of laser to the maximum extent by using the optical array, and can realize the acquisition of close-range point cloud data with high precision and high frame rate.
The specific working mode of the system is as follows:
1) fixing a model to be tested on a model mounting table, and debugging a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module;
2) aligning a laser emergent port of the laser scanning device to a model to be tested on a model mounting table, starting the laser scanning device, emitting a laser beam by a laser emitting module, reflecting the laser beam by a beam splitter prism, and then emitting the laser beam perpendicularly to the optical array; the emergent laser beam is collimated by a collimating lens, then enters a scanning galvanometer after passing through a deflection prism, and the scanning galvanometer is debugged to enable the laser beam to be emitted to a model to be measured on a model mounting table in a way of being vertical to a laser emergent port;
3) the emitted laser is reflected by the model to be measured, returns to an optical array in the laser scanning device, and is received by the laser receiving module after passing through the light splitting prism;
4) and transmitting the acquired measurement data of the inertial navigation element and the point cloud data received by the laser receiving module to a data processing module, and calculating a final pose result by combining iterative extended Kalman filtering.
In the embodiment, an iterative extended Kalman filter is further adopted to realize multi-sensor fusion, model pose information is respectively calculated according to output values of an inertial navigation element and a laser scanning device by establishing a model pose state equation, and the final model pose is calculated by taking the difference value of the inertial navigation element and the laser scanning device as a measurement value of the iterative extended Kalman filter, so that high-precision pose measurement under long-time work is realized, and the three-dimensional displacement and three-dimensional rotation angle information of a target to be measured relative to an initial position can be provided in real time. The specific process for realizing multi-sensor fusion by adopting the iterative extended Kalman filter comprises the following steps:
a) and (3) establishing a system state equation by combining the measurement error of the inertial navigation element:
X(k)=F(k-1)X(k-1)+G(k-1)W(k-1)
Figure BDA0002992566300000051
wherein X (k) is a six-dimensional state vector of a k-time system, F (k-1) is a k-1-time system state transition matrix, W (k-1) is a k-1-time system process noise sequence, and G (k-1) is a k-1-time noise transition matrix;
δLX,δLY,δLZrespectively representing the three-axis position error of the system, delta alphaX,δβY,
Figure BDA0002992566300000052
Systematic three-axis attitude error, εAXAYAZRespectively representing the constant drift error, epsilon, of a triaxial accelerometer in an inertial navigation elementGXGYGZRespectively representing the constant drift error of the triaxial fiber-optic gyroscope in the inertial navigation element.
b) Taking the information difference value solved by the inertial navigation element and the laser scanning device as an observation vector, and expressing an observation equation as follows:
Figure BDA0002992566300000053
wherein Z (k) is a six-dimensional observation vector of a k-time system, and V (k) is an observation noise sequence of the k-time system; h (k) is a system observation transfer matrix at the moment k; l isINS、LLidarRespectively representing the three-axis position information calculated by the inertial navigation element and the laser scanning device; alpha is alphaINS、αLidarAnd respectively representing the three-axis attitude information calculated by the inertial navigation element and the laser scanning device.
c) Obtaining measurement data of the inertial navigation element, updating the attitude through the equivalent rotation vector, and obtaining model pose information including three-axis position information LINSAnd three-axis attitude information alphaINS
Three-axis position information LLidarAnd three-axis attitude information alphaLidarThe calculation formula is as follows:
Figure BDA0002992566300000054
Figure BDA0002992566300000061
wherein, R is a rigid body transformation rotation matrix. In this embodiment, the ICP-pt2pl algorithm is used as the optimization solving method of the above formula.
d) Acquiring point cloud data acquired by a laser scanning device, and respectively representing the point cloud data of two adjacent frames as P ═ { P ═ P1,p2…pnP'1,p′2…p′nRegistering point clouds to obtain model pose information; wherein p isiIs the ith point cloud position, p, in the previous frame of point cloud datai' is the ith point cloud position in the subsequent frame of point cloud data, and n is the number of point clouds in each frame of point cloud data.
e) And (4) solving a final pose result by iterative computation by using iterative extended Kalman filtering.
The iterative calculation process of the iterative extended Kalman filtering comprises the following steps:
initialization, order
Figure BDA0002992566300000062
Taking the six-dimensional state vector X (k) of the k-time system as a state vector and taking the six-dimensional observation vector Z (k) of the k-time system as an observation vector, and performing iterative computation:
Figure BDA0002992566300000063
Figure BDA0002992566300000064
Figure BDA0002992566300000065
Figure BDA0002992566300000066
wherein the content of the first and second substances,
Figure BDA0002992566300000067
is hkFunction of in
Figure BDA0002992566300000068
Jacobian matrix at a point, hk(. cndot.) is a local linearization function,
Figure BDA0002992566300000069
is the Kalman gain value, P, at time k of the ith iterationk∣k-1For the one-step prediction error covariance matrix from time k-1 to time k, the superscript T represents the transpose,
Figure BDA00029925663000000610
a covariance matrix representing a white noise sequence, v represents complianceThe measured noise of the gaussian distribution is,
Figure BDA00029925663000000611
expressed as a one-step estimate of the state variable from time k-1 to time k, ZkExpressed as the measured variable value at time k,
Figure BDA00029925663000000612
representing the state variable estimated value at the k moment of the ith iteration;
performing nonlinear optimization by Gauss-Newton method
Figure BDA00029925663000000613
And when the current iteration is less than the preset threshold, stopping iteration, and taking the last iteration result as an output value:
Figure BDA0002992566300000071
Figure BDA0002992566300000072
wherein the content of the first and second substances,
Figure BDA0002992566300000073
the state variable estimate, P, at time kk|kRepresenting the prediction error covariance matrix at time k.
Therefore, the scope of the present invention should not be limited to the disclosure of the embodiments, but includes various alternatives and modifications without departing from the scope of the present invention, which is encompassed by the claims of the present patent application.

Claims (10)

1. A model pose measurement system based on multi-sensor fusion is characterized by comprising a model installation platform (1), an inertial navigation element (2), a laser scanning device (3) and a data processing module (4);
the inertial navigation element (2) is fixed in the model mounting table (1), a laser emergent port of the laser scanning device (3) is right opposite to the model mounting table (1), and the inertial navigation element (2) and the laser scanning device (3) are in communication connection with the data processing module (4);
the inertial navigation element (2) comprises a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module, and the wireless communication module is wirelessly connected with the data processing module;
laser scanning device (3) including optical array (9), collimating lens (8), deflection prism (7) and scanning mirror (6) that shakes, optical array on evenly arranged a plurality of laser emission/receiving unit, by laser beam that laser emission/receiving unit sent after shaking mirror (6) through collimating lens (8), deflection prism (7), scanning in proper order, laser beam perpendicular to laser exit is launched to being located the model that awaits measuring on model mount table (1), its reflected light beam is received by data processing module (4) after returning in the original way.
2. The model pose measurement system based on multi-sensor fusion of claim 1, wherein the laser emitting/receiving unit comprises a laser emitting module (10), a laser receiving module (11) and a beam splitter prism (12); the beam splitter prism is positioned on a 45-degree bisector of two optical axes of the laser emitting module and the laser receiving module, a laser beam emitted by the laser emitting module is reflected by the beam splitter prism and then is emitted out in a direction perpendicular to the optical array, and light returned by an original path after being reflected by the model to be detected is received by the laser receiving module after penetrating through the beam splitter prism.
3. The multi-sensor fusion-based model pose measurement system according to claim 2, wherein the laser emitting module (10) adopts a vertical cavity surface emitting laser.
4. The multi-sensor fusion-based model pose measurement system according to claim 2, wherein the laser receiving module (11) employs a sub avalanche diode.
5. The multi-sensor fusion-based model pose measurement system of claim 1, wherein said three-axis accelerometer is a three-axis quartz flexible accelerometer.
6. The model pose measurement system based on multi-sensor fusion of claim 1, characterized in that the scanning galvanometer (6) is a MEMS galvanometer.
7. A multi-sensor fusion model pose measurement method based on the system of claim 1 is characterized by comprising the following steps:
1) fixing a model to be tested on a model mounting table (1), and debugging a three-axis accelerometer, a three-axis fiber-optic gyroscope and a wireless communication module;
2) aligning a laser emergent port of a laser scanning device (3) to a model to be tested on a model mounting table (1), starting the laser scanning device, emitting a laser beam by a laser emitting module (10), reflecting the laser beam by a beam splitter prism (12) and then emitting the laser beam perpendicularly to an optical array (9); the emergent laser beam is collimated by a collimating lens (8), then enters a scanning galvanometer (6) after passing through a deflection prism (7), and the scanning galvanometer is debugged to enable the laser beam to be emitted to a model to be measured on a model mounting table (1) in a way of being vertical to a laser emergent port;
3) the emitted laser is reflected by the model to be measured, returns to an optical array (9) in the laser scanning device (3) on the original path, and is received by a laser receiving module (11) after passing through a beam splitter prism (12);
4) and transmitting the acquired measurement data of the inertial navigation element and the point cloud data received by the laser receiving module (11) to the data processing module (4), and calculating a final pose result by combining iterative extended Kalman filtering.
8. The multi-sensor fused model pose measurement method according to claim 7, wherein the step 4) comprises:
4.1) combining the measurement error of the inertial navigation element to establish a system state equation:
X(k)=F(k-1)X(k-1)+G(k-1)W(k-1)
Figure FDA0002992566290000021
wherein X (k) is a six-dimensional state vector of a k-time system, F (k-1) is a k-1-time system state transition matrix, W (k-1) is a k-1-time system process noise sequence, and G (k-1) is a k-1-time noise transition matrix;
δLX,δLY,δLZrespectively representing the three-axis position error of the system, delta alphaX,δβY,
Figure FDA0002992566290000022
Systematic three-axis attitude error, εAXAYAZRespectively representing the constant drift error, epsilon, of a triaxial accelerometer in an inertial navigation elementGXGYGZRespectively representing constant drift errors of a triaxial fiber-optic gyroscope in the inertial navigation element;
4.2) taking the information difference value solved by the inertial navigation element and the laser scanning device as an observation vector, and expressing an observation equation as follows:
Figure FDA0002992566290000023
wherein Z (k) is a six-dimensional observation vector of a k-time system, and V (k) is an observation noise sequence of the k-time system; h (k) is a system observation transfer matrix at the moment k; l isINS、LLidarRespectively representing the three-axis position information calculated by the inertial navigation element and the laser scanning device; alpha is alphaINS、αLidarRespectively representing three-axis attitude information solved by the inertial navigation element and the laser scanning device;
4.3) obtaining the measurement data of the inertial navigation element, and obtaining the model position by updating the attitude through the equivalent rotation vectorAttitude information including three-axis positional information LINSAnd three-axis attitude information alphaINS
4.4) acquiring the point cloud data collected by the laser scanning device, and respectively representing the point cloud data of two adjacent frames as P ═ { P ═ P1,p2…pnP'1,p′2…p′nRegistering point clouds to obtain model pose information; wherein p isiIs the ith point cloud position, p 'in the previous frame point cloud data'iThe point cloud position is the ith point cloud position in the next frame of point cloud data, and n is the number of point clouds in each frame of point cloud data;
and 4.5) solving a final pose result by iterative computation by using iterative extended Kalman filtering.
9. The multi-sensor fusion model pose measurement method according to claim 8, wherein the three-axis position information L in step 4.3)LidarAnd three-axis attitude information alphaLidarThe calculation formula is as follows:
Figure FDA0002992566290000031
Figure FDA0002992566290000032
wherein, R is a rigid body transformation rotation matrix.
10. The multi-sensor fusion model pose measurement method according to claim 8, wherein the iterative computation process of the iterative extended kalman filter in step 4.5) is as follows:
initialization, order
Figure FDA0002992566290000033
Taking the six-dimensional state vector X (k) of the k-time system as a state vector and taking the six-dimensional observation vector Z (k) of the k-time system as an observation vector, and performing iterative computation:
Figure FDA0002992566290000034
Figure FDA0002992566290000035
Figure FDA0002992566290000036
Figure FDA0002992566290000041
wherein the content of the first and second substances,
Figure FDA0002992566290000042
is hkFunction of in
Figure FDA0002992566290000043
Jacobian matrix at a point, hk(. cndot.) is a local linearization function,
Figure FDA0002992566290000044
is the Kalman gain value, P, at time k of the ith iterationk∣k-1For the one-step prediction error covariance matrix from time k-1 to time k, the superscript T represents the transpose,
Figure FDA0002992566290000045
a covariance matrix representing a white noise sequence, v represents a measurement noise subject to a Gaussian distribution,
Figure FDA0002992566290000046
expressed as one-step estimate of the state variable from time k-1 to time k,ZkExpressed as the measured variable value at time k,
Figure FDA0002992566290000047
representing the state variable estimated value at the k moment of the ith iteration;
performing nonlinear optimization by Gauss-Newton method
Figure FDA0002992566290000048
And when the current value is smaller than the preset threshold value, stopping iteration, and taking the last iteration result as an output value.
CN202110320356.3A 2021-03-25 2021-03-25 Model pose measurement system and method based on multi-sensor fusion Active CN113267178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110320356.3A CN113267178B (en) 2021-03-25 2021-03-25 Model pose measurement system and method based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110320356.3A CN113267178B (en) 2021-03-25 2021-03-25 Model pose measurement system and method based on multi-sensor fusion

Publications (2)

Publication Number Publication Date
CN113267178A true CN113267178A (en) 2021-08-17
CN113267178B CN113267178B (en) 2023-07-07

Family

ID=77227901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110320356.3A Active CN113267178B (en) 2021-03-25 2021-03-25 Model pose measurement system and method based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN113267178B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115127548A (en) * 2022-06-28 2022-09-30 苏州精源创智能科技有限公司 Navigation system integrating inertial navigation and laser dot matrix
CN117724114A (en) * 2024-02-09 2024-03-19 深圳市奇航疆域技术有限公司 Three-dimensional laser scanning device and method based on laser range finder

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130179119A1 (en) * 2011-12-21 2013-07-11 Robotics Paradigm Systems, LLC Data collection and point cloud generation system and method
US20140253689A1 (en) * 2013-03-08 2014-09-11 Kabushiki Kaisha Topcon Measuring Instrument
CN104807479A (en) * 2015-05-20 2015-07-29 江苏华豪航海电器有限公司 Inertial navigation alignment performance evaluation method based on main inertial navigation attitude variation quantity assistance
CN105547288A (en) * 2015-12-08 2016-05-04 华中科技大学 Self-localization method and system for mobile device in underground coal mine
CN105628026A (en) * 2016-03-04 2016-06-01 深圳大学 Positioning and posture determining method and system of mobile object
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN107167148A (en) * 2017-05-24 2017-09-15 安科机器人有限公司 Synchronous superposition method and apparatus
CN107289933A (en) * 2017-06-28 2017-10-24 东南大学 Double card Kalman Filtering guider and method based on MEMS sensor and VLC positioning fusions
US20190154449A1 (en) * 2016-07-22 2019-05-23 Regents Of The University Of Minnesota Square-root multi-state constraint kalman filter for vision-aided inertial navigation system
CN111258313A (en) * 2020-01-20 2020-06-09 深圳市普渡科技有限公司 Multi-sensor fusion SLAM system and robot
CN111367138A (en) * 2020-04-14 2020-07-03 长春理工大学 Novel laser scanning projection device
CN112083725A (en) * 2020-09-04 2020-12-15 湖南大学 Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
US20210072404A1 (en) * 2019-09-05 2021-03-11 Beijing Baidu Netcom Science And Technology Co., Ltd. Method And Apparatus For Evaluating Data, Device, And Computer-Readable Storage Medium
CN112489176A (en) * 2020-11-26 2021-03-12 香港理工大学深圳研究院 Tightly-coupled graph building method fusing ESKF, g2o and point cloud matching

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130179119A1 (en) * 2011-12-21 2013-07-11 Robotics Paradigm Systems, LLC Data collection and point cloud generation system and method
US20140253689A1 (en) * 2013-03-08 2014-09-11 Kabushiki Kaisha Topcon Measuring Instrument
CN104807479A (en) * 2015-05-20 2015-07-29 江苏华豪航海电器有限公司 Inertial navigation alignment performance evaluation method based on main inertial navigation attitude variation quantity assistance
CN105547288A (en) * 2015-12-08 2016-05-04 华中科技大学 Self-localization method and system for mobile device in underground coal mine
CN105628026A (en) * 2016-03-04 2016-06-01 深圳大学 Positioning and posture determining method and system of mobile object
US20190154449A1 (en) * 2016-07-22 2019-05-23 Regents Of The University Of Minnesota Square-root multi-state constraint kalman filter for vision-aided inertial navigation system
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN107167148A (en) * 2017-05-24 2017-09-15 安科机器人有限公司 Synchronous superposition method and apparatus
CN107289933A (en) * 2017-06-28 2017-10-24 东南大学 Double card Kalman Filtering guider and method based on MEMS sensor and VLC positioning fusions
US20210072404A1 (en) * 2019-09-05 2021-03-11 Beijing Baidu Netcom Science And Technology Co., Ltd. Method And Apparatus For Evaluating Data, Device, And Computer-Readable Storage Medium
CN111258313A (en) * 2020-01-20 2020-06-09 深圳市普渡科技有限公司 Multi-sensor fusion SLAM system and robot
CN111367138A (en) * 2020-04-14 2020-07-03 长春理工大学 Novel laser scanning projection device
CN112083725A (en) * 2020-09-04 2020-12-15 湖南大学 Structure-shared multi-sensor fusion positioning system for automatic driving vehicle
CN112489176A (en) * 2020-11-26 2021-03-12 香港理工大学深圳研究院 Tightly-coupled graph building method fusing ESKF, g2o and point cloud matching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WANG GUANBEI等: ""LIDAR/IMU calibration based on ego-motion estimation"" *
张京林: ""基于视觉传感器的服务机器人SLAM技术研究"", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115127548A (en) * 2022-06-28 2022-09-30 苏州精源创智能科技有限公司 Navigation system integrating inertial navigation and laser dot matrix
CN117724114A (en) * 2024-02-09 2024-03-19 深圳市奇航疆域技术有限公司 Three-dimensional laser scanning device and method based on laser range finder
CN117724114B (en) * 2024-02-09 2024-04-19 深圳市奇航疆域技术有限公司 Three-dimensional laser scanning device and method based on laser range finder

Also Published As

Publication number Publication date
CN113267178B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN107340522B (en) Laser radar positioning method, device and system
US11782141B2 (en) Method and apparatus for automatic calibration of mobile LiDAR systems
US11035660B2 (en) Inertial dimensional metrology
EP2405284B1 (en) Self-compensating laser tracker
KR100421428B1 (en) Measurement of 6-degree-of-freedom motions of rigid bodies using a 3-facet mirror
CN108089196B (en) Optics is initiative and is fused non-cooperative target position appearance measuring device passively
JP7025156B2 (en) Data processing equipment, data processing method and data processing program
CN113267178B (en) Model pose measurement system and method based on multi-sensor fusion
Tanenhaus et al. Miniature IMU/INS with optimally fused low drift MEMS gyro and accelerometers for applications in GPS-denied environments
EP2372390B1 (en) Telescope Based Calibration of a Three Dimensional Optical Scanner
EP2353026B1 (en) Telescope based calibration of a three dimensional optical scanner
CN107490391B (en) Space-based detection reference calibration method based on high-bandwidth gyroscope measurement
RU2620288C1 (en) Method and device for determining orientation of space or air crafts
Vaidis et al. Uncertainty analysis for accurate ground truth trajectories with robotic total stations
CN117724114B (en) Three-dimensional laser scanning device and method based on laser range finder
CN114964596B (en) Multi-dimensional force sensor based on expansion optimal precision space and distributed force measuring system
CN111623775A (en) Vehicle attitude measurement system, method, device, and storage medium
WO2019085526A1 (en) Three-dimensional space-oriented positioning correcting method, combined positioning method and device
CN110954101A (en) Debugging method for laser positioning of unmanned aerial vehicle by using Vicon
CN116413010A (en) Real-time monitoring system for on-orbit visual axis change of space remote sensing camera and application method thereof
CN114646312A (en) Astronomical analysis positioning method based on starlight refraction information
CN116124112A (en) Three-dimensional gyroscope based on atomic interference and laser interference
CN117804448A (en) Autonomous system positioning method, device, computer equipment and storage medium
CN116413706A (en) Method for simultaneously establishing graph and calibrating internal reference of laser radar on mobile carrier
CN111238412A (en) Measuring method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant