CN113237475A - Attitude estimation method and device based on vision and inertia measurement unit - Google Patents

Attitude estimation method and device based on vision and inertia measurement unit Download PDF

Info

Publication number
CN113237475A
CN113237475A CN202110509653.2A CN202110509653A CN113237475A CN 113237475 A CN113237475 A CN 113237475A CN 202110509653 A CN202110509653 A CN 202110509653A CN 113237475 A CN113237475 A CN 113237475A
Authority
CN
China
Prior art keywords
imu
measurement data
data
image data
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110509653.2A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongke Shenzhi Technology Co ltd
Original Assignee
Beijing Zhongke Shenzhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongke Shenzhi Technology Co ltd filed Critical Beijing Zhongke Shenzhi Technology Co ltd
Priority to CN202110509653.2A priority Critical patent/CN113237475A/en
Publication of CN113237475A publication Critical patent/CN113237475A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method and a device for estimating an attitude based on a vision and inertia measurement unit, wherein the method comprises the following steps: acquiring image data of a camera and measurement data of an IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on a target object; and estimating the attitude of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU. The invention can simplify the algorithm and improve the accuracy of attitude estimation.

Description

Attitude estimation method and device based on vision and inertia measurement unit
Technical Field
The invention relates to the technical field of vision, in particular to a method and a device for estimating an attitude based on a vision and inertia measurement unit.
Background
Perspectral-n-Point (PnP), also called pose estimation, has the idea of how to estimate the pose when knowing the coordinates of n three-dimensional spatial points and their two-dimensional projection positions. Studies have shown that when n <3, the PnP problem has infinite solutions, and thus studies of the PnP problem are mainly directed to the case of more than 3 feature points.
At present, the PnP problem solving utilizes only two-dimensional pixel coordinates and three-dimensional coordinates of characteristic points, and the fact that the solving algorithm is complex, low in precision and poor in robustness is found in practice.
Disclosure of Invention
The invention aims to provide a method and a device for estimating an attitude based on a vision and inertia measurement unit, so as to solve the technical problem.
In order to achieve the purpose, the invention adopts the following technical scheme:
in providing a method for attitude estimation based on vision and inertial measurement units, the improvement comprising:
acquiring image data of a camera and measurement data of an IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on a target object;
and estimating the attitude of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU.
Further, the estimating the pose of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU includes:
and calculating and acquiring the attitude of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system.
Further, the transformation matrix is defined as:
Figure BDA0003062201620000021
in the above formula, R represents a rotation matrix;
t denotes a translation vector.
Further, the estimating the pose of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU further includes:
and calibrating the image data and the measurement data of the IMU so as to improve the fusion precision of the image data and the measurement data of the IMU.
Further, the calibration method comprises the following steps:
judging whether the image data is valid or not through a preset threshold;
and when the image data is effective image data, fusing the image data and the measurement data of the IMU through iteration unscented Kalman filtering.
Further, still include:
acquiring angle data, wherein the angle data comprises measurement data of a tilt sensor arranged on a camera and measurement data of a tilt sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
In order to achieve the purpose, the invention also adopts the following technical scheme:
in a vision and inertial measurement unit based attitude estimation device, the improvement comprising:
the acquisition module is used for acquiring image data of the camera and measurement data of the IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on the target object;
and the estimation module is used for estimating the posture of the target object in the camera coordinate system through the fusion of the image data and the measurement data of the IMU.
Further, the estimation module is further configured to:
and calculating and acquiring the attitude of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system.
Further, the estimation module is further configured to: and calibrating the image data and the measurement data of the IMU so as to improve the fusion precision of the image data and the measurement data of the IMU.
Further, it is characterized by also comprising:
the calibration module is used for acquiring angle data, wherein the angle data comprises measurement data of an inclination angle sensor arranged on the camera and measurement data of an inclination angle sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
According to the invention, the camera is combined with the Inertial Measurement Unit (IMU), and the resolution of the accelerometer of the IMU is particularly high, so that the algorithm is simple, and the accuracy of attitude estimation is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a flow chart of a method for estimating a pose based on a vision and inertial measurement unit according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an attitude estimation apparatus based on a vision and inertia measurement unit according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
Wherein the showings are for the purpose of illustration only and are shown by way of illustration only and not in actual form, and are not to be construed as limiting the present patent; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
An attitude estimation method based on a vision and inertia measurement unit provided by an embodiment of the present invention, as shown in fig. 1, includes the following steps:
acquiring image data of a camera and measurement data of an IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on a target object, the target object comprises n characteristic points, and the positions of the n characteristic points in an object coordinate system are known, wherein n is larger than 3;
and estimating the attitude of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU.
Obviously, by combining the camera with an Inertial Measurement Unit (IMU), the algorithm can be made simple and the accuracy of the pose estimation improved, since the resolution of the accelerometer of the IMU is particularly high.
In some embodiments, estimating the pose of the target object in the camera coordinate system through image data and measurement data fusion of the IMU includes:
calculating and acquiring the posture of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system, wherein the transformation matrix is defined as:
Figure BDA0003062201620000041
in the above formula, R represents a rotation matrix;
t represents a translation vector, t ═ tx,ty,tz]T
In this embodiment, the entire rotation matrix from the object coordinate system to the camera coordinate system can be directly obtained through the measurement data of the IMU, and therefore, the posture of the target object in the camera coordinate system can be obtained only by calculating the translation vector t.
In some embodiments, estimating the pose of the target object in the camera coordinate system through image data and measurement data fusion of the IMU further comprises:
the calibration method comprises the following steps of calibrating image data and IMU measurement data to improve the fusion precision of the image data and the IMU measurement data and avoid the divergence of the fusion process:
judging whether the image data is valid or not through a preset threshold;
and when the image data is effective image data, fusing the image data and the measurement data of the IMU through iteration unscented Kalman filtering.
Through the scheme, the accuracy of the calibration data can be effectively improved, and the accuracy of the attitude estimation is further improved.
In some embodiments, the attitude estimation method based on the vision and inertia measurement units further comprises:
acquiring angle data, wherein the angle data comprises measurement data of a tilt sensor arranged on a camera and measurement data of a tilt sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
Based on the same inventive concept, an embodiment of the present invention further provides an attitude estimation apparatus based on a vision and inertia measurement unit, as shown in fig. 2, including:
the acquisition module 1 is used for acquiring image data of a camera and measurement data of an IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on a target object, the target object comprises n characteristic points, and the positions of the n characteristic points in an object coordinate system are known, wherein n is larger than 3;
and the estimation module 2 is used for estimating the posture of the target object in the camera coordinate system through the fusion of the image data and the measurement data of the IMU.
Obviously, by combining the camera with an Inertial Measurement Unit (IMU), the algorithm can be made simple and the accuracy of the pose estimation improved, since the resolution of the accelerometer of the IMU is particularly high.
In some embodiments, the estimation module is further to:
calculating and acquiring the posture of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system, wherein the transformation matrix is defined as:
Figure BDA0003062201620000061
in the above formula, R represents a rotation matrix;
t represents a translation vector, t ═ tx,ty,tz]T
In this embodiment, the entire rotation matrix from the object coordinate system to the camera coordinate system can be directly obtained through the measurement data of the IMU, and therefore, the posture of the target object in the camera coordinate system can be obtained only by calculating the translation vector t.
In some embodiments, the estimation module is further to:
the calibration method comprises the following steps of calibrating image data and IMU measurement data to improve the fusion precision of the image data and the IMU measurement data and avoid the divergence of the fusion process:
judging whether the image data is valid or not through a preset threshold;
and when the image data is effective image data, fusing the image data and the measurement data of the IMU through iteration unscented Kalman filtering.
Through the scheme, the accuracy of the calibration data can be effectively improved, and the accuracy of the attitude estimation is further improved.
In some embodiments, the attitude estimation device based on the vision and inertia measurement units further comprises:
the calibration module 3 is used for acquiring angle data, wherein the angle data comprises measurement data of a tilt sensor arranged on the camera and measurement data of a tilt sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
It should be understood that the above-described embodiments are merely preferred embodiments of the invention and the technical principles applied thereto. It will be understood by those skilled in the art that various modifications, equivalents, changes, and the like can be made to the present invention. However, such variations are within the scope of the invention as long as they do not depart from the spirit of the invention. In addition, certain terms used in the specification and claims of the present application are not limiting, but are used merely for convenience of description.

Claims (10)

1. A method for estimating attitude based on vision and inertial measurement units, comprising:
acquiring image data of a camera and measurement data of an IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on a target object;
and estimating the attitude of the target object in the camera coordinate system through the image data and the measurement data fusion of the IMU.
2. The method of claim 1, wherein estimating the pose of the target object in the camera coordinate system through image data and IMU measurement data fusion comprises:
and calculating and acquiring the attitude of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system.
3. The method of claim 2, wherein the transformation matrix is defined as:
Figure FDA0003062201610000011
in the above formula, R represents a rotation matrix;
t denotes a translation vector.
4. The method of claim 1, wherein estimating the pose of the target object in the camera coordinate system through image data and IMU measurement data fusion, further comprises:
and calibrating the image data and the measurement data of the IMU so as to improve the fusion precision of the image data and the measurement data of the IMU.
5. The method of claim 4, wherein the calibration method comprises:
judging whether the image data is valid or not through a preset threshold;
and when the image data is effective image data, fusing the image data and the measurement data of the IMU through iteration unscented Kalman filtering.
6. The method of claim 4, further comprising:
acquiring angle data, wherein the angle data comprises measurement data of a tilt sensor arranged on a camera and measurement data of a tilt sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
7. An attitude estimation device based on a vision and inertial measurement unit, comprising:
the acquisition module is used for acquiring image data of the camera and measurement data of the IMU, wherein the measurement data of the IMU comprises the measurement data of the IMU arranged on the camera and the measurement data of the IMU arranged on the target object;
and the estimation module is used for estimating the posture of the target object in the camera coordinate system through the fusion of the image data and the measurement data of the IMU.
8. The apparatus of claim 7, wherein the estimation module is further configured to:
and calculating and acquiring the attitude of the target object in the camera coordinate system by calculating a transformation matrix from the object coordinate system to the camera coordinate system.
9. The apparatus of claim 7, wherein the estimation module is further configured to: and calibrating the image data and the measurement data of the IMU so as to improve the fusion precision of the image data and the measurement data of the IMU.
10. The apparatus of claim 7, further comprising:
the calibration module is used for acquiring angle data, wherein the angle data comprises measurement data of an inclination angle sensor arranged on the camera and measurement data of an inclination angle sensor arranged on a target object;
additional constraints are provided by the angle data to improve the accuracy of the pose estimation.
CN202110509653.2A 2021-05-13 2021-05-13 Attitude estimation method and device based on vision and inertia measurement unit Pending CN113237475A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110509653.2A CN113237475A (en) 2021-05-13 2021-05-13 Attitude estimation method and device based on vision and inertia measurement unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110509653.2A CN113237475A (en) 2021-05-13 2021-05-13 Attitude estimation method and device based on vision and inertia measurement unit

Publications (1)

Publication Number Publication Date
CN113237475A true CN113237475A (en) 2021-08-10

Family

ID=77133419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110509653.2A Pending CN113237475A (en) 2021-05-13 2021-05-13 Attitude estimation method and device based on vision and inertia measurement unit

Country Status (1)

Country Link
CN (1) CN113237475A (en)

Similar Documents

Publication Publication Date Title
CN110582798B (en) System and method for virtual enhanced vision simultaneous localization and mapping
CN106643699B (en) Space positioning device and positioning method in virtual reality system
JP4814669B2 (en) 3D coordinate acquisition device
JP6338021B2 (en) Image processing apparatus, image processing method, and image processing program
CN109155066B (en) Method for motion estimation between two images of an environmental region of a motor vehicle, computing device, driver assistance system and motor vehicle
CN108038886B (en) Binocular camera system calibration method and device and automobile
CN107833237B (en) Method and apparatus for blurring virtual objects in video
CN112083403B (en) Positioning tracking error correction method and system for virtual scene
JP6894707B2 (en) Information processing device and its control method, program
JP6061770B2 (en) Camera posture estimation apparatus and program thereof
CN111279354A (en) Image processing method, apparatus and computer-readable storage medium
JP2006234703A (en) Image processing device, three-dimensional measuring device, and program for image processing device
JP2013097675A (en) Gradient estimation device, gradient estimation method, and gradient estimation program
JP6947066B2 (en) Posture estimator
CN112017236A (en) Method and device for calculating position of target object based on monocular camera
CN109474817B (en) Optical sensing device, method and optical detection module
JP6922348B2 (en) Information processing equipment, methods, and programs
Satoh et al. A head tracking method using bird's-eye view camera and gyroscope
CN113345032B (en) Initialization map building method and system based on wide-angle camera large distortion map
CN111105467A (en) Image calibration method and device and electronic equipment
JP2019045989A (en) Information processing apparatus, information processing method and computer program
CN111145267B (en) 360-degree panoramic view multi-camera calibration method based on IMU assistance
CN113237475A (en) Attitude estimation method and device based on vision and inertia measurement unit
US11069121B2 (en) Methods, devices and computer program products for creating textured 3D images
JP2020035158A (en) Attitude estimation device and calibration system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination