US11274788B2 - Gimbal pose correction method and device - Google Patents

Gimbal pose correction method and device Download PDF

Info

Publication number
US11274788B2
US11274788B2 US17/075,034 US202017075034A US11274788B2 US 11274788 B2 US11274788 B2 US 11274788B2 US 202017075034 A US202017075034 A US 202017075034A US 11274788 B2 US11274788 B2 US 11274788B2
Authority
US
United States
Prior art keywords
pose
gimbal
compensation device
velocity
attitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/075,034
Other versions
US20210033242A1 (en
Inventor
Xiang Zhang
Bing Li
You Zhou
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Assigned to SZ DJI Technology Co., Ltd. reassignment SZ DJI Technology Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LI, BING, ZHOU, YOU, ZHANG, XIANG
Publication of US20210033242A1 publication Critical patent/US20210033242A1/en
Application granted granted Critical
Publication of US11274788B2 publication Critical patent/US11274788B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/18Heads with mechanism for moving the apparatus relatively to the stand
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/043Allowing translations
    • F16M11/046Allowing translations adapted to upward-downward translation movement
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • F16M11/10Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting around a horizontal axis
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • F16M11/12Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting in more than one direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M2200/00Details of stands or supports

Definitions

  • the present disclosure relates to the technical field of gimbal control and, more particularly, to a gimbal pose correction method and device.
  • Pose estimation is one of the key problems to be solved in robot control.
  • the pose estimation refers to obtaining position, velocity, attitude, and heading information that satisfies requirements of control bandwidth, dynamic performance, stability, and accuracy according to data from various motion state sensors.
  • a system providing instant pose information is called a navigation system.
  • the navigation system generally includes an inertial navigation system, a global navigation satellite system (GNSS), a Doppler navigation system, a visual navigation system, and the like.
  • GNSS global navigation satellite system
  • Doppler navigation system a visual navigation system
  • Integrated navigation technology uses a plurality of different navigation systems to measure a same information source, extracts and corrects errors of each navigation system using measured values.
  • Integrated navigation technology is one of the important applications in the field of multi-sensor information fusion state estimation.
  • Inertial-GNSS integrated navigation is one of the commonly used integrated navigations.
  • Conventional inertial-GNSS integrated navigations use the north-east-down (NED) coordinate system as a navigation coordinate system, and hence need a north-pointing heading observation and generally use a geomagnetic sensor to provide a reference heading.
  • the geomagnetic sensor is susceptible to interference from electric current and magnetic field.
  • the conventional inertial-GNSS integrated navigations use latitude and longitude to represent the position, such that the GNSS needs to provide position measurement in the form of latitude and longitude. Therefore, the GNSS navigation cannot work in an indoor environment.
  • Conventional single-point GNSS's generally have position and velocity measurement errors at m-level, but in some applications, the velocity control accuracy is needed to be mm-level, and hence, the inertial-GNSS integrated navigations cannot satisfy the accuracy requirements.
  • a gimbal pose correction method including obtaining a first pose of a gimbal based on an Inertial Measurement Unit (IMU) arranged at a vertical compensation device configured to be coupled to the gimbal and compensate for movement of the gimbal in a vertical direction, obtaining a second pose of the vertical compensation device based on a vision device arranged at the vertical compensation device, and correcting the first pose according to the second pose.
  • IMU Inertial Measurement Unit
  • a gimbal pose correction device including a vertical compensation device, and a vision device and an Inertial Measurement Unit (IMU) both arranged at and electrically coupled to the vertical compensation device.
  • the vertical compensation device is configured to be connected to a gimbal and compensate for movement of the gimbal in a vertical direction.
  • the vertical compensation device is further configured to obtain a first pose of the gimbal based on the IMU, obtain a second pose of the vertical compensation device based on the vision device, and correct the first pose according to the second pose.
  • FIG. 1 is a schematic structural diagram of a gimbal pose correction device consistent with embodiments of the disclosure.
  • FIG. 2 is a schematic structural diagram of another gimbal pose correction device consistent with embodiments of the disclosure.
  • FIG. 3 is a schematic structural diagram of another gimbal pose correction device consistent with embodiments of the disclosure.
  • FIG. 4 is a schematic flow chart of a gimbal pose correction method consistent with embodiments of the disclosure.
  • FIG. 5 is a schematic flow chart of another gimbal pose correction method consistent with embodiments of the disclosure.
  • IMU Inertial measurement unit
  • Vision device 3
  • Main body 31
  • Body 32
  • Base 33
  • Handheld member 4
  • Axis arm 5
  • Motor 6
  • FIG. 1 is a schematic structural diagram of an example gimbal pose correction device consistent with the disclosure.
  • a gimbal is connected to a vertical compensation device, and the vertical compensation device can compensate for a movement of the gimbal in a vertical direction.
  • the gimbal can be mounted on a movable object, e.g., a user, an unmanned aerial vehicle (UAV), or a robot, through the vertical compensation device.
  • UAV unmanned aerial vehicle
  • the vertical compensation device can be used to compensate for the movement of the gimbal in the vertical direction.
  • the vertical movement of the gimbal can be reduced, thereby ensuring the smooth images of the camera.
  • the vertical compensation device includes a vision device 2 and an inertial measurement unit (IMU) 1 .
  • FIGS. 2 and 3 are schematic structural diagram of other examples of the gimbal pose correction device consistent with the disclosure.
  • the vertical compensation device includes a main body 3 and an axis arm 4 connected to the gimbal.
  • the axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction.
  • a motor 5 is arranged at a main body 3 , and the motor 5 can be configured to drive the axis arm 4 to rotate.
  • the axis arm 4 can also be driven to rotate by other driving devices.
  • the IMU 1 can be arranged at the axis arm 4 , and the IMU 1 can be arranged at an end of the axis arm 4 connected to the gimbal. In some embodiments, the IMU 1 can be arranged at any other position of the axis arm 4 .
  • the vision device 2 can be arranged at the main body 3 , and a detection direction of the vision device 2 can be upward or downward.
  • a detection direction of the vision device 2 can be upward or downward.
  • the detection direction of the vision device 2 can be approximately parallel to the vertical direction, and the detection direction of the vision device 2 can have a small tilt angle relative to the vertical direction (an angle range of the tilt angle can be set according to empirical values).
  • the vision device 2 can monitor vertically upwards or vertically downwards.
  • the main body 3 includes a body 31 and a base 32 fixedly connected to the body 31 , and the vision device 2 is arranged at the base 32 .
  • the gimbal can be mounted at a UAV, a mobile robot, or another movable device through the base 32 .
  • the vertical compensation device can compensate for the movement in the vertical direction to offset an influence of the movement in the vertical direction on the images of the camera.
  • the vertical compensation device may include a handheld device, and the compensation device includes a handheld member 33 fixedly connected to the body 31 .
  • the user can hold the handheld member 33 to drive the vertical compensation device to move as a whole.
  • the movement in the vertical direction along with a stride frequency can affect the images of the camera, and the vertical compensation device can compensate for the vertical movement to offset the influence of the movement in the vertical direction on the images of the camera.
  • a body coordinate system ⁇ b ⁇ -O b x b y b z b can be defined as follows.
  • An origin of the coordinate system O b can be a geometric center of a plane at which the axis arm 4 is connected to an end of the gimbal corresponding to an axis.
  • x b axis can be in a vertical symmetry plane of the body 31 and parallel to a bottom surface of the base 32 , and can point forward.
  • y b axis can be perpendicular to the vertical symmetry plane of the body 31 and can point to a right side of the body 31 .
  • z b axis can be in the vertical symmetry plane of the body 31 and perpendicular to the x b axis, and can point below the body 31 .
  • a base coordinate system ⁇ p ⁇ -O p x p y p z p of the base 20 can be defined as follows.
  • An origin of the coordinate system O p can be a center of the axis arm 4 , i.e., an intersection of a rotation center line of the axis arm 4 and the vertical symmetry plane of the body 31 .
  • x p axis can be in the vertical symmetry plane of the body 31 and parallel to the bottom surface of the base 32 , and can point forward.
  • y p axis can be perpendicular to the vertical symmetry plane of the body 31 and can point to the right side of the body 31 .
  • z p axis can be in the vertical symmetry plane of the body 31 and perpendicular to the x p axis, and can point below the body 31 .
  • a camera coordinate system can be denoted by ⁇ c ⁇ -O c x c y c z c
  • a navigation coordinate system can be denoted by ⁇ n ⁇ -O n x n y n z n
  • An origin of the navigation coordinate system O n can be determined by a vertical projection of an origin of the camera coordinate system O c on the ground when the system starts to work.
  • Coordinate axis of the navigation coordinate system can be determined by an output of the vision device 2 .
  • the vision device 2 can output a pose of the camera coordinate system ⁇ c ⁇ relative to the navigation coordinate system ⁇ n ⁇ .
  • the vision device 2 can output a reference position P nc n , a reference velocity V nc n , and a reference attitude q c n of the vertical compensation device. In some other embodiments, the vision device 2 can output the reference position P nc n and the reference velocity V nc n of the vertical compensation device.
  • FIG. 4 is a schematic flow chart of an example gimbal pose correction method consistent with the disclosure.
  • An execution entity of the method may include a processor of the vertical compensation device, or an independent control device communicatively connected to the processor of the vertical compensation device.
  • a first pose of the gimbal is obtained based on the IMU 1 .
  • the first pose may include the velocity, position, and attitude of the gimbal.
  • the IMU 1 may include a gyroscope and an accelerometer.
  • the gyroscope can include a three-axis gyroscope
  • the accelerometer can include a three-axis accelerometer.
  • the process at S 401 can include obtaining an angular velocity of the gimbal based on the gyroscope, obtaining a specific force of the gimbal based on the accelerometer, and then calculating the attitude, velocity, and position of the gimbal based on the angular velocity and the specific force.
  • the method further includes updating the attitude of the gimbal. Updating the attitude of the gimbal can include designing an attitude update formula according to the angular velocity and the specific force, and updating the attitude of the gimbal according to the attitude update formula.
  • a design process for the attitude update formula can be as follows.
  • An ideal output of the gyroscope, denoted as ⁇ ib b can include a projection of a rotation angular rate of the body coordinate system ⁇ b ⁇ relative to an inertial system ⁇ i ⁇ in the ⁇ b ⁇ system, and an actual output of the gyroscope is denoted as ⁇ ib b .
  • An ideal output of the accelerometer, denoted as f b can include a projection of the specific force in the ⁇ b ⁇ system, and the actual output of the accelerometer can be denoted as ⁇ tilde over (f) ⁇ b .
  • Formula (2) can be determined by a latest updated attitude value, ⁇ ie n and ⁇ en n are the earth's rotation angular rate and position angular rate.
  • the method consistent with the disclosure is suitable for a moving shoot at a low-velocity and short-distance, and near the ground, such that ⁇ ie n and ⁇ en n can be approximately ignored, and thus, ⁇ nb b ⁇ ib b .
  • an attitude matrix ⁇ n b can be obtained by updating the attitude quaternion, which actually can establish a mathematical platform of strapdown inertial navigation.
  • the method further includes updating the velocity of the gimbal.
  • Updating the velocity of the gimbal can include designing a velocity update formula according to the angular velocity and the specific force, and updating the velocity of the gimbal according to the velocity update formula.
  • the method further includes updating the position of the gimbal.
  • Updating the position of the gimbal can include designing a position update formula according to the angular velocity and the specific force, and updating the position of the gimbal according to the position update formula.
  • attitude update formula velocity update formula
  • position update formula described above are merely examples, which are not limited herein.
  • a second pose of the vertical compensation device is obtained based on the vision device 2 .
  • the vision device 2 may include a visual odometer or a visual inertial odometer.
  • the vision device 2 can include the visual odometer, and the second pose can include the velocity and position of the vertical compensation device.
  • FIG. 5 is a schematic flow chart of another example gimbal pose correction method consistent with the disclosure.
  • the vision device 2 includes the visual inertial odometer (VIO), and the second pose can include the velocity, position, and attitude of the vertical compensation device.
  • VIO visual inertial odometer
  • the vertical compensation device may further include a Time of Flight (TOF) measurement device.
  • TOF Time of Flight
  • a detection result of the vision device 2 can be corrected by a detection result of the TOF measurement device.
  • the vertical compensation device can obtain the position of the vertical compensation device through a detection of the TOF measurement device, and correct the position of the compensation device obtained by the vision device 2 to obtain an accurate position of the vertical compensation device.
  • an Ultra-Wideband (UWB) positioning device can be used instead of the vision device 2 , and the pose of the vertical compensation device can be measured by the UWB positioning device.
  • An inertial-UWB integrated navigation method consistent with the disclosure is not interfered by electric current and magnetic field, and can be suitable for various indoor and outdoor environments.
  • the vision device 2 can be fixed on the base 32 , the coordinate system of the reference velocity and position of the vertical compensation device directly output by the vision device 2 can be different from the coordinate system of the first pose.
  • the reference velocity and reference position of the vertical compensation device directly output by the vision device 2 cannot be used as the reference of the first pose to correct the first pose.
  • coordinate conversion can be performed on the reference velocity and the reference position of the vertical compensation device directly output by the vision device 2 to obtain the second pose being in the same coordinate system as the first pose.
  • an angular velocity sensor 6 can be arranged at the axis arm 4 and can be configured to obtain a joint angle of the axis arm 4 .
  • the method may further include obtaining the joint angle of the axis arm 4 based on the angular velocity sensor 6 .
  • the joint angle of the axis arm 4 can be determined based on a joint angle of the motor 5 that drives the axis arm 4 to rotate.
  • a type of the angular velocity sensor 6 is not limited herein, and any suitable angular velocity sensor 6 can be selected.
  • the process at S 402 can further include, according to the joint angle, performing the coordinate conversion on the reference velocity of the vertical compensation device output by the vision device 2 , obtaining the conversed velocity of the vertical compensation device, and correcting the velocity of the gimbal according to the conversed velocity of the vertical compensation device.
  • the process at S 402 can further include, according to the joint angle, performing the coordinate conversion on the reference position of the vertical compensation device output by the vision device 2 , obtaining the conversed position of the vertical compensation device, and correcting the position of the gimbal according to the conversed position of the vertical compensation device.
  • the process at S 402 can further include constructing a reference direction cosine matrix of the reference attitude based on the reference attitude output by the visual inertial odometer, and according to the reference direction cosine matrix, obtaining the attitude of the vertical compensation device.
  • Obtaining the attitude of the vertical compensation device according to the direction cosine matrix can include obtaining an attitude correction value of the vertical compensation device according to the reference direction cosine matrix, and obtaining the attitude of the vertical compensation device according to the attitude correction value. Therefore, the attitude of the gimbal can be corrected by the attitude of the vertical compensation device.
  • the first pose is corrected according to the second pose.
  • the first pose obtained at S 401 can be corrected to obtain a pose estimation value of the gimbal.
  • the pose of the gimbal can be controlled according to the pose estimation value to ensure the accuracy of the gimbal pose.
  • correcting the first pose or the gimbal pose can refer to correcting the pose of the gimbal in the direction of the yaw axis, pitch axis, and/or roll axis.
  • a loop feedback, an optimal estimation, or another algorithm may be used at S 403 to fuse the first pose and the second pose to realize the inertial-visual integrated navigation.
  • a Kalman filter an optimal estimation algorithm
  • an implementation process of fusing the first pose and the second pose using the Kalman filter will be described.
  • the process at S 401 can further include obtaining the angular velocity of the gimbal based on the gyroscope, obtaining the specific force of the gimbal based on the accelerometer, and calculating the error of the first pose according to the angular velocity and the specific force.
  • calculating the error of the first pose according to the angular velocity and the specific force can include, according to the angular velocity and the specific force, constructing an attitude error, a velocity error, and a position error of the first pose, and calculating the error of the first pose according to the attitude error, velocity error and position error.
  • the process at S 403 can include approximating the error of the first pose to obtain the Kalman filter, obtaining a correction value through the Kalman filter by using the second pose as an observation value, and correcting the first pose according to the correction value to realize the correction of the pose of the gimbal in the vertical direction.
  • approximating the error of the first pose can refer to removing an error term that has a small impact in the error of the first pose.
  • the gimbal consistent with the disclosure is suitable for the moving shooting at the low-velocity and short-distance, and near the ground.
  • a calculation process of an attitude error formula can be as follows.
  • Formula (10) can be written as:
  • ⁇ ⁇ ⁇ q . b ′ b 1 2 ⁇ ( ⁇ ⁇ ib b ⁇ ⁇ ⁇ ⁇ q b ′ b - ⁇ ⁇ ⁇ q b ′ b ⁇ ⁇ ⁇ ib b ) - 1 2 ⁇ ( ⁇ + n r ) ⁇ ⁇ ⁇ ⁇ q b ′ b ⁇ 1 2 ⁇ [ 0 1 ⁇ 3 0 - 2 ⁇ ⁇ ⁇ ⁇ ib b ⁇ 0 3 ⁇ 1 ] ⁇ ⁇ ⁇ ⁇ q b ′ b - 1 2 ⁇ [ 0 ⁇ - n r ]
  • attitude angle offset of the ⁇ b′ ⁇ system relative to the ⁇ b ⁇ system is denoted as ⁇ .
  • the state equation of attitude error can be:
  • [ ⁇ . ⁇ . ] [ - ⁇ ⁇ ⁇ ib b ⁇ ⁇ - I 3 ⁇ 3 0 3 ⁇ 3 0 3 ⁇ 3 ] ⁇ [ ⁇ ⁇ ] + [ - I 3 ⁇ 3 0 3 ⁇ 3 0 3 ⁇ 3 I 3 ⁇ 3 ] ⁇ [ n r n w ] ( 13 )
  • a calculation process of the velocity error can be as follows.
  • the gimbal consistent with the disclosure is suitable for the moving shooting at low-velocity and short-distance, and near the ground.
  • the error of the first pose i.e., the error formula of the integrated navigation system
  • the error of the first pose i.e., the error formula of the integrated navigation system
  • a state transition matrix F can be:
  • a noise distribution matrix G can be:
  • the vision device 2 can include the visual inertial odometer.
  • the observation value of the Kalman filter described above can be designed according to an output result of the visual inertial odometer. The specific design process can be as follows.
  • the reference attitude output by the visual inertial odometer is denoted as q c n
  • the cosine matrix of the reference direction is denoted as C n c .
  • a heading reference output by the visual inertial odometer can be used as a heading observation of the integrated navigation system, and it is considered that the ⁇ b ⁇ series and the ⁇ c ⁇ series are completely aligned.
  • v x n [ 1 0 0 ]
  • v y n [ 0 1 0 ]
  • v z n [ 0 0 1 ]
  • v x b C n c v x n (17)
  • a unit projection of the gravity reference vector in the ⁇ b ⁇ system (e.g., the reference vector in z direction of the ⁇ b ⁇ system), when the gimbal is completely still, can be:
  • the reference vector v y b in y direction of the ⁇ b ⁇ system can be obtained from v z b and v x b .
  • the reference attitude quaternion q n b can be obtained from C n b , and the attitude correction quaternion can be:
  • ⁇ circumflex over (q) ⁇ b n in Formula (20) is a latest estimation of the attitude quaternion.
  • the attitude correction value ⁇ circumflex over ( ⁇ ) ⁇ output by the Kalman filter can be used to correct the updated attitude value of the gimbal obtained by Formula (4), and the corrected attitude output can be obtained to realize the correction of the attitude of the gimbal.
  • a velocity and position vector [V nc n P nc n ] T output by the visual inertial odometer can include the velocity and position of the camera coordinate system ⁇ c ⁇ relative to the ⁇ n ⁇ system, and a velocity observation and a position observation of the ⁇ b ⁇ system are needed to be obtained. Mechanical errors are not considered herein.
  • a parallelogram mechanism of the axis arm 4 can ensure that an end plane of the axis is always parallel to the bottom surface of the base 32 . Therefore, there is only translational motion between the ⁇ b ⁇ system and the ⁇ p ⁇ system.
  • V r n C p n
  • P r n C p n
  • C p n denotes the direction cosine matrix from the ⁇ p ⁇ system to the ⁇ n ⁇ system
  • ⁇ P p is a projection of a relative position vector from O b to O c in the ⁇ p ⁇ system
  • ⁇ V p is a projection of a relative velocity vector from O b to O c in the
  • the reference velocity vector V r n and reference position vector P r n can be obtained, the velocity observation formula and position observation formula of the integrated navigation system can be obtained as:
  • H ⁇ [0 3 ⁇ 3 I 3 ⁇ 3 0 3 ⁇ 9 ]
  • v V [v Vx v Vy v Vz ] T
  • H P [0 3 ⁇ 6 I 3 ⁇ 3 0 3 ⁇ 6 ]
  • v P [v Px v Py v Pz ] T
  • v V is the velocity observation noise
  • H P is the position observation noise.
  • Formula (27) can be used as the velocity observation formula of the Kalman filter, a velocity correction value can be output through the Kalman filter, and the updated speed value obtained by the velocity correction value and Formula (5) can be corrected to obtain a corrected velocity output, thereby realizing the correction of the velocity of the gimbal.
  • Formula (28) can be used as the position observation formula of the Kalman filter, the position correction value can be output through the Kalman filter, and the updated position value obtained by the position correction value and Formula (6) can be corrected to obtain a corrected position output, thereby realizing the correction of the position of the gimbal.
  • the method consistent with the disclosure can adopt the inertial-vision integrated navigation mode, and correct the second pose obtained by the vision device 2 based on the first pose obtained by the IMU 1 to obtain the pose satisfying the requirements of the control bandwidth and accuracy.
  • the inertial-visual integrated navigation mode consistent with the present disclosure is not interfered by electric current and magnetic field, and can be suitable for various indoor and outdoor environments.
  • the present disclosure further provides the gimbal pose correction device.
  • the device may include the vertical compensation device connected to the gimbal, the vision device 2 arranged at the vertical compensation device, and the IMU 1 arranged at the vertical compensation device.
  • the vertical compensation device can be configured to compensate for the movement of the gimbal in the vertical direction, and the vision device 2 and the IMU 1 can be electrically connected to the vertical compensation device.
  • the vertical compensation device can be configured to obtain the first pose of the gimbal based on the IMU 1 , obtain the second pose of the vertical compensation device based on the vision device 2 , and correct the first pose according to the second pose.
  • the vertical compensation device further includes the main body 3 and the axis arm 4 connected to the gimbal.
  • the axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction.
  • the IMU 1 can be arranged at the axis arm 4
  • the vision device 2 can be arranged at the main body 3 .
  • the vision device 2 can include the visual odometer, and the second pose can include the velocity and position of the vertical compensation device.
  • the vision device 2 can includes the visual inertial odometer, and the second pose can include the velocity, position, and attitude of the vertical compensation device.
  • the vertical compensation device can include the axis arm 4 connected to the gimbal.
  • the axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction.
  • the angular velocity sensor 6 can be arranged at the axis arm 4 .
  • the vertical compensation device can be configured to obtain the joint angle of the axis arm 4 based on the angular velocity sensor 6 .
  • the first pose can include the velocity of the gimbal.
  • the vertical compensation device can be configured to perform the coordinate conversion on the reference velocity of the vertical compensation device output by the vision device 2 according to the joint angle, and obtain the velocity of the vertical compensation device.
  • the first pose can include the position of the gimbal.
  • the vertical compensation device can be configured to perform the coordinate conversion on the reference position of the vertical compensation device output by the vision device 2 according to the joint angle, and obtain the position of the vertical compensation device.
  • the vertical compensation device can be configured to construct the reference direction cosine matrix of the reference attitude based on the reference attitude output by the visual inertial odometer, and obtain the attitude of the vertical compensation device according to the reference direction cosine matrix.
  • the vertical compensation device can be configured to obtain the attitude correction value of the vertical compensation device according to the reference direction cosine matrix, and obtain the attitude of the vertical compensation device according to the attitude correction value.
  • the first pose can include the velocity, position, and attitude of the gimbal.
  • the IMU 1 can include the gyroscope and the accelerometer.
  • the vertical compensation device can be configured to obtain the angular velocity of the gimbal based on the gyroscope, obtain the specific force of the gimbal based on the accelerometer, and calculate the attitude, velocity, and position of the gimbal according to the angular velocity and the specific force.
  • the vertical compensation device can be configured to design the attitude update formula according to the angular velocity and the specific force, and update the attitude of the gimbal according to the attitude update formula.
  • the vertical compensation device can be configured to design the velocity update formula according to the angular velocity and the specific force, and update the velocity of the gimbal according to the velocity update formula.
  • the vertical compensation device can be configured to design the position update formula according to the angular velocity and the specific force and update the position of the gimbal according to the position update formula.
  • the IMU 1 can include the gyroscope and the accelerometer.
  • the vertical compensation device can be configured to obtain the angular velocity of the gimbal based on the gyroscope, obtain the specific force of the gimbal based on the accelerometer, and calculate the error of the first pose according to the angular velocity and the specific force.
  • the vertical compensation device can be configured to construct the attitude error, velocity error, and position error of the first pose according to the angular velocity and the specific force, and calculate the error of the first pose according to the attitude error, velocity error, and position error.
  • the vertical compensation device can be configured to approximate the error of the first pose to obtain the Kalman filter, obtain the correction value through the Kalman filter by using the second pose as the observation value, and correct the first pose according to the correction value.
  • the devices described above are merely exemplary.
  • the units described as separate components may or may not be physically separate, and a component shown as a unit may or may not be a physical unit. That is, the units may be located in one place or may be distributed over a plurality of network elements. Some or all of the components may be selected according to the actual needs to achieve the object of the present disclosure. Those skilled in the art can understand and implement without creative work.

Abstract

A gimbal pose correction method includes obtaining a first pose of a gimbal based on an Inertial Measurement Unit (IMU) arranged at a vertical compensation device configured to be coupled to the gimbal and compensate for movement of the gimbal in a vertical direction, obtaining a second pose of the vertical compensation device based on a vision device arranged at the vertical compensation device, and correcting the first pose according to the second pose.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a continuation of International Application No. PCT/CN2018/084499, filed on Apr. 25, 2018, the entire content of which is incorporated herein by reference.
TECHNICAL FIELD
The present disclosure relates to the technical field of gimbal control and, more particularly, to a gimbal pose correction method and device.
BACKGROUND
Pose estimation is one of the key problems to be solved in robot control. The pose estimation refers to obtaining position, velocity, attitude, and heading information that satisfies requirements of control bandwidth, dynamic performance, stability, and accuracy according to data from various motion state sensors. A system providing instant pose information is called a navigation system. The navigation system generally includes an inertial navigation system, a global navigation satellite system (GNSS), a Doppler navigation system, a visual navigation system, and the like. Integrated navigation technology uses a plurality of different navigation systems to measure a same information source, extracts and corrects errors of each navigation system using measured values. Integrated navigation technology is one of the important applications in the field of multi-sensor information fusion state estimation.
Inertial-GNSS integrated navigation is one of the commonly used integrated navigations. Conventional inertial-GNSS integrated navigations use the north-east-down (NED) coordinate system as a navigation coordinate system, and hence need a north-pointing heading observation and generally use a geomagnetic sensor to provide a reference heading. However, the geomagnetic sensor is susceptible to interference from electric current and magnetic field. In addition, the conventional inertial-GNSS integrated navigations use latitude and longitude to represent the position, such that the GNSS needs to provide position measurement in the form of latitude and longitude. Therefore, the GNSS navigation cannot work in an indoor environment. Conventional single-point GNSS's generally have position and velocity measurement errors at m-level, but in some applications, the velocity control accuracy is needed to be mm-level, and hence, the inertial-GNSS integrated navigations cannot satisfy the accuracy requirements.
SUMMARY
In accordance with the disclosure, there is provided a gimbal pose correction method including obtaining a first pose of a gimbal based on an Inertial Measurement Unit (IMU) arranged at a vertical compensation device configured to be coupled to the gimbal and compensate for movement of the gimbal in a vertical direction, obtaining a second pose of the vertical compensation device based on a vision device arranged at the vertical compensation device, and correcting the first pose according to the second pose.
Also in accordance with the disclosure, there is provided a gimbal pose correction device including a vertical compensation device, and a vision device and an Inertial Measurement Unit (IMU) both arranged at and electrically coupled to the vertical compensation device. The vertical compensation device is configured to be connected to a gimbal and compensate for movement of the gimbal in a vertical direction. The vertical compensation device is further configured to obtain a first pose of the gimbal based on the IMU, obtain a second pose of the vertical compensation device based on the vision device, and correct the first pose according to the second pose.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to provide a clearer illustration of technical solutions of disclosed embodiments, the drawings used in the description of the disclosed embodiments are briefly described below. It will be appreciated that the disclosed drawings are merely examples and other drawings conceived by those having ordinary skills in the art on the basis of the described drawings without inventive efforts should fall within the scope of the present disclosure.
FIG. 1 is a schematic structural diagram of a gimbal pose correction device consistent with embodiments of the disclosure.
FIG. 2 is a schematic structural diagram of another gimbal pose correction device consistent with embodiments of the disclosure.
FIG. 3 is a schematic structural diagram of another gimbal pose correction device consistent with embodiments of the disclosure.
FIG. 4 is a schematic flow chart of a gimbal pose correction method consistent with embodiments of the disclosure.
FIG. 5 is a schematic flow chart of another gimbal pose correction method consistent with embodiments of the disclosure.
Description of Reference Numerals
1 Inertial measurement unit (IMU) 2 Vision device
3 Main body 31 Body
32 Base 33 Handheld member
4 Axis arm 5 Motor
6 Angular velocity sensor
DETAILED DESCRIPTION OF THE EMBODIMENTS
In order to provide a clearer illustration of technical solutions of disclosed embodiments, example embodiments will be described with reference to the accompanying drawings. It will be appreciated that the described embodiments are some rather than all of the embodiments of the present disclosure. Other embodiments conceived by those having ordinary skills in the art on the basis of the described embodiments without inventive efforts should fall within the scope of the present disclosure.
Hereinafter, the gimbal pose correction method and device consistent with the disclosure will be described in detail with reference to the accompanying drawings. Unless conflicting, the exemplary embodiments and features in the exemplary embodiments can be combined with each other.
FIG. 1 is a schematic structural diagram of an example gimbal pose correction device consistent with the disclosure. As shown in FIG. 1, a gimbal is connected to a vertical compensation device, and the vertical compensation device can compensate for a movement of the gimbal in a vertical direction. In some embodiments, the gimbal can be mounted on a movable object, e.g., a user, an unmanned aerial vehicle (UAV), or a robot, through the vertical compensation device. When the movable object moves, there is a movement in the vertical direction, and the movement in the vertical direction can result in unstable images of a camera on the gimbal. Therefore, the vertical compensation device can be used to compensate for the movement of the gimbal in the vertical direction. Compared with a situation that the gimbal is directly mounted on the movable object and moves with the movable object, the vertical movement of the gimbal can be reduced, thereby ensuring the smooth images of the camera.
The vertical compensation device includes a vision device 2 and an inertial measurement unit (IMU) 1. FIGS. 2 and 3 are schematic structural diagram of other examples of the gimbal pose correction device consistent with the disclosure. As shown in FIGS. 2 and 3, the vertical compensation device includes a main body 3 and an axis arm 4 connected to the gimbal. The axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction. In some embodiments, a motor 5 is arranged at a main body 3, and the motor 5 can be configured to drive the axis arm 4 to rotate. In some other embodiments, the axis arm 4 can also be driven to rotate by other driving devices. The IMU 1 can be arranged at the axis arm 4, and the IMU 1 can be arranged at an end of the axis arm 4 connected to the gimbal. In some embodiments, the IMU 1 can be arranged at any other position of the axis arm 4.
The vision device 2 can be arranged at the main body 3, and a detection direction of the vision device 2 can be upward or downward. For example, when a gimbal pose correction system is located in an outdoor environment, the vision device 2 can face downward, and when the gimbal pose correction system is located indoors, the vision device 2 can face upward or downward. In some embodiments, the detection direction of the vision device 2 can be approximately parallel to the vertical direction, and the detection direction of the vision device 2 can have a small tilt angle relative to the vertical direction (an angle range of the tilt angle can be set according to empirical values). In some embodiments, the vision device 2 can monitor vertically upwards or vertically downwards. As shown in FIGS. 2 and 3, the main body 3 includes a body 31 and a base 32 fixedly connected to the body 31, and the vision device 2 is arranged at the base 32.
In some embodiments, the gimbal can be mounted at a UAV, a mobile robot, or another movable device through the base 32. During the movement of the UAV, mobile robot, or other movable devices, there is the movement in the vertical direction that affects the images of the camera (i.e., the camera on the gimbal). The vertical compensation device can compensate for the movement in the vertical direction to offset an influence of the movement in the vertical direction on the images of the camera.
In some embodiments, as shown in FIGS. 2 and 3, the vertical compensation device may include a handheld device, and the compensation device includes a handheld member 33 fixedly connected to the body 31. The user can hold the handheld member 33 to drive the vertical compensation device to move as a whole. When the user is walking, the movement in the vertical direction along with a stride frequency can affect the images of the camera, and the vertical compensation device can compensate for the vertical movement to offset the influence of the movement in the vertical direction on the images of the camera.
Hereinafter, a body coordinate system {b}-Obxbybzb can be defined as follows. An origin of the coordinate system Ob can be a geometric center of a plane at which the axis arm 4 is connected to an end of the gimbal corresponding to an axis. xb axis can be in a vertical symmetry plane of the body 31 and parallel to a bottom surface of the base 32, and can point forward. yb axis can be perpendicular to the vertical symmetry plane of the body 31 and can point to a right side of the body 31. zb axis can be in the vertical symmetry plane of the body 31 and perpendicular to the xb axis, and can point below the body 31.
A base coordinate system {p}-Opxpypzp of the base 20 can be defined as follows. An origin of the coordinate system Op can be a center of the axis arm 4, i.e., an intersection of a rotation center line of the axis arm 4 and the vertical symmetry plane of the body 31. xp axis can be in the vertical symmetry plane of the body 31 and parallel to the bottom surface of the base 32, and can point forward. yp axis can be perpendicular to the vertical symmetry plane of the body 31 and can point to the right side of the body 31. zp axis can be in the vertical symmetry plane of the body 31 and perpendicular to the xp axis, and can point below the body 31.
A camera coordinate system can be denoted by {c}-Ocxcyczc, and a navigation coordinate system can be denoted by {n}-Onxnynzn. An origin of the navigation coordinate system On can be determined by a vertical projection of an origin of the camera coordinate system Oc on the ground when the system starts to work. Coordinate axis of the navigation coordinate system can be determined by an output of the vision device 2. The vision device 2 can output a pose of the camera coordinate system {c} relative to the navigation coordinate system {n}. In some embodiments, the vision device 2 can output a reference position Pnc n, a reference velocity Vnc n, and a reference attitude q c n of the vertical compensation device. In some other embodiments, the vision device 2 can output the reference position Pnc n and the reference velocity Vnc n of the vertical compensation device.
FIG. 4 is a schematic flow chart of an example gimbal pose correction method consistent with the disclosure. An execution entity of the method may include a processor of the vertical compensation device, or an independent control device communicatively connected to the processor of the vertical compensation device.
As shown in FIG. 4, at S401, a first pose of the gimbal is obtained based on the IMU 1. The first pose may include the velocity, position, and attitude of the gimbal.
The IMU 1 may include a gyroscope and an accelerometer. In some embodiments, the gyroscope can include a three-axis gyroscope, and the accelerometer can include a three-axis accelerometer. The process at S401 can include obtaining an angular velocity of the gimbal based on the gyroscope, obtaining a specific force of the gimbal based on the accelerometer, and then calculating the attitude, velocity, and position of the gimbal based on the angular velocity and the specific force.
In some embodiments, the method further includes updating the attitude of the gimbal. Updating the attitude of the gimbal can include designing an attitude update formula according to the angular velocity and the specific force, and updating the attitude of the gimbal according to the attitude update formula.
In some embodiments, a design process for the attitude update formula can be as follows. An ideal output of the gyroscope, denoted as ωib b, can include a projection of a rotation angular rate of the body coordinate system {b} relative to an inertial system {i} in the {b} system, and an actual output of the gyroscope is denoted as ω ib b. An ideal output of the accelerometer, denoted as fb, can include a projection of the specific force in the {b} system, and the actual output of the accelerometer can be denoted as {tilde over (f)}b.
A quaternion qb n can be used as a representation of the attitude of the {n} system relative to the {b} system, and an error-free ideal quaternion differential formula can be determined by the following formula:
{dot over (q)} n b=½ωnb b ⊗q n b=½Ω(ωnb b)q n b  (1)
The attitude angular rate in Formula (1) can be determined by the following formula:
ωnb bib b−ωin bib b −C n bie nen n)  (2)
Formula (2) can be determined by a latest updated attitude value, ωie n and ωen n are the earth's rotation angular rate and position angular rate. The method consistent with the disclosure is suitable for a moving shoot at a low-velocity and short-distance, and near the ground, such that ωie n and ωen n can be approximately ignored, and thus, ωnb b≈ωib b. In an actual system, due to an existence of a gyroscope measurement error and a navigation solution error, an actual solution of the quaternion differential formula can be performed by the following formula:
{circumflex over ({dot over (q)})} n b=½{circumflex over (ω)}nb b ⊗{circumflex over (q)} n b≈½Ω({circumflex over (ω)}ib b)q n b  (3)
An actual body coordinate system determined by {circumflex over (q)}b n can be denoted as {b′}. Discretize the quaternion differential formula shown in Formula (1) and use the first-order approximation, the quaternion update formula shown below can be obtained:
{circumflex over (q)} b n(t k+1)={circumflex over (q)} b n(t k)+½Ω({circumflex over (ω)}ib b(t k+1))Δt·{circumflex over (q)} b n(t k)  (4)
According to Formula (4), an attitude matrix Ĉn b can be obtained by updating the attitude quaternion, which actually can establish a mathematical platform of strapdown inertial navigation.
In some embodiments, the method further includes updating the velocity of the gimbal. Updating the velocity of the gimbal can include designing a velocity update formula according to the angular velocity and the specific force, and updating the velocity of the gimbal according to the velocity update formula. For example, the following formula can be used as the approximate velocity update formula:
{circumflex over (V)} n(t k+1)={circumflex over (V)} n(t k)+C b n {circumflex over (f)} b(t k+1)·Δt  (5)
In some embodiments, the method further includes updating the position of the gimbal. Updating the position of the gimbal can include designing a position update formula according to the angular velocity and the specific force, and updating the position of the gimbal according to the position update formula. For example, the following formula can be used as the approximate position update formula:
{circumflex over (P)} n(t k+1)={circumflex over (P)} n(t k)+{circumflex over (V)} n(t k+1)·Δt+½C b n {circumflex over (f)} b(t k+1)·Δt 2  (6)
It can be appreciated that the attitude update formula, velocity update formula, and position update formula described above are merely examples, which are not limited herein.
At S402, a second pose of the vertical compensation device is obtained based on the vision device 2.
The vision device 2 may include a visual odometer or a visual inertial odometer. For example, the vision device 2 can include the visual odometer, and the second pose can include the velocity and position of the vertical compensation device. FIG. 5 is a schematic flow chart of another example gimbal pose correction method consistent with the disclosure. As another example, as shown in FIG. 5, the vision device 2 includes the visual inertial odometer (VIO), and the second pose can include the velocity, position, and attitude of the vertical compensation device.
In some embodiments, the vertical compensation device may further include a Time of Flight (TOF) measurement device. In some embodiments, a detection result of the vision device 2 can be corrected by a detection result of the TOF measurement device. For example, the vertical compensation device can obtain the position of the vertical compensation device through a detection of the TOF measurement device, and correct the position of the compensation device obtained by the vision device 2 to obtain an accurate position of the vertical compensation device.
In some embodiments, an Ultra-Wideband (UWB) positioning device can be used instead of the vision device 2, and the pose of the vertical compensation device can be measured by the UWB positioning device. An inertial-UWB integrated navigation method consistent with the disclosure is not interfered by electric current and magnetic field, and can be suitable for various indoor and outdoor environments.
Since the vision device 2 can be fixed on the base 32, the coordinate system of the reference velocity and position of the vertical compensation device directly output by the vision device 2 can be different from the coordinate system of the first pose. The reference velocity and reference position of the vertical compensation device directly output by the vision device 2 cannot be used as the reference of the first pose to correct the first pose. In some embodiments, coordinate conversion can be performed on the reference velocity and the reference position of the vertical compensation device directly output by the vision device 2 to obtain the second pose being in the same coordinate system as the first pose.
In some embodiments, an angular velocity sensor 6 can be arranged at the axis arm 4 and can be configured to obtain a joint angle of the axis arm 4. Before the process at S402, the method may further include obtaining the joint angle of the axis arm 4 based on the angular velocity sensor 6. In some other embodiments, the joint angle of the axis arm 4 can be determined based on a joint angle of the motor 5 that drives the axis arm 4 to rotate. A type of the angular velocity sensor 6 is not limited herein, and any suitable angular velocity sensor 6 can be selected.
In some embodiments, the process at S402 can further include, according to the joint angle, performing the coordinate conversion on the reference velocity of the vertical compensation device output by the vision device 2, obtaining the conversed velocity of the vertical compensation device, and correcting the velocity of the gimbal according to the conversed velocity of the vertical compensation device. In some embodiments, the process at S402 can further include, according to the joint angle, performing the coordinate conversion on the reference position of the vertical compensation device output by the vision device 2, obtaining the conversed position of the vertical compensation device, and correcting the position of the gimbal according to the conversed position of the vertical compensation device.
When the vision device 2 includes the visual inertial odometry, the process at S402 can further include constructing a reference direction cosine matrix of the reference attitude based on the reference attitude output by the visual inertial odometer, and according to the reference direction cosine matrix, obtaining the attitude of the vertical compensation device. Obtaining the attitude of the vertical compensation device according to the direction cosine matrix can include obtaining an attitude correction value of the vertical compensation device according to the reference direction cosine matrix, and obtaining the attitude of the vertical compensation device according to the attitude correction value. Therefore, the attitude of the gimbal can be corrected by the attitude of the vertical compensation device.
At S403, the first pose is corrected according to the second pose.
According to the second pose obtained at S402, the first pose obtained at S401 can be corrected to obtain a pose estimation value of the gimbal. The pose of the gimbal can be controlled according to the pose estimation value to ensure the accuracy of the gimbal pose. In some embodiments, for a three-axis (yaw axis, pitch axis, roll axis) gimbal, correcting the first pose or the gimbal pose can refer to correcting the pose of the gimbal in the direction of the yaw axis, pitch axis, and/or roll axis.
A loop feedback, an optimal estimation, or another algorithm may be used at S403 to fuse the first pose and the second pose to realize the inertial-visual integrated navigation. In some embodiments, a Kalman filter (an optimal estimation algorithm) can be used to fuse the first pose and the second pose. Hereinafter, an implementation process of fusing the first pose and the second pose using the Kalman filter will be described.
In some embodiments, the process at S401 can further include obtaining the angular velocity of the gimbal based on the gyroscope, obtaining the specific force of the gimbal based on the accelerometer, and calculating the error of the first pose according to the angular velocity and the specific force. For example, calculating the error of the first pose according to the angular velocity and the specific force can include, according to the angular velocity and the specific force, constructing an attitude error, a velocity error, and a position error of the first pose, and calculating the error of the first pose according to the attitude error, velocity error and position error.
The process at S403 can include approximating the error of the first pose to obtain the Kalman filter, obtaining a correction value through the Kalman filter by using the second pose as an observation value, and correcting the first pose according to the correction value to realize the correction of the pose of the gimbal in the vertical direction. Herein, approximating the error of the first pose can refer to removing an error term that has a small impact in the error of the first pose.
In some embodiments, as shown in FIG. 5, the gimbal consistent with the disclosure is suitable for the moving shooting at the low-velocity and short-distance, and near the ground. A measurement error model of the gyroscope can be:
{tilde over (ω)}ib bib b +b+n r  (7)
wherein nr denotes measurement noise of the gyroscope and is assumed to be a Gaussian white noise. b denotes a zero bias of the gyroscope and is assumed to be a random walk process in a form of {dot over (b)}=nw, and nw denotes Gaussian white noise. {circumflex over (b)} denotes a zero-bias estimation of the gyroscope. If {circumflex over (b)} is a constant zero-bias, then {circumflex over ({dot over (b)})}=0. According to the measurement error model of the gyroscope, it can be obtained that ωib b={tilde over (ω)}ib b−b−nr and {circumflex over (ω)}ib b={tilde over (ω)}ib b−{circumflex over (b)}.
The zero bias error of the gyroscope can be defined as:
ε=b−{circumflex over (b)}  (8)
Thus, {dot over (ε)}={dot over (b)}−{circumflex over ({dot over (b)})}=nw.
A state quantity of the attitude solution can be defined as x=[qb n b]T. According to the quaternion differential formula and the measurement error model of the gyroscope, it can be obtained:
[ q . n b b . ] = [ 1 2 Ω ( ω ~ ib b - b - n r ) q ^ n b 0 ] + [ 0 n w ]
For the state estimator:
[ q ^ . n b b ^ . ] = [ 1 2 Ω ( ω ~ ib b - b ^ ) q ^ n b 0 ]
Combined with the above formula, a calculation process of an attitude error formula can be as follows. An error quaternion caused by {circumflex over (q)}b n can be denoted by δqb′ b, and according to the quaternion multiplication, it can be obtained:
q n b =δq b′ b ⊗{circumflex over (q)} n b  (9)
Perform a time derivative on Formula (9), and calculate a state equation of the system according to the attitude:
δ{dot over (q)} b′ b≈½(ωib b ⊗δq b′ b −δq b′ b⊗{circumflex over (ω)}ib b)  (10)
Considering the measurement error model (7) of the gyroscope, Formula (10) can be written as:
δ q . b b = 1 2 ( ω ^ ib b δ q b b - δ q b b ω ^ ib b ) - 1 2 ( ɛ + n r ) δ q b b 1 2 [ 0 1 × 3 0 - 2 ω ^ ib b × 0 3 × 1 ] · δ q b b - 1 2 [ 0 ɛ - n r ]
The attitude angle offset of the {b′} system relative to the {b} system is denoted as ϕ. Consider ϕ to be a small angle, the approximate expression of δqb′ b can be
δ q b b = [ 1 ϕ 2 ] T ,
which can be inserted into Formula (11) to obtain:
{dot over (ϕ)}=−{circumflex over (ω)}ib b ×ϕ−ε−n r  (12)
The state equation of attitude error can be:
[ ϕ . ɛ . ] = [ - ω ^ ib b × - I 3 × 3 0 3 × 3 0 3 × 3 ] · [ ϕ ɛ ] + [ - I 3 × 3 0 3 × 3 0 3 × 3 I 3 × 3 ] · [ n r n w ] ( 13 )
A calculation process of the velocity error can be as follows. According to the specific force formula, an error-free ideal velocity value can be determined according to the following differential formula:
{dot over (V)} n =C b n f b−(2ωie nen nV n +g n
where gn represents an acceleration of gravity in the navigation coordinate system. The gimbal consistent with the disclosure is suitable for the moving shooting at low-velocity and short-distance, and near the ground. Thus, ωie n and ωen n can be approximately ignored, such that the approximate velocity error calculation formula can be as follows:
δ{dot over (V)} n=−ϕn ×{circumflex over (f)} n+∇n  (14)
wherein ∇n denotes a projection of an accelerometer zero offset in the navigation coordinate system.
A calculation process of the position error can be as follows. Unlike conventional integrated navigation using the latitude and longitude to represent the position, the method consistent with the disclosure can use the visual navigation for position measurement, and is suitable for the moving shooting at low-velocity and short-distance, and near the ground. Therefore, a position error formula in the form of distance can be as Formula (15):
δ{dot over (P)} n =δV n  (15)
Combining the calculation formulas of the attitude error, velocity error, and position error, the error of the first pose (i.e., the error formula of the integrated navigation system) can be obtained as:
{dot over (X)}=FX+Gw  (16)
The system state quantity X can be:
X=[ϕx bϕy bϕz b δV x n δV y n δV z n δP x n δP y n δP z nεxεyεzxyz]T
A state transition matrix F can be:
F = [ - ω ^ ib b × 0 3 × 3 0 3 × 3 - I 3 × 3 0 3 × 3 f ^ n × 0 3 × 3 0 3 × 3 0 3 × 3 C b n 0 3 × 3 I 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 ]
where └{circumflex over (ω)}ib b×┘ is anti-symmetric matrix of {circumflex over (ω)}ib b, └{circumflex over (f)}nΔ┘ is anti-symmetric matrix of {circumflex over (f)}n.
A system noise vector w can be:
w=[n r n w n a]T
where nr denotes the noise of the gyroscope, nw denotes the random walk noise of the gyroscope, and na denotes noise of the accelerometer.
A noise distribution matrix G can be:
G = [ I 3 × 3 0 3 × 3 I 3 × 3 0 3 × 3 0 3 × 3 C b n 0 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 I 3 × 3 0 3 × 3 0 3 × 3 0 3 × 3 I 3 × 3 ]
Perform discretization and first-order approximation on Formula (16) to obtain a discretized error calculation formula for the first pose, and design the Kalman filter using the discretized error calculation formula for the first pose.
In some embodiments, the vision device 2 can include the visual inertial odometer. In some embodiments, the observation value of the Kalman filter described above can be designed according to an output result of the visual inertial odometer. The specific design process can be as follows.
The reference attitude output by the visual inertial odometer is denoted as q c n, and the cosine matrix of the reference direction is denoted as C n c. In some embodiments, a heading reference output by the visual inertial odometer can be used as a heading observation of the integrated navigation system, and it is considered that the {b} series and the {c} series are completely aligned.
Assume positive unit vectors of the three axes of the navigation coordinate system {n} are:
v x n = [ 1 0 0 ] , v y n = [ 0 1 0 ] , v z n = [ 0 0 1 ]
The projection of a heading reference vector in the {b} system (e.g., a reference vector in x direction of the {b} system) can be:
v x b =C n c v x n  (17)
According to the specific force formula, a unit projection of the gravity reference vector in the {b} system (e.g., the reference vector in z direction of the {b} system), when the gimbal is completely still, can be:
v z b = f ^ b f ^ b ( 18 )
According to the orthogonal relationship of the coordinate system, the reference vector vy b in y direction of the {b} system can be obtained from vz b and vx b. The direction cosine matrix of the reference attitude constructed by vx b, vx b, and vx b can be as follows:
C n b=[v x b v y b v z b]  (19)
The reference attitude quaternion q n b can be obtained from C n b, and the attitude correction quaternion can be:
δ q _ = q _ n b q ^ b n = [ cos ϕ _ 2 ϕ _ ϕ _ sin ϕ _ 2 ] T ( 20 )
{circumflex over (q)}b n in Formula (20) is a latest estimation of the attitude quaternion. Under the condition of small angle, according to the above formula, the observation of the attitude correction can be obtained as follows:
ϕ=[2δ q 1 q 2 q 3]T  (21)
where δq 1, δq 2, and δq 3 are the error quaternion.
The observation formula for attitude correction can be:
Z ϕ =ϕ=H ϕ X+v ϕ  (22)
where vϕ denotes attitude observation noise. Hϕ=[I3×3 03×12] and vϕ=[vϕx vϕy vϕz]T.
Using Formula (22) as the attitude observation formula of the Kalman filter, the attitude correction value {circumflex over (ϕ)} output by the Kalman filter can be used to correct the updated attitude value of the gimbal obtained by Formula (4), and the corrected attitude output can be obtained to realize the correction of the attitude of the gimbal.
A velocity and position vector [Vnc n Pnc n]T output by the visual inertial odometer can include the velocity and position of the camera coordinate system {c} relative to the {n} system, and a velocity observation and a position observation of the {b} system are needed to be obtained. Mechanical errors are not considered herein. During the rotation of the axis arm 4, a parallelogram mechanism of the axis arm 4 can ensure that an end plane of the axis is always parallel to the bottom surface of the base 32. Therefore, there is only translational motion between the {b} system and the {p} system.
According to the output of the visual inertial odometer and a geometric and dynamic transmission relationship of the mechanical structure, the reference velocity Vr n and reference position Pr n of the axis arm 4 can be solved as follows:
V r n =C p n V r p =C p n(V nc p +ΔV p)=V nc n +C p n ΔV p  (23)
P r n =C p n P r p =C p n(P nc p +ΔP p)=P nc n +C p n ΔP p  (24)
where Cp n denotes the direction cosine matrix from the {p} system to the {n} system, ΔPp is a projection of a relative position vector from Ob to Oc in the {p} system, and ΔVp is a projection of a relative velocity vector from Ob to Oc in the {p} system.
Denote [Ox Oy Oz]T as a position offset vector from Oc to Op in the {p} system. Denote the joint angle of the axis arm 4 as α, when the axis arm 4 is parallel to the base 32, α=0, and define the counterclockwise direction as the positive direction. Define the length L of the axis arm 4 as the length from the rotation center line of the axis arm 4 to the end of the axis (i.e., the end of the axis arm 4 connected to the gimbal). ΔPp can be calculated according to the following formula:
Δ P p = [ Δ P x p Δ P y p Δ P z p ] = [ - O x + L cos α O y O z + L sin α ] ( 25 )
ΔVp is calculated according to the following formula:
Δ V p = [ Δ V x p Δ V y p Δ V z p ] = [ L α . cos α 0 - L α . sin α ] ( 26 )
According to Formulas (23) and (24), the reference velocity vector Vr n and reference position vector Pr n can be obtained, the velocity observation formula and position observation formula of the integrated navigation system can be obtained as:
Z V ={circumflex over (V)} n −V r n =H V X+v V  (27)
Z P ={circumflex over (P)} n −P r n =H P X+v P  (28)
where Hϕ=[03×3 I3×3 03×9], vV=[vVx vVy vVz]T, HP=[03×6 I3×3 03×6], vP=[vPx vPy vPz]T, vV is the velocity observation noise, and HP is the position observation noise.
Formula (27) can be used as the velocity observation formula of the Kalman filter, a velocity correction value can be output through the Kalman filter, and the updated speed value obtained by the velocity correction value and Formula (5) can be corrected to obtain a corrected velocity output, thereby realizing the correction of the velocity of the gimbal. Formula (28) can be used as the position observation formula of the Kalman filter, the position correction value can be output through the Kalman filter, and the updated position value obtained by the position correction value and Formula (6) can be corrected to obtain a corrected position output, thereby realizing the correction of the position of the gimbal.
The method consistent with the disclosure can adopt the inertial-vision integrated navigation mode, and correct the second pose obtained by the vision device 2 based on the first pose obtained by the IMU 1 to obtain the pose satisfying the requirements of the control bandwidth and accuracy. The inertial-visual integrated navigation mode consistent with the present disclosure is not interfered by electric current and magnetic field, and can be suitable for various indoor and outdoor environments.
Referring again to FIGS. 1 to 3, the present disclosure further provides the gimbal pose correction device. The device may include the vertical compensation device connected to the gimbal, the vision device 2 arranged at the vertical compensation device, and the IMU 1 arranged at the vertical compensation device. The vertical compensation device can be configured to compensate for the movement of the gimbal in the vertical direction, and the vision device 2 and the IMU 1 can be electrically connected to the vertical compensation device.
The vertical compensation device can be configured to obtain the first pose of the gimbal based on the IMU 1, obtain the second pose of the vertical compensation device based on the vision device 2, and correct the first pose according to the second pose.
The vertical compensation device further includes the main body 3 and the axis arm 4 connected to the gimbal. The axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction. The IMU 1 can be arranged at the axis arm 4, and the vision device 2 can be arranged at the main body 3.
The vision device 2 can include the visual odometer, and the second pose can include the velocity and position of the vertical compensation device. The vision device 2 can includes the visual inertial odometer, and the second pose can include the velocity, position, and attitude of the vertical compensation device.
The vertical compensation device can include the axis arm 4 connected to the gimbal. The axis arm 4 can rotate to compensate for the movement of the gimbal in the vertical direction. The angular velocity sensor 6 can be arranged at the axis arm 4. The vertical compensation device can be configured to obtain the joint angle of the axis arm 4 based on the angular velocity sensor 6.
The first pose can include the velocity of the gimbal. The vertical compensation device can be configured to perform the coordinate conversion on the reference velocity of the vertical compensation device output by the vision device 2 according to the joint angle, and obtain the velocity of the vertical compensation device.
The first pose can include the position of the gimbal. The vertical compensation device can be configured to perform the coordinate conversion on the reference position of the vertical compensation device output by the vision device 2 according to the joint angle, and obtain the position of the vertical compensation device.
The vertical compensation device can be configured to construct the reference direction cosine matrix of the reference attitude based on the reference attitude output by the visual inertial odometer, and obtain the attitude of the vertical compensation device according to the reference direction cosine matrix.
The vertical compensation device can be configured to obtain the attitude correction value of the vertical compensation device according to the reference direction cosine matrix, and obtain the attitude of the vertical compensation device according to the attitude correction value.
The first pose can include the velocity, position, and attitude of the gimbal.
The IMU 1 can include the gyroscope and the accelerometer. The vertical compensation device can be configured to obtain the angular velocity of the gimbal based on the gyroscope, obtain the specific force of the gimbal based on the accelerometer, and calculate the attitude, velocity, and position of the gimbal according to the angular velocity and the specific force.
The vertical compensation device can be configured to design the attitude update formula according to the angular velocity and the specific force, and update the attitude of the gimbal according to the attitude update formula.
The vertical compensation device can be configured to design the velocity update formula according to the angular velocity and the specific force, and update the velocity of the gimbal according to the velocity update formula.
The vertical compensation device can be configured to design the position update formula according to the angular velocity and the specific force and update the position of the gimbal according to the position update formula.
The IMU 1 can include the gyroscope and the accelerometer. The vertical compensation device can be configured to obtain the angular velocity of the gimbal based on the gyroscope, obtain the specific force of the gimbal based on the accelerometer, and calculate the error of the first pose according to the angular velocity and the specific force.
The vertical compensation device can be configured to construct the attitude error, velocity error, and position error of the first pose according to the angular velocity and the specific force, and calculate the error of the first pose according to the attitude error, velocity error, and position error.
The vertical compensation device can be configured to approximate the error of the first pose to obtain the Kalman filter, obtain the correction value through the Kalman filter by using the second pose as the observation value, and correct the first pose according to the correction value.
For simplification purposes, detailed descriptions of the operations of exemplary devices are omitted and references can be made to the descriptions of the exemplary methods. The devices described above are merely exemplary. The units described as separate components may or may not be physically separate, and a component shown as a unit may or may not be a physical unit. That is, the units may be located in one place or may be distributed over a plurality of network elements. Some or all of the components may be selected according to the actual needs to achieve the object of the present disclosure. Those skilled in the art can understand and implement without creative work.
The terms “first,” “second,” or the like in the specification, claims, and the drawings of the present disclosure are merely used to distinguish similar elements, and are not intended to describe a specified order or a sequence. In addition, the terms “including,” “comprising,” and variations thereof herein are open, non-limiting terminologies, which are meant to encompass a series of steps of processes and methods, or a series of units of systems, apparatuses, or devices listed thereafter and equivalents thereof as well as additional steps of the processes and methods or units of the systems, apparatuses, or devices.
The gimbal pose correction method and device consistent with the disclosure are described in detail above. It is intended that the disclosed embodiments be considered as exemplary only and not to limit the scope of the disclosure. Changes, modifications, alterations, and variations of the above-described embodiments may be made by those skilled in the art within the scope of the disclosure.

Claims (20)

What is claimed is:
1. A gimbal pose correction method comprising:
obtaining a first pose of a gimbal based on an Inertial Measurement Unit (IMU) arranged at a vertical compensation device, the vertical compensation device being configured to be coupled to the gimbal and compensate for movement of the gimbal in a vertical direction;
obtaining a second pose of the vertical compensation device based on a vision device arranged at the vertical compensation device; and
correcting the first pose according to the second pose.
2. The method of claim 1, wherein:
the vertical compensation device includes a main body and an axis arm configured to be connected to the gimbal;
the IMU is arranged at the axis arm; and
the vision device is arranged at the main body.
3. The method of claim 1, wherein:
the vision device includes a visual odometer; and
the second pose includes a velocity and a position of the vertical compensation device.
4. The method of claim 3,
wherein:
the vertical compensation device includes an axis arm configured to be connected to the gimbal and compensate for the movement of the gimbal in the vertical direction via rotation; and
an angular velocity sensor is arranged at the axis arm;
the method further comprising, before obtaining the second pose:
obtaining a joint angle of the axis arm based on the angular velocity sensor.
5. The method of claim 4, wherein:
the first pose includes a velocity of the gimbal; and
obtaining the second pose includes performing coordinate conversion on a reference velocity of the vertical compensation device output by the vision device according to the joint angle to obtain a velocity of the vertical compensation device.
6. The method of claim 4, wherein:
the first pose includes a position of the gimbal; and
obtaining the second pose includes performing coordinate conversion on a reference position of the vertical compensation device output by the vision device according to the joint angle to obtain a position of the vertical compensation device.
7. The method of claim 1, wherein:
the vision device includes a visual inertial odometer; and
the second pose includes a velocity, a position, and an attitude of the vertical compensation device.
8. The method of claim 7, wherein obtaining the second pose includes:
based on a reference attitude output by the visual inertial odometer, constructing a reference direction cosine matrix of the reference attitude; and
obtaining an attitude of the vertical compensation device according to the reference direction cosine matrix.
9. The method of claim 8, wherein obtaining the attitude of the vertical compensation device includes:
obtaining an attitude correction value of the vertical compensation device according to the reference direction cosine matrix; and
obtaining the attitude of the vertical compensation device according to the attitude correction value.
10. The method of claim 1, wherein the first pose includes a velocity, a position, and an attitude of the gimbal.
11. The method of claim 10, wherein:
the IMU includes a gyroscope and an accelerometer;
obtaining the first pose includes:
obtaining an angular velocity of the gimbal based on the gyroscope;
obtaining a specific force of the gimbal based on the accelerometer; and
calculating the attitude, the velocity, and the position of the gimbal according to the angular velocity and the specific force.
12. The method of claim 11, wherein calculating the attitude, the velocity, and the position of the gimbal includes:
designing an attitude update formula according to the angular velocity and the specific force; and
updating the attitude of the gimbal according to the attitude update formula.
13. The method of claim 11, wherein calculating the attitude, the velocity, and the position of the gimbal includes:
designing a velocity update formula according to the angular velocity and the specific force; and
updating the velocity of the gimbal according to the velocity update formula.
14. The method of claim 11, wherein calculating the attitude, the velocity, and the position of the gimbal includes:
designing a position update formula according to the angular velocity and the specific force; and
updating the position of the gimbal according to the position update formula.
15. The method of claim 10, wherein:
the IMU includes a gyroscope and an accelerometer;
obtaining the first pose of the gimbal includes:
obtaining an angular velocity of the gimbal based on the gyroscope;
obtaining a specific force of the gimbal based on the accelerometer; and
calculating an error of the first pose according to the angular velocity and the specific force.
16. The method of claim 15, wherein calculating the error of the first pose includes:
constructing an attitude error, a velocity error, and a position error of the first pose according to the angular velocity and the specific force; and
calculating the error of the first pose according to the attitude error, the velocity error, and the position error.
17. The method of claim 15, wherein correcting the first pose according to the second pose includes:
approximating the error of the first pose to obtain a Kalman filter;
obtaining a correction value through the Kalman filter by using the second pose as an observation value; and
correcting the first pose according to the correction value.
18. A gimbal pose correction device comprising:
a vertical compensation device configured to be connected to a gimbal and compensate for movement of the gimbal in a vertical direction;
a vision device arranged at and electrically coupled to the vertical compensation device; and
an Inertial Measurement Unit (IMU) arranged at and electrically coupled to the vertical compensation device;
wherein the vertical compensation device is further configured to:
obtain a first pose of the gimbal based on the IMU;
obtain a second pose of the vertical compensation device based on the vision device; and
correct the first pose according to the second pose.
19. The device of claim 18, wherein:
the vertical compensation device includes a main body and an axis arm configured to be connected to the gimbal, the axis arm being configured to rotate to compensate for the movement of the gimbal in the vertical direction;
the IMU is arranged at the axis arm; and
the vision device is arranged at the main body.
20. The device of claim 18, wherein:
the vision device includes a visual odometer; and
the second pose includes a velocity and a position of the vertical compensation device.
US17/075,034 2018-04-25 2020-10-20 Gimbal pose correction method and device Active US11274788B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/084499 WO2019205034A1 (en) 2018-04-25 2018-04-25 Camera stabilizer position correction method and device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/084499 Continuation WO2019205034A1 (en) 2018-04-25 2018-04-25 Camera stabilizer position correction method and device

Publications (2)

Publication Number Publication Date
US20210033242A1 US20210033242A1 (en) 2021-02-04
US11274788B2 true US11274788B2 (en) 2022-03-15

Family

ID=68112759

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/075,034 Active US11274788B2 (en) 2018-04-25 2020-10-20 Gimbal pose correction method and device

Country Status (4)

Country Link
US (1) US11274788B2 (en)
EP (1) EP3786757B1 (en)
CN (1) CN110325822B (en)
WO (1) WO2019205034A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020020630A (en) * 2018-07-31 2020-02-06 セイコーエプソン株式会社 Attitude estimation method, attitude estimation device, and moving object
CN111399476B (en) * 2020-03-13 2023-01-10 江西憶源多媒体科技有限公司 Real-time detection method for monitoring out-of-control holder based on image alignment
CN113841025A (en) * 2020-10-14 2021-12-24 深圳市大疆创新科技有限公司 Position and attitude determination method for movable platform, related device and system
CN113406964B (en) * 2021-05-19 2022-11-18 浙江华飞智能科技有限公司 Motion parameter adjusting method and device, storage medium and electronic device
CN117474906B (en) * 2023-12-26 2024-03-26 合肥吉麦智能装备有限公司 Intraoperative X-ray machine resetting method based on spine X-ray image matching

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6965397B1 (en) 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
CN102355574A (en) 2011-10-17 2012-02-15 上海大学 Image stabilizing method of airborne tripod head moving target autonomous tracking system
CN102707734A (en) 2012-06-19 2012-10-03 上海大学 Self-stabilizing cloud deck based on inertia attitude sensor
CN102768042A (en) 2012-07-11 2012-11-07 清华大学 Visual-inertial combined navigation method
CN103900473A (en) 2014-03-31 2014-07-02 浙江大学 Intelligent mobile device six-degree-of-freedom fused pose estimation method based on camera and gravity inductor
US20140267778A1 (en) 2013-03-15 2014-09-18 Freefly Systems, Inc. Apparatuses and methods for controlling a gimbal and other displacement systems
CN104698485A (en) 2015-01-09 2015-06-10 中国电子科技集团公司第三十八研究所 BD, GPS and MEMS based integrated navigation system and method
CN104833352A (en) 2015-01-29 2015-08-12 西北工业大学 Multi-medium complex-environment high-precision vision/inertia combination navigation method
US20160273921A1 (en) 2013-12-10 2016-09-22 SZ DJI Technology Co., Ltd. Sensor fusion
WO2017011945A1 (en) 2015-07-17 2017-01-26 深圳市尚腾影科技有限公司 Attitude data input apparatus and method, and cradle head control apparatus and method
CN107278246A (en) 2016-02-01 2017-10-20 深圳市大疆灵眸科技有限公司 Vertical Zeng Wen mechanisms, cradle head device and capture apparatus

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE530384C2 (en) * 2006-02-17 2008-05-20 Totalfoersvarets Forskningsins Method for remote control of an unmanned ground vehicle with mobile camera and such ground vehicle
CN201750495U (en) * 2010-07-05 2011-02-16 杭州晨安机电技术有限公司 Posture self-correcting vehicle-mounted camera
US9561870B2 (en) * 2013-12-10 2017-02-07 SZ DJI Technology Co., Ltd. Carrier having non-orthogonal axes
WO2015101822A1 (en) * 2014-01-02 2015-07-09 Mastortech Limited Camera stabilisation mounting
CN104301715A (en) * 2014-10-15 2015-01-21 天津市亚安科技股份有限公司 Method and device for achieving 3D preset accurate linkage between camera and pan-tilt
EP3782912A1 (en) * 2014-12-23 2021-02-24 SZ DJI Osmo Technology Co., Ltd. Uav panoramic imaging
DE102015002595A1 (en) * 2015-02-28 2016-09-01 Audi Ag Method for compensating vertical movements
CN205534966U (en) * 2016-02-01 2016-08-31 深圳市大疆创新科技有限公司 Vertical steady mechanism, cradle head device and shooting equipment of increasing
CN107466378B (en) * 2016-04-14 2019-12-10 深圳市大疆灵眸科技有限公司 Vertical stability augmentation mechanism, holder device, supporting device and shooting equipment
CN107079103B (en) * 2016-05-31 2019-04-02 深圳市大疆灵眸科技有限公司 Cloud platform control method, device and holder
WO2018023492A1 (en) * 2016-08-03 2018-02-08 深圳市大疆灵眸科技有限公司 Mount control method and system
CN106231192B (en) * 2016-08-04 2019-05-07 北京二郎神科技有限公司 A kind of image-pickup method and device
CN106200693B (en) * 2016-08-12 2019-05-21 东南大学 The holder real-time control system and control method of land investigation small drone
CN106292741A (en) * 2016-09-27 2017-01-04 成都普诺思博科技有限公司 A kind of mobile machine user tripod head system based on brushless electric machine
WO2018120059A1 (en) * 2016-12-30 2018-07-05 深圳市大疆灵眸科技有限公司 Control method and system for cradle head, cradle head, and unmanned aerial vehicle
CN206417213U (en) * 2016-12-30 2017-08-18 深圳市大疆灵眸科技有限公司 Head and unmanned vehicle
WO2018120012A1 (en) * 2016-12-30 2018-07-05 深圳市大疆灵眸科技有限公司 Method and device for controlling cradle head, and cradle head
CN106953553A (en) * 2017-03-12 2017-07-14 纳恩博(北京)科技有限公司 The control method and device of a kind of head and horizontal stage electric machine
CN107065926A (en) * 2017-04-12 2017-08-18 普宙飞行器科技(深圳)有限公司 Omnidirectional's obstacle avoidance apparatus, head, the control method of head and avoidance obstacle method
CN107235013A (en) * 2017-07-28 2017-10-10 深圳普思英察科技有限公司 Automotive positioning pan and tilt head
CN207191468U (en) * 2017-09-11 2018-04-06 深圳市大疆创新科技有限公司 Head camera and the unmanned plane with the head camera
CN107741748A (en) * 2017-10-13 2018-02-27 杭州数尔安防科技股份有限公司 A kind of device of two-axis position automatic straightening

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6965397B1 (en) 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
CN102355574A (en) 2011-10-17 2012-02-15 上海大学 Image stabilizing method of airborne tripod head moving target autonomous tracking system
CN102707734A (en) 2012-06-19 2012-10-03 上海大学 Self-stabilizing cloud deck based on inertia attitude sensor
CN102768042A (en) 2012-07-11 2012-11-07 清华大学 Visual-inertial combined navigation method
US20140267778A1 (en) 2013-03-15 2014-09-18 Freefly Systems, Inc. Apparatuses and methods for controlling a gimbal and other displacement systems
US20160273921A1 (en) 2013-12-10 2016-09-22 SZ DJI Technology Co., Ltd. Sensor fusion
CN103900473A (en) 2014-03-31 2014-07-02 浙江大学 Intelligent mobile device six-degree-of-freedom fused pose estimation method based on camera and gravity inductor
CN104698485A (en) 2015-01-09 2015-06-10 中国电子科技集团公司第三十八研究所 BD, GPS and MEMS based integrated navigation system and method
CN104833352A (en) 2015-01-29 2015-08-12 西北工业大学 Multi-medium complex-environment high-precision vision/inertia combination navigation method
WO2017011945A1 (en) 2015-07-17 2017-01-26 深圳市尚腾影科技有限公司 Attitude data input apparatus and method, and cradle head control apparatus and method
CN107278246A (en) 2016-02-01 2017-10-20 深圳市大疆灵眸科技有限公司 Vertical Zeng Wen mechanisms, cradle head device and capture apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
The World Intellectual Property Organization (WIPO) International Search Report for PCT/CN2018/084499 dated Sep. 4, 2018 4 Pages (including translation).

Also Published As

Publication number Publication date
EP3786757A1 (en) 2021-03-03
WO2019205034A1 (en) 2019-10-31
US20210033242A1 (en) 2021-02-04
CN110325822B (en) 2023-06-27
EP3786757B1 (en) 2022-09-21
EP3786757A4 (en) 2021-12-08
CN110325822A (en) 2019-10-11

Similar Documents

Publication Publication Date Title
US11274788B2 (en) Gimbal pose correction method and device
KR102454882B1 (en) Dead reckoning method and apparatus for vehicle, device and storage medium
CN109813311B (en) Unmanned aerial vehicle formation collaborative navigation method
US10317214B2 (en) Inertial odometry with retroactive sensor calibration
US20210199438A1 (en) Heading initialization method for tilt rtk
US8005635B2 (en) Self-calibrated azimuth and attitude accuracy enhancing method and system (SAAAEMS)
US9261980B2 (en) Motion capture pointer with data fusion
US11598886B2 (en) GNSS/IMU surveying and mapping system and method
US7991576B2 (en) Indoor navigation system and method
US9541392B2 (en) Surveying system and method
JP4782111B2 (en) System and method for estimating position, attitude and / or direction of flight of a vehicle
CN102879011B (en) Lunar inertial navigation alignment method assisted by star sensor
CN106767767A (en) A kind of micro-nano multimode star sensor system and its data fusion method
CN110017837A (en) A kind of Combinated navigation method of the diamagnetic interference of posture
JP2012173190A (en) Positioning system and positioning method
CN109916395A (en) A kind of autonomous Fault-tolerant Integrated navigation algorithm of posture
Leutenegger et al. A low-cost and fail-safe inertial navigation system for airplanes
CN111207745A (en) Inertia measurement method suitable for vertical gyroscope of large maneuvering unmanned aerial vehicle
Goppert et al. Invariant Kalman filter application to optical flow based visual odometry for UAVs
Li et al. Low-cost MEMS sensor-based attitude determination system by integration of magnetometers and GPS: A real-data test and performance evaluation
CN107576327A (en) Varistructure integrated navigation system design method based on Observable degree analysis of Beidou double
CN111141285B (en) Aviation gravity measuring device
Gustavsson UAV pose estimation using sensor fusion of inertial, sonar and satellite signals
Huang et al. Integration of MEMS inertial sensor-based GNC of a UAV
Ting et al. Inertial/celestial integrated navigation algorithm for long endurance unmanned aerial vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: SZ DJI TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHANG, XIANG;LI, BING;ZHOU, YOU;SIGNING DATES FROM 20201007 TO 20201016;REEL/FRAME:054109/0946

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE