US20190219392A1 - Measuring camera to body alignment for an imager mounted within a structural body - Google Patents

Measuring camera to body alignment for an imager mounted within a structural body Download PDF

Info

Publication number
US20190219392A1
US20190219392A1 US15/873,036 US201815873036A US2019219392A1 US 20190219392 A1 US20190219392 A1 US 20190219392A1 US 201815873036 A US201815873036 A US 201815873036A US 2019219392 A1 US2019219392 A1 US 2019219392A1
Authority
US
United States
Prior art keywords
body
angles
coordinate frame
imaging device
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/873,036
Other versions
US10458793B2 (en
Inventor
Daniel P. Everson
James M. Maley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
US Army Research Laboratory
US Secretary of Army
Original Assignee
US Army Research Laboratory
US Secretary of Army
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by US Army Research Laboratory , US Secretary of Army filed Critical US Army Research Laboratory
Priority to US15/873,036 priority Critical patent/US10458793B2/en
Assigned to THE UNITED STATES OF AMERICA AS REPRESENTED BY THE SECRETARY OF THE ARMY reassignment THE UNITED STATES OF AMERICA AS REPRESENTED BY THE SECRETARY OF THE ARMY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EVERSON, DANIEL P, MALEY, JAMES M
Publication of US20190219392A1 publication Critical patent/US20190219392A1/en
Application granted granted Critical
Publication of US10458793B2 publication Critical patent/US10458793B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/26Measuring arrangements characterised by the use of optical means for measuring angles or tapers; for testing the alignment of axes
    • G01B11/27Measuring arrangements characterised by the use of optical means for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical means
    • G01B5/0025Measuring of vehicle parts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/003Alignment of optical elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • H04N13/0246
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

A technique is provided to measure an alignment of an imaging device of a guided projectile containing an imager and a world coordinate frame, and includes producing a pure rolling motion of a body of the projectile, capturing a series of images of an imaging device calibration target over a range of body roll angles of the rolling body, measuring the roll angles of the rolling body with respect to a world coordinate frame as defined by the imaging device calibration target, simultaneously estimating alignment angles of the imaging device and misalignment angles associated with an orientation of the body, and estimating a rotational transform between an imaging device coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles.

Description

    GOVERNMENT INTEREST
  • The embodiments described herein may be manufactured, used, and/or licensed by or for the United States Government without the payment of royalties thereon.
  • BACKGROUND Technical Field
  • The embodiments herein generally relate to navigation systems, and more particularly to navigation systems using images to estimate positions and orientations of a structural body or vehicle.
  • Description of the Related Art
  • Vision based navigation systems utilize information from a digital imaging system to estimate navigation states of a vehicle. A camera intrinsics model developed through calibration of an imager describes the projective transformation from the observed world scene into the camera coordinate frame. This transformation allows for pixel locations identified by computer vision algorithms to be used in estimating navigation states. However, this process relates the world scene to the camera coordinate frame and an additional transformation is needed to provide estimates in the vehicle's body coordinate frame as is necessary in order to be useful for vehicle guidance. Accordingly, measurement of the rotation transform between the camera coordinate frame and the body coordinate frame is a necessary element of implementing a vision based navigation system.
  • The conventional solutions generally require direct measurement of the orientation of the body and the reference objects in the Earth/world coordinate frame using precision instruments, sensing the motion of the body with respect to a stationary set of reference objects, or measuring the relative orientation of the body with respect to the world coordinate frame using precision alignment devices. However, these approaches are generally costly and/or imprecise.
  • SUMMARY
  • In view of the foregoing, an embodiment herein provides a method for measuring an alignment of an imaging device, the method comprising producing a pure rolling motion of a body; capturing a series of images of an imaging device calibration target over a range of body roll angles of the rolling body; measuring the roll angles of the rolling body with respect to a world coordinate frame as defined by the imaging device calibration target; simultaneously estimating alignment angles of the imaging device and misalignment angles associated with an orientation of the body; and estimating a rotational transform between an imaging device coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles. The alignment angles may comprise three rotation angles representing a transform between the body coordinate frame and the imaging device coordinate frame, and wherein the misalignment angles may comprise two rotation angles representing a transform between a roll axis of the body and the world coordinate frame. The body may comprise a cylindrical body. The producing of the pure rolling motion of the body may occur by a pair of support mechanisms that mechanically constrain the body to produce the pure rolling motion. The support mechanisms may comprise v-block mechanisms. The method may comprise using an imaging device extrinsics estimate to define a world-to-camera coordinate frame rotation for each image in the series of images. The method may comprise using a numerical optimization process to estimate the alignment angles of the imaging device and misalignment angles associated with an orientation of the body. The body may comprise a guided projectile.
  • Another embodiment provides a system for measuring an alignment of an imager, the system comprising a pair of support mechanisms to produce a pure rolling motion of a body; an imager rigidly affixed to the body to capture a series of images of an imager calibration target over a range of body roll angles of the rolling body; a sensor to measure the roll angles of the rolling body with respect to a world coordinate frame as defined by the imager calibration target; and a processor to simultaneously estimate alignment angles of the imager and misalignment angles associated with an orientation of the body; and estimate a rotational transform between an imager coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles, wherein the alignment angles comprise three rotation angles representing a transform between the body coordinate frame and the imager coordinate frame, and wherein the misalignment angles comprise two rotation angles representing a transform between a roll axis of the body and the world coordinate frame. The imager calibration target may comprise a checkerboard camera calibration target. The sensor may comprise any of an inclinometer and an inertial measurement unit device. The processor may use an imager extrinsics estimate to define a world-to-camera coordinate frame rotation for each image in the series of images. The processor may use a numerical optimization process to estimate the alignment angles of the imager and misalignment angles associated with an orientation of the body. The body may comprise a guided projectile containing the imager.
  • Another embodiment provides a non-transitory computer readable medium comprising instructions that when executed cause a processor of a computing device to detect a pure rolling motion of a body; process a series of images of an imaging device calibration target over a range of body roll angles of the rolling body; measure the roll angles of the rolling body with respect to a world coordinate frame as defined by the imaging device calibration target simultaneously estimate alignment angles of the imaging device and misalignment angles associated with an orientation of the body; and estimate a rotational transform between an imaging device coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles. The alignment angles may comprise three rotation angles representing a transform between the body coordinate frame and the imaging device coordinate frame. The misalignment angles may comprise two rotation angles representing a transform between a roll axis of the body and the world coordinate frame. The imaging device calibration target may comprise a checkerboard camera calibration target. The processor may use an imaging device extrinsics estimate to define a world-to-camera coordinate frame rotation for each image in the series of images. The processor may use a numerical optimization process to estimate the alignment angles of the imaging device and misalignment angles associated with an orientation of the body.
  • These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
  • FIG. 1 is a block diagram illustrating a system, according to an embodiment herein;
  • FIG. 2 is a schematic diagram of a structural body, according to an embodiment herein;
  • FIG. 3 is a schematic diagram of a calibration target, according to an embodiment herein;
  • FIG. 4 is a schematic diagram of v-block mechanisms, according to an embodiment herein;
  • FIG. 5 is a schematic diagram of a structural body set in v-block mechanisms, according to an embodiment herein;
  • FIG. 6 is an illustrative example of performing a camera alignment process, according to an embodiment herein; and
  • FIG. 7 is a block diagram of a computing device, according to an embodiment herein.
  • DETAILED DESCRIPTION
  • The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
  • The embodiments herein provide a method for measuring the camera to body rotational transformation for an imager mounted within a cylindrical body as would typically be found for an imager mounted on the nose of a projectile. The method is able to measure the transform and eliminates the need for a precise geometrical configuration of measurement hardware and does not rely on lasers or other expensive alignment devices. The embodiments herein may be used to identify the camera-to-body rotational transform that is necessary to convert computer vision results such as navigation states from the camera coordinate frame to the body coordinate frame. Referring now to the drawings, and more particularly to FIGS. 1 through 7, there are shown exemplary embodiments.
  • FIG. 1 is a block diagram of a system 100 for measuring an alignment of an imaging device 5. The imaging device 5 may comprise a camera, smartphone, computing device, or any other electronic device configured to take and/or process images and or video. For the purposes of the descriptions herein, the terms imaging device, imager, and camera are used interchangeably, without implication of any specific type of imaging device. In an example, the system 100 comprises support mechanisms 30 to produce a pure rolling motion of a body 10. In an example, the support mechanisms 30 may comprise v-block support mechanisms 30. However, the support mechanisms 30 may include any suitable structure for mechanically constraining the body 10 to produce a pure rolling motion (i.e., all of the points of the body 10 which lie on the longitudinal axis of the body 10 are stationary as the body 10 rotates about this axis), and the v-block configuration, while described as an exemplary embodiment, is not restrictive, and other configurations and types of support mechanisms 30 may be used in accordance with the embodiments herein. Furthermore, the body 10 may be a structural body and configured in any suitable shape including, without limitation, cylindrical, conical, elongated, among other shapes and configurations capable of having a pure rolling motion. In an example, the body 10 may comprise a guided projectile containing the imaging device 5. The imaging device 5, which may be rigidly affixed to the body 10, captures a series of images 50 of an imaging device calibration target 20 over a range of body roll angles of the rolling body 10. A sensor 25 is used to measure the roll angles of the rolling body 10 with respect to a world coordinate frame as defined by the imaging device calibration target 20. The system 100 also includes a processor 55 to simultaneously measure alignment angles φC, θC, ϕC of the imaging, device 5 and misalignment angles φVB, θVB associated with the alignment of the roll axis of the body 10, wherein the alignment angles φC, θC, ϕC comprise three rotation angles representing a transform between the body coordinate frame and the imaging device coordinate frame, and wherein the misalignment angles φVB, θVB comprise two rotation angles representing a transform between the roll axis of the body 10 and the world coordinate frame as constrained by the orientation of the support mechanisms 30. The processor 55 also estimates a rotational transform between an imaging device coordinate frame and a body coordinate frame. The imaging device calibration target 20 may comprise a checkerboard camera calibration target, according to one example. There are also other possible types of calibration targets, which may be used in accordance with the embodiments herein. The sensor 25 may comprise any of an inclinometer and an inertial measurement unit device. The processor 55 is to use an imaging device extrinsics estimate to define a world-to-camera coordinate frame rotation for each image in the series of images 50. The processor 55 is to use a numerical optimization process to estimate the alignment angles φC, θC, ϕC and the misalignment angles φVB, θVB.
  • The embodiments herein utilize a previously-generated pinhole camera model and lens distortion model to describe the projective transformation between the world coordinate frame and the camera coordinate frame of the imager (e.g., imaging device 5) installed within a body 10, such as shown in FIG. 2, with reference to FIG. 1. In an example, the body 10 may be approximately 83 mm, although other configurations are possible in accordance with the embodiments herein. These models are generated using a planar checkerboard camera calibration technique, according to an example, which relies on multiple images of a planar calibration target 20, such as shown in FIG. 3, with reference to FIGS. 1 and 2, to identify world coordinate frame data points located at the corner points (x0,y0; x0,y1; x1,y0; x1,y1) of each block of the calibration target 20. Accordingly, as used herein, “corner points” refers to all of the intersections of the black and white squares on the calibration target 20, and in the example of FIG. 3, there are 171 such corner points, with only one set labeled for illustration purposes. Other configurations of the calibration target 20 may render any number of corner points, which may be used in accordance with the embodiments herein. The processor 55 that estimates the extrinsics for each image uses many corner points (for example, >50) to estimate (via numerical optimization) the position and orientation of the imaging device 5 with respect to the calibration target 20. These world points are then used to perform a numerical optimization that estimates the parameters of the camera pinhole model and lens distortion model.
  • The camera model obtained through camera calibration may be used to estimate the position and orientation of the imager (camera extrinsics) for a given image with respect to the world coordinate frame defined by the position and orientation of the planar calibration target 20. If the orientation of the body 10 with respect to the calibration target 20 is also known over a series of images this information can be used to determine the rotational transformation between the camera coordinate frame and the body coordinate frame. However, it is a difficult process to accurately measure the orientation of the body 10 with respect to the calibration target 20. To do so would typically require the use of precision staging and mechanical or optical alignment devices such as lasers or theodolites. The method for measuring the camera to body rotational transformation provided by the embodiments herein estimates the camera to body rotational transformation without the need for these devices.
  • As mentioned, the support mechanisms 30 may be configured as v-blocks, as shown in FIG. 4, with reference to FIGS. 1 through 3, may be used in accordance with the method described below. Because the body 10 within which the imager (e.g., imaging device 5) is mounted is cylindrical in one example, v-block support mechanisms 30, as shown in FIG. 5, with reference to FIGS. 1 through 4, may be used to provide a mechanism to easily generate a pure rolling motion of the body 10. The method provided by the embodiments herein uses the pure rolling motion provided by the support mechanisms 30 to extract the body to camera rotation transformation while requiring only limited additional physical constraints on the calibration setup such that the roll angle of the body at the point each image in the series of images 50 is measured with respect to the world frame as defined by the orientation of the calibration target 20. Preferably, the calibration target coordinate system x-axis is leveled; i.e. set perpendicular to the Earth's gravity vector. This may be achieved using a precise bubble level or digital inclinometer. Additionally, the roll angle of the body 10 with respect to the Earth's gravity vector should be measured for each image that is collected. This may be achieved with a center finding inclinometer, digital inclinometer, or a digital IMU installed within the body 10. Neither of these constraints necessitate the use of expensive mechanical or optical alignment devices.
  • According to the embodiments herein, a series of images of a calibration target 20 is captured over a range of body roll angles. The previously calculated camera intrinsics and distortion models are used to estimate the camera position and orientation (camera extrinsics) with respect to the world coordinate frame as defined by the calibration target 20 for each image in the series. Using the camera pose for each image, non-linear least squares numerical optimization is used to solve for the three rotation angles that represent the transform between the body coordinate frame and the camera coordinate frame as well as the two rotation angles that represent the transform between the body roll axis (as represented by the v-block support mechanisms 30) and the world coordinate frame. By including the rotation angles for the world to v-block transformation in the optimization solution the method provided by the embodiments herein alleviates any need to precisely align the v-block support mechanisms 30 with the calibration target 20 so long as the calibration target x-axis is leveled.
  • The measurement method utilizes rotation matrices to describe the transformation between a series of reference coordinate frames. Each of these rotation matrices is a direction cosine matrix (DCM) which can be represented by a combination of three Euler rotation angles about the three axes of a given coordinate frame.
  • Rotation from the world coordinate frame to the nominal orientation of the v-block support mechanisms 30 is given by the 3×3 rotation matrix RW VB˜. Using this nominal rotation allows for the world coordinate frame and the v-block support mechanisms 30 to use axis conventions that are not aligned while allowing the optimization calculation to operate on relatively small rotation angles. Typical axis convention for the world coordinate frame is x-axis to the right, y-axis down, z-axis into the calibration target 20. Typical axis convention for the v-block support mechanisms 30 is x-axis parallel to the v-notch, y-axis to the right, z-axis down. This results in a situation where the z-axis of the world coordinate frame is nominally aligned with the s-axis of the v-block support mechanisms 30. Thus, the rotation matrix RW VB˜ comprises of a series of 90° rotations resulting in the rotation matrix below:
  • R W VB = [ 0 0 1 1 0 0 0 1 0 ]
  • Rotation from the nominal orientation of the v-block support mechanisms 30 to the actual orientation of the v-block support mechanisms 30, accounting for misalignment between the calibration target 20 and the v-block support mechanisms 30 is given by the 3×3 rotation matrix RVB˜ VB. This rotation matrix is constructed from only the yaw and pitch Euler angle terms with the roll term set equal to 0. The calibration target 20 is leveled as part of the measurement procedure and the physical nature of the v-block support mechanisms 30 dictates that it does not impose any influence on the roll angle of the body 10. Accordingly, RVB˜ VB=∫(φVB, θVB), where the angles φVB, θVB are the Euler rotation angles between the nominal v-block coordinate frame and the actual v-block coordinate frame accounting for misalignment.
  • Rotation from the actual orientation of the v-block to the coordinate frame of the body is given by the 3×3 matrix RVB B. This matrix is composed of only the roll Euler angle term as rotating the body 10 on the v-block support mechanisms 30 produces a pure rolling motion. RVB B=∫(ϕB), where the angle ϕB is the Euler roll angle of the body 10 on the v-block; support mechanisms 30 as measured by the center finding inclinometer or other means. The body roll angle is accurately referenced to the world coordinate frame because the leveled calibration target 20 and the center finding inclinometer use the Earth's gravity vector as a common reference. The measured sequence of roll angles for the captured series of images is an input to the optimization calculation.
  • Rotation from the body 10 to the nominal camera coordinate frame is given by the 3×3 rotation matrix RB . This rotation allows for the body 10 and imaging device 5 to use axis conventions that are not aligned while allowing the optimization calculation to operate on relatively small angles. Typical axis convention for the body is x-axis out the nose, y-axis out the right wing, z-axis down. Typical axis convention for the camera is x-axis to the right of the image, y-axis down, z-axis out the lens. This results in a situation where the x-axis of the body 10 is nominally aligned with the z-axis of the camera. Thus, the rotation matrix RB comprises of a series of 90° rotations resulting in the rotation matrix below:
  • R B C = [ 0 1 0 0 0 1 1 0 0 ]
  • Rotation from the nominal camera coordinate frame to the actual camera coordinate frame is given by the 3×3 rotation matrix R C. This rotation matrix describes the misalignment between the camera projection axis (camera z-axis) and the body longitudinal axis (body x-axis) as well as the rotation of the image coordinate frame with respect to the body roll orientation (camera yaw, rotation about z-axis). R C<∫(φC, θC, ϕC), where the angles φC, θC, ϕC are the Euler rotation angles between the nominal camera coordinate frame and the actual camera coordinate frame.
  • Multiplication of rotation matrices for intermediate coordinate frames allows for the rotation from the world coordinate frame to the actual camera coordinate frame as shown in FIG. 6. This series of rotations can be expressed as:

  • R W C =R C R B R VB B R VB˜ VB R W VB˜
  • Using the above definitions for rotation matrices, the world to misaligned camera rotation transformation RW C is a function of five unknown variables φVB, θVB, φC, θC, ϕC, and one measured variable ϕB which is recorded at the time the series of images is captured.
  • The series of images captured at different body roll angles provides a dataset that may be used to perform a non-linear least squares numerical optimization to estimate the five unknowns. The numerical optimization requires the calculation of an error residual that can be minimized to arrive at an optimization solution. This error residual is calculated by comparing the rotation transformation RW C=∫(φC, θC, ϕC, φVB, θVB, ϕVB) to the world to camera transformation provided by the extrinsics estimate for each captured image. This comparison results in a residual rotation matrix given by Rresidual=RW CRW where RW ˜C′ is the transpose of the measured world to camera rotation matrix provided by the extrinsics estimate Rresidual is converted into three Euler angles, which form the error residual between the numerical optimization solution and the measured world to camera transformation from the extrinsics estimates for each image. The residual vector r is a column vector (of dimension three times number of images by one), and the least squares cost function becomes:

  • ](φCCϕCVBVBB)=r′r
  • Once the numerical optimization has solved for an estimate of the nominal camera coordinate frame to actual camera coordinate frame rotation matrix R C the camera to body rotation matrix needed to produce image based navigation states in the body coordinate frame RC B may be generated using the following equations.

  • R B C =R C R B

  • R C B=(R B C)−1
  • The procedure described above may be used to estimate the camera to body transformation for various imagers containing a lens mounted within a body 10.
  • Experimentally, the method described herein was demonstrated with an imager 5 mounted within an 83 mm cylindrical body 10, A camera intrinsics and distortion model for the imaging device 5 was previously calculated in MATLAB® software using a refined version of the camera calibration toolbox utilities available from the California Institute of Technology and a series of images of a checkerboard calibration target 20. The demonstration experiment used the same checkerboard calibration target used for estimation of the camera intrinsics. A pair of v-block support mechanisms 30 were used to capture a series of seventeen images of this calibration target over the sequence of body roll angles listed below:

  • [0, 9.95, 20.3, 29.6, 40.4, 50.6, 59.8, 69.7, 89.8, 0, −10.6, −20.3, −30.3, −40.2, −50.2, −60,−70.7, −89.2, 0.04]
  • MATLAB® Computer Vision System Toolbox was used to estimate the extrinsics for each image in the roll sequence. These extrinsics estimates and the roll values measured by a digital inclinometer were used as inputs to a non-linear least squares optimization to solve for the five variables φVB, θVB, φC, θC, ϕC using the MATLAB® software function lsqnonlin. Initial estimates for each unknown were set equal to 0°. The Jacobian (H) of the cost function returned by the MATLAB® software function was used to determine the estimation quality by forming an estimate of the Fisher information matrix:

  • F=H′H
  • The information matrix was full rank with a condition number of 5.41. This shows that the five parameters are identifiable from the data, although there is some cross correlation between the variables. To further evaluate the optimization accuracy, the information matrix was inverted to estimate the covariance of the estimated quantities, with the diagonal entries used to calculate the standard deviations. The optimization returned the results presented in Table 1 below:
  • TABLE 1 Mean Standard Deviation Variable (degrees) (degrees) φVB −6.667 0.229 θVB −0.276 0.316 φC −61.073 0.316 θC −0.176 0.316 ΦC −0.667 0.316
  • Mean residual for the optimization result was 0.034°, indicating that the rotation error between the optimization solution and the extrinsics estimate calculated for each image was small. Investigation into the Euler angle components of the residual error revealed that the mean error for the camera yaw axis was 0.057° while the mean error for the camera pitch and roll axes was 0.024° and 0.019°, respectively. The larger mean error value for the camera yaw axis may be attributed to the precision of the digital inclinometer that was used to measure the body roll angle for each image, which is limited to 0.1° precision for angles of magnitude greater than 10°. A more precise measurement could potentially further reduce the residual produced by the optimization solution. These results demonstrate the ability of the method provided by the embodiments herein to accurately measure the angles which characterize the imaging device to body rotation transformation (accuracy <0.1° camera yaw and <0.05° camera pitch and roll).
  • To demonstrate that the method provided by the embodiments herein is suitable for producing repeatable results, as well as suitable for differentiating between world to v-block misalignment and camera to body alignment, the process was repeated for three sets of images. For the first set, the calibration target 20 was placed approximately perpendicular to the z-axis of the imaging device 5. For the second set, the calibration target 20 was tilted by approximately 10° (measured by a digital inclinometer). For the third set, the calibration target 20 was panned by a few degrees (not measured). The result of the optimization calculation for these three image sets is provided in Table 2 below. A center finding inclinometer, which is precise to only 1°, was used to measure the body roll angle for this experiment, resulting in reduced accuracy in estimating the camera yaw angle when compared to the results presented above which used a digital inclinometer to measure the body roll angle. The results presented in the Table 2 demonstrate the ability of the method provided by the embodiments herein to produce repeatable results even if the body roll axis as constrained by the v-block support mechanisms 30 is misaligned with the calibration target 20.
  • TABLE 2 C~ to C Euler VB~ to VB Euler Mean Angles (degrees) Angles (degrees) Residual Yaw Pitch Roll Yaw Pitch (degrees) Original −16.3022 −0.3845 −0.0346 1.9017 0.1454 0.2462 Fit Board Tilt −15.8626 −0.3854 −0.0337 1.9293 10.522 0.1244 Board Pan −16.2485 −0.3636 −0.0438 −3.4806 0.0179 0.2401
  • The method according to the embodiments herein provides a unique approach to measuring the camera to body rotational transform without the need for a precise physical configuration of the measurement hardware. Effective measurement may be accomplished with only a digital inclinometer to level the checkerboard calibration target 20 and a center finding inclinometer or digital inclinometer to measure the roll angle of the body 10. Because the method solves for misalignment between the calibration target 20 and the v-block support mechanisms 30 during numerical optimization, expensive mechanical or optical alignment devices are not necessary for effective implementation of the method provided by the embodiments herein.
  • Measurement results for a calibrated imager (e.g., imaging device 5) mounted within a body 10 demonstrate the ability of the method provided by the embodiments herein to generate repeatable measurements of the camera to body rotational transformation even if the checkerboard calibration target 20 and v-block support mechanisms 30 are misaligned.
  • Various examples described herein may include both hardware and software elements. The examples that are implemented in software may include firmware, resident software, microcode, etc. Other examples may include a computer program product configured to include a pre-configured set of instructions, which when performed, may result in actions as stated in conjunction with the methods described above. In an example, the preconfigured set of instructions may be stored on a tangible non-transitory computer readable medium or a program storage device containing software code. In the software embodiments, instructions may be provided to a computing device 70 by the processor 55 linked to the computing device 70.
  • FIG. 7 with reference to FIGS. 1 through 6, is a block diagram of computing device 70 comprising the processor 55 as described above and a machine-readable storage medium 75. Processor 55 may include a central processing unit, microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in a machine-readable storage medium 75. Processor 55 may fetch, decode, and execute computer-executable instructions 81, 83, 85, 87, and 89 to enable execution of locally-hosted or remotely-hosted applications for controlling action of the computing device 70. As an alternative or in addition to retrieving and executing instructions, processor 55 may include one or more electronic circuits including a number of electronic components for performing the functionality of one or more of instructions 81, 83, 85, 87, and 89.
  • The machine-readable storage medium 75 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, the machine-readable storage medium 75 may be, for example, Read-Only Memory, an Electrically-Erasable Programmable Read-Only Memory, a storage drive, an optical disc, and the like. In one example, the machine-readable storage medium 75 may include a non-transitory computer-readable storage medium.
  • In an example, the processor 55 executes computer readable instructions 81-89. For example, computer-executable detecting instructions 81 may detect a pure rolling motion of a body 10. Computer-executable processing instructions 83 may process a series of images 50 of an imaging device calibration target 20 over a range of body roll angles of the rolling body 10. For example, the images 50 are processed to estimate the extrinsics for each image. Computer-executable measuring instructions 85 may measure the roll angles of the rolling body 10 with respect to a world coordinate frame as defined by the imaging device calibration target 20. Computer-executable estimating instructions 87 may simultaneously estimate alignment angles φC, θC, ϕC of the imaging device 5 and misalignment angles φVB, θVB associated with an orientation of the body 10. Computer-executable estimating instructions 89 may estimate a rotational transform between an imaging device coordinate frame and a body coordinate frame based on the estimated alignment angles φC, θC, ϕC and misalignment angles φVB, θVB.
  • The processor 55 is to use an imaging device extrinsics estimate to define a world-to-camera coordinate frame rotation for each image in the series of images 50. The processor 55 is to use a numerical optimization process to estimate the alignment angles φC, θC, ϕC of the imaging device 5 and misalignment angles φVB, θVB associated with an orientation of the body 10.
  • According to the embodiments herein, v-block support mechanisms 30 may be used to produce pure rolling motion of a body 10. A series of images 50 of a checkerboard camera calibration target 20 is captured over a range of body roll angles. Numerical optimization may be used to estimate the rotational transform between the camera coordinate frame and the body coordinate frame and the misalignment between the v-block support mechanisms 30 and the calibration target 20. The embodiments herein avoid the requirement of directly measuring the orientation of the body 10 with respect to the calibration target 20 and accordingly does not require expensive mechanical or optical alignment devices.
  • The embodiments herein may be used to measure the alignment of a camera with the body 10 of a guided projectile, such as for use with a vision-based navigation system. In some embodiments, the system 100 may be used in the implementation of vision based navigation systems for robots, unmanned aerial vehicles (UAVs), self-driving cars, as well as photogrammetry applications such as aerial mapping.
  • The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.

Claims (5)

What is claimed is:
1: A method for measuring an alignment of an imaging device, the method comprising:
producing a pure rolling motion of a body;
capturing a series of images of an imaging device calibration target over a range of body roll angles of the rolling body;
measuring the roll angles of the rolling body with respect to a world coordinate frame as defined by the imaging device calibration target;
simultaneously estimating alignment angles of the imaging device and misalignment angles associated with an orientation of the body; and
estimating a rotational transform between an imaging device coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles
wherein the producing of the pure rolling motion of the body occurs by a pair of support mechanisms that mechanically constrain the body to produce the pure rolling motion
wherein the support mechanisms comprise v-block mechanisms.
2-7. (canceled)
8: The method of claim 1, wherein the body comprises a guided projectile.
9: A system for measuring an alignment of an imager, the system comprising:
a pair of support mechanisms to produce a pure rolling motion of a body;
an imager rigidly affixed to the body to capture a series of images of an imager calibration target over a range of body roll angles of the rolling body;
a sensor to measure the roll angles of the rolling body with respect to a world coordinate frame as defined by the imager calibration target; and
a processor to:
simultaneously estimate alignment angles of the imager and misalignment angles associated with an orientation of the body; and
estimate a rotational transform between an imager coordinate frame and a body coordinate frame based on the estimated alignment angles and misalignment angles,
wherein the alignment angles comprise three rotation angles representing a transform between the body coordinate frame and the imager coordinate frame, and wherein the misalignment angles comprise two rotation angles representing a transform between a roll axis of the body and the world coordinate frame.
wherein the body comprises a guided projectile containing the imager.
10-20. (canceled)
US15/873,036 2018-01-17 2018-01-17 Measuring camera to body alignment for an imager mounted within a structural body Active US10458793B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/873,036 US10458793B2 (en) 2018-01-17 2018-01-17 Measuring camera to body alignment for an imager mounted within a structural body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/873,036 US10458793B2 (en) 2018-01-17 2018-01-17 Measuring camera to body alignment for an imager mounted within a structural body

Publications (2)

Publication Number Publication Date
US20190219392A1 true US20190219392A1 (en) 2019-07-18
US10458793B2 US10458793B2 (en) 2019-10-29

Family

ID=67212793

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/873,036 Active US10458793B2 (en) 2018-01-17 2018-01-17 Measuring camera to body alignment for an imager mounted within a structural body

Country Status (1)

Country Link
US (1) US10458793B2 (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6121999A (en) * 1997-06-09 2000-09-19 Schaack; David F. Eliminating routine alignment calibrations in perspective dimensional measurements
US20030103651A1 (en) * 2001-12-03 2003-06-05 Kurt Novak Photogrammetric apparatus
US6965397B1 (en) * 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
US20060152589A1 (en) * 2002-09-25 2006-07-13 Steven Morrison Imaging and measurement system
US8320709B2 (en) * 2006-06-23 2012-11-27 Canon Kabushiki Kaisha Information processing method and apparatus for calculating information regarding measurement target on the basis of captured images
US8619144B1 (en) * 2012-03-14 2013-12-31 Rawles Llc Automatic camera calibration
US20160005164A1 (en) * 2013-02-21 2016-01-07 Regents Of The University Of Minnesota Extrinsic parameter calibration of a vision-aided inertial navigation system
US20160225191A1 (en) * 2015-02-02 2016-08-04 Daqri, Llc Head mounted display calibration
US20170277197A1 (en) * 2016-03-22 2017-09-28 Sharp Laboratories Of America, Inc. Autonomous Navigation using Visual Odometry
US20170280135A1 (en) * 2016-03-22 2017-09-28 The Lightco Inc. Camera calibration apparatus and methods
US20170287166A1 (en) * 2016-03-29 2017-10-05 Institut National D'optique Camera calibration method using a calibration target
US20180238682A1 (en) * 2015-01-07 2018-08-23 Snap-On Incorporated Rolling virtual wheel spindle calibration
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6121999A (en) * 1997-06-09 2000-09-19 Schaack; David F. Eliminating routine alignment calibrations in perspective dimensional measurements
US6965397B1 (en) * 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
US20030103651A1 (en) * 2001-12-03 2003-06-05 Kurt Novak Photogrammetric apparatus
US20060152589A1 (en) * 2002-09-25 2006-07-13 Steven Morrison Imaging and measurement system
US8320709B2 (en) * 2006-06-23 2012-11-27 Canon Kabushiki Kaisha Information processing method and apparatus for calculating information regarding measurement target on the basis of captured images
US8619144B1 (en) * 2012-03-14 2013-12-31 Rawles Llc Automatic camera calibration
US20160005164A1 (en) * 2013-02-21 2016-01-07 Regents Of The University Of Minnesota Extrinsic parameter calibration of a vision-aided inertial navigation system
US20180238682A1 (en) * 2015-01-07 2018-08-23 Snap-On Incorporated Rolling virtual wheel spindle calibration
US20160225191A1 (en) * 2015-02-02 2016-08-04 Daqri, Llc Head mounted display calibration
US20170277197A1 (en) * 2016-03-22 2017-09-28 Sharp Laboratories Of America, Inc. Autonomous Navigation using Visual Odometry
US20170280135A1 (en) * 2016-03-22 2017-09-28 The Lightco Inc. Camera calibration apparatus and methods
US20170287166A1 (en) * 2016-03-29 2017-10-05 Institut National D'optique Camera calibration method using a calibration target
US20190015988A1 (en) * 2017-07-11 2019-01-17 Seiko Epson Corporation Robot control device, robot, robot system, and calibration method of camera for robot

Also Published As

Publication number Publication date
US10458793B2 (en) 2019-10-29

Similar Documents

Publication Publication Date Title
Lobo et al. Relative pose calibration between visual and inertial sensors
CN1847789B (en) Method and apparatus for measuring position and orientation
JP2013539872A (en) Online reference generation and tracking in multi-user augmented reality
US20130230235A1 (en) Information processing apparatus and information processing method
CN100573586C (en) Calibrating method of binocular three-dimensional measuring system
US20170178358A1 (en) Determination of position from images and associated camera positions
EP2615580B1 (en) Automatic scene calibration
JP3859574B2 (en) 3D visual sensor
WO2011013301A1 (en) Position and orientation calibration method and apparatus
EP2959315B1 (en) Generation of 3d models of an environment
US20100149368A1 (en) Imaging apparatus, imaging method, and program
US20080050042A1 (en) Hardware-in-the-loop simulation system and method for computer vision
JP4914039B2 (en) Information processing method and apparatus
US20060256200A1 (en) Method and system for improving video metadata through the use of frame-to-frame correspondences
JPH07225405A (en) Image blurring correcting camera
CN103256920A (en) Determining tilt angle and tilt direction using image processing
ES2306091T5 (en) Procedure for determining the position and relative displacement of an object in space.
WO2010001940A1 (en) Position measurement method, position measurement device, and program
JP2008131250A (en) Correcting device for on-board camera and production method for vehicle using same correcting device
Ellenberg et al. Use of unmanned aerial vehicle for quantitative infrastructure evaluation
Wu et al. Autonomous flight in GPS-denied environments using monocular vision and inertial sensors
US8320616B2 (en) Image-based system and methods for vehicle guidance and navigation
CN103759716B (en) The dynamic target position of mechanically-based arm end monocular vision and attitude measurement method
KR101565362B1 (en) Gyroscope conditioning and gyro-camera alignment
CN102914293A (en) The information processing apparatus and information processing method

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: THE UNITED STATES OF AMERICA AS REPRESENTED BY THE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EVERSON, DANIEL P;MALEY, JAMES M;REEL/FRAME:045139/0551

Effective date: 20180111

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE