GB2368740A - Self-calibration of sensors - Google Patents

Self-calibration of sensors Download PDF

Info

Publication number
GB2368740A
GB2368740A GB0108482A GB0108482A GB2368740A GB 2368740 A GB2368740 A GB 2368740A GB 0108482 A GB0108482 A GB 0108482A GB 0108482 A GB0108482 A GB 0108482A GB 2368740 A GB2368740 A GB 2368740A
Authority
GB
United Kingdom
Prior art keywords
sensors
sensor
moving object
image
attitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0108482A
Other versions
GB0108482D0 (en
GB2368740B (en
Inventor
Edmund Peter Sparks
Christopher John Gillham
Christopher George Harris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Roke Manor Research Ltd
Original Assignee
Roke Manor Research Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=9889552&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=GB2368740(A) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Roke Manor Research Ltd filed Critical Roke Manor Research Ltd
Priority to US10/257,449 priority Critical patent/US20030152248A1/en
Priority to PCT/EP2001/004097 priority patent/WO2001077704A2/en
Priority to AU2001268965A priority patent/AU2001268965A1/en
Publication of GB0108482D0 publication Critical patent/GB0108482D0/en
Publication of GB2368740A publication Critical patent/GB2368740A/en
Application granted granted Critical
Publication of GB2368740B publication Critical patent/GB2368740B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/7803Means for monitoring or calibrating

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

A method of calibrating the position and/or attitude of one or more image sensors comprises capturing the image of a moving object at one or more locations, determining the corresponding 2-d position of the object on said image sensors and from the determined positions calculating the position and/or attitude of the sensors. If the 3-d position of the moving object eg from GPS or radar measurements a single sensor can view the object at three locations to determine its position and orientation. If more sensors are used, fewer locations are needed. If the object's position is not known, it is viewed by at least two sensors whose position and orientation is found iteratively as the object moves around - at least five object locations are needed. The object may be an aeroplane or helicopter.

Description

METHOD OF SELF-CALIBRATION OF SENSORS This invention relates to a method of self calibration of imaging sensors. Imaging sensors (e. g. a camera) are used to passively monitor detectable objects, such as aeroplanes, for example by'hot-spot'or motion
detection. It is envisaged that a self-calibrating array of imaging sensors could .. be used for a warning system in an air defence role. Radar systems suffer from the disadvantage of being active (they transmit signals), they thus make themselves targets. Consequently to preserve the system it may be required to turn itself off. Acoustic systems can provide no advance warning of objects travelling at super-sonic speeds. Imaging sensors, being passive, do not give away their position in operation.
In known systems, image sensors and processing modules perform object detection for instance using, the motion of the object or the presence of the hot exhaust (for infra-red imaging sensors). This information can be transmitted (for example using a land line, or directional radio communication) to a central point where the detection from a number of image sensors is correlated and the position and track of the object is calculated. However, a single sensor will not give a good indication of range, speed and direction of flight. -The object must be observed by two or more sensors, allowing triangulation to be performed. The attitude of each sensor must be known to a sufficient accuracy. The position and attitude of a sensor is called its calibration. This calibration could be achieved by surveying them, but under adverse deployment conditions (e. g. in enemy territory, or for hasty deployment) adequate surveying may not be practicable.
In combat scenarios such sensors imaging, may be dropped remotely by parachute, by personnel on the ground, or other suitable means..
It is an object of the invention to to provide
a method for the image sensors to calibrate themselves.
The invention comprises a method of calibrating one or more image sensor in terms of position and/or attitude comprising: a) capturing the image of a moving object at one or more locations. b). determining the corresponding 2-d position on said image (sensors). c) from the data obtained in steps a) & b) calculating the position and/or attitude of the sensor.
In this way the invention uses a moving object of opportunity, e. g. an aircraft to calibrate the image sensors.
If possible, it is preferable if the 3-d position of the moving object the locations is known. This may be achieved by the aircraft relating its position to the image sensors, if not a hostile aircraft (most aircraft have GPS which enable the aircraft to locate the aircraft's position). Alternatively the 3 D coordinates, or estimates therefor, may be determined by a radar system and indirectly which communicates these data to the sensors.
Where a single sensor calibrates itself and no other data are available, in step a) there needs to be a minimum of three locations, and the aircraft's position needs to be known at these locations too.
The number of location of capture can be reduced to one or two if ancillary sensor information is also known.
The ancilliary-sensoiìnfrmation maybe sensor position or attitude, or an estimate of one or both of attitude and position. Alternatively the ancillary sensor information, is obtained by capturing the 2-d position on said image sensor of a fixed known reference point.
The invention is also applicable to the case where the position of the moving object not known. Normally to calibrate a single image sensor and the moving object needs to be captured at least is captured at least 5 locations for it calibration. Again ancillary sensor in for motion will also help improve the accuracy of the calibration and reduce the number said locations of capture..
It is advantageous also for there to me a plurality of sensors working together to calibrate themselves. Under these circumstances the moving object is captured on the image sensors at the same time, i. e. corresponding to the same location. One or more sensors of such a system may have their location and/or attitude already know or determined. If both the location and attitude of a sensor in such systems is known it does obviously not need calibrating but assists to calibrate other sensors. Alternatively only one of either attitude or position of one or more or all of the sensors is not know, or only estimated.
Example 1-known moving object location Consider a plurality of imaging sensors that have at least partially overlapping fields of view. Each sensor is self-calibrated independently, so one needs only to consider for a single sensor. To perform self-calibration, the sensor will require a number of views of a target whose 3D position is known. The target may be a co-operating aircraft whose location is known for example by an on-board GPS, or any tangos location is determined using for example radar.
Consider n (at least 3) observations being taken of the target. To start off, select 3 observations that are not in a 3D straight line, and using these, apply a closed-form technique (known to those skilled in the art, for example, one technique requires solving a quartic equation) to determine the sesnor calibration. This will not in general result in a very accurate calibration, but it can be improved by incorporating the remaining n-3 observations. For example, this can be performed by using an extended Kalman Filter initialised with the closed-form solution. The parameters of the Kalman Filter will be the sensor attitude (for example, roll, pitch and yaw) and sensor location (for example elevation, latitude and longitude). It is at this point that the sensor elevation may be constrained to lie on the ground surface as specified by the terrain map. The closed-form solution may be omitted if an adequate initial estimate of the calibration is available, and the observations incorporated directly into the Kalman Filter.
Example 2.
In the following example there are two image sensors (or cameras) ! & 2 whose exact position and orientation is unknown. The cameras are selfcalibrated according to the accurately known (i. e calculated) position of an object, for example, a co-operating aircraft flying along a flight path which can determine its own location by some method e. g. it may have a GPS receiver.
At known position'A', having 3-d co-ordinates XA, YA, ZA the aircraft can be observed at a location point (XIA. YIA) on the 2 dimensional image sensor and position X2A, Y2A (2 dimensional) on image sensor 2. The aircraft is observed at two further locations (B and C) and the values of X, Y, Z, x, y, and are determined for each sensor at each location. Thus for each location and each sensor the variables XYZ, x, y are known.
The variables which are unknown and which require to be determined are for each of the two sensors, a and ss (the effective x, y co-ordinates of the sensor, i. e. 2 dimensional location on a map) and #, 8, À the effective pitch, roll and yaw values of the sensors-i. e. orientation When there are two sensors and three measured points a, ss, X, 8 and # for each sensor can be determined for each from the 3 sets of values X, Y, Z x, y. where A, B, C refers to position of object aircraft and 1 & 2 refers to sensor number.
XA, YA, ZA XB, YB, ZB Xc, Yc, Zc XIA, YIA XIB, YIB Xic, Yic X2A, Y2A X2B, y2B X2C, Y2C The above known variables (21 in total) are used to solve the unknown a, ss, X, 8, and # for each sensor (10 unknowns). Suitable mathematical techniques to solve this would be clear to the person skilled in the art and include techniques such as Kalman filters to determine the 2 exact closed-form solutions for the sensor calibration. For each solution, for example, a Kalman Filter for the sensor calibration can be initiated and sequentially all the additional observations added in, and the sensor calibration refined.
Preferabtne'TeeBservations are not bunched together or on a straight
line. It is not necessary that the aircraft is friendly, as long as its position at a time is known. Its position may, e. g., be determined by radar.
Calibration can still be achieved even if a known object is not available, provided that at least approximate sensor calibrations are available. For example, sensor location may be known approximately (or accurately known) by use of on-board GPS receivers. Sensor attitude may be approximately known due to the method of deployment (e. g. self righting unit-so the sensor always points roughly vertically) or by using additional instrumentation e. g. compass (for azimuth), and tilt meters (for elevation). A moving object such as an aircraft assumed to be the same and observed additionally by a sensor whose position and orientation is known. This would yield information allowing to improve the estimate of position and orientation of the imaging sensor. Whose calibration is unknown even where both imagining sensors have errors in an assumed attitude and/or position, it is still possible to improve their estimates. In general any errors generated would then be compared to those generated an assuming various positions and attitudes; and as a result of the comparison the optimum estimate of actual location may be determined where the errors are iterated to zero or a minimum. An example is described in the following example.
Example 3-unknown moving object locations Even if the moving object 3-D locations are not known, a calibration can still be performed provided that there is sufficient overlap in the sensor fields of view. Assume to. start with that a moving object seen in 2 or more sensors is correctly identified-that there is no confusion between different targets. The determination of the sensor calibrations is then equivalent to that of fibrebundle adjustment in photogrammetry. This requires the construction of a mathematical model of all the sensor calibrations and all target 3D locations. By projecting the targets into each sensor, and iteratively minimising their differences to the observations, an optimal solution can be found. This technique is known to those skilled in the art.
It is most useful, for the techniques to have initial estimates for the sensor calibrations to start the iterative minimisation. Without the use of additional measurements, only relative sensor calibrations can be obtained-for example, shifting all the sensors by an identical amount in any direction will be an equally valid solution. This is an example of the so-called speed-scale ambiguity. This ambiguity can be resolved by use of the terrain map and the assumption that all the sensors are on the ground, provided that the sensor altitudes are sufficiently diverse.
There remains the problem of resolving confusion between moving objects. It shall be assumed that each sensor has accurate knowledge of time, by use of an on-board clock or a GPS clock. Only targets seen simultaneously in 2 or more sensors will normally provide useful calibration data.
One simple method is to use occasions when at most only a single moving object is observed in each sensor. If this is due to the presence of a single moving object in the monitored space, then the target will indeed be correctly identified. erurrenee of one or more of such single-moving object events
--, ilmay enab'CSHEration to be performed, depending on the sinuosity of the
target flight-path. It may be that more than one moving object is present in some of these events so that incorrect identification occurs, leading to an inconsistent calibration. This problem could be overcome by employing a RANSAC algorithm to work with subsets of these events.
The resolution of confusion between moving objects is aided by forming target tracks in each sensor. Provided these tracks do not cross, all observations along a track should originate from the same target (though at different times). Even when tracks cross, it may be possible to correctly identify them.
The shapes of these tracks in the image may provide disambiguating information. For example, an aircraft flying at constant velocity will form a straight track, which should not be matched to a distinctly curved track seen in another sensor.
It may be that the target is not observed as a simple point event, but has useful identifying attributes. For example, in an infra-red sensor, the intensity of a jet aircraft may change suddenly as afterburners are turned on. Identification of this same track attribute in different sensors would be evidence of track matching.
Prior estimates of the sensor calibration may be used to disambiguate moving objects. A prior calibration estimate for a sensor may act to localise a moving object in a volume of space, so that if these volumes do not overlap between sensors, then the moving object cannot be in common. For tracks, an overlap region must exist at all times for correct matching.
In some instances additional information may be utilised to improve the accuracy of the estimation. This may include observation by the image sensor of fixed reference point such as mountain peaks stars etc.
Self-calibration in general, can be performed using a number of examples of objects of opportunity seen by the sensors. To be of use, each object should preferably be seen by at least 2 sensors, and be correctly identified in each sensor as the same object A filter (e. g. a Kalman Filter) can be constructed for both the sensor calibrations and a general object position. The filters are initialised to the approximate sensor calibrations. Each set of object observations is first used to estimate the object position, then used to refine the (linearised) filter.

Claims (12)

  1. Claims 1. A method of calibrating one or more image sensors in terms of position and/or attitude comprising: a) capturing the image of a moving object at one or more locations; b). determining the corresponding 2-d position on said image (sensors); c) from the data obtained in steps a) & b) calculating the position and/or attitude of the one or more sensors
  2. 2. A method as claimed in claim 1, wherein in step a) the 3-d position of the moving object at at least location is known.
    < -, c,
  3. 3. A method as claimed in 1 or 2 wherein the method is used ! librate one L
    image sensor and in step a) the number of locations of capture is at least three.
  4. 4. A method as claimed in claims 1 or 2 wherein step a) the number of location of capture is one or two and in step c) ancillary sensor information is also known and used in said calculation.
  5. 5. A method as claimed in claims lor 2 wherein at least 2 image sensors are used in the calibration and in step c) ancillary sensor information is also known and used in said calculation.
  6. 6. A method as claimed in claims 2-5 wherein said moving object transmits positional data directly to said image sensor.
  7. 7. A method as claims 2-1 whereih said positional data of the moving object is determined indirectly by a un which transmit data to said imaging
    sensor.
  8. 8. A method as claimed in claim 1 wherein the position of the moving object not known.
  9. 9. A method as claimed in claim 9 wherein the method is to calibrate a single image sensor and the moving object is captured at least 5 locations.
  10. 10. A method as claimed in claims 4 to 7 or 9 wherein in step c) ancillary sensor information is also known and used in said calculations.
  11. 11. vA method is claimed in claim 4-7,9 or 10 wherein said ancillary sensor information is position or attitude, or an estimate of one or both of attitude and position, of the single or at least one of the plurality of sensors.
  12. 12. A method as claimed in 4-7, 9 to 11 wherein said ancillary sensor information, is obtained by capturing the 2-d position on said image sensor of. a fixed known reference point.
    13 A method as claimed in any preceding claim wherein said moving object is a helicopter or aircraft.
    14 An image sensor adapted for self calibration according to the methods of any preceding claim.
GB0108482A 2000-04-11 2001-03-30 Method of self-calibration of sensors Expired - Fee Related GB2368740B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/257,449 US20030152248A1 (en) 2000-04-11 2001-04-09 Self calibration of an array of imaging sensors
PCT/EP2001/004097 WO2001077704A2 (en) 2000-04-11 2001-04-09 Self-calibration of an array of imaging sensors
AU2001268965A AU2001268965A1 (en) 2000-04-11 2001-04-09 Self-calibration of an array of imaging sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0008739A GB0008739D0 (en) 2000-04-11 2000-04-11 Self-Calibration of an Array of Imaging Sensors

Publications (3)

Publication Number Publication Date
GB0108482D0 GB0108482D0 (en) 2001-08-15
GB2368740A true GB2368740A (en) 2002-05-08
GB2368740B GB2368740B (en) 2005-01-12

Family

ID=9889552

Family Applications (2)

Application Number Title Priority Date Filing Date
GB0008739A Pending GB0008739D0 (en) 2000-04-11 2000-04-11 Self-Calibration of an Array of Imaging Sensors
GB0108482A Expired - Fee Related GB2368740B (en) 2000-04-11 2001-03-30 Method of self-calibration of sensors

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB0008739A Pending GB0008739D0 (en) 2000-04-11 2000-04-11 Self-Calibration of an Array of Imaging Sensors

Country Status (1)

Country Link
GB (2) GB0008739D0 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09285906A (en) * 1996-04-19 1997-11-04 Japan Steel Works Ltd:The Camera type reference hole drilling machine correcting method and camera type reference hole drilling machine
US5692070A (en) * 1994-03-15 1997-11-25 Fujitsu Limited Calibration of semiconductor pattern inspection device and a fabrication process of a semiconductor device using such an inspection device
WO1998005922A1 (en) * 1996-08-07 1998-02-12 Komatsu Ltd. Calibration method
US5960125A (en) * 1996-11-21 1999-09-28 Cognex Corporation Nonfeedback-based machine vision method for determining a calibration relationship between a camera and a moveable object
JP2000218448A (en) * 1999-01-29 2000-08-08 Hitachi Ltd High accuracy positioning device
US6101455A (en) * 1998-05-14 2000-08-08 Davis; Michael S. Automatic calibration of cameras and structured light sources

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5692070A (en) * 1994-03-15 1997-11-25 Fujitsu Limited Calibration of semiconductor pattern inspection device and a fabrication process of a semiconductor device using such an inspection device
US5840595A (en) * 1994-03-15 1998-11-24 Fujitsu Limited Calibration of semiconductor pattern inspection device and a fabrication process of a semiconductor device using such an inspection device
JPH09285906A (en) * 1996-04-19 1997-11-04 Japan Steel Works Ltd:The Camera type reference hole drilling machine correcting method and camera type reference hole drilling machine
WO1998005922A1 (en) * 1996-08-07 1998-02-12 Komatsu Ltd. Calibration method
US5960125A (en) * 1996-11-21 1999-09-28 Cognex Corporation Nonfeedback-based machine vision method for determining a calibration relationship between a camera and a moveable object
US6101455A (en) * 1998-05-14 2000-08-08 Davis; Michael S. Automatic calibration of cameras and structured light sources
JP2000218448A (en) * 1999-01-29 2000-08-08 Hitachi Ltd High accuracy positioning device

Also Published As

Publication number Publication date
GB0108482D0 (en) 2001-08-15
GB0008739D0 (en) 2001-11-21
GB2368740B (en) 2005-01-12

Similar Documents

Publication Publication Date Title
US5969676A (en) Radio frequency interferometer and laser rangefinder/destination base targeting system
US6489922B1 (en) Passive/ranging/tracking processing method for collision avoidance guidance and control
US8296056B2 (en) Enhanced vision system for precision navigation in low visibility or global positioning system (GPS) denied conditions
EP3617749B1 (en) Method and arrangement for sourcing of location information, generating and updating maps representing the location
EP2877904B1 (en) Method for the acquisition and processing of geographical information of a path
IL238877A (en) Kalman filtering with indirect noise measurements
US20030152248A1 (en) Self calibration of an array of imaging sensors
CN114502465B (en) Determination of attitude by pulsed beacons and low cost inertial measurement units
EP3911968B1 (en) Locating system
KR20110080775A (en) Apparatus and method for height measurement
KR101764222B1 (en) System and method for high precise positioning
CA2908754C (en) Navigation system with rapid gnss and inertial initialization
CN105467366A (en) Mobile platform cooperative locating device and mobile platform cooperative locating system
KR100963680B1 (en) Apparatus and method for measuring remote target&#39;s axis using gps
JP2000193741A (en) Target tracking device
US8933836B1 (en) High speed angle-to-target estimation for a multiple antenna system and method
WO2007063537A1 (en) A method and system for locating an unknown emitter
US6664917B2 (en) Synthetic aperture, interferometric, down-looking, imaging, radar system
US12000946B2 (en) System and method for maintaining cooperative precision navigation and timing (PNT) across networked platforms in contested environments
US9134403B1 (en) System and method for relative localization
GB2368740A (en) Self-calibration of sensors
RU2483324C1 (en) Method for aircraft navigation on radar images of earth&#39;s surface
Evans et al. Fusion of reference-aided GPS, imagery, and inertial information for airborne geolocation
US20230243623A1 (en) System and method for navigation and targeting in gps-challenged environments using factor graph optimization
US20240128993A1 (en) Coordinate Frame Projection Using Multiple Unique Signals Transmitted from a Localized Array of Spatially Distributed Antennas

Legal Events

Date Code Title Description
AT Applications terminated before publication under section 16(1)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20060330

710B Request of alter time limits

Free format text: NOTICE OF APPEAL AGAINST THE DECISION OF THE COMPTROLLER DATED 20080201 LODGED WITH THE PATENTS COURT ON 20080213 (ACTION NO. CH2008APP0147)

APTC Appeals to the court
APTC Appeals to the court

Free format text: APPEAL REFUSED; APPEAL DATED 28 JANUARY 2009 (CH/2008/APP/0147), THE APPLICATION FOR PERMISSION TO APPEAL WAS REFUSED.