WO2003034750A1 - Image capture apparatus - Google Patents

Image capture apparatus Download PDF

Info

Publication number
WO2003034750A1
WO2003034750A1 PCT/GB2002/004517 GB0204517W WO03034750A1 WO 2003034750 A1 WO2003034750 A1 WO 2003034750A1 GB 0204517 W GB0204517 W GB 0204517W WO 03034750 A1 WO03034750 A1 WO 03034750A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens distortion
distortion parameter
images
epipolar
parameter value
Prior art date
Application number
PCT/GB2002/004517
Other languages
French (fr)
Inventor
Andrew Fitzgibbon
Original Assignee
Isis Innovation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isis Innovation Limited filed Critical Isis Innovation Limited
Publication of WO2003034750A1 publication Critical patent/WO2003034750A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details

Definitions

  • This invention relates generally to image capture apparatus and, more particularly, to image capture apparatus including means for estimating lens distortion parameters from two or more images captured by a camera.
  • the lens of an image capturing device is generally curved and, as such, an image captured therewith will exhibit a certain amount of "curvature” caused by lens distortion, which curvature does not substantially adversely affect the quality of the captured image itself, but causes the image not to obey the laws of perspective projection, and therefore needs to be accounted for.
  • curvature caused by lens distortion
  • the result of this (if viewed in terms of a single frame) is the production of an essentially "perspective" image of the same scene in which the corners of the original image have been effectively "pushed out” or "pushed in”, as illustrated in Figure 1 of the drawings.
  • the amount of curvature created in an image because of lens distortion may typically be represented by a single value, and the addition of such a single distortion term within the image processing function is known to significantly improve the results of scene reconstruction, particularly over long video sequences.
  • the present invention is concerned primarily with the situation where the original camera lens is not available, for example in the case of archive footage, or when using variable lens geometries.
  • the present invention is concerned with the problem of nonlinear lens distortion in the context of camera self- calibration and structure from motion. In particular, the recovery of three-dimensional camera motion from two-dimensional point tracks, where there is moderate to severe lens distortion.
  • the first known as the plumb line method uses straight lines in a scene to provide constraints on the distortion parameters.
  • straight lines are not always available in a scene, and when present are not necessarily trivial to detect.
  • extreme care must often be taken to ensure that real-world curves are not confused with distorted lines.
  • lens distortion can be computed given two or more images of the same scene, captured from two or more different respective angles, and no other information
  • bundle adjustment which involves the computation of the fundamental matrix of a pair of cameras.
  • An example of this method is given in Proceedings International Conference on Pattern Recognition (proc ICPR), 1996 “On the Epipolar Geometry Between Two Images With Lens Distortion", Z. Zhang, in which lens distortion is considered as an integral part of the camera and the rigidity constraints or assumptions required to compute the fundamental matrix are extended to include the parameters of the distortion model.
  • the epipolar geometry between two images with lens distortion is described, and it is established that for a point in one image, its corresponding point in the other image should lie on the so-called epipolar curve.
  • the above-described technique relies on iterative methods to find the distortion parameters. As is usual with such iterative methods, their convergence is not guaranteed, initial estimates must be found, and - although fast within this class of nonlinear techniques - they remain too slow to place in the inner loop of any hypothesise-and-test architecture.
  • image data processing apparatus comprising means for receiving two or more images of a scene captured by one or more image capturing devices, means for identifying point correspondences between said captured images, means for determining a lens distortion parameter value corresponding to lens distortion exhibited in said captured images and adjusting said point correspondences according to said determined lens distortion parameter value such that said point correspondences lie on substantially the same epipolar curve, said lens distortion parameter value being determined so as to satisfy an epipolar constraint equation defining said epipolar curve, the lens distortion parameter comprising an element of said epipolar constraint equation, the apparatus further comprising means for computing one or more suitable estimated lens distortion parameter values for insertion in said epipolar constraint equation.
  • the means for computing one or more suitable estimated lens distortion parameter values is arranged to compute said values using one or more design matrices, said one or more design matrices preferably being derived from two- dimensional point coordinates, beneficially obtained directly from the captured images.
  • Each point correspondence obtained directly from the captured images comprises a pair of image points (x exc y,) and (x, ', y, "), where i ranges from 1 to the number of point correspondences, m.
  • r, : x + y , 2
  • r, ' : x, + y, .
  • three design matrix rows may be defined as follows:
  • R, (1) [x, 'x, x, 'y, x, ' y, 'x, y, 'y, y, ' x, y, 1 ]
  • R, m [ 0 0 *, 'r, 2 0 0 y. 'r ,r, 2 JV, 12 rf + r * ]
  • R, (3) [ 0 0 0 0 0 0 0 0 0 0 r, ' 2 r, 2 ]
  • denotes the lens distortion parameter
  • f denotes a 9-vector containing the elements of the fundamental matrix. It is known how to solve such problems.
  • three pairs of design matrix rows may be defined as follows:
  • Figure 1 is a schematic diagram illustrating the effect of applying lens distortion correction to a captured image, it will be shown that distortion can be computed given two or more images of the scene and no other information.
  • the following description of an exemplary embodiment of the present invention relates primarily to allowing the matching of image pairs via interest-point correspondences.
  • the most successful prior art techniques for matching interest points are based on the geometric constraints offered by multiple-view geometry. These tend to be effective because fast linear algorithms exist for the computation of the relationships allowing their computation to form the kernel of RANS AC -based matching algorithms.
  • these constraints cannot be applied because the two-view relationships (fundamental matrix, planar homography) are not accurate in the image periphery.
  • an object of the present invention is to develop a model for the between- view relations which incorporates lens distortion.
  • a model is required which admits a direct solution, i.e. computation from point correspondences via well understood, fast, globally convergent, numerical algorithms such as single value decomposition (SVD) or eigenvalue extraction, and the present invention is concerned with the calculation of one or more realistic estimates of camera geometry and correspondences for use in bundle adjustment techniques generally.
  • SVD single value decomposition
  • eigenvalue extraction eigenvalue extraction
  • x (x, y)
  • x will denote a general vector, including 2D points in homogeneous coordinates.
  • the image points observed will be distorted functions of some perspective pinhole points, which shall be denoted by p_, so the image point x is the distorted version of the perfect point p . .
  • the present invention is only concerned with radial distortion, so that the relationship between x and p . is dependent on their distances from the image centre. Throughout the paper, all these points are expressed in a 2D coordinate system, with origin at the distortion centre. In the absence of any other information, one would fix the distortion centre at the centre of the image, which is considered to be a reasonable approximation.
  • the distortion centre may be assumed known, it is known in the art how to write the distortion correction (e.g. C. Slama, “Manual of Photogrammetry, 4 th Edition", American Society of Photogrammetry, Falls Church, NA, USA, 1980) in one of several ways.
  • n x ( 1 + ⁇ x ⁇ 2 )
  • a point correspondence in pinhole coordinates ⁇ p' which corresponds to a real 3D point which has been imaged by a pair of cameras will satisfy the epipolar constraint. This is embodied in the fundamental matrix, F, for the pair of cameras:
  • Equation (4) has a maximum of 18 solutions which can be used as the distortion parameters in the bundle adjustment estimation described above, with one of said solutions providing the closest approximation of lens distortion. However, it has been found that there are at most only 10 solutions, and in practice probably no more than 6 solutions, that are real.
  • each point correspondence adds two rows to the design matrices: l 1 ' 1 » 0 0 -xx' -xy' - x ⁇
  • the present invention extends the uncalibrated estimation of geometry from multiple images to include a correction for lens distortion. Its main contribution is considered to be a linear algorithm for the simultaneous estimation of this model and the fundamental matrix. All conventional algorithms for this purpose are iterative.
  • the present invention enables images which exhibit lens distortion to be matched with the same ease as those which accurately fit the pinhole model. Furthermore, it is possible to use the distortion-aware model to match even low distortion images without overfitting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Vascular Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

Image data processing apparatus, including means for computing and accounting for lens distortion given two or more images of a scene and no other information. In particular, the image data processing apparatus of the invention produces one or more estimated lens distortion parameters for insertion in an epipolar constraint equation, the apparatus being arranged to adjust point correspondences between two or more captured images such that they lie of substantially the same epipolar curve.

Description

IMAGE CAPTURE APPARATUS
This invention relates generally to image capture apparatus and, more particularly, to image capture apparatus including means for estimating lens distortion parameters from two or more images captured by a camera.
There are many circumstances, for example within the film industry, in which it may be required to reconstruct a three-dimensional scene which is observed in one or more captured images. For instance, it may be required to combine a three-dimensional video sequence of a scene captured within the real world with one or more three dimensional virtual buildings or other objects incorporated therein.
The lens of an image capturing device is generally curved and, as such, an image captured therewith will exhibit a certain amount of "curvature" caused by lens distortion, which curvature does not substantially adversely affect the quality of the captured image itself, but causes the image not to obey the laws of perspective projection, and therefore needs to be accounted for. In order to achieve this, it is known in the art to calibrate the camera capturing the images of the scene so as to apply one or more lens distortion terms to the captured images and thus substantially eliminate, or at least reduce, the above-mentioned curvature therefrom. The result of this (if viewed in terms of a single frame) is the production of an essentially "perspective" image of the same scene in which the corners of the original image have been effectively "pushed out" or "pushed in", as illustrated in Figure 1 of the drawings.
The amount of curvature created in an image because of lens distortion may typically be represented by a single value, and the addition of such a single distortion term within the image processing function is known to significantly improve the results of scene reconstruction, particularly over long video sequences.
It should be noted that the largest body of work on the correction of lens distortion deals with camera precalibration, i.e. where the camera is calibrated offline (before the image sequence is captured therewith). However, the present invention is concerned primarily with the situation where the original camera lens is not available, for example in the case of archive footage, or when using variable lens geometries. Thus, the present invention is concerned with the problem of nonlinear lens distortion in the context of camera self- calibration and structure from motion. In particular, the recovery of three-dimensional camera motion from two-dimensional point tracks, where there is moderate to severe lens distortion.
Known techniques concerned with online estimation of lens distortion can be divided into two strategies. The first, known as the plumb line method uses straight lines in a scene to provide constraints on the distortion parameters. However, straight lines are not always available in a scene, and when present are not necessarily trivial to detect. As a result, extreme care must often be taken to ensure that real-world curves are not confused with distorted lines.
It is known in the art that lens distortion can be computed given two or more images of the same scene, captured from two or more different respective angles, and no other information, and the second known method is known as bundle adjustment which involves the computation of the fundamental matrix of a pair of cameras. An example of this method is given in Proceedings International Conference on Pattern Recognition (proc ICPR), 1996 "On the Epipolar Geometry Between Two Images With Lens Distortion", Z. Zhang, in which lens distortion is considered as an integral part of the camera and the rigidity constraints or assumptions required to compute the fundamental matrix are extended to include the parameters of the distortion model. The epipolar geometry between two images with lens distortion is described, and it is established that for a point in one image, its corresponding point in the other image should lie on the so-called epipolar curve. The paper then goes on to investigate the possibility of estimating the distortion parameters and the fundamental matrix based on this generalised epipolar constraint. However, experimental results with computer simulation have shown that the distortion parameters can only be estimated correctly using the disclosed techniques if the noise in image points is low and the lens distortion is severe. Otherwise it is considered to be better to treat the camera(s) as being free of distortion.
Further, the above-described technique relies on iterative methods to find the distortion parameters. As is usual with such iterative methods, their convergence is not guaranteed, initial estimates must be found, and - although fast within this class of nonlinear techniques - they remain too slow to place in the inner loop of any hypothesise-and-test architecture.
Thus, some known systems are only able to estimate lens distortion parameters satisfactorily in the extreme conditions whereby noise in image points is low and lens distortion is severe, otherwise lens distortion is ignored. However, in many cases, ignoring the lens distortion parameters produces unsatisfactory results and, if accurate camera information is required there is considered to be no recourse but to bundle adjustment (also described in the chapter 'Bundle Adjustment: A Modern Synthesis', by W.Triggs, P. McLauchlan, R. Hartley and A. Fitzgibbon in Vision Algorithms .-Theory and Practice, LNCS, Springer Verlag, 2000), initialised with reasonable estimates of camera geometry in the presence of lens distortion which have until now been necessarily partially initially estimated manually by an expert before being inserted into one or more iterative algorithms for a more accurate calculation of these parameters. As a result, currently-known bundle adjustment techniques cannot be used to compute lens distortion parameters simultaneously with unknown camera motion and unknown scene geometry, and must instead be set by automated guesswork, which is slow, requires expert manual input and frequently fails altogether, and we have now devised an arrangement which overcomes the problems outlined above.
In accordance with the present invention, there is provided image data processing apparatus, comprising means for receiving two or more images of a scene captured by one or more image capturing devices, means for identifying point correspondences between said captured images, means for determining a lens distortion parameter value corresponding to lens distortion exhibited in said captured images and adjusting said point correspondences according to said determined lens distortion parameter value such that said point correspondences lie on substantially the same epipolar curve, said lens distortion parameter value being determined so as to satisfy an epipolar constraint equation defining said epipolar curve, the lens distortion parameter comprising an element of said epipolar constraint equation, the apparatus further comprising means for computing one or more suitable estimated lens distortion parameter values for insertion in said epipolar constraint equation.
In a preferred embodiment of the invention, the means for computing one or more suitable estimated lens distortion parameter values is arranged to compute said values using one or more design matrices, said one or more design matrices preferably being derived from two- dimensional point coordinates, beneficially obtained directly from the captured images.
Each point correspondence obtained directly from the captured images comprises a pair of image points (x„ y,) and (x, ', y, "), where i ranges from 1 to the number of point correspondences, m. Define r, := x + y ,2, and r, ' := x, + y, .
In one preferred embodiment, from each such point correspondence, three design matrix rows may be defined as follows:
R,(1) = [x, 'x, x, 'y, x, ' y, 'x, y, 'y, y, ' x, y, 1 ]
R,m = [ 0 0 *, 'r,2 0 0 y. 'r ,r,2 JV, 12 rf + r * ]
R,(3)= [ 0 0 0 0 0 0 0 0 r, '2r,2 ]
These rows are assembled into m x 9 design matrices D D2, D3 (i.e. Dα α ε {1, 2, 3}) as follows:
Figure imgf000006_0001
The three design matrices are inserted into a quadratic eigenvalue problem of the form:
(D, + λD2 + λ2D3) f = 0
where λ denotes the lens distortion parameter and f denotes a 9-vector containing the elements of the fundamental matrix. It is known how to solve such problems.
In a second embodiment, from each such point correspondence, three pairs of design matrix rows may be defined as follows:
D (l) _ o o o -x' -y' -l yx' υy' v 1
' [i' y' 1 0 0 0 -xx' -xy' -x J
n (2) _ [ 0 0 0 -rx' -ry' -r'-r 0 0 yr'
< [ rx' ry' r'+r 0 0 0 0 0 -xr' J
P (3) = θ 0 0 0 0 -rr' θ O θ ] Λ' L 0 0 rr' 0 0 0 0 0 0 J
These pairs of rows are assembled into three design matrices D,, D2, D3 each of size 2m x 9, as above. The three design matrices are inserted into a quadratic eigenvalue problem of the form:
(D, + λD2 + λ2D3) h = 0
where λ denotes the lens distortion parameter and h denotes a 9-vector containing the elements of the planar homography. An embodiment of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram illustrating the effect of applying lens distortion correction to a captured image, it will be shown that distortion can be computed given two or more images of the scene and no other information.
The following description of an exemplary embodiment of the present invention relates primarily to allowing the matching of image pairs via interest-point correspondences. As explained above, the most successful prior art techniques for matching interest points are based on the geometric constraints offered by multiple-view geometry. These tend to be effective because fast linear algorithms exist for the computation of the relationships allowing their computation to form the kernel of RANS AC -based matching algorithms. However, when images exhibit strong lens distortion, these constraints cannot be applied because the two-view relationships (fundamental matrix, planar homography) are not accurate in the image periphery.
Thus, an object of the present invention is to develop a model for the between- view relations which incorporates lens distortion. In particular, a model is required which admits a direct solution, i.e. computation from point correspondences via well understood, fast, globally convergent, numerical algorithms such as single value decomposition (SVD) or eigenvalue extraction, and the present invention is concerned with the calculation of one or more realistic estimates of camera geometry and correspondences for use in bundle adjustment techniques generally.
In the following, 2D points (in non-homogeneous coordinates) will be denoted by x = (x, y) and x will denote a general vector, including 2D points in homogeneous coordinates. The data used in the algorithm employed in this exemplary embodiment of the invention comprises point correspondences between lens distorted images. As the following deals almost entirely with two-view geometry, primes will be used to indicate a corresponding point in the second view. Thus, as input we have a set of two-view point correspondences, denoted x <= x'.
The image points observed will be distorted functions of some perspective pinhole points, which shall be denoted by p_, so the image point x is the distorted version of the perfect point p.. The present invention is only concerned with radial distortion, so that the relationship between x and p. is dependent on their distances from the image centre. Throughout the paper, all these points are expressed in a 2D coordinate system, with origin at the distortion centre. In the absence of any other information, one would fix the distortion centre at the centre of the image, which is considered to be a reasonable approximation.
Given that the distortion centre may be assumed known, it is known in the art how to write the distortion correction (e.g. C. Slama, "Manual of Photogrammetry, 4th Edition", American Society of Photogrammetry, Falls Church, NA, USA, 1980) in one of several ways.
The distortion model used in this description of an exemplary embodiment of the present invention is:
(1) E = / ( 1 + λ|x|2)
and:
(2) n = x ( 1 + κ\x\2)
In order to compute the fundamental matrix from perfect point correspondences p ρ' an algorithm may be derived and used, which can be modified to include λ, the distortion parameter, such that the fundamental matrix F may be computed from distorted, measurable points x<=>x', as follows. A point correspondence in pinhole coordinates ρ p' which corresponds to a real 3D point which has been imaged by a pair of cameras will satisfy the epipolar constraint. This is embodied in the fundamental matrix, F, for the pair of cameras:
P'TFP = o (3)
It is the task of the apparatus of this exemplary embodiment of the present invention to recover F from point correspondences. Writing p = (p, q, 1), and concatenating the rows of F into a nine-vector f, we may rewrite the above constraint as:
Figure imgf000009_0001
Collecting eight such rows into a design matrix D, we obtain an estimate for f by solving Df = 0. This estimate will be greatly improved by truncating the resulting matrix to rank 2.
In order to compute F from the known image coordinates x, we must express equation (3) in terms of x. Writing the distortion equation (1) projectively, we obtain:
Figure imgf000009_0002
p = x + λz
or: ι> x(H- «(x3 +jr>)) y(i + 2 + j,3)) 1 1
Figure imgf000009_0003
where both x and z are known (i.e. can be computed from image coordinates alone). Then the epipolar constraint is: (x' + \r')TV(x + λz) = 0
,'TT Fx + λ(z'TFx + x" Fz) -r- λ 2V„'Tl ,Fz = 0
or:
(x' + κz')τF(x + κz) = 0 x'TFx + κ(z'τFx + x'TFz) + κ2z'τFz = 0
which is quadratic in λ (or K) and linear in F. Indeed, expanding everything out, we obtain (with r = ||x||, and r' = ||x'||)
[ x'x x'y x' y'x y'y y' x V 1 ] * +
4- λ [ 0 0 z'r2 0 0 y'r2 xr'2 yr'2 r2+r'2 ] f + + λ2 [ 0 0 0 0 0 0 0 0 r'V ] f = 0
Gathering the three row vectors into three design matrices, we obtain the following quadratic eigenvalue problem (QEP):
(Dι + \D2 + λ2£>3)f = 0 (4)
Such problems are analogous to standard second order ordinary differential equations (ODE's) (replace λ with partial derivative operators), and efficient numerical algorithms are already available in the art for their solution. Equation (4) has a maximum of 18 solutions which can be used as the distortion parameters in the bundle adjustment estimation described above, with one of said solutions providing the closest approximation of lens distortion. However, it has been found that there are at most only 10 solutions, and in practice probably no more than 6 solutions, that are real.
The preceding analysis applies also to the estimation of a plane projective transformation between the images. In this case, each point correspondence adds two rows to the design matrices: l1 ' 1 » 0 0 -xx' -xy' -x \
D2 = \ °, °, ,0 ~r ' -ry' -r'-r 0 0 yr' 1 L r ry r'+r 0 0 0 0 0 -sr' J
£, To o 0 0 0 -rr' O O o l
L 0 0 rr' O 0 0 O O o J
The analogous computation for the trifocal tensor leads to a cubic eigenvalue problem, which is again readily solved.
Thus, in summary, the present invention extends the uncalibrated estimation of geometry from multiple images to include a correction for lens distortion. Its main contribution is considered to be a linear algorithm for the simultaneous estimation of this model and the fundamental matrix. All conventional algorithms for this purpose are iterative.
The present invention enables images which exhibit lens distortion to be matched with the same ease as those which accurately fit the pinhole model. Furthermore, it is possible to use the distortion-aware model to match even low distortion images without overfitting.
A specific embodiment of the present invention has been described herein by way of example only and it will be apparent to a person skilled in the art that modifications and variations may be made to the described embodiment without departing from the scope of the invention.

Claims

CLAIMS:
1. Image data processing apparatus, comprising means for receiving two or more images of a scene captured by one or more image capturing devices, means for identifying point correspondences between said captured images, means for determining a lens distortion parameter value corresponding to lens distortion exhibited in said captured images and adjusting said point correspondences according to said determined lens distortion parameter value such that said point correspondences lie on substantially the same epipolar curve, said lens distortion parameter value being determined so as to satisfy an epipolar constraint equation defining said epipolar curve, the lens distortion parameter comprising an element of said epipolar constraint equation, the apparatus further comprising means for computing one or more suitable estimated lens distortion parameter values for insertion in said epipolar constraint equation.
2. Apparatus according to claim 1, wherein the lens distortion parameter value is computed using two-dimensional point correspondences or coordinates obtained directly from said captured images.
3. Apparatus according to claim 2, wherein said lens distortion parameter value is computed using one or more design matrices.
4. Apparatus according to any one of claim 3, wherein three or more design matrices are generated and inserted into a polynomial eigenvalue of the form:
(D, + λD2 + λ2D3) f = 0
where λ denotes the lens distortion parameter and f denotes a 9-vector containing the elements of the fundamental matrix.
5. Apparatus according to claim 3 , wherein three or more design matrices are generated and inserted into a polynomial eigenvalue problem of the form:
(D, + λD2 + λ2D3) h = 0
where λ denotes the lens distortion parameter and h denotes a 9-vector containing the elements of the planar homography.
6. Image data processing apparatus substantially as herein described with reference to the accompanying drawings.
PCT/GB2002/004517 2001-10-13 2002-10-03 Image capture apparatus WO2003034750A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0124608.1 2001-10-13
GB0124608A GB2380887A (en) 2001-10-13 2001-10-13 Lens distortion correction using correspondence points within images which are constrained to lie on the same epipolar curve

Publications (1)

Publication Number Publication Date
WO2003034750A1 true WO2003034750A1 (en) 2003-04-24

Family

ID=9923769

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2002/004517 WO2003034750A1 (en) 2001-10-13 2002-10-03 Image capture apparatus

Country Status (2)

Country Link
GB (1) GB2380887A (en)
WO (1) WO2003034750A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1320813C (en) * 2003-06-20 2007-06-06 北京中星微电子有限公司 A distortion correction method for lens imaging
WO2009000906A1 (en) 2007-06-26 2008-12-31 Dublin City University A method for high precision lens distortion calibration and removal
US9438897B2 (en) 2011-07-25 2016-09-06 Universidade De Coimbra Method and apparatus for automatic camera calibration using one or more images of a checkerboard pattern

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680014A2 (en) * 1994-04-25 1995-11-02 Canon Kabushiki Kaisha Image processing method and apparatus
EP0895189A1 (en) * 1997-07-28 1999-02-03 Digital Equipment Corporation Method for recovering radial distortion parameters from a single camera image
EP0998139A2 (en) * 1998-10-30 2000-05-03 Hewlett-Packard Company Correcting distorsion in an imaging system
US6137491A (en) * 1998-06-05 2000-10-24 Microsoft Corporation Method and apparatus for reconstructing geometry using geometrically constrained structure from motion with points on planes
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
US6173087B1 (en) * 1996-11-13 2001-01-09 Sarnoff Corporation Multi-view image registration with application to mosaicing and lens distortion correction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2801123B1 (en) * 1999-11-12 2002-04-05 Bertrand Aube METHOD FOR THE AUTOMATIC CREATION OF A DIGITAL MODEL FROM COUPLES OF STEREOSCOPIC IMAGES

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680014A2 (en) * 1994-04-25 1995-11-02 Canon Kabushiki Kaisha Image processing method and apparatus
US6173087B1 (en) * 1996-11-13 2001-01-09 Sarnoff Corporation Multi-view image registration with application to mosaicing and lens distortion correction
US6137902A (en) * 1997-07-22 2000-10-24 Atr Human Information Processing Research Laboratories Linear estimation method for three-dimensional position with affine camera correction
EP0895189A1 (en) * 1997-07-28 1999-02-03 Digital Equipment Corporation Method for recovering radial distortion parameters from a single camera image
US6137491A (en) * 1998-06-05 2000-10-24 Microsoft Corporation Method and apparatus for reconstructing geometry using geometrically constrained structure from motion with points on planes
EP0998139A2 (en) * 1998-10-30 2000-05-03 Hewlett-Packard Company Correcting distorsion in an imaging system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MOHR R. ET AL: "RELATIVE 3D RECONSTRUCTION USING MULTIPLE UNCALIBRATED IMAGES", INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH , SAGE SCIENCE PRESS, THOUSAND OAKS, US, vol. 14, no. 6, 1 December 1995 (1995-12-01), pages 619 - 632, XP000558746 *
REMAGNINO P. ET AL: "Correlation techniques in adaptive template with uncalibrated cameras", PROCEEDINGS OF THE SPIE, 2 November 1994 (1994-11-02), SPIE, BELLINGHAM, VA, US, pages 252 - 263, XP000647033 *
STEIN G P: "Lens distortion calibration using point correspondences", PROCEEDINGS OF THE 1997 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, SAN JUAN, PUERTO RICO, 17-19 JUNE 1997, pages 602 - 608, XP002168147 *

Also Published As

Publication number Publication date
GB0124608D0 (en) 2001-12-05
GB2380887A (en) 2003-04-16

Similar Documents

Publication Publication Date Title
Ha et al. High-quality depth from uncalibrated small motion clip
US6353678B1 (en) Method and apparatus for detecting independent motion in three-dimensional scenes
US8385630B2 (en) System and method of processing stereo images
CN111897349B (en) Autonomous obstacle avoidance method for underwater robot based on binocular vision
US7565029B2 (en) Method for determining camera position from two-dimensional images that form a panorama
Yang et al. Polarimetric dense monocular slam
US7113632B2 (en) Method of and apparatus for rectifying a stereoscopic image
JP5285619B2 (en) Camera system calibration
US6995762B1 (en) Measurement of dimensions of solid objects from two-dimensional image(s)
US9338437B2 (en) Apparatus and method for reconstructing high density three-dimensional image
WO2018171008A1 (en) Specular highlight area restoration method based on light field image
US9025862B2 (en) Range image pixel matching method
CN102289803A (en) Image Processing Apparatus, Image Processing Method, and Program
US20170213324A1 (en) Image deblurring method and apparatus
JP4887376B2 (en) A method for obtaining a dense parallax field in stereo vision
CN112884682A (en) Stereo image color correction method and system based on matching and fusion
KR100951309B1 (en) New Calibration Method of Multi-view Camera for a Optical Motion Capture System
EP3166074A1 (en) Method of camera calibration for a multi-camera system and apparatus performing the same
WO2014002521A1 (en) Image processing device and image processing method
Eichhardt et al. Affine correspondences between central cameras for rapid relative pose estimation
Svoboda et al. Matching in catadioptric images with appropriate windows, and outliers removal
US20170116739A1 (en) Apparatus and method for raw-cost calculation using adaptive window mask
WO2019058487A1 (en) Three-dimensional reconstructed image processing device, three-dimensional reconstructed image processing method, and computer-readable storage medium having three-dimensional reconstructed image processing program stored thereon
WO2003034750A1 (en) Image capture apparatus
JP2000353244A (en) Method for obtaining basic matrix, method for restoring euclidean three-dimensional information and device therefor

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NO NZ OM PH PL PT RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP