WO2004088245A1 - Method of solving the correspondence problem in convergent stereophotogrammetry - Google Patents

Method of solving the correspondence problem in convergent stereophotogrammetry Download PDF

Info

Publication number
WO2004088245A1
WO2004088245A1 PCT/US2004/009372 US2004009372W WO2004088245A1 WO 2004088245 A1 WO2004088245 A1 WO 2004088245A1 US 2004009372 W US2004009372 W US 2004009372W WO 2004088245 A1 WO2004088245 A1 WO 2004088245A1
Authority
WO
WIPO (PCT)
Prior art keywords
epipolar
central
scan line
convergence
video image
Prior art date
Application number
PCT/US2004/009372
Other languages
French (fr)
Inventor
Pieter O. Zanen
Original Assignee
Zanen Pieter O
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zanen Pieter O filed Critical Zanen Pieter O
Publication of WO2004088245A1 publication Critical patent/WO2004088245A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • the invention pertains to the field of stereographic photography. More particularly, the invention pertains to a stereographic method for extracting information from an image and finding matching points in multiple images.
  • Stereographic photography is the method of producing images, which are apparently three-dimensional, by recording separate left- and right-eye images. The viewer reconstructs the 3-D image by viewing the two separate 2-D images simultaneously.
  • Stereographic photography has been known since at least the mid- 19th century, when stereo viewers were a popular parlor accessory. Such stereo views have historically been created with two lenses on a single camera, spaced apart by approximately the inter-ocular distance of a human head.
  • Photogrammetry is the process of making measurements through the use of photographs. Stereo photogrammetry uses multiple images (usually left/right images from stereo pairs) to measure objects and positions in three dimensions. To do so, it is necessary to locate the same (“homologous") point in each of the multiple images. The problem of finding homologous image points in multiple images is known as the "problem of correspondence". As the camera or lens separation increases, so do the differences in the scene as recorded by each camera or lens, thus making it difficult to match corresponding points in the two images.
  • One embodiment of the present invention is a method for solving the problem of correspondence associated with convergent 3D cameras.
  • the method includes the steps of aligning a central horizontal epipolar line parallel to the scan line of a charge-coupled device (CCD), choosing at least one scan line of the CCD that adheres to epipolar geometry, conduction of a disparity estimation of the scan line chosen, self calibration of the camera, and rectification of the images.
  • a laser pointer is aimed within the central epipolar plane and perpendicular to the base line to help aim the device at the object enclosed by the measurement volume, and allow the camera to adjust itself.
  • Another embodiment of the present invention is a method of extracting the settings of a 3D adapter with variable convergence from an image, using structured light, hi an alternative embodiment the magnification settings can be obtained from the image.
  • This method includes the steps of aligning a central horizontal epipolar line parallel to the scan line of the CCD; choosing at least one scan line of the CCD that adheres to epipolar geometry; illuminating the scene with a laser beam positioned within the central epipolar plane and halfway and perpendicular to the base line; obtaining an independent means of range estimation with an optical linear position sensor aimed under an angle of equal or less than 90 degrees to the laser beam thrown on the scene by the above described laser pointer; conducting a disparity estimation of the laser dot on the scan line chosen; and reconstructing the convergence from the range and the disparity.
  • the independent means for range estimation is a laser range finder using the 'time of flight' principle with its laser beam positioned within the central epipolar plane and positioned halfway and perpendicular to the base line.
  • a second laser range finder positioned parallel to the first beam and halfway and perpendicular to the base line throws a laser dot on the scene so the scale can be extracted from the image of the scene.
  • Figs. 1 A and IB show how correspondences are constrained to conjugate epipolar lines in a parallel camera setup and a convergent camera setup, respectively.
  • Fig. 2 shows the general concept of epipolar geometry.
  • Fig. 3 shows convergence extraction with linear PSD in a method of the present invention.
  • Fig. 4 shows convergence extraction with a laser range finder in a method of the present invention.
  • Fig. 5 shows a structured light set up to extract convergence and scale in a method of the present invention.
  • the present invention relies on the fundamentals of epipolar geometry to solve the stereo correspondence problem of finding homologous image points in multiple images.
  • Epipolar geometry is a property specific to stereo cameras (see Fig 1 A).
  • Epipolar geometry is a geometric relationship between two perspective cameras.
  • the epipole is the point of intersection of the line joining the optical centres with the image plane. This is the image in one camera of the optical centre of the other camera.
  • the epipolar plane is a plane defined by a 3D point and the optical centres. It is the plane defined by an image point and the optical centres.
  • the epipolar line is a straight line of intersection between the epipolar plane and the image plane. It is the image in one camera of a ray through the optical centre and image point in the other camera. Epipolar lines all intersect at the epipole.
  • a point in one image creates a line in the other image on which its corresponding point lies. Therefore, the search for correspondences is reduced from a region to a line. This creates an epipolar constraint, which arises because, for image points corresponding to the same 3D point, there is coplanarity between the image points, 3D point and the optical centres.
  • Figure 2 shows a stereo (pinhole) camera.
  • the optical centers centers of the lenses of actual cameras, O LFL and O LFR ) are the origins of the left and right lens reference frames.
  • the line through these two optical centers is called the baseline (21). Any plane
  • epipolar plane (20) in 3-D space that contains the baseline is called an epipolar plane. All scene points in such a plane are projected on a line in each of the images (24), (25). These lines are the epipolar lines.
  • a pair of epipolar lines (26), (27) that share the same epipolar plane are called conjugate epipolar lines. If two points from the image pair correspond, they must lie on conjugate epipolar lines. This is called the epipolar constraint. It reduces the set of possible correspondence candidates for a point in the left image from all points in the right image to only those on the conjugated epipolar line in the right image. For pinhole cameras, the epipolar lines are straight. Due to lens distortion, the epipolar lines may become curved.
  • Epipolar geometry is also disrupted as soon as the two cameras deviate from a stereo configuration whereby the optical axes of both cameras no longer are parallel as shown in Fig IB. If you change the convergence of both cameras over one and only one common axes of rotation there will always be one plane for which the epipolar constraint is valid.
  • This epipolar plane (the "central epipolar plane") (30) goes through the baseline and is normal to the common axes of rotation over which the cameras converge.
  • FIG. IB This concept is illustrated in Figure IB.
  • the images shown are made with a setup using convergent left and right views of the kind used in the patents cited above.
  • Charge - coupled devices, image sensors that separate the spectrum of color into red, green, and blue, or a video camera using another technology, are preferably utilized in the present invention.
  • CCDs are used in conjunction with each of the lenses (24) (25).
  • the CCD is an area CCD, which is square or rectangular in shape and can capture an entire image at once.
  • another type of single video camera is used in conjunction with the single camera 3D adapter to obtain a left and right view of the scene (24) (25).
  • the terms CCD and video camera are used interchangeably to mean an image forming device.
  • the present invention aligns a central horizontal epipolar line (10) parallel to the central scan line of the video camera, so that at least one of the scan lines from the video camera adheres to epipolar geometry.
  • the scan lines preferably adhere to epipolar geometry for all convergence settings.
  • the epipolar line (10) is preferably perpendicular to the single axis of rotation for convergence adjustment. Therefore, if at least one of the scan lines from the video camera has scene points from each of the images on one line, a pair of such scan lines would share the same epipolar plane, and if the two points from the image pair were to correspond, they would lie on conjugate epipolar lines.
  • a method that searches for disparity in the central epipolar plane starts its searching at the CCD scan line closest to the central epipolar plane, but must expand symmetrically within adjustable limits its searching to adjacent CCD scan lines until it solves the correspondence for homologous points in the left and right view of the scene.
  • U.S. Patent No. 5,532,777 describes an adapter that generates a left and right view with variable convergence from a single lens, in which the convergence is coupled mechanically to the zoom or focus setting of the lens.
  • U.S. Patent No. 5,828,913 describes a method to dynamically calibrate such a system by selecting one calibration map matched for a particular convergence setting from multiple calibration maps for multiple convergence settings. Mechanical solutions are very expensive to make when they have to adhere to the tight tolerances required by the solution proposed in the preceding patents.
  • the present invention does not rely on a mechanical coupling to select the correct calibration map. Instead, it extracts from the scene the convergence of the system by scanning for the disparity between left and right views of homologous points, or a red aiming dot illuminated by a laser beam thrown on the scene in the central epipolar plane and halfway, perpendicular to the baseline, the line that connects the centers of left and right view. Because such points lay on conjugate epipolar lines, the correspondence problem reduces to a one dimensional search problem which is simpler and therefore faster. Referring to Fig.
  • the disparity can be constructed as follows: By using the independently obtained range, using a linear position sensor (31) of the laser dot projected on the scene, the convergence can be derived by comparing the disparity with the observed range. Next you select a calibration map matching the current convergence, calibrate the 3-D imager and rectify the left and right views. Once both views are rectified all points in left and right views lay on conjugate epipolar lines and solving the correspondence in left and right view for homologous points is relatively simple and therefore fast.
  • the range of the aim dot is obtained with a laser range finder (40) that works according to the 'time of flight' principle.
  • the CCD scan lines must be exactly parallel to the central epipolar plane (30) of the imager.
  • the CCD scan lines are also preferably perpendicular to the common axis of rotation for the convergence adjustment.
  • a laser beam aimed at the scene and positioned in the central epipolar plane preferably half way and perpendicular to the baseline is used.
  • a laser range finder (40) whether it is one using a linear optical position sensor (31) or a time of flight sensor, needs its laser beam aimed exactly the way it is defined for the laser beam used for aiming as described in the previous paragraph.
  • LRF laser range finder
  • a second laser beam of a laser range finder positioned at an offset d and parallel to the LRF illuminates the target with a dot.
  • the system is rectified it is possible to obtain a non-scaled length estimate of distance f.
  • the offset d is known and the ranges c, b allow a direct scaled estimate off. Comparing the estimated and direct measurements leads to a scaling correction so that the rectified system can measure known units, which depends on the focal length of the lens used with the system.

Abstract

The present invention provides a method for solving the problem of correspondence associated with stereo cameras. The method includes the steps of aligning a central horizontal epipolar line parallel to the scan line of the CCD; choosing at least one scan line of the CCD that adheres to epipolar geometry, conduction of a disparity estimation of the scan line chosen, self calibration of the camera, and rectification of the images. In an alternative embodiment of the present invention, a laser pointer is aimed at the epipolar plane positioned perpendicular to the base line to help aim the device at the image or measurement volume, and allow the camera to adjust itself.

Description

METHOD OF SOLVING THE CORRESPONDENCE PROBLEM IN CONVERGENT STEREOPHOTOGRAMMETRY
REFERENCE TO RELATED APPLICATIONS
This application claims an invention which was disclosed in Provisional Application Number 60/458,074, filed March 27, 2003, entitled "METHOD OF PHOTOGRAMMETRY USING CENTRAL SCAN LINE FOR SOLVING CORRESPONDENCE PROBLEM" and Provisional Application Number 60/529,964, filed December 16, 2003, entitled "EXTRACTING SCALE AND CONVERGENCE OF
A SINGLE CAMERA 3-D ZOOM ADAPTER WITH STRUCTURED LIGHT". The benefit under 35 USC §119(e) of the United States provisional applications is hereby claimed, and the aforementioned applications are hereby incorporated herein by reference.
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
The invention pertains to the field of stereographic photography. More particularly, the invention pertains to a stereographic method for extracting information from an image and finding matching points in multiple images.
DESCRIPTION OF RELATED ART
Stereographic photography is the method of producing images, which are apparently three-dimensional, by recording separate left- and right-eye images. The viewer reconstructs the 3-D image by viewing the two separate 2-D images simultaneously. Stereographic photography has been known since at least the mid- 19th century, when stereo viewers were a popular parlor accessory. Such stereo views have historically been created with two lenses on a single camera, spaced apart by approximately the inter-ocular distance of a human head. Photogrammetry is the process of making measurements through the use of photographs. Stereo photogrammetry uses multiple images (usually left/right images from stereo pairs) to measure objects and positions in three dimensions. To do so, it is necessary to locate the same ("homologous") point in each of the multiple images. The problem of finding homologous image points in multiple images is known as the "problem of correspondence". As the camera or lens separation increases, so do the differences in the scene as recorded by each camera or lens, thus making it difficult to match corresponding points in the two images.
All of the multiple-lens or multiple-camera systems in the prior art have severe drawbacks, in the added complexity and cost of duplicating the complete camera system and the synchronization of the two separate images (this is especially a problem in film (non- video) applications). In addition, the use of two separate lenses (whether on one camera or two) introduces problems of synchronizing focus, view, and the images themselves.
hi order to minimize these drawbacks, the inventor of the present invention has developed methods and apparatus for using converging mirrors to enable multiple images to be obtained by a single lens, and has received the following U.S. patents: "SINGLE LENS APPARATUS FOR THREE-DIMENSIONAL IMAGING HAVING FOCUS- RELATED CONVERGENCE COMPENSATION", US Patent No. 5,532,777; "METHOD AND APPARATUS FOR THREE DIMENSIONAL MEASUREMENT AND
IMAGING HAVING FOCUS-RELATED CONVERGENCE COMPENSATION", US Patent No. 5,828,913; and "APPARATUS FOR THREE-DIMENSIONAL MEASUREMENT AND IMAGING HAVING FOCUS-RELATED CONVERGENCE COMPENSATION", US Patent No. 5,883,662. These patents are herein incorporated by reference.
SUMMARY OF THE INVENTION
One embodiment of the present invention is a method for solving the problem of correspondence associated with convergent 3D cameras. The method includes the steps of aligning a central horizontal epipolar line parallel to the scan line of a charge-coupled device (CCD), choosing at least one scan line of the CCD that adheres to epipolar geometry, conduction of a disparity estimation of the scan line chosen, self calibration of the camera, and rectification of the images. In an alternative embodiment of the present invention, a laser pointer is aimed within the central epipolar plane and perpendicular to the base line to help aim the device at the object enclosed by the measurement volume, and allow the camera to adjust itself.
Another embodiment of the present invention is a method of extracting the settings of a 3D adapter with variable convergence from an image, using structured light, hi an alternative embodiment the magnification settings can be obtained from the image.
This method includes the steps of aligning a central horizontal epipolar line parallel to the scan line of the CCD; choosing at least one scan line of the CCD that adheres to epipolar geometry; illuminating the scene with a laser beam positioned within the central epipolar plane and halfway and perpendicular to the base line; obtaining an independent means of range estimation with an optical linear position sensor aimed under an angle of equal or less than 90 degrees to the laser beam thrown on the scene by the above described laser pointer; conducting a disparity estimation of the laser dot on the scan line chosen; and reconstructing the convergence from the range and the disparity.
In an alternative embodiment, the independent means for range estimation is a laser range finder using the 'time of flight' principle with its laser beam positioned within the central epipolar plane and positioned halfway and perpendicular to the base line.
In another embodiment, a second laser range finder positioned parallel to the first beam and halfway and perpendicular to the base line throws a laser dot on the scene so the scale can be extracted from the image of the scene.
BRIEF DESCRIPTION OF THE DRAWINGS
Figs. 1 A and IB show how correspondences are constrained to conjugate epipolar lines in a parallel camera setup and a convergent camera setup, respectively.
Fig. 2 shows the general concept of epipolar geometry. Fig. 3 shows convergence extraction with linear PSD in a method of the present invention.
Fig. 4 shows convergence extraction with a laser range finder in a method of the present invention.
Fig. 5 shows a structured light set up to extract convergence and scale in a method of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The present invention relies on the fundamentals of epipolar geometry to solve the stereo correspondence problem of finding homologous image points in multiple images. Epipolar geometry is a property specific to stereo cameras (see Fig 1 A).
Epipolar geometry is a geometric relationship between two perspective cameras.
The epipole is the point of intersection of the line joining the optical centres with the image plane. This is the image in one camera of the optical centre of the other camera. The epipolar plane is a plane defined by a 3D point and the optical centres. It is the plane defined by an image point and the optical centres. The epipolar line is a straight line of intersection between the epipolar plane and the image plane. It is the image in one camera of a ray through the optical centre and image point in the other camera. Epipolar lines all intersect at the epipole.
Regarding correspondences between images, a point in one image creates a line in the other image on which its corresponding point lies. Therefore, the search for correspondences is reduced from a region to a line. This creates an epipolar constraint, which arises because, for image points corresponding to the same 3D point, there is coplanarity between the image points, 3D point and the optical centres.
Figure 2 shows a stereo (pinhole) camera. The optical centers (centers of the lenses of actual cameras, OLFL and OLFR) are the origins of the left and right lens reference frames. The line through these two optical centers is called the baseline (21). Any plane
(20) in 3-D space that contains the baseline is called an epipolar plane. All scene points in such a plane are projected on a line in each of the images (24), (25). These lines are the epipolar lines. A pair of epipolar lines (26), (27) that share the same epipolar plane are called conjugate epipolar lines. If two points from the image pair correspond, they must lie on conjugate epipolar lines. This is called the epipolar constraint. It reduces the set of possible correspondence candidates for a point in the left image from all points in the right image to only those on the conjugated epipolar line in the right image. For pinhole cameras, the epipolar lines are straight. Due to lens distortion, the epipolar lines may become curved.
Epipolar geometry is also disrupted as soon as the two cameras deviate from a stereo configuration whereby the optical axes of both cameras no longer are parallel as shown in Fig IB. If you change the convergence of both cameras over one and only one common axes of rotation there will always be one plane for which the epipolar constraint is valid. This epipolar plane (the "central epipolar plane") (30) goes through the baseline and is normal to the common axes of rotation over which the cameras converge.
This concept is illustrated in Figure IB. The images shown are made with a setup using convergent left and right views of the kind used in the patents cited above. Charge - coupled devices, image sensors that separate the spectrum of color into red, green, and blue, or a video camera using another technology, are preferably utilized in the present invention. In one example, CCDs are used in conjunction with each of the lenses (24) (25). hi this example, the CCD is an area CCD, which is square or rectangular in shape and can capture an entire image at once. Alternatively, another type of single video camera is used in conjunction with the single camera 3D adapter to obtain a left and right view of the scene (24) (25). Herein, the terms CCD and video camera are used interchangeably to mean an image forming device.
The present invention aligns a central horizontal epipolar line (10) parallel to the central scan line of the video camera, so that at least one of the scan lines from the video camera adheres to epipolar geometry. The scan lines preferably adhere to epipolar geometry for all convergence settings. The epipolar line (10) is preferably perpendicular to the single axis of rotation for convergence adjustment. Therefore, if at least one of the scan lines from the video camera has scene points from each of the images on one line, a pair of such scan lines would share the same epipolar plane, and if the two points from the image pair were to correspond, they would lie on conjugate epipolar lines. A method that searches for disparity in the central epipolar plane starts its searching at the CCD scan line closest to the central epipolar plane, but must expand symmetrically within adjustable limits its searching to adjacent CCD scan lines until it solves the correspondence for homologous points in the left and right view of the scene.
With convergent three-dimensional imagers one can resurrect epipolar geometry for points in 3-D space that do not lay in the above defined central epipolar plane, by rectifying left and right views. Rectification is only possible when both views are calibrated. U.S. Patent No. 5,532,777 describes an adapter that generates a left and right view with variable convergence from a single lens, in which the convergence is coupled mechanically to the zoom or focus setting of the lens. U.S. Patent No. 5,828,913 describes a method to dynamically calibrate such a system by selecting one calibration map matched for a particular convergence setting from multiple calibration maps for multiple convergence settings. Mechanical solutions are very expensive to make when they have to adhere to the tight tolerances required by the solution proposed in the preceding patents.
In the co-pending patent application "ACQUISITION OF 3-D SCENES WITH A
SINGLE HAND HELD CAMERA", U.S. Patent Application number 09/595,402, herein incorporated by reference, a method for self calibration was proposed that took advantage of the fact that the mechanism used in adapters defined in U.S. Patent No. 5,532,777 has a predictable effect on the disparity among homologous points. However the method described is slow in part because it relies on two-dimensional correspondence solving.
The present invention does not rely on a mechanical coupling to select the correct calibration map. Instead, it extracts from the scene the convergence of the system by scanning for the disparity between left and right views of homologous points, or a red aiming dot illuminated by a laser beam thrown on the scene in the central epipolar plane and halfway, perpendicular to the baseline, the line that connects the centers of left and right view. Because such points lay on conjugate epipolar lines, the correspondence problem reduces to a one dimensional search problem which is simpler and therefore faster. Referring to Fig. 3, once the correspondence for left and right view of the aiming dot (or homologous points) is known, the disparity can be constructed as follows: By using the independently obtained range, using a linear position sensor (31) of the laser dot projected on the scene, the convergence can be derived by comparing the disparity with the observed range. Next you select a calibration map matching the current convergence, calibrate the 3-D imager and rectify the left and right views. Once both views are rectified all points in left and right views lay on conjugate epipolar lines and solving the correspondence in left and right view for homologous points is relatively simple and therefore fast.
As shown in Fig. 4, in an alternative embodiment, the range of the aim dot is obtained with a laser range finder (40) that works according to the 'time of flight' principle.
Some design requirements of the 3D imager to make this scheme work are:
Change of convergence for both views must be over one, and only one, common axis of rotation.
The CCD scan lines must be exactly parallel to the central epipolar plane (30) of the imager. The CCD scan lines are also preferably perpendicular to the common axis of rotation for the convergence adjustment.
To aid in detecting the disparity of homologous points in low contrast scenes, a laser beam aimed at the scene and positioned in the central epipolar plane preferably half way and perpendicular to the baseline is used. A laser range finder (40), whether it is one using a linear optical position sensor (31) or a time of flight sensor, needs its laser beam aimed exactly the way it is defined for the laser beam used for aiming as described in the previous paragraph.
hi a third embodiment shown in Fig. 5, both convergence and scale are extracted from the image. The system works as follows:
Convergence extraction:
With a laser range finder (LRF) the convergence can be reconstructed using the range c, the known baseline z, and the directions of the line of sight for the dot for left [XL, j and right view [X , VR]. Once the convergence is known the system can be calibrated with a static calibration map matching the current convergence angle and the image can be rectified.
Scale extraction:
Any time the lens zoom setting is changed, a second laser beam of a laser range finder positioned at an offset d and parallel to the LRF illuminates the target with a dot.
Because the system is rectified it is possible to obtain a non-scaled length estimate of distance f. The offset d is known and the ranges c, b allow a direct scaled estimate off. Comparing the estimated and direct measurements leads to a scaling correction so that the rectified system can measure known units, which depends on the focal length of the lens used with the system.
Accordingly, it is to be understood that the embodiments of the invention herein described are merely illustrative of the application of the principles of the invention. Reference herein to details of the illustrated embodiments is not intended to limit the scope of the claims, which themselves recite those features regarded as essential to the invention.

Claims

What is claimed is:
1. A method of solving a correspondence problem associated with a convergent 3D imaging system comprising the steps of:
a) aligning a horizontal epipolar line parallel to a central scan line of a video image sensor such that at least one scan line of the video image sensor adheres to epipolar geometry; and
b) extracting a convergence of the system from a scene.
2. The method of claim 1, wherein, in step b), a disparity search in a central epipolar plane begins at a scan line closest to a central epipolar plane, and expands symmetrically within adjustable limits to adjacent scan lines until it solves a correspondence for homologous points in a left view and a right view of a scene.
3. The method of claim 1, wherein step b) is accomplished by projecting an aiming dot illuminated by a laser beam within a central epipolar plane.
4. The method of claim 1 , wherein the horizontal epipolar line is perpendicular to a single axis of rotation.
5. The method of claim 1, wherein the central scan line comprises a scene point from each image on an epipolar line.
6. The method of claim 1, wherein the video image sensor is a charge-coupled device.
7. The method of claim 1, further comprising the step of self calibrating the video image sensor.
8. The method of claim 1, further comprising the step of rectifying at least one image produced by the video image sensor.
9. A method of extracting at least one setting of a 3D adapter with variable convergence from an image, comprising the steps of: a) aligning a horizontal epipolar line parallel to a central scan line of a video image sensor such that at least one scan line of the video image sensor adheres to epipolar geometry;
b) illuminating the image with a first laser beam aimed within a central epipolar plane;
c) measuring a range with a sensor;
d) conducting a disparity estimation of a laser dot on the central scan line; and
e) reconstructing a convergence from the range and the disparity estimation.
10. The method of claim 9, wherein, in step d), searching for the disparity in a central epipolar plane begins at a scan line closest to a central epipolar plane, and expands symmetrically within adjustable limits to adjacent scan lines until it solves a correspondence for homologous points in a left view and a right view of a scene.
11. The method of claim 9, wherein step d) is accomplished by projecting an aiming dot illuminated by a laser beam within a central epipolar plane.
12. The method of claim 9, wherein step e) is accomplished by the substeps of:
i) comparing the disparity with the range;
ii) selecting a calibration map matching the convergence;
iii) calibrating the video image sensor;
iv) rectifying a left view and a right view; and
v) solving a correspondence in the left view and the right view for homologous points.
13. The method of claim 9, wherein step c) is performed by measuring a time of flight on the first laser beam.
14. The method of claim 9, further comprising the steps of:
f) positioning a second laser beam parallel at an offset d and parallel to the first laser beam of step b);
g) obtaining a non-scaled length estimate of a distance f;
h) deriving a direct scaled estimate of f from the offset d and the range; and
i) deriving a scaling correction by comparing an estimated measurement and a direct measurement.
15. The method of claim 9, wherein, in step a), the scan line is also perpendicular to a common axis of convergence adjustment.
16. The method of claim 9, wherein a change of convergence for both views of the image is over one common axis of rotation.
17.The method of claim 9, wherein the sensor is a time of flight sensor.
18. The method of claim 9, wherein the sensor is an optical position sensor.
19. The method of claim 9, wherein the video image sensor is a charge-coupled device.
20. The method of claim 9, wherein a setting extracted by the method is a magnification setting.
PCT/US2004/009372 2003-03-27 2004-03-26 Method of solving the correspondence problem in convergent stereophotogrammetry WO2004088245A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US45807403P 2003-03-27 2003-03-27
US60/458,074 2003-03-27
US52996403P 2003-12-16 2003-12-16
US60/529,964 2003-12-16

Publications (1)

Publication Number Publication Date
WO2004088245A1 true WO2004088245A1 (en) 2004-10-14

Family

ID=33135086

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/009372 WO2004088245A1 (en) 2003-03-27 2004-03-26 Method of solving the correspondence problem in convergent stereophotogrammetry

Country Status (1)

Country Link
WO (1) WO2004088245A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2891451A1 (en) * 2005-10-04 2007-04-06 Amplitude Soc Par Actions Simp Navigation system for orthopedic surgery, has operational area observation unit provided as box connected to telescopic column via ball and socket joint type articulation and oriented with respect to column using adjusting handles
CN101853528A (en) * 2010-05-10 2010-10-06 沈阳雅克科技有限公司 Hand-held three-dimensional surface information extraction method and extractor thereof
CN102519436A (en) * 2011-12-28 2012-06-27 武汉大学 Chang'e-1 (CE-1) stereo camera and laser altimeter data combined adjustment method
CN101563709B (en) * 2006-12-18 2013-07-31 皇家飞利浦电子股份有限公司 Calibrating a camera system
CN104603672A (en) * 2012-03-30 2015-05-06 汤姆逊许可公司 Laser projector system with graphical pointer
WO2015118467A1 (en) * 2014-02-05 2015-08-13 Creaform Inc. Structured light matching of a set of curves from two cameras
US9129378B2 (en) 2011-09-07 2015-09-08 Thomson Licensing Method and apparatus for recovering a component of a distortion field and for determining a disparity field
CN104897176A (en) * 2015-06-29 2015-09-09 北京建筑大学 Multicore parallel photogrammetry block adjustment method
DE102018115176B3 (en) 2018-06-25 2019-08-01 Sick Ag Stereo camera and alignment method
CN110148177A (en) * 2018-02-11 2019-08-20 百度在线网络技术(北京)有限公司 For determining the method, apparatus of the attitude angle of camera, calculating equipment, computer readable storage medium and acquisition entity
JP2019190962A (en) * 2018-04-24 2019-10-31 大成建設株式会社 Image horizontal adjusting device and program of the same, and drawing generating system
US10643343B2 (en) 2014-02-05 2020-05-05 Creaform Inc. Structured light matching of a set of curves from three cameras

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GOSHTASBY A ET AL: "DESIGN OF A SINGLE-LENS STEREO CAMERA SYSTEM", PATTERN RECOGNITION, PERGAMON PRESS INC. ELMSFORD, N.Y, US, vol. 26, no. 6, 1 June 1993 (1993-06-01), pages 923 - 937, XP000385020, ISSN: 0031-3203 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2891451A1 (en) * 2005-10-04 2007-04-06 Amplitude Soc Par Actions Simp Navigation system for orthopedic surgery, has operational area observation unit provided as box connected to telescopic column via ball and socket joint type articulation and oriented with respect to column using adjusting handles
CN101563709B (en) * 2006-12-18 2013-07-31 皇家飞利浦电子股份有限公司 Calibrating a camera system
CN101853528A (en) * 2010-05-10 2010-10-06 沈阳雅克科技有限公司 Hand-held three-dimensional surface information extraction method and extractor thereof
US9129378B2 (en) 2011-09-07 2015-09-08 Thomson Licensing Method and apparatus for recovering a component of a distortion field and for determining a disparity field
CN102519436A (en) * 2011-12-28 2012-06-27 武汉大学 Chang'e-1 (CE-1) stereo camera and laser altimeter data combined adjustment method
CN104603672A (en) * 2012-03-30 2015-05-06 汤姆逊许可公司 Laser projector system with graphical pointer
WO2015118467A1 (en) * 2014-02-05 2015-08-13 Creaform Inc. Structured light matching of a set of curves from two cameras
US10271039B2 (en) 2014-02-05 2019-04-23 Creaform Inc. Structured light matching of a set of curves from two cameras
US10643343B2 (en) 2014-02-05 2020-05-05 Creaform Inc. Structured light matching of a set of curves from three cameras
CN104897176A (en) * 2015-06-29 2015-09-09 北京建筑大学 Multicore parallel photogrammetry block adjustment method
CN110148177A (en) * 2018-02-11 2019-08-20 百度在线网络技术(北京)有限公司 For determining the method, apparatus of the attitude angle of camera, calculating equipment, computer readable storage medium and acquisition entity
JP2019190962A (en) * 2018-04-24 2019-10-31 大成建設株式会社 Image horizontal adjusting device and program of the same, and drawing generating system
JP7076097B2 (en) 2018-04-24 2022-05-27 大成建設株式会社 Image leveling device and its program, and drawing generation system
DE102018115176B3 (en) 2018-06-25 2019-08-01 Sick Ag Stereo camera and alignment method

Similar Documents

Publication Publication Date Title
TWI791728B (en) Augmented reality display with active alignment
US6643396B1 (en) Acquisition of 3-D scenes with a single hand held camera
US10869024B2 (en) Augmented reality displays with active alignment and corresponding methods
CN113256730B (en) System and method for dynamic calibration of an array camera
JP5014979B2 (en) 3D information acquisition and display system for personal electronic devices
EP0811876B1 (en) Method and apparatus for three-dimensional measurement and imaging having focus-related convergence compensation
US6512892B1 (en) 3D camera
JP4115801B2 (en) 3D imaging device
EP1087336A3 (en) Apparatus and method for stereoscopic image processing
GB2354391A (en) 3D camera having maximum parallax warning.
GB2388896A (en) An apparatus for and method of aligning a structure
WO2004088245A1 (en) Method of solving the correspondence problem in convergent stereophotogrammetry
CN109668509A (en) Based on biprism single camera three-dimensional measurement industrial endoscope system and measurement method
JP4432462B2 (en) Imaging apparatus and method, imaging system
WO2011134215A1 (en) Stereoscopic camera device
US20160241841A1 (en) A stereoscopic assembly and method for manufacturing same
CN108413941A (en) A kind of simple and efficient distance measuring method based on cheap binocular camera
Montgomery et al. Stereoscopic camera design
JP4020911B2 (en) 3D imaging device
Ariff et al. Stereo-image capture tool for forensic mapping of surveillance video footage
Berestova Stereo radiography: distortion correction and perception improvement

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase