WO1993003579A1 - Three-dimensional vision system - Google Patents

Three-dimensional vision system Download PDF

Info

Publication number
WO1993003579A1
WO1993003579A1 PCT/GB1991/001638 GB9101638W WO9303579A1 WO 1993003579 A1 WO1993003579 A1 WO 1993003579A1 GB 9101638 W GB9101638 W GB 9101638W WO 9303579 A1 WO9303579 A1 WO 9303579A1
Authority
WO
WIPO (PCT)
Prior art keywords
plane
projector
ground plane
centre
stripes
Prior art date
Application number
PCT/GB1991/001638
Other languages
French (fr)
Inventor
Andrew Blake
Original Assignee
Isis Innovation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isis Innovation Limited filed Critical Isis Innovation Limited
Publication of WO1993003579A1 publication Critical patent/WO1993003579A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns

Definitions

  • the invention relates to a three-dimensional vision system for optical inspection and robotics.
  • the invention concerns an active vision system where the object field to be viewed is illuminated in a particular manner and the perceived image is analysed having regard to the specific manner of illumination.
  • One kind of active vision system uses structured light, the object field being illuminated in a predetermined pattern. This may be achieved by a mechanical scanning device or by a projector system which, by the use of optical masking, projects a static pattern of light on to the object field.
  • Hu and Stockman 3D surface solution using structured light and constant propagation
  • a three-dimensional vision system which comprises:
  • a camera for viewing the object field having an optical centre
  • a base line of length t constituted by a straight line joining the centre of projection to the camera optical centre, epipolar lines in the projector plane being defined as lines co-planar with the base line, characterised in that the structured light comprises two sets of parallel stripes on the ground plane, one
  • the object field is restricted by a plane parallel to the ground plane at a height h, h being given by the equation
  • e is the system measurement error referred to the projector plane.
  • both stripe sets emanate from the same optical aperture.
  • this requires corner detection in the analysis of the received image and it is found that edge detection can be more accurate.
  • lines common to projected planes of light from each projector intersect, thereby giving a virtual centre of projection and a virtual projector plane.
  • Figure 1 is a diagram of a three-dimensional vision system in accordance with the invention with two projectors;
  • Figure 2 is a diagram illustrating the vertical centre of projection and the vertical projector plane for the two-projector system
  • Figure 3 is a schematic diagram of a single projector system
  • Figure 4 is a diagram of the projector plane of the Figure 1 arrangement
  • Figure 5 is a diagram illustrating the selection of the skew angle of the grid lines with respect to the epipolar lines of the system.
  • Figure 6 is a diagram illustrating the selection of the working volume for the system.
  • the system comprises two projectors A and B with masks 3 and 4 which define respective sets of parallel lines. The sets are crossed.
  • Light-planes 5 are shown emanating from projector A and a light plane 6 is shown emanating from projector B. The light is directed to an object field having a ground plane 7.
  • a video camera 8 is provided and this has an image plane 9.
  • Figure 3 shows that when the two projectors are in a suitable mutual alignment there is then a single virtual centre of projection 12 - the point through which all planes in A and B projectors pass.
  • a fine height adjustment ensures, as part of the stripe calibration process, that all planes pass through a common point.
  • a virtual projection plane 13 is chosen parallel to the planes of both projectors so that stripes are parallel on the virtual plane. Then a fine adjustment for the camera ensures that epipolar lines are also parallel on this projection plane.
  • Figure 3 illustrates a single projector system to which the two-projector system is equivalent under the conditions of Figure 2. It will be seen that the object 10 being viewed has crossed lines projected on to it and the image of these as detected by the camera is computer analysed. The purpose of using two projectors rather than one is so that each can carry its own set of parallel stripes, and can be switched on or off independently. Thus the two sets of stripes are imaged separately in two frames, and only overlaid computationally in the image plane to form a grid. This means that grid points in the image, which are now virtual intersections of stripe-pairs, can be localised as finely as the accuracy that the edge- detector will allow.
  • the projector plane 13 there is a grid of lines constituted by intersecting parallel sets A and .B.
  • the straight line joining the centre of projection 12 with the optical centre 15 of the camera is a base line shown at 16, having a length t.
  • a target point T on the object 10 defines with the base line 16 a plane, the intersection of which with the projector plane 13 is an epipolar line 17. It is clear that a stripe intersection corresponding to the position of target T must lie in the projector plane on the epipolar line 17. Given a stripe-intersection at P in the image, it is required to identify which pair (a,b) of stripes in the projector plane generated it.
  • the epipolar constraint (figure 3) greatly limits candidate solutions for (a,b) but there may still be some ambiguity.
  • the "A" stripes In order to achieve optimal suppression of ambiguity, the "A" stripes must be rotated to a near- degenerate alignment.
  • the angle £ between the epipolar line through a particular grid crossing and the A stripe should be just large enough that the nodes immediately adjacent on the A stripe should clear the thickened epipolar line (figure 5) . This is achieved by choosing sin ⁇ > p
  • e is the system error parameter whereby epipolar lines are eff ectively thickened .
  • Thi s parameter can be measured experimentally . If an arrangement is used in which the baseline is parallel to the proj ector plane epipolar lines are mutually paral l el on the pro j e ctor plane , s o tha t the rotational alignment can be achieved simultaneously throughout the plane . Now the bound for delta can be
  • f is the projector's focal length and H is the height of its centre of projection above the ground- plane. If, for some reason, it were essential to use a non-parallel baseline then a matching non-parallel stripe set could be used in the projector to maintain simultaneous alignment.
  • the answer can be expressed as a design rule which represents a trade-off between the stripe densities r , r_ on the ground-plane, for projectors A, B respectively, measurement error e referred to the projector plane and the height h of the working volume above the ground plane.
  • the design rule stated here applies to the case in which the projector-camera baseline is parallel to the ground plane, as in figure 6. In that case, for a given error e , the working volume is bounded above by a plane parallel to the ground plane, at a height
  • H is the height of ground
  • f ' is its focal length
  • t is the length of the camera-projector baseline
  • the design rule depends to some extent on the angle £ small, certainly in relation to the angle 90 -( which stripes B make with the epipolar lines.
  • An image stripe with normal n is distinguished as positive or negative according to sign of the scalar product n.e, where e is the direction of an epipolar line that cuts the stripe.
  • the grid-point matching process can be quickened considerably by computing certain tables offline, that is, at calibration time.
  • Stripe-range table - ranges of the indices a, b indicating, at each image point P, which stripes could possibly be imaged at the point P, given the working volume constraints. In practice this cuts down enormously the number of grid-nodes a, b that need be tested online. In our system, the stripe-range table, together with the stripe-polarity test above cuts down the number of grid-nodes to be tested from 2500 to less than 10.
  • Node-epipolar table a table containing the coefficients of the equation of the epipolar line in the image plane for each grid-node is constructed off-line. Then, in place of the test in the projector plane for intersection between the epipolar line and a grid-node, the dual test is carried out.
  • the dual of the grid-node in the projector plane is its corresponding epipolar line in the image plane.
  • the dual of the epipolar in the projector plane is the image point P.
  • the node-epipolar table requires only a modest amount of storage space (2500 nodes in our system) and makes the intersection test very rapid. The original test in the projector plane would require either on-line generation of epipolars, which is slow, or storage of epipolar lines for all possible image

Abstract

A three-dimensional vision system for optical inspection and robotics. One or two projectors establish structured light in an object field comprising two sets of parallel stripes on a ground plane, one set having a periodicity on the ground plane of 1/rA and making a first angle with epipolar lines of the system and the other set having a periodicity on the ground plane of 1/rB and making a second, much larger angle with the epipolar lines.

Description

THREE-DIMENSIONAL VISION SYSTEM
The invention relates to a three-dimensional vision system for optical inspection and robotics. In particular the invention concerns an active vision system where the object field to be viewed is illuminated in a particular manner and the perceived image is analysed having regard to the specific manner of illumination. One kind of active vision system uses structured light, the object field being illuminated in a predetermined pattern. This may be achieved by a mechanical scanning device or by a projector system which, by the use of optical masking, projects a static pattern of light on to the object field. Such a system is proposed by Hu and Stockman ("3D surface solution using structured light and constant propagation" - IEEE Trans. PAMI No 4 pages 390-402, 1989) who suggested the use of a grid of light from a single projector to illuminate the scene, providing easily-matched artificial "surface features" . However, the system was incapable of unambiguous operation - for each imaged grid point, several candidate 3D points were generated. Proposed solutions to the ambiguity problem have involved coding stripes by colour, thickness, or pattern. Colour coding cannot comfortably be combined with narrow band optical filtering for daylight operation. Thickness coding may be deceived when the thickness of a stripe is modified by surface effects. Space coding relies on projected codes remaining substantially undamaged by surface relief . The present invention provides a solution to the problem of resolving ambiguities in such a system. According to the invention there is provided a three-dimensional vision system which comprises:
(a) an object field having a ground plane;
(b) a projector system for illuminating the object field with structured light, the projector system having a focal length f', a projector plane and centre of projection a height H from the ground plane;
(c) a camera for viewing the object field, the camera having an optical centre; and (d) a base line of length t constituted by a straight line joining the centre of projection to the camera optical centre, epipolar lines in the projector plane being defined as lines co-planar with the base line, characterised in that the structured light comprises two sets of parallel stripes on the ground plane, one
1 set having a periodicity on the ground plane of /r. and making a first angle with the epipolar lines and the other set having a periodicity on the ground plane 1 of /r and making a second, much larger angle with the epipolar lines.
In a preferred embodiment of the invention the object field is restricted by a plane parallel to the ground plane at a height h, h being given by the equation
h = H
Figure imgf000004_0001
where e is the system measurement error referred to the projector plane.
Angling the stripes and restricting the working volume in the manner described ensures that ambiguities are avoided.
It can be arranged, as in the Hu and Stockman system, for example, that both stripe sets emanate from the same optical aperture. However, this requires corner detection in the analysis of the received image and it is found that edge detection can be more accurate. Accordingly, it is preferred to use separate projectors for the two sets of stripes, and the system can be regarded as trinocular, having two projectors and a camera. In order to establish a centre of projection and a projector plane for a system employing two projectors, it is arranged that lines common to projected planes of light from each projector intersect, thereby giving a virtual centre of projection and a virtual projector plane.
The invention will further be described with reference to the accompanying drawings, of which:-
Figure 1 is a diagram of a three-dimensional vision system in accordance with the invention with two projectors;
Figure 2 is a diagram illustrating the vertical centre of projection and the vertical projector plane for the two-projector system; Figure 3 is a schematic diagram of a single projector system;
Figure 4 is a diagram of the projector plane of the Figure 1 arrangement;
Figure 5 is a diagram illustrating the selection of the skew angle of the grid lines with respect to the epipolar lines of the system; and
Figure 6 is a diagram illustrating the selection of the working volume for the system.
Referring to Figure 1 the system comprises two projectors A and B with masks 3 and 4 which define respective sets of parallel lines. The sets are crossed. Light-planes 5 are shown emanating from projector A and a light plane 6 is shown emanating from projector B. The light is directed to an object field having a ground plane 7. A video camera 8 is provided and this has an image plane 9.
Figure 3 shows that when the two projectors are in a suitable mutual alignment there is then a single virtual centre of projection 12 - the point through which all planes in A and B projectors pass. A fine height adjustment ensures, as part of the stripe calibration process, that all planes pass through a common point. A virtual projection plane 13 is chosen parallel to the planes of both projectors so that stripes are parallel on the virtual plane. Then a fine adjustment for the camera ensures that epipolar lines are also parallel on this projection plane.
Figure 3 illustrates a single projector system to which the two-projector system is equivalent under the conditions of Figure 2. It will be seen that the object 10 being viewed has crossed lines projected on to it and the image of these as detected by the camera is computer analysed. The purpose of using two projectors rather than one is so that each can carry its own set of parallel stripes, and can be switched on or off independently. Thus the two sets of stripes are imaged separately in two frames, and only overlaid computationally in the image plane to form a grid. This means that grid points in the image, which are now virtual intersections of stripe-pairs, can be localised as finely as the accuracy that the edge- detector will allow. In practice this means that an order of magnitude of sub-pixel resolution is possible which is better than can be achieved by a corner detector. The disadvantage is that motion is no longer frozen if the two frames are captured sequentially. However, the two frames could be captured simultaneously using a standard 3-colour CCD camera, leaving the third colour channel free as an intensity reference, providing some degree of rejection of surface markings and discontinuities.
In the projector plane 13 there is a grid of lines constituted by intersecting parallel sets A and .B. The straight line joining the centre of projection 12 with the optical centre 15 of the camera is a base line shown at 16, having a length t. A target point T on the object 10 defines with the base line 16 a plane, the intersection of which with the projector plane 13 is an epipolar line 17. It is clear that a stripe intersection corresponding to the position of target T must lie in the projector plane on the epipolar line 17. Given a stripe-intersection at P in the image, it is required to identify which pair (a,b) of stripes in the projector plane generated it. The epipolar constraint (figure 3) greatly limits candidate solutions for (a,b) but there may still be some ambiguity. It is clear that reducing system error and increasing stripe spacing would reduce ambiguity. However, ambiguity can be eliminated if a "near-degenerate" arrangement is used, as in figure 4. By arranging for epipolar lines to be almost parallel to one set of stripes in the grid - say the "A" set - the number of candidate solutions for intersections of stripes with a given epipolar line is greatly reduced. If, further, the intersection can be guaranteed to lie within a certain window W in the projector plane then just one unique solution remains. This can be achieved by restricting the working volume over which the range-sensor operates. Provided working volume limits are compatible with measurement tolerance, ambiguity is avoided. If, for example, measurement error bounds are deliberately slackened, by a factor of 2 the 3D positions of about half of the grid-nodes become ambiguous. Hence performance is dependent on careful analysis of error and epipolar geometry.
In order to achieve optimal suppression of ambiguity, the "A" stripes must be rotated to a near- degenerate alignment. Let us denote by p A,,Pτ o the periodicity, on the projection plane, of stripes from projectors A and B respectively. The angle £ between the epipolar line through a particular grid crossing and the A stripe should be just large enough that the nodes immediately adjacent on the A stripe should clear the thickened epipolar line (figure 5) . This is achieved by choosing sin © > p
pB Where e is the system error parameter whereby epipolar lines are eff ectively thickened . Thi s parameter can be measured experimentally . If an arrangement is used in which the baseline is parallel to the proj ector plane epipolar lines are mutually paral l el on the pro j e ctor plane , s o tha t the rotational alignment can be achieved simultaneously throughout the plane . Now the bound for delta can be
1 eexxpprreesssseedd iinn tteerrmmss ooff tthhee density r_ (in mm ) of stripes on the ground-plane :
Figure imgf000008_0001
f where f is the projector's focal length and H is the height of its centre of projection above the ground- plane. If, for some reason, it were essential to use a non-parallel baseline then a matching non-parallel stripe set could be used in the projector to maintain simultaneous alignment.
Assuming near-degenerate alignment as specified above, how much must the working volume be restricted in order to guarantee a unique solution for the stripe intersection? The answer can be expressed as a design rule which represents a trade-off between the stripe densities r , r_ on the ground-plane, for projectors A, B respectively, measurement error e referred to the projector plane and the height h of the working volume above the ground plane. The design rule stated here applies to the case in which the projector-camera baseline is parallel to the ground plane, as in figure 6. In that case, for a given error e , the working volume is bounded above by a plane parallel to the ground plane, at a height
H M
where H is the height of
Figure imgf000009_0001
ground, f ' is its focal length and t is the length of the camera-projector baseline.
The implications of this rule for system design are most easily appreciated by looking at the approximation in the limit that h << H (the top of the working volume is well below the centre of the projector) and that e << p. (measurement error much smaller than stripe spacing) . Furthermore, we take projector plane error e to be the bounding value. In that case, the design rule approximates to
Figure imgf000009_0002
f cos 0 where Λ = 1/(rAr„), the area of a grid square on the baseplate and eχ is area referred to the image plane.
The principal design trade-offs are then as follows. Overall system performance is limited by the angular measurement accuracy. Working-volume height h is traded off against the size of the smallest resolvable area, . Resolvable area and/or working volume constraints can be improved by enlarging the baseline t which also improves depth resolution, but at the expense of an increased incidence of occlusion of the projected pattern.
The design rule depends to some extent on the angle £ small, certainly in relation to the angle 90 -( which stripes B make with the epipolar lines.
Although the above description has been made with reference to thin stripes, these are an inefficient use of illuminant power. In practice, thick stripes are used, preferably a 1:1 mark-to-space ratio and positive and negative light/dark transitions are treated as "mathematical" stripes. They can be precisely localised by a directional edge-detector. Furthermore, positive and negative stripes can be distinguished by their orientation in the image-plane, hence effectively doubling the achievable stripe density as specified by the above design rule. Of course the orientation n of a stripe as measured in the image plane (n is the image-stripe normal, pointing towards the light side of the stripe) depends on the orientation of the object surface- Notwithstanding, the following result can be proved which provides a robust constraint which immediately distinguishes positive and negative stripes:
An image stripe with normal n is distinguished as positive or negative according to sign of the scalar product n.e, where e is the direction of an epipolar line that cuts the stripe.
Equivalently, no matter how the orientation of the object surface varies, an image stripe never rotates through an epipolar line.
The grid-point matching process can be quickened considerably by computing certain tables offline, that is, at calibration time.
Stripe-range table - ranges of the indices a, b indicating, at each image point P, which stripes could possibly be imaged at the point P, given the working volume constraints. In practice this cuts down enormously the number of grid-nodes a, b that need be tested online. In our system, the stripe-range table, together with the stripe-polarity test above cuts down the number of grid-nodes to be tested from 2500 to less than 10.
Node-epipolar table - a table containing the coefficients of the equation of the epipolar line in the image plane for each grid-node is constructed off-line. Then, in place of the test in the projector plane for intersection between the epipolar line and a grid-node, the dual test is carried out. The dual of the grid-node in the projector plane is its corresponding epipolar line in the image plane. The dual of the epipolar in the projector plane is the image point P. " The node-epipolar table requires only a modest amount of storage space (2500 nodes in our system) and makes the intersection test very rapid. The original test in the projector plane would require either on-line generation of epipolars, which is slow, or storage of epipolar lines for all possible image
7 positions (say 10 of them) which is impractical.
Testing against a "thick" epipolar line to allow for image measurement error is done by ensuring that node-epipolars are stored as suitably normalised vectors c and the test for intersection is then
- eτ < c.P < eτ.
The current system where eχ is the image measurement error of the system achieves an accuracy
3 of about 0.2mm over a working volume of 50mm . It runs at a rate of around 10 ms per node processed, on a SUN 4/260. Most of the computation time is spent in edge-detection, applying Gaussian masks, searching for maxima and interpolating between pixels. Current efforts to reduce this by using fixed-point framestore convolution hardware to implement a preliminary pass of low-precision edge-detection will leave computationally costly, subpixel edge-localisation to be done only at the grid points themselves.

Claims

1. A three-dimensional vision system which comprises: (a) an object field having a ground plane;
(b) a projector system for illuminating the object field with structured light, the projector system having a focal length f', a projector plane and centre of projection a height H from the ground plane;
(c) a camera for viewing the object field, the camera having an optical centre; and
(d) a base line of length t constituted by a straight line joining the centre of projection to the camera optical centre, epipolar lines in the projector plane being defined as lines co-planar with the base line, characterised in that the structured light comprises two sets of parallel stripes on the ground plane, one
1 set having a periodicity on the ground plane of /rA and making a first angle with the epipolar lines and the other set having a periodicity on the ground plane
1 of /r and making a second, much larger angle with the epipolar lines.
2. A three-dimensional vision system as claimed in
Claim 1 further characterised in that in a preferred embodiment of the invention the object field is restricted by a plane parallel to the ground plane at a height h, h being given by the equation
+ HrArBP
f "2HeprA
Figure imgf000013_0001
where e is the system measurement error referred to
P the projector plane.
3. A three-dimensional vision system as claimed in Claim 1 or Claim 2 wherein there are provided separate projectors for the two sets of stripes, the system thus being trinocular having two projectors and a camer .
4. A three-dimensional vision system as claimed in Claim 3 wherein it is arranged that lines common to projected planes of light from each projector intersect, thereby giving a virtual centre of projection and a virtual projector plane.
PCT/GB1991/001638 1991-07-26 1991-09-24 Three-dimensional vision system WO1993003579A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9116151.3 1991-07-26
GB919116151A GB9116151D0 (en) 1991-07-26 1991-07-26 Three-dimensional vision system

Publications (1)

Publication Number Publication Date
WO1993003579A1 true WO1993003579A1 (en) 1993-02-18

Family

ID=10699015

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1991/001638 WO1993003579A1 (en) 1991-07-26 1991-09-24 Three-dimensional vision system

Country Status (2)

Country Link
GB (1) GB9116151D0 (en)
WO (1) WO1993003579A1 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003078920A2 (en) * 2002-03-20 2003-09-25 Steinbichler Optotechnik Gmbh Method and device for determining the absolute coordinates of an object
WO2004076970A1 (en) * 2003-02-27 2004-09-10 Storz Endoskop Produktions Gmbh Method and optical system for measuring the topography of a test object
GB2410794A (en) * 2004-02-05 2005-08-10 Univ Sheffield Hallam Apparatus and methods for three dimensional scanning
DE102004044695A1 (en) * 2004-09-15 2006-03-30 Sick Ag Object distance measuring process, involves temporally illuminating object twice one after the other from different directions, and calculating distance of object on basis of received images from local misalignment of illuminating structure
EP1777485A1 (en) * 2004-08-03 2007-04-25 Techno Dream 21 Co., Ltd. Three-dimensional shape measuring method and apparatus for the same
GB2447060A (en) * 2007-03-01 2008-09-03 Magiqads Sdn Bhd Image creating method of a virtual three dimensional image reproduced on a planar surface
US7433024B2 (en) 2006-02-27 2008-10-07 Prime Sense Ltd. Range mapping using speckle decorrelation
WO2009063088A2 (en) * 2007-11-15 2009-05-22 Sirona Dental Systems Gmbh Method for optical measurement of objects using a triangulation method
WO2009063087A2 (en) * 2007-11-15 2009-05-22 Sirona Dental Systems Gmbh Method for optical measurement of the three dimensional geometry of objects
WO2010140059A3 (en) * 2009-06-01 2011-01-27 Gerd Hausler Method and device for three-dimensional surface detection with a dynamic reference frame
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US8090194B2 (en) 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
CN102589571A (en) * 2012-01-18 2012-07-18 西安交通大学 Spatial three-dimensional vision-computing verification method
US8339616B2 (en) 2009-03-31 2012-12-25 Micrometric Vision Technologies Method and apparatus for high-speed unconstrained three-dimensional digitalization
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US8374397B2 (en) 2005-10-11 2013-02-12 Primesense Ltd Depth-varying light fields for three dimensional sensing
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
US8400494B2 (en) 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8494252B2 (en) 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
US8538166B2 (en) 2006-11-21 2013-09-17 Mantisvision Ltd. 3D geometric modeling and 3D video content creation
US8649025B2 (en) 2010-03-27 2014-02-11 Micrometric Vision Technologies Methods and apparatus for real-time digitization of three-dimensional scenes
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
US8982182B2 (en) 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US9030528B2 (en) 2011-04-04 2015-05-12 Apple Inc. Multi-zone imaging sensor and lens array
US9066087B2 (en) 2010-11-19 2015-06-23 Apple Inc. Depth mapping using time-coded illumination
US9098931B2 (en) 2010-08-11 2015-08-04 Apple Inc. Scanning projectors and image capture modules for 3D mapping
US9131136B2 (en) 2010-12-06 2015-09-08 Apple Inc. Lens arrays for pattern projection and imaging
US9157790B2 (en) 2012-02-15 2015-10-13 Apple Inc. Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis
US9330324B2 (en) 2005-10-11 2016-05-03 Apple Inc. Error compensation in three-dimensional mapping
CN105783770A (en) * 2016-01-22 2016-07-20 西南科技大学 Method for measuring ice shaped contour based on line structured light
US9582889B2 (en) 2009-07-30 2017-02-28 Apple Inc. Depth mapping based on pattern matching and stereoscopic information
CN111630342A (en) * 2018-08-29 2020-09-04 深圳配天智能技术研究院有限公司 Gap detection method and system for visual welding system
CN116664408A (en) * 2023-07-31 2023-08-29 北京朗视仪器股份有限公司 Point cloud up-sampling method and device for color structured light

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4146926A (en) * 1976-09-02 1979-03-27 Iria Institut De Recherche D'informatique Et D'automatique Process and apparatus for optically exploring the surface of a body
US4802759A (en) * 1986-08-11 1989-02-07 Goro Matsumoto Three-dimensional shape measuring apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4146926A (en) * 1976-09-02 1979-03-27 Iria Institut De Recherche D'informatique Et D'automatique Process and apparatus for optically exploring the surface of a body
US4802759A (en) * 1986-08-11 1989-02-07 Goro Matsumoto Three-dimensional shape measuring apparatus

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003078920A3 (en) * 2002-03-20 2004-02-05 Steinbichler Optotechnik Gmbh Method and device for determining the absolute coordinates of an object
US6876458B2 (en) 2002-03-20 2005-04-05 Steinbichler Optotechnik Gmbh Method and device for determining the absolute coordinates of an object
WO2003078920A2 (en) * 2002-03-20 2003-09-25 Steinbichler Optotechnik Gmbh Method and device for determining the absolute coordinates of an object
US7486805B2 (en) 2003-02-27 2009-02-03 Storz Endoskop Produktions Gmbh Method and optical system for measuring the topography of a test object
WO2004076970A1 (en) * 2003-02-27 2004-09-10 Storz Endoskop Produktions Gmbh Method and optical system for measuring the topography of a test object
GB2410794A (en) * 2004-02-05 2005-08-10 Univ Sheffield Hallam Apparatus and methods for three dimensional scanning
EP1777485A1 (en) * 2004-08-03 2007-04-25 Techno Dream 21 Co., Ltd. Three-dimensional shape measuring method and apparatus for the same
EP1777485A4 (en) * 2004-08-03 2012-09-19 Techno Dream 21 Co Ltd Three-dimensional shape measuring method and apparatus for the same
DE102004044695A1 (en) * 2004-09-15 2006-03-30 Sick Ag Object distance measuring process, involves temporally illuminating object twice one after the other from different directions, and calculating distance of object on basis of received images from local misalignment of illuminating structure
US8400494B2 (en) 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
US9066084B2 (en) 2005-10-11 2015-06-23 Apple Inc. Method and system for object reconstruction
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US8374397B2 (en) 2005-10-11 2013-02-12 Primesense Ltd Depth-varying light fields for three dimensional sensing
US9330324B2 (en) 2005-10-11 2016-05-03 Apple Inc. Error compensation in three-dimensional mapping
US7433024B2 (en) 2006-02-27 2008-10-07 Prime Sense Ltd. Range mapping using speckle decorrelation
US8090194B2 (en) 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US9367952B2 (en) 2006-11-21 2016-06-14 Mantisvision Ltd. 3D geometric modeling and 3D video content creation
US8538166B2 (en) 2006-11-21 2013-09-17 Mantisvision Ltd. 3D geometric modeling and 3D video content creation
US8208719B2 (en) 2006-11-21 2012-06-26 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US10140753B2 (en) 2006-11-21 2018-11-27 Mantis Vision Ltd. 3D geometric modeling and 3D video content creation
US10902668B2 (en) 2006-11-21 2021-01-26 Mantisvision Ltd. 3D geometric modeling and 3D video content creation
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
GB2447060B (en) * 2007-03-01 2009-08-05 Magiqads Sdn Bhd Method of creation of a virtual three dimensional image to enable its reproduction on planar substrates
GB2447060A (en) * 2007-03-01 2008-09-03 Magiqads Sdn Bhd Image creating method of a virtual three dimensional image reproduced on a planar surface
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US8494252B2 (en) 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
WO2009063087A3 (en) * 2007-11-15 2009-09-24 Sirona Dental Systems Gmbh Method for optical measurement of the three dimensional geometry of objects
WO2009063087A2 (en) * 2007-11-15 2009-05-22 Sirona Dental Systems Gmbh Method for optical measurement of the three dimensional geometry of objects
US8280152B2 (en) 2007-11-15 2012-10-02 Sirona Dental Systems Gmbh Method for optical measurement of the three dimensional geometry of objects
EP2469224A3 (en) * 2007-11-15 2012-08-01 Sirona Dental Systems GmbH Method for intra-oral optical measurement of objects using a triangulation method
WO2009063088A3 (en) * 2007-11-15 2009-09-17 Sirona Dental Systems Gmbh Method for optical measurement of objects using a triangulation method
WO2009063088A2 (en) * 2007-11-15 2009-05-22 Sirona Dental Systems Gmbh Method for optical measurement of objects using a triangulation method
US8160334B2 (en) 2007-11-15 2012-04-17 Sirona Dental Systems Gmbh Method for optical measurement of objects using a triangulation method
JP2011504230A (en) * 2007-11-15 2011-02-03 シロナ・デンタル・システムズ・ゲゼルシャフト・ミット・ベシュレンクテル・ハフツング Optical measurement method of objects using trigonometry
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8339616B2 (en) 2009-03-31 2012-12-25 Micrometric Vision Technologies Method and apparatus for high-speed unconstrained three-dimensional digitalization
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US9091536B2 (en) 2009-06-01 2015-07-28 Dentsply International Inc. Method and device for three-dimensional surface detection with a dynamic reference frame
WO2010140059A3 (en) * 2009-06-01 2011-01-27 Gerd Hausler Method and device for three-dimensional surface detection with a dynamic reference frame
US9582889B2 (en) 2009-07-30 2017-02-28 Apple Inc. Depth mapping based on pattern matching and stereoscopic information
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
US8982182B2 (en) 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US8649025B2 (en) 2010-03-27 2014-02-11 Micrometric Vision Technologies Methods and apparatus for real-time digitization of three-dimensional scenes
US9098931B2 (en) 2010-08-11 2015-08-04 Apple Inc. Scanning projectors and image capture modules for 3D mapping
US9066087B2 (en) 2010-11-19 2015-06-23 Apple Inc. Depth mapping using time-coded illumination
US9167138B2 (en) 2010-12-06 2015-10-20 Apple Inc. Pattern projection and imaging using lens arrays
US9131136B2 (en) 2010-12-06 2015-09-08 Apple Inc. Lens arrays for pattern projection and imaging
US9030528B2 (en) 2011-04-04 2015-05-12 Apple Inc. Multi-zone imaging sensor and lens array
CN102589571A (en) * 2012-01-18 2012-07-18 西安交通大学 Spatial three-dimensional vision-computing verification method
US9651417B2 (en) 2012-02-15 2017-05-16 Apple Inc. Scanning depth engine
US9157790B2 (en) 2012-02-15 2015-10-13 Apple Inc. Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis
CN105783770A (en) * 2016-01-22 2016-07-20 西南科技大学 Method for measuring ice shaped contour based on line structured light
CN111630342A (en) * 2018-08-29 2020-09-04 深圳配天智能技术研究院有限公司 Gap detection method and system for visual welding system
CN111630342B (en) * 2018-08-29 2022-04-15 深圳配天智能技术研究院有限公司 Gap detection method and system for visual welding system
CN116664408A (en) * 2023-07-31 2023-08-29 北京朗视仪器股份有限公司 Point cloud up-sampling method and device for color structured light
CN116664408B (en) * 2023-07-31 2023-10-13 北京朗视仪器股份有限公司 Point cloud up-sampling method and device for color structured light

Also Published As

Publication number Publication date
GB9116151D0 (en) 1991-09-11

Similar Documents

Publication Publication Date Title
WO1993003579A1 (en) Three-dimensional vision system
US6611344B1 (en) Apparatus and method to measure three dimensional data
Woodham Analysing images of curved surfaces
Gennery Modelling the environment of an exploring vehicle by means of stereo vision
Blake et al. Trinocular active range-sensing
JPH07509782A (en) Validation method for optical distance measurement of target surfaces in turbulent environments
JP2002509259A (en) Method and apparatus for three-dimensional inspection of electronic components
US6411327B1 (en) Stereo camera system for obtaining a stereo image of an object, and system and method for measuring distance between the stereo camera system and the object using the stereo image
EP1240540B1 (en) Rectified catadioptric stereo sensors
Wolff Surface orientation from two camera stereo with polarizers
CN115239801B (en) Object positioning method and device
TW517268B (en) Three dimensional lead inspection system
JP2009097941A (en) Shape measuring device and surface state measuring device
CN112212845A (en) Two-dimensional coordinate measuring instrument for vertical line
US4213684A (en) System for forming a quadrifid image comprising angularly related fields of view of a three dimensional object
US5317374A (en) Method and apparatus for measuring a distance to an object from dual two dimensional images thereof
CN213120563U (en) Two-dimensional coordinate measuring instrument for vertical line
Yang Shape from darkness under error
JP2006078291A (en) Omnidirectional three-dimensional measuring apparatus
Wang et al. Ccd camera calibration for underwater laser scanning system
JPH0367566B2 (en)
JPH0432567Y2 (en)
US4390271A (en) Process for producing orthographic projection image of complex terrain/targets from an automatic system using two photographs
RU1793221C (en) Method for terrain shaping determining
Jost et al. Color digitizing and modeling of free-form 3D objects

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IT LU NL SE

NENP Non-entry into the national phase

Ref country code: CA