WO2005090905A1 - Optical profilometer apparatus and method - Google Patents

Optical profilometer apparatus and method Download PDF

Info

Publication number
WO2005090905A1
WO2005090905A1 PCT/GB2004/001103 GB2004001103W WO2005090905A1 WO 2005090905 A1 WO2005090905 A1 WO 2005090905A1 GB 2004001103 W GB2004001103 W GB 2004001103W WO 2005090905 A1 WO2005090905 A1 WO 2005090905A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pattern
image data
light
illuminating arrangement
Prior art date
Application number
PCT/GB2004/001103
Other languages
French (fr)
Inventor
John Edley Wilson
Matthew Gerard Reed
Original Assignee
Spiral Scratch Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spiral Scratch Limited filed Critical Spiral Scratch Limited
Priority to PCT/GB2004/001103 priority Critical patent/WO2005090905A1/en
Publication of WO2005090905A1 publication Critical patent/WO2005090905A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Definitions

  • This invention relates to making images including depth information, which is to say, primarily, the production of an image of an object which includes information about the distance from the viewer of an image of parts of the imaged object.
  • Images including depth information include: mask images, produced from a single viewpoint; angular-composite images, produced from two or more viewpoints differing in angular orientation of the object about a single axis; fully three dimensional images, produced from three or more viewpoints differing in angular orientation of the object about at least two orthogonal axes.
  • a three-dimensional representation of any of those images of, say, a human head could be, for example, a sculpture, or a rendering in glass or clear plastic of the shape of the head by laser-produced point strains, visible as bright points under illumination.
  • a two-dimensional representation of any of those images for example, one displayed on a video screen, can have image depth information which can be perceived, as by manipulating the image, e.g. by rotation, or if it can be viewed by an arrangement such as a decoding screen, in the case of integral imaging, or by separating two two- dimensional images taken from adjacent vantage points, one into each eye, simulating binocular vision.
  • depth imaging means the production of an image with depth information, whether or not actually displayed, but at least with the potential of being displayed or used to produce something that can be viewed as a two-dimensional or three-dimensional representation of an object, and includes, therefore, the process of capturing information, including depth information, about the object, and the processing of that information to the point where it can be used to produce an image.
  • One method for depth imaging involves illuminating an object with a beam of light having a sinusoidally varying intensity pattern, produced by a grating. This throws a pattern of parallel light and dark stripes on to the object. When viewed from an offset position, the stripes are deformed. A series of images is formed, using a linear array camera, as the object is rotated. Each image will be different, and from the different images, the position, in three dimensions, of each point on the surface of the object is calculated by triangulation, according to an algorithm programmed into a computer.
  • the present invention provides methods that are much faster and which use less expensive equipment, and which, in particular, are capable of being used in connection with personal computers as a desktop depth imaging facility.
  • the invention comprises a method for making an image of an object including depth information comprising the steps of: • illuminating the object with a periodic pattern of light from an illuminating arrangement; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being placed such that different parts of it are at different distances from the focal plane; • capturing image data from the thus-illuminated object; • analysing the captured image data to extract depth information based on the extent of defocussing of the pattern; and displaying an image of the object without the pattern and with depth information.
  • the image may be a mask image.
  • the image data may be captured in a single image.
  • the image may be an angular-composite image, and the data may then be captured in at least two mask images differing in the angular orientation of the object about a single axis orthogonal top a line between the object and the illuminating arrangement.
  • the image may be a 3D image.
  • the image data may than be captured in at least three mask images differing in the angular orientation of the object about at least two axes orthogonal to a line joining the object and the illuminating arrangement.
  • the object may be placed such that it does not intersect the focal plane, and may be placed such that it is in a region in which rate of change of defocussing with distance from the illuminating arrangement is greatest, and/or a region in which the rate of change of defocussing with distance from the illuminating arrangement is reasonably constant.
  • the pattern may be removed from the image by capturing image data corresponding to out-of-phase light patterns on the object and image data from the object illuminated without the pattern.
  • the pattern may be of alternating bright and dark lines; it is desirable that no region of the pattern on the object is completely unilluminated - essentially no information can be gathered from unilluminated regions - and, of course, it is desirable that no substantial part of the object should be totally absorbing.
  • the pattern may be generated by a grating, which may be of equally spaced light and dark parallel lines.
  • the concept of projecting an image of a grating onto a 3D object to produce a composite image is known in the field of 3D measurement using structured light.
  • the shape of the 3D object deforms the grating in such a way that the shape may be calculated using triangulation methods (for example - WO 00/70303).
  • triangulation methods for example - WO 00/70303.
  • Such methods require the imaging device to be positioned at an angle to the projection device.
  • the deformation of the grating makes grating removal difficult, as a loss in the periodicity of the grating has occurred.
  • texture mapping requires an image without the grating present.
  • the projection of a grid image onto an object is known also in the art of confocal microscopy.
  • the grating has only a narrow depth of focus and the presence of the grating image serves to locate the depth of those parts of the object, which lie in the same focal plane as the grating image (for example - WO 98/45745).
  • the grid is removed by a phase stepping method.
  • the technique requires at least three phase-stepped composite images and the mathematical treatment is simplified if the phase stepping is set at 120 degrees.
  • a second example (DE 199 30 816) uses a similar phase stepping method; in this case four steps are used at 90-degree intervals. In practice it is possible to perform an approximate phase stepping method using just two steps. In this case parts of the grating image in parts of the composite image may not removed completely.
  • correlation methods may be used to subtract a grating image from a composite image.
  • the use of correlation functions in the statistical analysis of signals and images is widespread.
  • the exact nature of the correlation analysis is dependant on the image data available, in particular: 1. knowledge of the form of the grating image, e.g. sine wave 2. knowledge of the period and amplitude of the grating image 3. knowledge of the position of the function in the composite image 4. knowledge of the wide field image, i.e. image in the absence of the grating
  • the grating may be removed completely and depth information may be gained at the pixel level. Where less information is available, it may be necessary to recover depth and texture information at the period level.
  • the extent of defocussing may be calculated on the basis of the width of a line of the pattern or on the basis of the modulation contrast of the pattern.
  • D(s) versus s may be plotted for individual optical systems.
  • the function is seen to display a largely linear region between the values 0.8 and 0.2. This is advantageous when depth distance is to be calculated from the defocus function.
  • P.A. Stokseth J. Opt. Soc. Am. 59#10, 1314 1969.
  • the defocus function is calculated analytically using both diffraction and geometrical optics theories.
  • an empirical treatise is given.
  • the defocus function is shown to asymmetrical either side of the focal plane (sphere), with a longer depth of defocus being observed behind the plane of focus.
  • the image may be scanned over parallel scan lines, parallel to or angled with respect to the lines of the pattern; the parallel scan lines may be at right angles to the lines of the pattern.
  • the mask image data may comprise pixel image data, which may be analysed on a pixel by pixel basis.
  • Image capture may be by a line scan camera or by an area scan camera, and may be in monochrome or colour.
  • the captured image data may be analysed to calculate colour information from the brightest parts of the image, namely from the brightness peaks of the pattern.
  • Calculated depth information may be adjusted using a calibration, as by a calibration look-up table, which may be generated by comparing calculated with actual depth measurements on a specimen object.
  • the image may be formatted for display using any preferred display system, such, for example, as a video screen driven by software simulating and manipulating 3D images, or as an integral or multiview image which can be viewed using a decoding screen.
  • the invention also comprises imaging apparatus for making an image of an object including depth information, comprising: • an illuminating arrangement adapted to illuminate the object with a periodic pattern of light; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being eatable with respect to the illuminating arrangement such that different parts of it are at different distances from the focal plane; • image data capturing means adapted to capture image data from the thus illuminated object; • depth analysis means adapted to analyse captured image data to extract depth information based on the extent of defocussing of the pattern; and • image display means for displaying an image of the object without the pattern and with depth information.
  • the image data capturing means may capture a mask image, and may comprise a one- dimensional or a two-dimensional array of detectors. Such may comprise a monochrome or colour CCD or CMOS camera.
  • the illuminating arrangement may comprise a light source, focussing means and a grating.
  • the light source may comprise a source of incoherent light, such as an incandescent filament lamp, a quartz-halogen lamp, a fluorescent lamp or a light-emitting diode.
  • the light source may, however, be a source of coherent light, such as a laser.
  • the focussing means may comprise a lens or a mirror, and may comprise a cylindrical, spherical or parabolic focussing arrangement.
  • the imaging apparatus may comprise a support for an object to be imaged.
  • the support may also support the illuminating arrangement in such relationship that the object is supported so that the focal plane does not intersect the object, and desirably in a region in which the rate of change of defocussing with distance from the illuminating arrangement is reasonably constant.
  • the support may also permit relative adjustment between the object and the illuminating arrangement, and may comprise a turntable.
  • the apparatus may also comprise means adapted to vary the periodic pattern of light, which may comprise means adapted to alter the orientation of a grating producing a periodic pattern of light.
  • the image display means may comprise a video screen driven by software capable of simulating and manipulating a 3D image.
  • Figure 1 shows (a) a mask image view of an object O from a single viewpoint; (b) a peripheral view such as will, when integrated, give rise to an angular- composite image; and (c) a fully three-dimensional view in which the object is rotated with respect to the viewer about two orthogonal axes;
  • Figure 2 illustrates the underlying principle of progressive defocussing with depth
  • Figure 3 is a view of a first embodiment of apparatus, for mask or angular- composite imaging
  • Figure 4 is a view of a second embodiment of apparatus, for fully three- dimensional imaging
  • Figure 5 illustrates four embodiments (a) - (d) of an illuminating arrangement
  • Figure 6 is a flow diagram showing an overview of the imaging method
  • Figure 7 is a flow diagram showing in detail one embodiment of one step in the flow diagram of Figure 7;
  • Figure 8 is a flow diagram showing in detail another embodiment of the step of Figure 8;
  • Figure 9 is a flow diagram showing in detail yet another embodiment of the step of Figure 8.
  • FFiigguurree 1100 is a flow diagram showing in detail one embodiment of another step in the flow diagram of Figure 7;
  • Figure 11 is a flow diagram showing in detail another embodiment of the step of Figure 11;
  • Figure 12 is a flow diagram showing in detail yet another embodiment of the step of Figure 11;
  • Figure 13 is a flow diagram showing a generalisation of the detail of Figure 13;
  • Figure 14 is a flow diagram showing one complete measurement method:
  • Figure 15 is a flow diagram showing another complete measurement method
  • FFiigguurree 1166 is a flow diagram showing another complete measurement method; and Figure 17 is a flow diagram showing a fourth complete measurement method.
  • the drawings illustrate an imaging apparatus for making an image of an object O including depth information, comprising: • an illuminating arrangement 11 adapted to illuminate the object O with a periodic pattern 12 of light; • the illuminating arrangement 11 being such that the pattern 12 is in focus in a focal plane 13 and defocuses progressively away from said focal plane 13; • the object O being locatable with respect to the illuminating arrangement 11 such that different parts of it are at different distances from the focal plane 13; • image data capturing means 14 adapted to capture image data from the thus illuminated object 11; depth analysis means 15 adapted to analyse captured image data to extract depth information based on the extent of defocussing of the pattern 12; and image display means 16 for displaying an image 17 of the object O without the pattern 13 and with depth information.
  • Figure 1 illustrates three different methods of imaging that can yield depth information about an object O.
  • the object is viewed from a single viewpoint. This is not usually conducive to capturing depth information, but, using the present invention, depth information can be extracted from such a view.
  • An image thus formed is termed a mask image.
  • the object O is viewed from more than one viewpoint.
  • depth information is gleaned from differences in the images.
  • integral imaging a single viewpoint is apparently used, but a wide 'taking' aperture and integral optics afford many different viewpoints within the taking aperture.
  • An image incorporating such information can be termed a fully three dimensional image.
  • objects stand on the ground or a base, and so an underview is unnecessary, and sufficient information can be gleaned from an angular-composite image, which corresponds to human binocular vision, but which can contain more information if the back of the object is taken into account..
  • FIG. 2 illustrates the underlying principle.
  • a light source L casts a pattern of light and dark lines from a grating Ml by means of a lens FI.
  • the pattern is in focus at a focal position f distant d from the lens FI.
  • the pattern would be out of focus, and is shown diagrammatically as being more out of focus the closer the screen approaches the lens FI. Contrast between the light and dark lines of the pattern is greatest at the focal distance d, and falls off towards the lens FI.
  • the measured modulation depth of the pattern gives an indication of the distance of the screen from the focal position f.
  • the pattern will be more or less out of focus at different positions on the object, and the modulation depth would be correspondingly different.
  • the distance of each point of the object from the focal position can be calculated as a function of the measured modulation depth at that point. This will be termed “structured modulation imaging” (SMI).
  • SI structured modulation imaging
  • the method differs from triangulation methods, in that imaging and viewing can take place from a single position, and the pattern defocuses over the depth of the object, whereas in triangulation, sharp focus over the whole object is preferred.
  • modulation depth as a function of distance from a focal plane of a lens system is discussed in WO-A-98/45745 and DE 199 30 816 Al .
  • the grid may be displaced so that the pattern moves into discrete positions across the object displaced by fractions of the grating constant, and an image of the pattern's projection on the object is recorded for each position of the grating. Only the in-focus parts of each image are used; they are assembled into a single image. The modulation depth information is used to remove the pattern from the image mathematically.
  • the method according to the invention is concerned with macroscopic imaging, and does not depend on such displacement of the grid.
  • the method comprises the steps of:
  • the image may be a mask image, in which the captured image data are captured in a single image, or it may be an angular-composite image, in which the image data are captured in at least two mask images differing in the angular orientation of the object O about a single axis orthogonal to a line between the object O and the illuminating arrangement 11.
  • the image may be a 3D image, in which the image data are captured in at least three mask images differing in the angular orientation of the object about at least two axes orthogonal to a line joining the object O and the illuminating arrangement 11.
  • Figure 3 shows apparatus for carrying out mask or angular-composite imaging, comprising an illuminating arrangement 11, and a turntable 31 on which the object O is placed.
  • the turntable 31 is rotated by an electric motor 32 about an axis 33 which is orthogonal to the optical axis 34 of the illuminating arrangement 11.
  • the motor 32 is controlled by a computer 35 to rotate the turntable stepwise through selected angular amounts.
  • Figure 4 shows apparatus for carrying out fully three-dimensional imaging, as well, of course, as mask and angular-composite imaging. Similar to the embodiment of Figure 3, it has, however, a support 41 on the turntable supporting the object O on an axis 42 about which it can be rotated by a second electric motor 43, also controlled by the computer 35, also in desired angular steps.
  • Figure 5 shows four different embodiments of the illuminating arrangement 11.
  • Figure 5(a) shows a light source L such as an incandescent filament lamp, illuminating a parallel line grating Ml with a focussing arrangement FI, such as a convex lens forming a virtual image of the grating in a focal plane P.
  • the grating Ml can be mounted on a carriage (not shown), which would also be controlled by the computer 35 of Figures 3 or 4, to move in the direction of Arrow A perpendicular to the rulings of the grating Ml .
  • Figure 5(b) shows a slit D interposed between the grating Ml and focussing means FI of Figure 5(a).
  • the grating Ml can be moved, again by the carriage, not shown, angularly with respect to the slit D and also perpendicularly to the lines of the grating Ml, Arrows B and A. These movements alter the spatial frequency of the illumination pattern, allowing altered modulation contrast characteristics for a fixed focussing means FI.
  • Figure 5(c) shows a helical grating M3 and a slit D placed between the light source L and the focussing means FI.
  • the light source L here can be a fluorescent tube. Rotation of the helical grating about its axis moves the pattern projected on the object O.
  • Figure 5(d) shows a collimated, controlled intensity light source L projecting on to a scanning mirror 51 which, at any one position, projects a strip of illumination on to the object O. If the intensity of the light source is synchronised with the scan, any desired light intensity pattern can be displayed on the object O.
  • Figure 6 is a flow diagram generic to all methods for forming and displaying images with depth information.
  • the object O is placed in the apparatus, on the turntable 31, and illuminated with whichever pattern is desired for the image in question.
  • the object can be of any shape, size (so long as it fits into the apparatus) and colour, the only limitation being that it must reflect light at least to some extent, so it cannot be black or totally absorbing over its entire surface. It should also preferably not be totally transparent. Objects with black regions or of glass or transparent plastics materials will give poor depth resolution. Objects up to 150mm long can be imaged in an apparatus with a paper size A4 footprint, which will conveniently fit on a desktop.
  • the software provides at Step 2 an option to customise the measurement parameters and set the customised parameters before capturing the image in the camera 35.
  • Such customisation can include selection of: colour, monochrome or sepia grid defocus over radius or diameter of turntable grid frequency lamp intensity colour and polarising filters camera lens aperture setting automatic gain control (AGC) on camera gamma setting on camera brightness on camera contrast on camera use of RBG channels separately or combined in depth calculation number of pixels, horizontal and vertical, used on camera number of steps per rotation (for angular-composite and 3D images) number of rotations of turntable number of steps per period, i.e.
  • AGC automatic gain control
  • the processed image is then further processed at Step 6 to extract the depth information. This will be dealt with in detail below.
  • Step 6 The image information yielded by Step 6 is then further processed at Step 7 to add colour and or texture, as will, again, be further discussed below.
  • geometrical mapping is performed, which might involve changing the coordinate system from cartesian coordinates, in which the initial measurement might have been made, to cylindrical coordinates, in which the final image might be displayed.
  • the image is displayed on whatever display arrangement has been selected to display it.
  • This might be a computer monitor screen, which will, of course, display only a 2D image, but such image can be manipulated by rotating it, for example, to show it from different aspects, and even show the back of the imaged object.
  • it might be a monitor screen with a decoding screen, the image on the screen having been processed into the format of an integral image such that, viewed through the decoding screen, the image appears to have depth appropriate to binocular vision.
  • the image information might be used to generate a true 3D set of coordinates used to drive a laser to write a 3D image in a glass or transparent plastic block.
  • Step 4 as seen in Figure 6, the object is moved, unless a single mask image is to be made.
  • the movement will be, in the case of an angular-composite image, a rotation about the axis 33 of the turntable.
  • the illumination, and the image will be of a vertical strip, as seen in Figure 5(d), and the turntable will be stepped around so that the entire object (or so much of it as may be desired to image) is imaged in vertical strips.
  • Such strips are 'welded' together in the general image processing step, Step 5.
  • the rotation about the axis 42 of the turntable 31 is also effected.
  • the object O is first imaged as an angular-composite image when it is the right way up, then it is flipped through 90° about axis 42 and another set of images made.
  • Figure 7 is a sub-flow diagram of the operation of making a mask image, i.e. one made as from a single viewpoint without rotation of the object. The whole of the object area facing the imaging apparatus is illuminated with the pattern.
  • Route 1 is the simplest.
  • the image is captured - this may be repeated one or more times, to gain better resolution from averaging multiple images.
  • the single, or single averaged image is then sent straight to step 5 for general image processing.
  • the image will, of course, contain depth information, in the form of the extent of defocussing of the pattern at different locations on the image, manifest as modulation contrast.
  • this information is extracted and the pattern removed by appropriate algorithms.
  • a first image is made with the grid pattern in place, than a second image is made with the grid moved out of the way.
  • Both first and second images may be made more than once and averaged. Both images are sent for further processing, depth information being extracted from the first image, and transferred to the second image, which does not, of course, have the pattern, so there is now no need of a pattern removal operation.
  • the grid, on Route 4 the object (which amounts to the same thing) is moved a known fraction of a grid period, and a second image taken. These two images are then sent for processing to extract depth information and remove the pattern for the final image processing steps.
  • Figure 8 is a sub-flow diagram for Step 4 for an angular-composite image.
  • a first image is captured, and, if desired, as before, one or more repeat captures made.
  • the object is rotated a known angular extent, and another image is made. This is repeated until the whole object, or such part of it as is required, has been imaged in vertical strips, as explained above.
  • a composite image is built up from the multiple strip images at the general image processing step, Step 5. In this operation, the pattern may be shifted, either to take it away completely, or to move it, or the object, a fraction of a grid period, as before, for each strip image.
  • Figure 9 is a sub-flow diagram for Step 4 for a fully three-dimensional imaging operation.
  • the procedure is as in Step 4 for the angular-composite image, with the additional step of moving the object relatively to the camera, about the other axis, axis 42.
  • Figure 10 is a sub-flow diagram for Step 6 for the single image, single grid method, Route 1 of sub-flow diagram, Figure 6.
  • the single image is taken from the general image processing step, Step 5 and the pixel brightness values read into an image array, on which further signal processing may be carried out if desired.
  • the array dimensions are calculated and the length and number of periods of the pattern are calculated.
  • the processing may be carried out on a period or pixel basis. On a period basis, the maximum, minimum and mean pixel brightness values are calculated for each period in each line of the array. In pixel based processing, the pixel phase and amplitude are calculated for each line of the array. Colour is derived from the maximum of the period signal, i.e. where the colour is not affected by the grid pattern.
  • the relative depth of each image portion is calculated from the modulation contrast derived from either of the previous calculations.
  • the actual depth is then calculated from a look up table obtained in a calibration step, which is simply an imaging operation as just described, compared with actual measurements of the distance of various portions of a test object from the imaging lens.
  • Figures 14, 15, 16 and 17 are flow charts for exemplary imaging methods selected from the more generalised flow charts of the preceding figures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for making an image of an object including depth information comprising the steps of: illuminating the object with a periodic pattern of light from an illuminating arrangement; the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; the object being placed such that different parts of it are at different distances from the focal plane; capturing image data from the thus­illuminated object; analysing the captured image data to extract depth information based on the extent of defocussing of the pattern; and displaying an image of the object without the pattern and with depth information. Apparatus for carrying out the said method.

Description

OPTICAL PROFILOMETER APPARATUS AND METHOD
This invention relates to making images including depth information, which is to say, primarily, the production of an image of an object which includes information about the distance from the viewer of an image of parts of the imaged object.
Images including depth information include: mask images, produced from a single viewpoint; angular-composite images, produced from two or more viewpoints differing in angular orientation of the object about a single axis; fully three dimensional images, produced from three or more viewpoints differing in angular orientation of the object about at least two orthogonal axes.
A three-dimensional representation of any of those images of, say, a human head, could be, for example, a sculpture, or a rendering in glass or clear plastic of the shape of the head by laser-produced point strains, visible as bright points under illumination. However, a two-dimensional representation of any of those images, for example, one displayed on a video screen, can have image depth information which can be perceived, as by manipulating the image, e.g. by rotation, or if it can be viewed by an arrangement such as a decoding screen, in the case of integral imaging, or by separating two two- dimensional images taken from adjacent vantage points, one into each eye, simulating binocular vision.
The term "depth imaging", as used herein, means the production of an image with depth information, whether or not actually displayed, but at least with the potential of being displayed or used to produce something that can be viewed as a two-dimensional or three-dimensional representation of an object, and includes, therefore, the process of capturing information, including depth information, about the object, and the processing of that information to the point where it can be used to produce an image.
One method for depth imaging, disclosed in US-A-4657394, involves illuminating an object with a beam of light having a sinusoidally varying intensity pattern, produced by a grating. This throws a pattern of parallel light and dark stripes on to the object. When viewed from an offset position, the stripes are deformed. A series of images is formed, using a linear array camera, as the object is rotated. Each image will be different, and from the different images, the position, in three dimensions, of each point on the surface of the object is calculated by triangulation, according to an algorithm programmed into a computer.
Other methods for depth determination using triangulation from multiple images are disclosed in DE-A- 19515949, DE-A-4416108, JP-A-4416108 and US-A-5085502. Such methods involve expensive equipment, are difficult to carry out and take a long time - usually about an hour.
The present invention provides methods that are much faster and which use less expensive equipment, and which, in particular, are capable of being used in connection with personal computers as a desktop depth imaging facility.
The invention comprises a method for making an image of an object including depth information comprising the steps of: • illuminating the object with a periodic pattern of light from an illuminating arrangement; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being placed such that different parts of it are at different distances from the focal plane; • capturing image data from the thus-illuminated object; • analysing the captured image data to extract depth information based on the extent of defocussing of the pattern; and displaying an image of the object without the pattern and with depth information.
The image may be a mask image. The image data may be captured in a single image. The image may be an angular-composite image, and the data may then be captured in at least two mask images differing in the angular orientation of the object about a single axis orthogonal top a line between the object and the illuminating arrangement.
The image may be a 3D image. The image data may than be captured in at least three mask images differing in the angular orientation of the object about at least two axes orthogonal to a line joining the object and the illuminating arrangement.
The object may be placed such that it does not intersect the focal plane, and may be placed such that it is in a region in which rate of change of defocussing with distance from the illuminating arrangement is greatest, and/or a region in which the rate of change of defocussing with distance from the illuminating arrangement is reasonably constant.
The pattern may be removed from the image by capturing image data corresponding to out-of-phase light patterns on the object and image data from the object illuminated without the pattern. The pattern may be of alternating bright and dark lines; it is desirable that no region of the pattern on the object is completely unilluminated - essentially no information can be gathered from unilluminated regions - and, of course, it is desirable that no substantial part of the object should be totally absorbing.
The pattern may be generated by a grating, which may be of equally spaced light and dark parallel lines.
The concept of projecting an image of a grating onto a 3D object to produce a composite image is known in the field of 3D measurement using structured light. Here the shape of the 3D object deforms the grating in such a way that the shape may be calculated using triangulation methods (for example - WO 00/70303). Such methods require the imaging device to be positioned at an angle to the projection device. In such measurements, the deformation of the grating makes grating removal difficult, as a loss in the periodicity of the grating has occurred. Thus depth is recovered but texture mapping requires an image without the grating present.
The projection of a grid image onto an object is known also in the art of confocal microscopy. Here the grating has only a narrow depth of focus and the presence of the grating image serves to locate the depth of those parts of the object, which lie in the same focal plane as the grating image (for example - WO 98/45745). Here the grid is removed by a phase stepping method. In brief, the technique requires at least three phase-stepped composite images and the mathematical treatment is simplified if the phase stepping is set at 120 degrees. A second example (DE 199 30 816) uses a similar phase stepping method; in this case four steps are used at 90-degree intervals. In practice it is possible to perform an approximate phase stepping method using just two steps. In this case parts of the grating image in parts of the composite image may not removed completely.
In addition to phase-stepping, correlation methods may be used to subtract a grating image from a composite image. The use of correlation functions in the statistical analysis of signals and images is widespread. The exact nature of the correlation analysis is dependant on the image data available, in particular: 1. knowledge of the form of the grating image, e.g. sine wave 2. knowledge of the period and amplitude of the grating image 3. knowledge of the position of the function in the composite image 4. knowledge of the wide field image, i.e. image in the absence of the grating
Where both grating and wide field images are known, the grating may be removed completely and depth information may be gained at the pixel level. Where less information is available, it may be necessary to recover depth and texture information at the period level.
The extent of defocussing may be calculated on the basis of the width of a line of the pattern or on the basis of the modulation contrast of the pattern.
The frequency response of a defocused optical system was first described by H.H. Hopkins (Proc. Roy. Soc. A 231, 3,1955). Here a description is given of the defocus function and its dependence on the image and optical properties. In brief, the distribution of intensity in the image plane is found by integrating the intensity distributions in the diffraction images associated with each point in the object. For a simple object (a lined grating) the defocus function (D) (also termed the optical transfer function and the modular transform function) may be calculated analytically and is often expressed in terms of a universal frequency function (s). By definition, V is inversely proportional to the aperture of the lens and proportional to the spacing of the grating. In practice this is seen as fine structure exhibiting only a short depth of focus whereas small apertures give a large depth of focus. With knowledge of the basic optical parameters, D(s) versus s may be plotted for individual optical systems. The function is seen to display a largely linear region between the values 0.8 and 0.2. This is advantageous when depth distance is to be calculated from the defocus function. A further description of the defocus function is given by P.A. Stokseth (J. Opt. Soc. Am. 59#10, 1314 1969). Here the defocus function is calculated analytically using both diffraction and geometrical optics theories. In addition, an empirical treatise is given. The defocus function is shown to asymmetrical either side of the focal plane (sphere), with a longer depth of defocus being observed behind the plane of focus.
The image may be scanned over parallel scan lines, parallel to or angled with respect to the lines of the pattern; the parallel scan lines may be at right angles to the lines of the pattern. The mask image data may comprise pixel image data, which may be analysed on a pixel by pixel basis.
Image capture may be by a line scan camera or by an area scan camera, and may be in monochrome or colour. The captured image data may be analysed to calculate colour information from the brightest parts of the image, namely from the brightness peaks of the pattern.
Calculated depth information may be adjusted using a calibration, as by a calibration look-up table, which may be generated by comparing calculated with actual depth measurements on a specimen object.
The image may be formatted for display using any preferred display system, such, for example, as a video screen driven by software simulating and manipulating 3D images, or as an integral or multiview image which can be viewed using a decoding screen.
The invention also comprises imaging apparatus for making an image of an object including depth information, comprising: • an illuminating arrangement adapted to illuminate the object with a periodic pattern of light; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being eatable with respect to the illuminating arrangement such that different parts of it are at different distances from the focal plane; • image data capturing means adapted to capture image data from the thus illuminated object; • depth analysis means adapted to analyse captured image data to extract depth information based on the extent of defocussing of the pattern; and • image display means for displaying an image of the object without the pattern and with depth information.
The image data capturing means may capture a mask image, and may comprise a one- dimensional or a two-dimensional array of detectors. Such may comprise a monochrome or colour CCD or CMOS camera. The illuminating arrangement may comprise a light source, focussing means and a grating.
The light source may comprise a source of incoherent light, such as an incandescent filament lamp, a quartz-halogen lamp, a fluorescent lamp or a light-emitting diode. The light source may, however, be a source of coherent light, such as a laser.
The focussing means may comprise a lens or a mirror, and may comprise a cylindrical, spherical or parabolic focussing arrangement. The imaging apparatus may comprise a support for an object to be imaged. The support may also support the illuminating arrangement in such relationship that the object is supported so that the focal plane does not intersect the object, and desirably in a region in which the rate of change of defocussing with distance from the illuminating arrangement is reasonably constant.
The support may also permit relative adjustment between the object and the illuminating arrangement, and may comprise a turntable.
The apparatus may also comprise means adapted to vary the periodic pattern of light, which may comprise means adapted to alter the orientation of a grating producing a periodic pattern of light.
The image display means may comprise a video screen driven by software capable of simulating and manipulating a 3D image. Embodiments of imaging apparatus and methods of imaging according to the invention will now be described with reference to the accompanying drawings, in which:
Figure 1 shows (a) a mask image view of an object O from a single viewpoint; (b) a peripheral view such as will, when integrated, give rise to an angular- composite image; and (c) a fully three-dimensional view in which the object is rotated with respect to the viewer about two orthogonal axes;
Figure 2 illustrates the underlying principle of progressive defocussing with depth;
Figure 3 is a view of a first embodiment of apparatus, for mask or angular- composite imaging;
Figure 4 is a view of a second embodiment of apparatus, for fully three- dimensional imaging
Figure 5 illustrates four embodiments (a) - (d) of an illuminating arrangement;
Figure 6 is a flow diagram showing an overview of the imaging method;
Figure 7 is a flow diagram showing in detail one embodiment of one step in the flow diagram of Figure 7;
Figure 8 is a flow diagram showing in detail another embodiment of the step of Figure 8;
Figure 9 is a flow diagram showing in detail yet another embodiment of the step of Figure 8;
FFiigguurree 1100 is a flow diagram showing in detail one embodiment of another step in the flow diagram of Figure 7;
Figure 11 is a flow diagram showing in detail another embodiment of the step of Figure 11;
Figure 12 is a flow diagram showing in detail yet another embodiment of the step of Figure 11;
Figure 13 is a flow diagram showing a generalisation of the detail of Figure 13;
Figure 14 is a flow diagram showing one complete measurement method:
Figure 15 is a flow diagram showing another complete measurement method;
FFiigguurree 1166 is a flow diagram showing another complete measurement method; and Figure 17 is a flow diagram showing a fourth complete measurement method.
The drawings illustrate an imaging apparatus for making an image of an object O including depth information, comprising: • an illuminating arrangement 11 adapted to illuminate the object O with a periodic pattern 12 of light; • the illuminating arrangement 11 being such that the pattern 12 is in focus in a focal plane 13 and defocuses progressively away from said focal plane 13; • the object O being locatable with respect to the illuminating arrangement 11 such that different parts of it are at different distances from the focal plane 13; • image data capturing means 14 adapted to capture image data from the thus illuminated object 11; depth analysis means 15 adapted to analyse captured image data to extract depth information based on the extent of defocussing of the pattern 12; and image display means 16 for displaying an image 17 of the object O without the pattern 13 and with depth information.
Figure 1 illustrates three different methods of imaging that can yield depth information about an object O. In Figure 1(a), the object is viewed from a single viewpoint. This is not usually conducive to capturing depth information, but, using the present invention, depth information can be extracted from such a view. An image thus formed is termed a mask image. In Figure 1(b), the object O is viewed from more than one viewpoint. In human binocular vision, and in binocular or multiview photography, depth information is gleaned from differences in the images. In integral imaging, a single viewpoint is apparently used, but a wide 'taking' aperture and integral optics afford many different viewpoints within the taking aperture. While such measures will serve to give depth information which can make an image appear to be three-dimensional, this will only apply to such regions of the object as are visible from the viewing position or positions. In order to acquire information about the back of the object, it is necessary to view from at least two, preferably more different directions. Such an image taken from two or more viewpoints as the object is rotated relatively to a single taking position is termed an angular-composite image. If the top and bottom of the object are to be imaged, it is necessary to have further viewpoints, with the object rotated, relative to the talcing position, about two axes A, B each orthogonal to a line X joining the object O and the viewing position P, as shown in Figure 1(c). An image incorporating such information can be termed a fully three dimensional image. By and large, objects stand on the ground or a base, and so an underview is unnecessary, and sufficient information can be gleaned from an angular-composite image, which corresponds to human binocular vision, but which can contain more information if the back of the object is taken into account..
Using methods as herein described, simple mask images, angular-composite images and fully three dimensional images can be made, each with depth information sufficient to produce a final image with the appearance of depth. Figure 2 illustrates the underlying principle. A light source L casts a pattern of light and dark lines from a grating Ml by means of a lens FI. The pattern is in focus at a focal position f distant d from the lens FI. Were the pattern to be cast on a screen closer than the distance d, the pattern would be out of focus, and is shown diagrammatically as being more out of focus the closer the screen approaches the lens FI. Contrast between the light and dark lines of the pattern is greatest at the focal distance d, and falls off towards the lens FI. The measured modulation depth of the pattern gives an indication of the distance of the screen from the focal position f.
If, instead of a flat screen, the pattern falls on a shaped object, the pattern will be more or less out of focus at different positions on the object, and the modulation depth would be correspondingly different. The distance of each point of the object from the focal position can be calculated as a function of the measured modulation depth at that point. This will be termed "structured modulation imaging" (SMI). The method differs from triangulation methods, in that imaging and viewing can take place from a single position, and the pattern defocuses over the depth of the object, whereas in triangulation, sharp focus over the whole object is preferred.
The modulation depth as a function of distance from a focal plane of a lens system is discussed in WO-A-98/45745 and DE 199 30 816 Al .
In those publications, which are concerned with microscopy, it is taught that the grid may be displaced so that the pattern moves into discrete positions across the object displaced by fractions of the grating constant, and an image of the pattern's projection on the object is recorded for each position of the grating. Only the in-focus parts of each image are used; they are assembled into a single image. The modulation depth information is used to remove the pattern from the image mathematically.
In contrast, the method according to the invention is concerned with macroscopic imaging, and does not depend on such displacement of the grid.
The method comprises the steps of:
• illuminating the object O with a periodic pattern 12 of light from an illuminating arrangement 11 ; • the illuminating arrangement 11 being such that the pattern 12 is in focus in a focal plane 13 and defocuses progressively away from said focal plane 13; • the object O being placed such that different parts of it are at different distances from the focal plane 13; • capturing image data from the thus-illuminated object O; • analysing the captured image data to extract depth information based on the extent of defocussing of the pattern 12; and displaying an image 17 of the object without the pattern 12 and with depth information.
The image may be a mask image, in which the captured image data are captured in a single image, or it may be an angular-composite image, in which the image data are captured in at least two mask images differing in the angular orientation of the object O about a single axis orthogonal to a line between the object O and the illuminating arrangement 11. Or the image may be a 3D image, in which the image data are captured in at least three mask images differing in the angular orientation of the object about at least two axes orthogonal to a line joining the object O and the illuminating arrangement 11. The method will be described in these three aspects with reference to the flow diagrams of Figures 6 to 17, and Figures 3, 4 and 5.
Figure 3 shows apparatus for carrying out mask or angular-composite imaging, comprising an illuminating arrangement 11, and a turntable 31 on which the object O is placed. The turntable 31 is rotated by an electric motor 32 about an axis 33 which is orthogonal to the optical axis 34 of the illuminating arrangement 11. The motor 32 is controlled by a computer 35 to rotate the turntable stepwise through selected angular amounts. Figure 4 shows apparatus for carrying out fully three-dimensional imaging, as well, of course, as mask and angular-composite imaging. Similar to the embodiment of Figure 3, it has, however, a support 41 on the turntable supporting the object O on an axis 42 about which it can be rotated by a second electric motor 43, also controlled by the computer 35, also in desired angular steps.
In the apparatus of both Figure 3 and Figure 4 is an image capture arrangement 36, that may comprise an area scan or a line scan digital camera arrangement. A keyboard 37 is used tro input instructions into the computer 35, and a VDU 38 displays the image. Figure 5 shows four different embodiments of the illuminating arrangement 11. Figure 5(a) shows a light source L such as an incandescent filament lamp, illuminating a parallel line grating Ml with a focussing arrangement FI, such as a convex lens forming a virtual image of the grating in a focal plane P. The grating Ml can be mounted on a carriage (not shown), which would also be controlled by the computer 35 of Figures 3 or 4, to move in the direction of Arrow A perpendicular to the rulings of the grating Ml .
Figure 5(b) shows a slit D interposed between the grating Ml and focussing means FI of Figure 5(a). The grating Ml can be moved, again by the carriage, not shown, angularly with respect to the slit D and also perpendicularly to the lines of the grating Ml, Arrows B and A. These movements alter the spatial frequency of the illumination pattern, allowing altered modulation contrast characteristics for a fixed focussing means FI.
Figure 5(c) shows a helical grating M3 and a slit D placed between the light source L and the focussing means FI. The light source L here can be a fluorescent tube. Rotation of the helical grating about its axis moves the pattern projected on the object O.
Figure 5(d) shows a collimated, controlled intensity light source L projecting on to a scanning mirror 51 which, at any one position, projects a strip of illumination on to the object O. If the intensity of the light source is synchronised with the scan, any desired light intensity pattern can be displayed on the object O.
Figure 6 is a flow diagram generic to all methods for forming and displaying images with depth information. To begin the process at Step 1, the object O is placed in the apparatus, on the turntable 31, and illuminated with whichever pattern is desired for the image in question.
The object can be of any shape, size (so long as it fits into the apparatus) and colour, the only limitation being that it must reflect light at least to some extent, so it cannot be black or totally absorbing over its entire surface. It should also preferably not be totally transparent. Objects with black regions or of glass or transparent plastics materials will give poor depth resolution. Objects up to 150mm long can be imaged in an apparatus with a paper size A4 footprint, which will conveniently fit on a desktop. The software provides at Step 2 an option to customise the measurement parameters and set the customised parameters before capturing the image in the camera 35. Such customisation can include selection of: colour, monochrome or sepia grid defocus over radius or diameter of turntable grid frequency lamp intensity colour and polarising filters camera lens aperture setting automatic gain control (AGC) on camera gamma setting on camera brightness on camera contrast on camera use of RBG channels separately or combined in depth calculation number of pixels, horizontal and vertical, used on camera number of steps per rotation (for angular-composite and 3D images) number of rotations of turntable number of steps per period, i.e. how many grids are to be used in the algorithm grid divergence corrections averaging algorithms, and at which stage in the calculations they are used • smoothing algorithms, and at which stage in the calculations they are used texture map algorithm geometry transformation algorithm 3D viewer After the image is captured at Step 3, it is subjected, at Step 5, to general image processing, involving, for example, the use of smoothing algorithms and cut and reassembly operations.
The processed image is then further processed at Step 6 to extract the depth information. This will be dealt with in detail below.
The image information yielded by Step 6 is then further processed at Step 7 to add colour and or texture, as will, again, be further discussed below. At Step 8, geometrical mapping is performed, which might involve changing the coordinate system from cartesian coordinates, in which the initial measurement might have been made, to cylindrical coordinates, in which the final image might be displayed.
Finally, at Step 9, the image is displayed on whatever display arrangement has been selected to display it. This might be a computer monitor screen, which will, of course, display only a 2D image, but such image can be manipulated by rotating it, for example, to show it from different aspects, and even show the back of the imaged object. Or it might be a monitor screen with a decoding screen, the image on the screen having been processed into the format of an integral image such that, viewed through the decoding screen, the image appears to have depth appropriate to binocular vision. Or the image information might be used to generate a true 3D set of coordinates used to drive a laser to write a 3D image in a glass or transparent plastic block.
In Step 4, as seen in Figure 6, the object is moved, unless a single mask image is to be made. The movement will be, in the case of an angular-composite image, a rotation about the axis 33 of the turntable. In this case, the illumination, and the image, will be of a vertical strip, as seen in Figure 5(d), and the turntable will be stepped around so that the entire object (or so much of it as may be desired to image) is imaged in vertical strips. Such strips are 'welded' together in the general image processing step, Step 5. If fully 3D image is required, the rotation about the axis 42 of the turntable 31 is also effected. Possibly, the object O is first imaged as an angular-composite image when it is the right way up, then it is flipped through 90° about axis 42 and another set of images made.
Figure 7 is a sub-flow diagram of the operation of making a mask image, i.e. one made as from a single viewpoint without rotation of the object. The whole of the object area facing the imaging apparatus is illuminated with the pattern.
There are four possible routes through this sub-flow diagram. Route 1 is the simplest. First, the image is captured - this may be repeated one or more times, to gain better resolution from averaging multiple images. The single, or single averaged image is then sent straight to step 5 for general image processing. The image will, of course, contain depth information, in the form of the extent of defocussing of the pattern at different locations on the image, manifest as modulation contrast. In the subsequent image processing, this information is extracted and the pattern removed by appropriate algorithms.
On Route 2, a first image is made with the grid pattern in place, than a second image is made with the grid moved out of the way. Both first and second images, of course, may be made more than once and averaged. Both images are sent for further processing, depth information being extracted from the first image, and transferred to the second image, which does not, of course, have the pattern, so there is now no need of a pattern removal operation. On Route 3, the grid, on Route 4, the object (which amounts to the same thing) is moved a known fraction of a grid period, and a second image taken. These two images are then sent for processing to extract depth information and remove the pattern for the final image processing steps. Figure 8 is a sub-flow diagram for Step 4 for an angular-composite image. A first image is captured, and, if desired, as before, one or more repeat captures made. The object is rotated a known angular extent, and another image is made. This is repeated until the whole object, or such part of it as is required, has been imaged in vertical strips, as explained above. A composite image is built up from the multiple strip images at the general image processing step, Step 5. In this operation, the pattern may be shifted, either to take it away completely, or to move it, or the object, a fraction of a grid period, as before, for each strip image.
Figure 9 is a sub-flow diagram for Step 4 for a fully three-dimensional imaging operation. The procedure is as in Step 4 for the angular-composite image, with the additional step of moving the object relatively to the camera, about the other axis, axis 42.
Figure 10 is a sub-flow diagram for Step 6 for the single image, single grid method, Route 1 of sub-flow diagram, Figure 6. The single image is taken from the general image processing step, Step 5 and the pixel brightness values read into an image array, on which further signal processing may be carried out if desired. The array dimensions are calculated and the length and number of periods of the pattern are calculated. The processing may be carried out on a period or pixel basis. On a period basis, the maximum, minimum and mean pixel brightness values are calculated for each period in each line of the array. In pixel based processing, the pixel phase and amplitude are calculated for each line of the array. Colour is derived from the maximum of the period signal, i.e. where the colour is not affected by the grid pattern. The relative depth of each image portion is calculated from the modulation contrast derived from either of the previous calculations. The actual depth is then calculated from a look up table obtained in a calibration step, which is simply an imaging operation as just described, compared with actual measurements of the distance of various portions of a test object from the imaging lens.
Where more than one image, and/or more than one grid position are involved, these calculations are made for each image and grid position, as will be seen from the sub flow diagrams for Step 6 as shown in Figures 11, 12 and 13. Figure 13 has an option to use single grid or n grid depth extraction algorithms.
Figures 14, 15, 16 and 17 are flow charts for exemplary imaging methods selected from the more generalised flow charts of the preceding figures.
Many variations are possible within the context of the invention. Different methods may be used for illuminating the object, including filament lamps, fluorescent lamps, lasers and so on. It is possible to use single wavelength light, or even infrared or ultraviolet light, if colour is not required, and appropriate imaging devices are used. Instead of a 'mechanical' grating, an electronic grating can be used, which can be controlled as to frequency and position. And different arrangements may be used for displaying and manipulating the final image, including a laser writing arrangement to a glass or plastic block or a computer assisted manufacturing arrangement which may involve spark erosion or other shaping technology, for rapid prototyping.

Claims

Claims:
1 A method for making an image of an object including depth information comprising the steps of: • illuminating the object with a periodic pattern of light from an illuminating arrangement; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being placed such that different parts of it are at different distances from the focal plane; • capturing image data from the thus-illuminated object; • analysing the captured image data to extract depth information based on the extent of defocussing of the pattern; and • displaying an image of the object without the pattern and with depth information.
2 A method according to claim 1 , in which the image is a mask image. 3 A method according to claim 2, in which the captured image data are captured in a single image.
4 A method according to claim 1, in which the image is an angular-composite image.
5 A method according to claim 4, in which the image data are captured in at least two mask images differing in the angular orientation of the object about a single axis orthogonal to a line between the object and the illuminating arrangement.. 6 A method according to claim 1, in which the image is a 3D image.
7 A method according to claim 6, in which the image data are captured in at least three mask images differing in the angular orientation of the object about at least two axes orthogonal to a line joining the object and the illuminating arrangement.
8 A method according to any one of claims 1 to 7, in which the object is placed such that it does not intersect the focal plane.
9 A method according to claim 8, in which the object is placed such that it is in a region in which rate of change of defocusing with distance from the illuminating arrangement is greatest. 10 A method according to claim 8 or claim 9, in which the object is placed such that it is in a region in which the rate of change of defocusing with distance from the illuminating arrangement is reasonably constant.
11 A method according to any one of claims 1 to 10, in which the pattern is removed from the image by capturing image data corresponding to out-of-phase light patterns on the object and image data from the object illuminated without the pattern. 12 A method according to any one of claims 1 to 11, in which the pattern is of alternating bright and dark lines.
13 A method according to claim 12, in which no region of the pattern on the object is completely unilluminated.
14 A method according to any one of claims 1 to 13, in which the pattern is generated by a grating.
15 A method according to claim 14, in which the grating is of equally spaced light and dark parallel lines.
16 A method according to any one of claims 1 to 15, in which the extent of defocussing is calculated on the basis of the width of a line of the pattern. 17 A method according to any one of claims 1 to 16, in which the extent of defocussing is calculated on the basis of the modulation contrast of the pattern.
18 A method according to any one of claims 1 to 17, in which the image is scanned over parallel scan lines angled with respect to the lines of the pattern.
19 A method according to claim 18, in which the parallel scan lines are at right angles to the parallel lines of the pattern.
20 A method according to any one of claims 1 to 19, in which the mask image data comprise pixel image data.
21 A method according to claim 20, in which the image data are analysed on a pixel- by-pixel basis. 22 A method according to any one of claims 1 to 21, in which image capture is by a line scan camera.
23 A method according to any one of claims 1 to 21, in which image capture is by an area scan camera. 24 A method according to any one of claims 1 to 23, in which image capture is in colour.
25 A method according to claim 24, in which the captured image data is analysed to calculate colour from the brightest parts of the image.
26 A method according to any one of claims 1 to 25, in which calculated depth information is adjusted using a calibration. 27 A method according to claim 26, in which the adjustment is effected using a calibration look-up table.
28 A method according to any one of claims 1 to 27, in which the image is formatted for display using a preferred display system.
29 Imaging apparatus for making an image of an object including depth information, comprising: • an illuminating arrangement adapted to illuminate the object with a periodic pattern of light; • the illuminating arrangement being such that the pattern is in focus in a focal plane and defocuses progressively away from said focal plane; • the object being locatable with respect to the illuminating arrangement such that different parts of it are at different distances from the focal plane; • image data capturing means adapted to capture image data from the thus illuminated object; • data analysis means adapted to analyse captured image data to extract depth information based on the extent of defocussing of the pattern; and • image display means for displaying an image of the object without the pattern and with depth information.
30 Apparatus according to claim 29, in which the image data capturing means capture a mask image.
31 Apparatus according to claim 29 or claim 30, in which the image data capturing means comprise a one-dimensional array of detectors.
32 Apparatus according to claim 29 or claim 30, in which the image data capturing means comprise a two dimensional array of detectors. 33 Apparatus according to claim 31 or claim 32, a monochrome camera.
34 Apparatus according to claim 31 or claim 32, being a colour camera. 35 Apparatus according to claim 33 or claim 34, being a CCD camera.
36 Apparatus according to claim 33 or claim 34, being a CMOS camera.
37 Apparatus according to any one of claims 29 to 36, in which the illuminating arrangement comprises a light source, focussing means and a grating.
38 Apparatus according to claim 37, in which the light source comprises a source of incoherent light. 39 Apparatus according to claim 38, in which the light source comprises an incandescent filament lamp.
40 Apparatus according to claim 38, in which the light source comprises a quartz- halogen lamp.
41 Apparatus according to claim 38, in which the light source comprises a fluorescent lamp.
42 Apparatus according to claim 38, in which the light source comprises a light emitting diode.
43 Apparatus according to claim 37, in which the light source is a source of coherent light. 44 Apparatus according to claim 43, in which the light source comprises a laser.
45 Apparatus according to any one of claims 37 to 44, in which the focussing means comprise a lens. 46 Apparatus according to any one of claims 37 to 45, in which the focussing means comprise a mirror.
47 Apparatus according to claim 45 or claim 46, in which the focussing means comprise a cylindrical focussing arrangement.
48 Apparatus according to claim 45 or claim 46, in which the focussing means comprise a spherical or parabolic focussing arrangement.
49 Apparatus according to any one of claims 29 to 48, comprising a support for an object to be imaged. 50 Apparatus according to claim 49, the support also supporting the illuminating arrangement in such relationship that the object is supported so that the focal plane does not intersect the object. 51 Apparatus according to claim 49 or claim 50, the support also supporting the illuminating arrangement in such relationship that the object is in a region in which the rate of change of defocussing with distance from the illuminating arrangement is reasonably constant. 52 Apparatus according to any one of claims 49 to 51, in which the support permits relative adjustment between the object and the illuminating arrangement.
53 Apparatus according to any one of claims 49 to 52, in which the support comprises a turntable.
54 Apparatus according to any one of claims 29 to 53, comprising means adapted to vary the periodic pattern of light.
55 Apparatus according to claim 54, comprising means adapted to alter the orientation of a grating producing a periodic pattern of light.
56 Apparatus according to any one of claims 29 to 55, in which the image display means comprise a video screen driven by software capable of simulating and manipulating a 3D image.
57 Apparatus according to any one of claims 29 to 56, substantially as hereinbefore described with reference to any one or more of the accompanying drawings.
PCT/GB2004/001103 2004-03-16 2004-03-16 Optical profilometer apparatus and method WO2005090905A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/GB2004/001103 WO2005090905A1 (en) 2004-03-16 2004-03-16 Optical profilometer apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/GB2004/001103 WO2005090905A1 (en) 2004-03-16 2004-03-16 Optical profilometer apparatus and method

Publications (1)

Publication Number Publication Date
WO2005090905A1 true WO2005090905A1 (en) 2005-09-29

Family

ID=34957095

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2004/001103 WO2005090905A1 (en) 2004-03-16 2004-03-16 Optical profilometer apparatus and method

Country Status (1)

Country Link
WO (1) WO2005090905A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2202994A1 (en) * 2008-12-23 2010-06-30 Sick Ag 3D camera for monitoring an area
GB2476738B (en) * 2007-07-18 2012-08-15 Iatia Imaging Pty Ltd Method and apparatus for determining the surface profile of an object
DE102010030833B4 (en) * 2009-07-03 2014-07-03 Koh Young Technology Inc. Device for measuring a three-dimensional shape
CN105403170A (en) * 2015-12-11 2016-03-16 华侨大学 Microscopic 3D morphology measurement method and apparatus
CN109445229A (en) * 2018-12-12 2019-03-08 华中科技大学 A method of obtaining the zoom camera focal length containing first order radial distortion

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5003166A (en) * 1989-11-07 1991-03-26 Massachusetts Institute Of Technology Multidimensional range mapping with pattern projection and cross correlation
US5878152A (en) * 1997-05-21 1999-03-02 Cognex Corporation Depth from focal gradient analysis using object texture removal by albedo normalization
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5003166A (en) * 1989-11-07 1991-03-26 Massachusetts Institute Of Technology Multidimensional range mapping with pattern projection and cross correlation
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US5878152A (en) * 1997-05-21 1999-03-02 Cognex Corporation Depth from focal gradient analysis using object texture removal by albedo normalization

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2476738B (en) * 2007-07-18 2012-08-15 Iatia Imaging Pty Ltd Method and apparatus for determining the surface profile of an object
EP2202994A1 (en) * 2008-12-23 2010-06-30 Sick Ag 3D camera for monitoring an area
DE102010030833B4 (en) * 2009-07-03 2014-07-03 Koh Young Technology Inc. Device for measuring a three-dimensional shape
CN105403170A (en) * 2015-12-11 2016-03-16 华侨大学 Microscopic 3D morphology measurement method and apparatus
CN109445229A (en) * 2018-12-12 2019-03-08 华中科技大学 A method of obtaining the zoom camera focal length containing first order radial distortion

Similar Documents

Publication Publication Date Title
US20060072123A1 (en) Methods and apparatus for making images including depth information
US10088296B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US10499040B2 (en) Device and method for optically scanning and measuring an environment and a method of control
AU2004273957B2 (en) High speed multiple line three-dimensional digitization
GB2545145B (en) A device and method for optically scanning and measuring an environment
US9671221B2 (en) Portable device for optically measuring three-dimensional coordinates
US20120200829A1 (en) Imaging and projecting devices and methods
WO2009120073A2 (en) A dynamically calibrated self referenced three dimensional structured light scanner
Zhang et al. Development of an omni-directional 3D camera for robot navigation
WO2016040229A1 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
KR20200046789A (en) Method and apparatus for generating 3-dimensional data of moving object
EP2398235A2 (en) Imaging and projection devices and methods
US11350077B2 (en) Handheld three dimensional scanner with an autoaperture
JP2011075336A (en) Three-dimensional shape measuring instrument and method
WO2005090905A1 (en) Optical profilometer apparatus and method
WO2016039955A1 (en) A portable device for optically measuring three- dimensional coordinates
GB2413910A (en) Determining depth information from change in size of a projected pattern with varying depth
EP1417453A1 (en) Device and method for 3d imaging
US20020067356A1 (en) Three-dimensional image reproduction data generator, method thereof, and storage medium
JPH07270140A (en) Method and apparatus for measuring three-dimensional shape

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase