WO1996006325A1 - Scanning arrangement and method - Google Patents
Scanning arrangement and method Download PDFInfo
- Publication number
- WO1996006325A1 WO1996006325A1 PCT/GB1995/001994 GB9501994W WO9606325A1 WO 1996006325 A1 WO1996006325 A1 WO 1996006325A1 GB 9501994 W GB9501994 W GB 9501994W WO 9606325 A1 WO9606325 A1 WO 9606325A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scanning
- scanmng
- arrangement according
- output data
- scan
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/08—Measuring arrangements characterised by the use of optical techniques for measuring diameters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4817—Constructional features, e.g. arrangements of optical elements relating to scanning
Definitions
- the present inv ention relates to a scanning arrangement and method for determining the shape, size or other three-dimensional surface characteristics of an object, such as colour for example.
- the Schulz system requires an externally generated coordinate system defined by an 0 array of photodetectors which detects an array of pilot lights on the scanner. Hence the scanner can only be used within this coordinate system and furthermore the array of pilot lights must be kept in ⁇ ⁇ ew by the-array of photodetectors, which further restricts the mobility of the scanner. This is a se ⁇ ous disadvantage because it is $ normally necessary to scan the object of interest from all sides in order to build up a complete picture of its surface.
- An object of the present invention is to provide a scanning arrangement and method in w hich the scanner can be moved freely without reference to the position of its 0 mounting.
- the invention provides a scanmng arrangement for determining the shape or other three-dimensional surlace characteristics of an object, the arrangement comprising a scanning device which is freely movable relative to said object, the ice comp ⁇ sing: a) an optical projector for projecting a predetermined pattern onto a region of the surface of the object, and b) an optical detector for detecnng the coordinates or other surface 0 characteristics of said region and for generating output signals representative ot such coordinates or other surface characteristics, the arrangement further including: c) processing means coupled to said detector for generaung a set of output data representing said surface characteristics of a scanned portion of the surface of said object, d) combining means coupled to said processing means for combining sets of such Q output data derived from o erlapping scans of said surface into a common set of output data by appropnate rotations and translations, said combining means optionally including further processing means for calculaung said rotauons and translations from subsets of respectiv e sets of such output data w
- sensing means for detecting movement of said scanning de ice relative to said object and generating output signals representative of such mov ement
- correcting means for correcting for movement of said scanning device relau e to said object between successi e scans, said correcting means being responsi e to at least one of:
- the scanning device carries inertial sensing means for sensing 10 its linear acceleration and rotation sensing means for sensing its rate ol rotation
- the correcting means eftectiv ely consists of the combining means, w hich in this embodiment compnses further processing means for 20 determining the translations and/or rotations required to combine the data f rom successive scans.
- the method comp ⁇ smg: l) projecting from the scanning device a predetermined optical pattern onto a region of the surface of the object, n) optically detecung the coordinates or other surface characteristics of said region 0 w ith an optical detector mounted on said scanning dev ice, in) deriving a set of output data representing said coordinates or other surface character! sitess of said region, ⁇ ) repeatedly scanning said optical pattern over said object in o erlapping fashion and denv ing further sets of output data from overlapping scans, and
- Figure 1 is a schematic perspectiv e v iew of one scanning device (utilising a linear
- Figure 2 is a block diagram of the scanning arrangement utilising the scanning dev ice of Figure 1 ;
- Figure 3 is a plan v iew showing the optics of another scanning device (utilising a two-dimensional scanning pattern) for use in a scanning arrangement in accordance w ith the inv ention;
- Figure 4 is a flow diagram show ing the image processing earned out by the arrangement of Figure 2;
- Figure 5 is a sketch perspectiv e v iew illustrating the projection of ov erlapping scanning patterns by the scanning dev ice of Figure 3, and
- Figure 6 is a sketch perspectiv e v iew illustrating a method of registe ⁇ ng ov erlapping scanned surface portions
- the generally T-shaped scanning dev ice 10 show n schematically in Figure 1 is hand held and comp ⁇ ses a projector w hich generates a v ertical fan-shaped beam ot light 3 onto an object 1 of unknow n shape and a lens 33 w hich collects diffusely
- the beam 3 is generated by a laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N ' e laser for example) and a cylindrical laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N ' e laser for example) and a cylindrical laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N ' e laser for example) and a cylindrical laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N ' e laser for example) and a cylindrical laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N ' e laser for example) and a cylindrical laser 29 (w hich is preferably a semiconductor laser ith beam-shaping optics but may be a He-N
- the scanmng dev ice can be generally C-shaped, hav ing the beam generator in one limb, the photodetector in the other and a gnp between the two limbs, and can be used with the gnp approximately v ertical and
- the photodetector 34 is inclined in the honzontal (z-x) plane at an angle ⁇ to the x-
- the scanning dev ice 10 is used to sweep the st ⁇ pe 13 ov er the entire surface ot object 1 Dunng this process, the changes in angular o ⁇ entation ot the scanning dev ice are detected by a v ibratory gyroscope 51 w hich generates output signals
- Miniature v ibratory gyroscopes are commercially av ailable at low cost and rely on Co ⁇ olis forces set up in a v ibrating shell or other structure.
- the acceleration of the scanmng device along the x, y and z axes is detected by an accelerometer 50 which generates respectiv e acceleration signals a , ay and az respectively.
- the sensitive axes of the gyroscope 51 are close to the centre of photodetector 34 in order to simplify subsequent computauon but this is not essential.
- the scanning device also includes a t ⁇ gger switch T which is operable by the user to start and stop the acquisition of data from the photodetector.
- This t ⁇ gger switch T outputs control signals c.
- These output signals may be conveyed to the circuitry by a flexible cable or wireless link for example or may for example be digitised and stored in local memory such as a high capacit miniature hard disc (not shown) in the scanning device, enabling the scanning device to be used remotely from its processing circuitry and enabling the processing to be earned out after scanning has been completed.
- At least some of the circuitry of Figure 2 may be located in the scanning device.
- the generation and/or acquisition of data from the photodetector 34 is gated by signals from the gyroscope 51 and or accelerometer 50 which prevent the determinauon of position and/or attitude of the scanner under conditions m which the angular rotation or acceleration are too great or too small (e.g. as a result of noise or dnft ) to enable accurate calculations to be made.
- the accelerometer is shown as three blocks 50a to 50c w hich output signals indicative of the acceleration along the x, y and z axes respectively and the gyroscope is similarly represented as blocks 51a to 51c which output signals indicaUve of the rate of rotation about axes x, y and z respectively.
- the acceleration signals are amplified by conventional low noise and dnft preamplifier circuits and fed via sample and hold circuitry 45 to an 14 bit analogue- to-digital converter 20.
- the rotation signals are amplified by conyenuonal lo noise and drift preamplifier circuits and fed via sample and hold circuitry 56 to a 14 bit analogue-to-digital converter 55.
- the sampie-and -hold circuitry 45, 55 samples at a rate of 5,000 per second.
- the digitised signals from analogue-to - digital converters 20 and 55 are fed to signal conditioning circuits 21 and 52 respec ⁇ vely which include digital filters for filtering out noise, correct for non- lineanty in the accelerometer or gyroscope outputs (e.g. by employing look-up tables) and correct for drift due to temperature va ⁇ ations.
- the resulting data is fed to a digital signal processor 23 where corrections are made for the offsets (if any) of the sensitive axes of the gyroscope 51 from the photodetector 34 and for the distance between the accelerometer 50 and the gyroscope 51.
- Such corrections are necessary because, for example a sudden rotation centred on accelerometer 50 would cause a significant acceleration of photodetector 34 but would not be detected as such by the accelerometer.
- the resulting rate of change of the rate of rotation detected by gyroscope 51 could be used, in conjunction with the known separation between the accelerometer and gyroscope and the output of the accelerometer, to calculate the acceleration of the photodetector, as will be apparent to persons skilled in the art.
- the rate of rotation and acceleration of the photodetector could be calculated from the outputs of two spaced apart gyroscopes (each sensitive about all three axes) or two spaced apart accelerometers (each sensitive along all three axes) provided that their positions relative to the photodetector were known.
- the processor 23 double integrates successive values of the corrected photodetector acceleration (along the x, y and z axes) by means of a double integrauon algonthm to determine the instantaneous position of the photodetector (relative to an arbitrar starting position) throughout the scan. Since the sample and hold circuits 45 and 56 are clocked at a rapid rate (e.g. 5 kHz) by clocking circuitry 57, this position sensing takes place essenually in real time.
- a rapid rate e.g. 5 kHz
- the processor 23 integrates successiv e values of the corrected photodetector rate of rotation (about the x, y and z axes) by means of an integration algonthm to determine the instantaneous orientation of the photodetector throughout the scan, again essentially in real time.
- the above integrations are performed on groups of 100 samples to update position and attitude 50 times per second, which matches the sampling rate of photodetector 34.
- correction circuitry 28 corrects for any distortion in the optics of the scanning device 10 (particularly distortion by the lens 33), denves the true profile of projected stripe 13 (as viewed at right angles to the plane of beam 3) from the image 13' by applying appropriate corrections (e.g.
- the scan data processor 25 receives signals representative of the true profile of the object 1 (as viewed at ⁇ ght angles to the plane of beam 3) as well as signals representative of the attitude and position of the photodetector 4 (which bears a fixed relationship to the position and attitude of the region of beam 3 which intersects the surface of the object 1), all in real time.
- signals representative of the true profile of the object 1 as viewed at ⁇ ght angles to the plane of beam 3
- signals representative of the attitude and position of the photodetector 4 which bears a fixed relationship to the position and attitude of the region of beam 3 which intersects the surface of the object 1.
- Such signals can for example either be used to d ⁇ ve an indicator such as an LED to indicate to the operator that the acceleration and or rate of rotation are w ithin an acceptable range tor data acquisition or can be used as gating signals to block the acquisition of data e.g. by disabling the clocking signals f rom circuitry 57 to sample and hold circuit 26
- the processor 25 only applies the rotations and translations corresponding to a group of successiv e profiles to generate a surface desc ⁇ ption of a small region of the scanned surface and stops generating this surface desc ⁇ ption (w hich will be a cloud of points in a common coordinate system) under the control of the operator and/or under the control of the abov e gating signals.
- a further surface desc ⁇ ption of another surface portion will then be generated in a similar manner e.g. in response to further operation of the sw itch or release of the gating signals. In this manner, successive scan files each comp ⁇ sing a cloud of points representative of a different but ov erlapping surface portion are generated.
- the position and attitude of the scanning dev ice 10 is monitored throughout the scanning and accordingly the position and attitude of the photodetector 34 at the beginning of the generation of each scan file is known (albeit possibly w ith some accumulated errors) and is associated w ith the three- dimensional coordinate data of that scan file.
- each scan file consists of a three-dimensional cloud of points desc ⁇ bing the surface coordinates of a region of the scanned surface, in association ith data representative of the position and o ⁇ entation of the photodetector 34 dunng the acquisition of each profile of that set.
- Successive scan files are output from processor 25 to a computer 47, which is provided with a display 131 and a ke ⁇ board 130.
- a semiconductor laser 29' transmits a laser beam to an opucal arrangement 300 which may either include a cylind ⁇ cal lens arranged to form a fan-shaped beam o ⁇ ented perpendicular to the plane of the drawing or a scanner such as an oscillating mirror arrangement arranged to oscillate the beam perpendicular to the plane of the drawing to form a corresponding fan-shaped env elope.
- the resulting fan-shaped beam or envelope is then directed by a fixed mirror onto a 24 facet polygonal mirror 40 rotating at 20,000 r.p.m. which scans a beam or env elope 3' m the plane of the drawing as shown.
- the profile (represented as an a ⁇ ow a or b) defined by the intersection of this beam or env elope w ith the surface of the object is imaged by a lens 33 onto an inclined photodetector 34' and forms a corresponding image a' or b'.
- detector 34' satisfies the Scheimpflug condition in order to maximise the depth of field.
- the profile a or b of the surface can be determined from the co ⁇ espondmg image a' or b ⁇
- processor 41 samples the output of photodetector 34' in synchromsm with the position signal from 41 and each resulting profile calculated by processor 42 is aligned relative to a common coordinate system according to the angular position signal.
- processor 41 is a cloud of points in a three-dimensional coordinate system which define the surface portion scanned by beam 3'.
- Preferably arrangement 300 includes a scanner which oscillates the laser beam rapidly perpendicular to the plane of the drawing (e.g. at 8kHz), relative to the rate of oscillation of the resulting envelope 3' in the plane of the drawing and the photodetector 34' is a two-dimensional lateral effect photodiode.
- a scanner which oscillates the laser beam rapidly perpendicular to the plane of the drawing (e.g. at 8kHz), relative to the rate of oscillation of the resulting envelope 3' in the plane of the drawing and the photodetector 34' is a two-dimensional lateral effect photodiode.
- Such an arrangement has a v ery short response time and can generate an effectiv ely instantaneous "snapshot" of the scanned portion of the surface before the scanmng dev ice has mo ed appreciably.
- the term 'effec ⁇ vely instantaneous' acquisition of data is used to desc ⁇ be an acquisition of many surface points m a two dimensional mat ⁇ x or raster projected onto the surface of the object in a pe ⁇ od which is so short that the maximum movement expected of the scanning device 10 in this time will be less than the required accuracy of the surface profile data.
- each acquisition w ill contain at least several hundred points.
- acousto-optic deflector and electro-optic deflector systems w hich may optionally be combined with a resonant line scanner. All of these systems are capable of projecting a spot onto the surface at a va ⁇ able angle to the scanmng dev ice in two dimensions, and can control the angle of projecUon at high speed.
- acousto-optic deflectors can currently be obtained to alter the refractive index of a crystal to divert the beam in any one of up to 1000 linearly spaced unique angles.
- Certain acousto-optic devices may be coupled to provide XY random access deflection.
- the rotating polygonal mirror system desc ⁇ bed above requires a medium-speed rotating poly gonal mirror 40 to provide the line scan in each line of the raster, and a low -speed scanner such as a further rota ⁇ ng poygonal mirror in optical arrangement 300 to prov ide the ertical raster movement reposiuoning the start of each line.
- polygonal mirror 40 provides a scan rate of 8,000 lines/sec if a 24 facet polygon is used ( 15 degree angle of scan), and scans 32 lines in 4ms.
- the preferred photodetector 34' is a lateral effect two dimensional linear silicon photodetector which generates an output indicating the offset of the centroid of the incident light relative to the centre of the photodetector in conunuous analogue form.
- Such photodetectors are capable of resolving optical changes at sev eral MHz.
- the processor 42 in the embodiment of Figure 3 preferably includes high speed flash analogue-to-digital converters which digitise the X and Y centroid signals from the photodetector which correspond to the surface profile at specific angle of projection of beam 3' at that point at the time of measurement.
- the digitised points are then converted to points in a three-dimensional coordinate system which is common to the entire scan performed by polygonal mirror 40 with the aid of signals from sensor 41 , which may be a Hall effect sensor for example. These signals indicate the instantaneous onentation of beam 3'. If an acousto-optic device is used for scanmng, the required information on the beam o ⁇ enta ⁇ on can be de ⁇ ved from the dnve signals to the device.
- each successive scan file can contain a very high proportion (e.g. 507c or more) of the data in the preceding scan, thus making overlapping almost complete. This enables the surface portions corresponding to successive scan files to be combined relatively easily to obtain a complete surface desc ⁇ ption of the object 1.
- Figure 3 is shown with an accelerometer 50 and a g roscope 51 which are similar to the acclerometer and gyroscope of Figure 1, it is not essential for the output of processor 42 to be combined with data de ⁇ ved from such an accelerometer and a gyroscope in order to give a succession of scan files including the required position and attitude data as generated by circuitry 25 in Figure 1.
- the scanned surface portions can be combined by a computer w ithout the assistance of such data, as will become apparent from the subsequent desc ⁇ ption
- the signals from the accelerometer and gyroscope can be used a) to de ⁇ ve the change in posiUon and attitude of the scanning device between successive scan files so that successive pairs (or larger groups) of scan files can be combined to form composite scan files which require less processing to be fitted together to form the complete surface desc ⁇ ption and b) to provide posi on and attitude signals to the subsequent processing circuitry which indicate to that circuitry w here regions of overlap between successive scan files may be found, thereby simpifying processing.
- the use of such signals w ill be desc ⁇ bed subsequently ith reference to Figure 5.
- the follo ing sections detail the data and processes desc ⁇ bed in the diagram.
- scan file of raw 3D coordinate data This is a file of data captured from a single scan of the scanning dev ice 10 (represented as a CLOUD OF POINTS) It will contain a vanable number of acquisitions ot surface profile 3D co-ordinates.
- a scan commences When the t ⁇ gger switch T is released, or the speed in any direction falls below a second (lower) threshold (for hysteresis), the scan is terminated.
- a second (lower) threshold for hysteresis
- Each profile acquisition will contain a number of distance samples along the length of the profile.
- a number ol extra data items may be included in each scan file in order to assist va ⁇ ous subsequent processes in performing error correction such as a scan file time stamp to indicate the time of the start of the scan, scanning dev ice position and attitude, and surface colour in another embodiment.
- the scan file consists of essentially a set of randomly placed and unassociated points on w hich any further processing is extremely difficult as the points are unordered and in an unsuitable formal for later post processing stages.
- This process fits a mathematical model to the surface desenbed by the randomly ordered points in the scan file w hich is in a form suitable for later post processing operations It should be noted that this mathematical model is not necessa ⁇ ly an expression for the surface in terms of x,y and z coordinates but is merely an ordered form of the raw data w hich enables features of interest such as discontinuities to be extracted in subsequent processes.
- the local neighbourhood growth algo ⁇ thms operate by selecting one or more seed points (usually at discontinuities) and growing the points into regions (influenced by the unstructured point data) until the regions meet and completely desc ⁇ be the set ot unstructured points in a spatially coherent manner suitable for further processing.
- Voronoi diagram is built up by defining a plane associated with each detected surface point such that each point within that plane is nearer to that detected surface point than to any of the other detected surface points.
- a polyhedron is built up w hosepolygonal faces are centred on respective detected surface points, the edges of the polygonal faces being midway between neighbou ⁇ ng detected surface points.
- the resulting partial polyhedron is then simplified by defining a t ⁇ angular face between three detected surface points and measu ⁇ ng the perpendicular distance between it and the intermediate detected surface points lying above it.
- the data resulting from process 62 will be a surface model in a form suitable for the extraction of points of interest.
- This surface model can be in the form of a set of splines, a binary tree, or a set of connected t ⁇ angles or other polygons depending on the precise method used in process 62.
- any de ⁇ vative of the local surface (particularly the gradient i.e. the first de ⁇ vative) passes through zero and include edges, comers, peaks, troughs and saddle regions.
- the methods for detecting these features include i) plane slicing and n) surface curvature analysis. In some instances where surfaces are essentially featureless, small and easily removable markers of known shape and size may be placed on the object to be scanned in order to generate artificial "points of interest". These may be removed from the scan file at the end of the processing by a process of subtracting the appropnate 3-D model. i) Plane Slicing
- the process of plane slicing involves taking successive plane slices at regular intervals along an axis that traverses the approximate centre of the scan data.
- This w ill prov ide a set of contour maps, of the surface at successive depths into the range data.
- This process can be iterative, in that initially a coarse spacing between planes can be used, followed by successively finer spacing between planes until points of interest are identified.
- surface curvature can be used to find discontinuities in a surface by detecting points with diverging splines.
- This method can also be used to detect surface features that w ill not be detected by other means, in that a smooth parabolic surface can be completely desc ⁇ bed mathematically, and used as a point of interest (its gradient passes through zero) for scan registration rather than surface point features as in the other methods. For this reason, this method is important as a tool to a genenc surface integration package, as it w ill allow the registration of scan files w hich contain few if anv discontinuities.
- K Gaussian Curvature
- H mean curvature
- the data flow from process 63 comp ⁇ ses "points of interest" associated with each scan file and can be represented by a list (e.g. peak, saddle, saddle, trough) linked to a further list of spaUal relationships between each of the points of interest.
- Each point of interest in the first list can be desc ⁇ bed by the following set of data or some subset:
- a search is performed for similar point of interest types, resulung in a list of common point of interest types between the pair of scan files.
- the relative spatial relationships between the common points of interest in the list for each scan can be compared by first matching distances between pairs of similar types to produce a subset of the onginal list where the spatial relationship between pairs of points of interests is matched.
- the relative spatial relationship of other neighbou ⁇ ng points of interest to the matching pair from each scan is then checked for a match containing three points of interest to produce a subset of the matching t ⁇ plets of points of interest. This is the minimal requirement for determining that any set of features are common between scans.
- the process is continued until no further matches are found.
- the larger the number of matched points of interest the greater the confidence in a correct matching between o erlapping surface features between scans.
- Each distance in scan file 1 is compared with corresponding distances (i.e. distances between corresponding groups of features, e.g. peak-ndge) in scan file 2. It may be found that peak- ⁇ dge distance a of scan file 1 matches the peak ⁇ dge distance of scan file 2, that the peak-pit distances a of the respective scan files match and that the ⁇ dge-pit distance of scan file 1 matches the ⁇ dge-pit distance a of scan file 2, for example. These matching distances are then processed to find the list of sets of three distances between the possible combinations of the three different features. The resulung lists in this case each comp ⁇ se one set of distances between a peak, a ⁇ dge and a pit:
- Scan file 1 peak- ⁇ dge distance, peak-pit distance, ⁇ dge-pit distance
- Scan file 2 peak- ⁇ dge distance, peak-pit distance, ⁇ dge-pit distance If these sets of distances match, then the peak, ⁇ dge and pit are assumed to be common to scan files 1 and 2.
- the three rotations (about the x, y and z axes of the coordinate system of one scan file) and the three translations (along those x, y, and z axes) required to supenmpose the common points of interest of one scan file onto another are determined.
- Process 65 scan file rotation & translation
- the abov e rotations and translations are applied to the clouds of points of all the ov erlapping scan files, resulting in a single cloud of points defining the enure surface
- the resulting data is of an identical type to the o ⁇ ginal scan file cloud of points or surface model, but translated to the common coordinate system.
- N nearest surface neighbours of point p can be estimated from its n nearest 3D neighbours, b. data density is relatively uniform over the surface of the object to be modelled, c. points are measured with the same accuracy
- This method y ields a more accurate surface model than 1.) when the registration error is small compared to the data acquisition error.
- the object cannot have a hole, and the projection of the cy nd ⁇ cal or
- This method places no rest ⁇ ctions on the topology of the object to be modelled and is 50 the single most useful method, although all three methods are available w ithin process 66 and can be selected by the user by ente ⁇ ng appropnate commands at the keyboard 130 of the computer 47 (Fig. 2).
- the Venn diagram is obtained by calculating the common surface segments bet een all possible pairs of range view s. Each area of overlap of surface features between two or more range views becomes a canonical subset of the Venn diagram.
- the procedure to obtain the Venn diagram from the set of range views consists of finding 15 the intersections between all possible pairs of range views, and once computed, it is possible to determine for each point in each view, w hich other v iew s hav e sampled an element on the surface at that point. The contents of the canonical subsets of the Venn diagram are thus implicitly available.
- the Spatial Neighbourhood Test (SNT) is used to check whether the Euclidian distance between a point in one view, and a surface patch in another view is small enough in relation to the measurement error to determine that they belong to the same
- the definiUon of the surface patch is local and is defined as the plane formed by the three nearest neighbours of the point once transformed in the paramet ⁇ c g ⁇ d of the other view.
- the resulting model in a custom format, is output from process 66 to process 67.
- the complete integrated t ⁇ angula ⁇ on model is used as the input data to a file format converter to produce a file descnbmg the object in a standard file format suitable for display , manipula ⁇ on and storage by a large number of 3D software packages including but not limited to CAD on a wide va ⁇ ety of platforms.
- Common 3D file formats which can be supported by the file format conversion utility include: Bezier Curves, B-Spline Curves and Surfaces, B-Rep models, Constructiv e Solid Geometry (CSG) & CSG Tree, DXF geometry Exchange Format, Initial Graphics Exchange Standard (IGES) (and va ⁇ ants - VDA-IS / VDA-FS), Non Uniform Rauonal B-Sphnes (NURBS), Octree Model, and Step/Pdes.
- This data is output to a CAD software package 68.
- the computer 47 is suitably a graphics workstation or may be a high speed personal computer.
- an object 1' (a paper cup) is shown being scanned by a scanning device 10, which is assumed to incorporate the preferred optical arrangement and accelerometer and gyroscope of Figure 3.
- the comers of the first scan i.e. that area covered by one ho ⁇ zontal sweep of beam 3' due to the rotation of the polygonal mirror 40
- p,q,r the comers of the first scan
- s the comers of the first scan
- p,q,r that overlaps substantially with the next scan t,u,v,w.
- a second row of tw o further scans are shown on the surface of the cup and the operator holding the scanning device 10 ensures that the two rows overlap.
- the overlap between successive scans pqrs and tuvw is due to the rapid scanning rate of poly gonal mirror 40 in relation to the rate at which the projected beam from the scanmng dev ice is swept across the surface of the scanned object 1 ' . Accordingly it is inevitable that the edge region of each scan (e.g. the region of scan pqrs shown hatched) will overlap the edge region of the next scan. For example, the edge region between comers r and of the first scan shown in Figure 5 will overlap the edge region between comers t and u of the second scan.
- COMMON FEATURE DETECTOR process 64 in Figure 4 can be guided to look in the edge region of each scan file for overlap, preferably in real time as the scan files are generated by scanning device 10. In other words, instead of waiting until all of the scan files have been generated and then looking for overlap between ever possible pair of scan files, the process 64 can look for overlap between regions (particularly edge regions) of successively generated scan files.
- inertial sensing means such as accelerometer 50 and gyroscope 51 are earned by scanmng device 10
- the outputs of such inertial sensing means can be processed to determine the velocity and direction of movement of the scans over the surface of the object 1 ' and hence to predict the precise regions of ov erlap of successive scan files.
- the v elocity and/or direction of movement between the first two scans can be de ⁇ v ed from the location of the common features in their scan files and can be used to predict, to a first approximation, the reeion in which the next two scans will ov erlap, so that COMMON FEATURE DETECTOR process 64 will look only in this region for common features.
- the processing needed to idenufy common points in successive scan files is much reduced and should be easily achiev able in real time.
- a docking station 100 is provided, to which the scanning device may be returned pe ⁇ odically dunng scanmng. If each sweep is begun by taking the scanning device from the docking station, the accumulated errors from the accelerometer and gyroscope can be reduced. Furthermore a reference object 200 of known size and shape may be scanned before the object 1 ' is scanned and used to calibrate the system.
- a file defining the entire surface of object 1' could be built up, provided that the object is kept in view by the photodetector 34' of the scanning device 10 throughout whilst sweeping the entire surface of the object with the projected pattern from the scanning device.
- this may not always be practicable and in general it may be easier to combine successive scan files defining part of the surface of the object to form a composite scan file, to repeat this process to form further composite scan files, and then to combine the composite scan files at the end of the scanning process by means of processes 64 and 65 of Figure 4 to form a complete surface desc ⁇ plion.
- the SURFACE FIT process 62 of Figure 4 can be simplified if the coordinate data of all the points is read out from the processor 47 ( Figure 3) in se ⁇ al fashion in a predetermined order (such as the order of acquisition for example) or in some other manner w hich preserves the information on their location within the scan.
- the nearest neighbours of each point can be identified from such information w ithout requi ⁇ ng the calculation of the distances between all pairs of points, and hence a surface can be built up by joining each point to its nearest neighbours.
- the angle of sweep of beam 3' in the scanmng arrangement of Figure 4 is relatively small, the resulung surface will approximate to a projection on a rectangular g ⁇ d.
- Figure 6 shows two surfaces SI and S2 in the form of projections on a rectangular g ⁇ d which have been de ⁇ ved from successive scan files output from the scanning arrangement of Figure 4.
- the normals N defined by (say) the four adjacent points of one surface S2 are calculated at least for the overlapping edge portions of the surface and then a) the distance along each normal at which it cuts surface S2 is calculated and b) the three rotations and translations needed to minimise the sum of these distances of mtersecuon (corresponding to the best overlap) is determined by the genetic or other iterative algonthm.
- Such a technique does not require the detection of discontinuities or other "points of interest”.
- a further embodiment of the present invention could provide simultaneous capture of a colour and texture overlay with surface posiuon data, by the use of a colour CCD or similar two dimensional colour photodetector device instead of, or in addition to, the present photodetector, while strobing the laser or using a broadband light source.
- a still further embodiment of the present invention may retain the inertial position and attitude measurement system as described in connection with Figures 1 and 2 but may use any other means for determining the surface coordinates relative to the scanning device.
- the position and atutude measurement system may be mounted on the object to be measured and the object can be moved around the fixed scanning device, or a further position and attitude measurement system may be provided on the object being scanned in addition to that on the scanning device, enabling both the scanned object and the scanning device to be moved freely.
- the laser 28 used as the light source may be replaced by a light-emitting diode and other wavelengths besides opucal wavelengths can be used, particularly I.R. wavelengths. Accordingly the term "optical” is to be construed broadly to include any radiation or arrangement which obeys the laws of optics.
- a still further embodiment of the present invention could utilise a controllable lens system with va ⁇ able magnification in order to prov ide increased dynamic range of depth measurement.
- a still further embodiment of the present inv ention could include a density , mass and v olume calculator in the computer (47).
- the application ot inert fine powder for example fingerp ⁇ nting dust, may be helpful.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
- Container Filling Or Packaging Operations (AREA)
- Vehicle Body Suspensions (AREA)
- Radar Systems Or Details Thereof (AREA)
- Eye Examination Apparatus (AREA)
- Position Input By Displaying (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
Abstract
Description
Claims
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP8507885A JPH10510352A (en) | 1994-08-24 | 1995-08-22 | Scanning device and scanning method |
NZ291492A NZ291492A (en) | 1994-08-24 | 1995-08-22 | Optical scanning to characterise surface of 3-d object: overlapping scans combined |
KR1019970701182A KR970705737A (en) | 1994-08-24 | 1995-08-22 | SCANNING ARRANGEMENT AND METHOD |
AU32636/95A AU715218B2 (en) | 1994-08-24 | 1995-08-22 | Scanning arrangement and method |
AT95929175T ATE209334T1 (en) | 1994-08-24 | 1995-08-22 | SCANNING DEVICE AND METHOD |
DE69524122T DE69524122D1 (en) | 1994-08-24 | 1995-08-22 | SCAN AND DEVICE |
EP95929175A EP0805948B1 (en) | 1994-08-24 | 1995-08-22 | Scanning arrangement and method |
US09/307,804 US6128086A (en) | 1994-08-24 | 1999-05-10 | Scanning arrangement and method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9417108A GB9417108D0 (en) | 1994-08-24 | 1994-08-24 | Device for measurement of three dimensional object features |
GB9515247.6 | 1995-07-25 | ||
GB9417108.9 | 1995-07-25 | ||
GB9515247A GB2292605B (en) | 1994-08-24 | 1995-07-25 | Scanning arrangement and method |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/804,920 Continuation US5850289A (en) | 1994-08-24 | 1997-02-24 | Scanning arrangement and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1996006325A1 true WO1996006325A1 (en) | 1996-02-29 |
Family
ID=26305511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB1995/001994 WO1996006325A1 (en) | 1994-08-24 | 1995-08-22 | Scanning arrangement and method |
Country Status (12)
Country | Link |
---|---|
US (3) | US5850289A (en) |
EP (1) | EP0805948B1 (en) |
JP (1) | JPH10510352A (en) |
KR (1) | KR970705737A (en) |
CN (1) | CN1066259C (en) |
AT (1) | ATE209334T1 (en) |
AU (1) | AU715218B2 (en) |
CA (1) | CA2198124A1 (en) |
DE (1) | DE69524122D1 (en) |
GB (1) | GB2292605B (en) |
NZ (1) | NZ291492A (en) |
WO (1) | WO1996006325A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
US5886702A (en) * | 1996-10-16 | 1999-03-23 | Real-Time Geometry Corporation | System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities |
US5945996A (en) * | 1996-10-16 | 1999-08-31 | Real-Time Geometry Corporation | System and method for rapidly generating an optimal mesh model of a 3D object or surface |
US6044170A (en) * | 1996-03-21 | 2000-03-28 | Real-Time Geometry Corporation | System and method for rapid shape digitizing and adaptive mesh generation |
US6356263B2 (en) | 1999-01-27 | 2002-03-12 | Viewpoint Corporation | Adaptive subdivision of mesh models |
EP0840880B1 (en) * | 1995-07-26 | 2002-03-13 | Stephen James Crampton | Scanning apparatus and method |
US6549288B1 (en) | 1998-05-14 | 2003-04-15 | Viewpoint Corp. | Structured-light, triangulation-based three-dimensional digitizer |
US7065242B2 (en) | 2000-03-28 | 2006-06-20 | Viewpoint Corporation | System and method of three-dimensional image capture and modeling |
US9628779B2 (en) | 2011-05-19 | 2017-04-18 | Hexagon Technology Center Gmbh | Optical measurement method and measurement system for determining 3D coordinates on a measurement object surface |
Families Citing this family (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE29522352U1 (en) * | 1995-12-12 | 2002-07-18 | Busch Dieter & Co Prueftech | Position measuring probe for the mutual alignment of bodies |
US5988862A (en) | 1996-04-24 | 1999-11-23 | Cyra Technologies, Inc. | Integrated system for quickly and accurately imaging and modeling three dimensional objects |
US6420698B1 (en) * | 1997-04-24 | 2002-07-16 | Cyra Technologies, Inc. | Integrated system for quickly and accurately imaging and modeling three-dimensional objects |
GB9716240D0 (en) * | 1997-07-31 | 1997-10-08 | Tricorder Technology Plc | Scanning apparatus and methods |
US6006604A (en) * | 1997-12-23 | 1999-12-28 | Simmonds Precision Products, Inc. | Probe placement using genetic algorithm analysis |
GB9810553D0 (en) * | 1998-05-15 | 1998-07-15 | Tricorder Technology Plc | Method and apparatus for 3D representation |
EP2416198B1 (en) * | 1998-05-25 | 2013-05-01 | Panasonic Corporation | Range finder device and camera |
US6445974B1 (en) * | 1998-12-30 | 2002-09-03 | Intergraph Corporation | CAD-neutral application programming interface |
JP3417377B2 (en) * | 1999-04-30 | 2003-06-16 | 日本電気株式会社 | Three-dimensional shape measuring method and apparatus, and recording medium |
NO313113B1 (en) * | 1999-07-13 | 2002-08-12 | Metronor Asa | System for scanning large geometry of objects |
US6441908B1 (en) * | 1999-08-06 | 2002-08-27 | Metron Systems, Inc. | Profiling of a component having reduced sensitivity to anomalous off-axis reflections |
DE19952592C1 (en) | 1999-11-02 | 2001-05-10 | Hommelwerke Gmbh | Workpiece surface contour scanning sensor, has movement of sensor point contacting workpiece surface converted into acceleration data via acceleration transducer |
CN1115546C (en) * | 1999-12-29 | 2003-07-23 | 宝山钢铁股份有限公司 | Surface three-dimensional appearance testing method and equipment |
US6505140B1 (en) | 2000-01-18 | 2003-01-07 | Intelligent Automation, Inc. | Computerized system and method for bullet ballistic analysis |
US6785634B2 (en) * | 2000-01-18 | 2004-08-31 | Intelligent Automation, Inc. | Computerized system and methods of ballistic analysis for gun identifiability and bullet-to-gun classifications |
US6600168B1 (en) * | 2000-02-03 | 2003-07-29 | Genex Technologies, Inc. | High speed laser three-dimensional imager |
WO2001069172A1 (en) * | 2000-03-10 | 2001-09-20 | Perceptron, Inc. | A non-contact measurement device |
US7084869B2 (en) * | 2000-03-31 | 2006-08-01 | Massachusetts Institute Of Technology | Methods and apparatus for detecting and correcting penetration between objects |
US6804380B1 (en) * | 2000-05-18 | 2004-10-12 | Leica Geosystems Hds, Inc. | System and method for acquiring tie-point location information on a structure |
US6771840B1 (en) * | 2000-05-18 | 2004-08-03 | Leica Geosystems Hds, Inc. | Apparatus and method for identifying the points that lie on a surface of interest |
US6689998B1 (en) * | 2000-07-05 | 2004-02-10 | Psc Scanning, Inc. | Apparatus for optical distancing autofocus and imaging and method of using the same |
WO2002003858A1 (en) * | 2000-07-12 | 2002-01-17 | Cornell Research Foundation, Inc. | Method and system for analyzing multi-variate data using canonical decomposition |
US7359748B1 (en) | 2000-07-26 | 2008-04-15 | Rhett Drugge | Apparatus for total immersion photography |
US6724489B2 (en) * | 2000-09-22 | 2004-04-20 | Daniel Freifeld | Three dimensional scanning camera |
JP2002118730A (en) * | 2000-10-06 | 2002-04-19 | Citizen Electronics Co Ltd | Handy scanner |
EP1211481A3 (en) * | 2000-11-29 | 2004-05-19 | microSystems GmbH | Control device to check the contours and/or the position of constructional elements |
US6678062B2 (en) * | 2000-12-08 | 2004-01-13 | Cyberoptics Corporation | Automated system with improved height sensing |
US6621063B2 (en) * | 2001-06-21 | 2003-09-16 | Psc Scanning, Inc. | Omni-directional optical code reader using scheimpflug optics |
AUPR631601A0 (en) * | 2001-07-11 | 2001-08-02 | Commonwealth Scientific And Industrial Research Organisation | Biotechnology array analysis |
AU2002332967B2 (en) * | 2001-10-17 | 2008-07-17 | Commonwealth Scientific And Industrial Research Organisation | Method and apparatus for identifying diagnostic components of a system |
JP2003207324A (en) * | 2002-01-16 | 2003-07-25 | Olympus Optical Co Ltd | Method and device for acquiring three-dimensional information |
JP4043258B2 (en) * | 2002-03-13 | 2008-02-06 | オリンパス株式会社 | 3D imaging device |
ITVR20020065A1 (en) * | 2002-06-12 | 2003-12-12 | Roncari S R L | FORCES CONTROL AND COMMAND DEVICE FOR THE TIGHTENING OF LOADS TO BE TRANSPORTED BY LIFT TRUCKS OR SIMILAR. |
US7324132B2 (en) * | 2003-05-06 | 2008-01-29 | Hewlett-Packard Development Company, L.P. | Imaging three-dimensional objects |
WO2004111927A2 (en) | 2003-06-13 | 2004-12-23 | UNIVERSITé LAVAL | Three-dimensional modeling from arbitrary three-dimensional curves |
US7526118B2 (en) * | 2003-08-06 | 2009-04-28 | Quality Vision International, Inc | Digital video optical inspection apparatus and method for vertically sectioning an object's surface |
EP1555506A1 (en) * | 2004-01-14 | 2005-07-20 | Wow Company S.A. | Imaging optical device ensuring focus independently of the object position |
EP1574817A1 (en) * | 2004-03-10 | 2005-09-14 | Diener&AG&Precision&Machining | Method ans system for scanning three-dimensional objects and holder for objects |
US7711179B2 (en) | 2004-04-21 | 2010-05-04 | Nextengine, Inc. | Hand held portable three dimensional scanner |
US7697748B2 (en) | 2004-07-06 | 2010-04-13 | Dimsdale Engineering, Llc | Method and apparatus for high resolution 3D imaging as a function of camera position, camera trajectory and range |
US7236235B2 (en) | 2004-07-06 | 2007-06-26 | Dimsdale Engineering, Llc | System and method for determining range in 3D imaging systems |
US7212949B2 (en) * | 2004-08-31 | 2007-05-01 | Intelligent Automation, Inc. | Automated system and method for tool mark analysis |
DE102004044999A1 (en) * | 2004-09-16 | 2006-04-06 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Input control for devices |
US7489408B2 (en) * | 2005-11-15 | 2009-02-10 | General Electric Company | Optical edge break gage |
US7843448B2 (en) * | 2005-11-21 | 2010-11-30 | Leica Geosystems Ag | Identification of occluded edge regions from 3D point data |
US7995054B2 (en) * | 2005-11-21 | 2011-08-09 | Leica Geosystems Ag | Identification of edge regions from 3D point data |
DE102006031833A1 (en) * | 2006-05-24 | 2007-12-06 | Dr. Wirth Grafische Technik Gmbh & Co. Kg | Method for generating image information |
US7256899B1 (en) * | 2006-10-04 | 2007-08-14 | Ivan Faul | Wireless methods and systems for three-dimensional non-contact shape sensing |
US9767599B2 (en) * | 2006-12-29 | 2017-09-19 | X-Rite Inc. | Surface appearance simulation |
EP2026034B1 (en) * | 2007-08-16 | 2020-04-29 | Carl Zeiss Optotechnik GmbH | Device for determining the 3D coordinates of an object, in particular a tooth |
AT506110B1 (en) * | 2007-12-12 | 2011-08-15 | Nextsense Mess Und Pruefsysteme Gmbh | DEVICE AND METHOD FOR DETECTING BODY MEASURE DATA AND CONTOUR DATA |
WO2009093162A1 (en) | 2008-01-24 | 2009-07-30 | Koninklijke Philips Electronics N.V. | Sensor device with tilting or orientation-correcting photo sensor for atmosphere creation |
JP5175600B2 (en) * | 2008-04-09 | 2013-04-03 | 株式会社日立ハイテクノロジーズ | Inspection device |
US20090259399A1 (en) * | 2008-04-15 | 2009-10-15 | Caterpillar Inc. | Obstacle detection method and system |
CN101561267B (en) * | 2008-04-16 | 2013-06-05 | 鸿富锦精密工业(深圳)有限公司 | Distance-measuring device and method |
DE102008031064A1 (en) * | 2008-07-01 | 2010-01-14 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Mobile device for generating three-dimensional recording of object, has evaluation unit for determining three-dimensional recording based on acquired acceleration and generated profile data |
DE102008039838B4 (en) | 2008-08-27 | 2011-09-22 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Method for scanning the three-dimensional surface of an object by means of a light beam scanner |
WO2010070383A1 (en) * | 2008-12-16 | 2010-06-24 | Sabanci Universitesi | A 3d scanner |
CA2957077C (en) * | 2009-06-01 | 2019-07-02 | Gerd Hausler | Method and device for three-dimensional surface detection with a dynamic reference frame |
DK2442720T3 (en) | 2009-06-17 | 2016-12-19 | 3Shape As | Focus scan devices |
FR2950138B1 (en) * | 2009-09-15 | 2011-11-18 | Noomeo | QUICK-RELEASE THREE-DIMENSIONAL SCANNING METHOD |
EP2564156B1 (en) * | 2010-04-26 | 2019-04-17 | Nikon Corporation | Profile measuring apparatus |
DE102010018979A1 (en) * | 2010-05-03 | 2011-11-03 | Steinbichler Optotechnik Gmbh | Method and device for determining the 3D coordinates of an object |
EP2400261A1 (en) * | 2010-06-21 | 2011-12-28 | Leica Geosystems AG | Optical measurement method and system for determining 3D coordination in a measuring object surface |
US9191648B2 (en) | 2011-02-22 | 2015-11-17 | 3M Innovative Properties Company | Hybrid stitching |
US8687172B2 (en) | 2011-04-13 | 2014-04-01 | Ivan Faul | Optical digitizer with improved distance measurement capability |
US20120045330A1 (en) * | 2011-07-29 | 2012-02-23 | General Electric Company | System and method for monitoring and controlling physical structures |
DE102011053212B3 (en) * | 2011-09-02 | 2012-10-04 | Sick Ag | Optoelectronic sensor and method for detecting objects in a surveillance area |
US9107613B2 (en) * | 2011-09-06 | 2015-08-18 | Provel, Inc. | Handheld scanning device |
US9524268B2 (en) * | 2011-10-31 | 2016-12-20 | University of Floria Research Foundation, Inc. | Vestibular dynamic inclinometer |
US9404792B2 (en) * | 2013-07-01 | 2016-08-02 | Raytheon Company | Auto-alignment system for high precision masted head mirror |
DE102013110581B4 (en) | 2013-09-24 | 2018-10-11 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment and device therefor |
WO2015118120A1 (en) | 2014-02-07 | 2015-08-13 | 3Shape A/S | Detecting tooth shade |
DE102014207896A1 (en) * | 2014-04-28 | 2015-10-29 | Robert Bosch Gmbh | 3D coarse laser scanner |
WO2016099321A1 (en) * | 2014-12-19 | 2016-06-23 | Андрей Владимирович КЛИМОВ | Method for checking the linear dimensions of three-dimensional objects |
WO2016113745A1 (en) * | 2015-01-18 | 2016-07-21 | Dentlytec G.P.L. Ltd | System, device, and method for dental intraoral scanning |
WO2017125926A2 (en) | 2016-01-18 | 2017-07-27 | Dentlytec G.P.L. Ltd | Intraoral scanner |
EP3288486B1 (en) | 2015-05-01 | 2020-01-15 | Dentlytec G.P.L. Ltd. | System for dental digital impressions |
AU2016332738B2 (en) | 2015-09-29 | 2022-06-02 | Haemonetics Corporation | System and method for imaging a rotating object |
DE102015118986A1 (en) * | 2015-11-05 | 2017-05-11 | Anix Gmbh | Test pit measuring system for the optical measurement of a test pit surface, method for optical measurement of a test pit surface with such a test pit measuring system and use of such Prüfgrubenmesssystems |
US10557942B2 (en) * | 2016-06-07 | 2020-02-11 | DSCG Solutions, Inc. | Estimation of motion using LIDAR |
WO2018047180A1 (en) | 2016-09-10 | 2018-03-15 | Ark Surgical Ltd. | Laparoscopic workspace device |
WO2019008586A1 (en) | 2017-07-04 | 2019-01-10 | Dentlytec G.P.L. Ltd | Dental device with probe |
US11690701B2 (en) | 2017-07-26 | 2023-07-04 | Dentlytec G.P.L. Ltd. | Intraoral scanner |
IL294778B2 (en) | 2017-10-06 | 2023-10-01 | Advanced Scanners Inc | Generation of one or more edges of luminosity to form three-dimensional models of objects |
DE102018121573B3 (en) * | 2018-09-04 | 2019-12-24 | Mühlbauer Gmbh & Co. Kg | INSPECTION SYSTEM FOR INSPECTING A COVER SURFACE OF A THREE-DIMENSIONAL TEST OBJECT, AND THEIR USE AND RELATED METHOD |
JP7193308B2 (en) * | 2018-11-09 | 2022-12-20 | 株式会社キーエンス | Profile measuring device |
DE102019200664B3 (en) * | 2019-01-18 | 2020-03-26 | Micro-Epsilon Optronic Gmbh | Sensor arrangement and method for measuring a measurement object |
US10955241B2 (en) | 2019-06-26 | 2021-03-23 | Aurora Flight Sciences Corporation | Aircraft imaging system using projected patterns on featureless surfaces |
JP7286573B2 (en) * | 2020-03-12 | 2023-06-05 | 株式会社日立エルジーデータストレージ | Ranging device and ranging method |
CN111383332B (en) * | 2020-03-26 | 2023-10-13 | 深圳市菲森科技有限公司 | Three-dimensional scanning and reconstruction system, computer device and readable storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2045938A (en) * | 1979-03-09 | 1980-11-05 | Newall Electronics Ltd | Dimension measuring device |
US5198877A (en) * | 1990-10-15 | 1993-03-30 | Pixsys, Inc. | Method and apparatus for three-dimensional non-contact shape sensing |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4648717A (en) * | 1984-02-06 | 1987-03-10 | Robotic Vision Systems, Inc. | Method of three-dimensional measurement with few projected patterns |
US4794262A (en) * | 1985-12-03 | 1988-12-27 | Yukio Sato | Method and apparatus for measuring profile of three-dimensional object |
GB8706388D0 (en) * | 1987-03-18 | 1987-04-23 | Meta Machines Ltd | Position sensing method |
JPH0749937B2 (en) * | 1988-03-22 | 1995-05-31 | 工業技術院長 | Shape measurement method |
DE4037273C2 (en) * | 1989-12-01 | 2003-07-31 | Leica Geosystems Ag | Device for photogrammetric measurement of an object |
US5259037A (en) * | 1991-02-07 | 1993-11-02 | Hughes Training, Inc. | Automated video imagery database generation using photogrammetry |
JP3242704B2 (en) * | 1992-03-19 | 2001-12-25 | 宏介 佐藤 | Distance measuring method and device |
DE4218219C2 (en) * | 1992-06-03 | 1998-05-07 | Geyer Medizin Und Fertigungste | Device for the contactless measurement of a difficult to access, three-dimensional medical or dental object |
WO1994015173A1 (en) * | 1992-12-18 | 1994-07-07 | 3D Scanners Ltd. | Scanning sensor |
JP3175393B2 (en) * | 1993-03-08 | 2001-06-11 | ソニー株式会社 | Distance measuring method and device |
JP3282332B2 (en) * | 1993-12-20 | 2002-05-13 | ミノルタ株式会社 | Image input system |
JP3178205B2 (en) * | 1993-12-20 | 2001-06-18 | ミノルタ株式会社 | Image input system |
US5557397A (en) * | 1994-09-21 | 1996-09-17 | Airborne Remote Mapping, Inc. | Aircraft-based topographical data collection and processing system |
KR100201739B1 (en) * | 1995-05-18 | 1999-06-15 | 타테이시 요시오 | Method for observing an object, apparatus for observing an object using said method, apparatus for measuring traffic flow and apparatus for observing a parking lot |
-
1995
- 1995-07-25 GB GB9515247A patent/GB2292605B/en not_active Expired - Fee Related
- 1995-08-22 WO PCT/GB1995/001994 patent/WO1996006325A1/en active IP Right Grant
- 1995-08-22 EP EP95929175A patent/EP0805948B1/en not_active Expired - Lifetime
- 1995-08-22 AU AU32636/95A patent/AU715218B2/en not_active Ceased
- 1995-08-22 CN CN95194680A patent/CN1066259C/en not_active Expired - Fee Related
- 1995-08-22 CA CA002198124A patent/CA2198124A1/en not_active Abandoned
- 1995-08-22 AT AT95929175T patent/ATE209334T1/en not_active IP Right Cessation
- 1995-08-22 DE DE69524122T patent/DE69524122D1/en not_active Expired - Lifetime
- 1995-08-22 NZ NZ291492A patent/NZ291492A/en unknown
- 1995-08-22 JP JP8507885A patent/JPH10510352A/en active Pending
- 1995-08-22 KR KR1019970701182A patent/KR970705737A/en not_active Application Discontinuation
-
1997
- 1997-02-24 US US08/804,920 patent/US5850289A/en not_active Expired - Fee Related
-
1998
- 1998-07-09 US US09/112,899 patent/US5912739A/en not_active Expired - Fee Related
-
1999
- 1999-05-10 US US09/307,804 patent/US6128086A/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2045938A (en) * | 1979-03-09 | 1980-11-05 | Newall Electronics Ltd | Dimension measuring device |
US5198877A (en) * | 1990-10-15 | 1993-03-30 | Pixsys, Inc. | Method and apparatus for three-dimensional non-contact shape sensing |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0840880B1 (en) * | 1995-07-26 | 2002-03-13 | Stephen James Crampton | Scanning apparatus and method |
USRE43895E1 (en) | 1995-07-26 | 2013-01-01 | 3D Scanners Limited | Scanning apparatus and method |
US7313264B2 (en) | 1995-07-26 | 2007-12-25 | 3D Scanners Limited | Scanning apparatus and method |
US6611617B1 (en) | 1995-07-26 | 2003-08-26 | Stephen James Crampton | Scanning apparatus and method |
US6044170A (en) * | 1996-03-21 | 2000-03-28 | Real-Time Geometry Corporation | System and method for rapid shape digitizing and adaptive mesh generation |
US5870220A (en) * | 1996-07-12 | 1999-02-09 | Real-Time Geometry Corporation | Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation |
US6262739B1 (en) | 1996-10-16 | 2001-07-17 | Real-Time Geometry Corporation | System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities |
US6392647B1 (en) | 1996-10-16 | 2002-05-21 | Viewpoint Corporation | System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities |
US6611267B2 (en) | 1996-10-16 | 2003-08-26 | Viewpoint Corporation | System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities |
US5945996A (en) * | 1996-10-16 | 1999-08-31 | Real-Time Geometry Corporation | System and method for rapidly generating an optimal mesh model of a 3D object or surface |
US5886702A (en) * | 1996-10-16 | 1999-03-23 | Real-Time Geometry Corporation | System and method for computer modeling of 3D objects or surfaces by mesh constructions having optimal quality characteristics and dynamic resolution capabilities |
US6549288B1 (en) | 1998-05-14 | 2003-04-15 | Viewpoint Corp. | Structured-light, triangulation-based three-dimensional digitizer |
US6356263B2 (en) | 1999-01-27 | 2002-03-12 | Viewpoint Corporation | Adaptive subdivision of mesh models |
US7065242B2 (en) | 2000-03-28 | 2006-06-20 | Viewpoint Corporation | System and method of three-dimensional image capture and modeling |
US7453456B2 (en) | 2000-03-28 | 2008-11-18 | Enliven Marketing Technologies Corporation | System and method of three-dimensional image capture and modeling |
US7474803B2 (en) | 2000-03-28 | 2009-01-06 | Enliven Marketing Technologies Corporation | System and method of three-dimensional image capture and modeling |
US9628779B2 (en) | 2011-05-19 | 2017-04-18 | Hexagon Technology Center Gmbh | Optical measurement method and measurement system for determining 3D coordinates on a measurement object surface |
Also Published As
Publication number | Publication date |
---|---|
DE69524122D1 (en) | 2002-01-03 |
US6128086A (en) | 2000-10-03 |
AU715218B2 (en) | 2000-01-20 |
EP0805948B1 (en) | 2001-11-21 |
CN1163661A (en) | 1997-10-29 |
US5912739A (en) | 1999-06-15 |
GB2292605A (en) | 1996-02-28 |
US5850289A (en) | 1998-12-15 |
CA2198124A1 (en) | 1996-02-29 |
ATE209334T1 (en) | 2001-12-15 |
GB9515247D0 (en) | 1995-09-20 |
AU3263695A (en) | 1996-03-14 |
KR970705737A (en) | 1997-10-09 |
NZ291492A (en) | 1998-03-25 |
EP0805948A1 (en) | 1997-11-12 |
CN1066259C (en) | 2001-05-23 |
JPH10510352A (en) | 1998-10-06 |
GB2292605B (en) | 1998-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5912739A (en) | Scanning arrangement and method | |
Gühring | Dense 3D surface acquisition by structured light using off-the-shelf components | |
JP5480914B2 (en) | Point cloud data processing device, point cloud data processing method, and point cloud data processing program | |
US9629551B2 (en) | Detection of a movable object when 3D scanning a rigid object | |
CA1313040C (en) | Method and apparatus for measuring a three-dimensional curved surface shape | |
US5753931A (en) | Object imaging device and method using line striping | |
US20150178412A1 (en) | Three-dimensional coordinate scanner and method of operation | |
JP3409873B2 (en) | Object input device | |
EP1680689B1 (en) | Device for scanning three-dimensional objects | |
Bradshaw | Non-contact surface geometry measurement techniques | |
Pfeifer et al. | Automatic tie elements detection for laser scanner strip adjustment | |
EP3975116B1 (en) | Detecting displacements and/or defects in a point cloud using cluster-based cloud-to-cloud comparison | |
Altuntas | Triangulation and time-of-flight based 3D digitisation techniques of cultural heritage structures | |
Prieto et al. | Visual system for fast and automated inspection of 3D parts | |
AU2891400A (en) | Scanning arrangement and method | |
Pfeifer et al. | Early stages of LiDAR data processing | |
JPH0713997B2 (en) | Wafer alignment angle detection device | |
Schutz et al. | Free-form 3D object reconstruction from range images | |
Kühmstedt et al. | Multi-resolution optical 3D sensor | |
Clark et al. | Depth Sensing by Variable Baseline Triangulation. | |
Prokos et al. | Design and evaluation of a photogrammetric 3D surface scanner | |
Rodella et al. | 3D shape recovery and registration based on the projection of non-coherent structured light | |
KR101168636B1 (en) | Apparatus and method for attribute extract about missing area of digital surface data | |
CN115968344A (en) | Navigation of underground mining machines | |
Sansoni et al. | Projection of Structured Light |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 95194680.3 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GE HU IS JP KE KG KP KR KZ LK LR LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG US UZ VN |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): KE MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2198124 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08804920 Country of ref document: US Ref document number: 1019970701182 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 291492 Country of ref document: NZ |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1995929175 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1019970701182 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 1995929175 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1019970701182 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1995929175 Country of ref document: EP |