USRE43895E1 - Scanning apparatus and method - Google Patents

Scanning apparatus and method Download PDF

Info

Publication number
USRE43895E1
USRE43895E1 US12/647,319 US64731909A USRE43895E US RE43895 E1 USRE43895 E1 US RE43895E1 US 64731909 A US64731909 A US 64731909A US RE43895 E USRE43895 E US RE43895E
Authority
US
United States
Prior art keywords
data
scanner
light
timing
generate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/647,319
Inventor
Stephen James Crampton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3D Scanners Ltd
Original Assignee
3D Scanners Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=10778272&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=USRE43895(E1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by 3D Scanners Ltd filed Critical 3D Scanners Ltd
Priority to US12/647,319 priority Critical patent/USRE43895E1/en
Application granted granted Critical
Publication of USRE43895E1 publication Critical patent/USRE43895E1/en
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object

Definitions

  • This invention relates to an apparatus and method for scanning a three-dimensional object.
  • Real-world, three-dimensional objects whether with natural form (e.g., geographical, plant, human, or animal-like) or man-imagined form (e.g., sculptures, reliefs, cars, boats, planes, or consumer products) are difficult to scan. This is because of features such as rapidly varying surface normals and surfaces for which a line of sight is difficult because it is partially obscured by other parts of the object.
  • natural form e.g., geographical, plant, human, or animal-like
  • man-imagined form e.g., sculptures, reliefs, cars, boats, planes, or consumer products
  • Scanning machines also known as digitizing machines—for scanning objects or parts of objects can be categorized into two types—computer numerically controlled (CNC) and manually operated.
  • a scanning machine includes a unit that contains a sensing means commonly referred to as a probe.
  • Objects or parts of objects can be scanned on CNC scanning machines with a number of computer numerically controlled (CNC) linear and rotating motor-driven axes.
  • CNC machines can move/reorient the probe or the object—or both—by a combination of translation and rotation about these axes.
  • Different machine designs are suited to different classes of objects.
  • Probes can be temporarily or permanently attached to most types of CNC machine tool or CNC coordinate measuring machine that can then be used for scanning. As examples, small and simple three-axis CNC milling machines may be used or large, complex five-axis machines may be used.
  • the points captured by CNC machines are usually on a regular grid and the rate varies from around 1 point per second up to around 20,000 points per second, depending on the technology being used, and the object being scanned. The points from these scanning machines are accurate to the order of 0.05 mm.
  • CNC machines with probes scan by executing one or more programs that move the axes of the machine such that there is relative motion between the probe and the object.
  • CNC machines are expensive, partly because of the incorporation of motors and the associated equipment for assuring precision motion, such as linear guides and drive screws. Few CNC machines are flexible enough so that the probe can be oriented in six degrees of freedom so as to scan the complete surface of a complex object. Even when a CNC machine has six degrees of freedom, it is often not sufficiently flexible so as to position the probe to scan the complete surface of the object without colliding with the object. When the object is a person or expensive, the risk of using a CNC machine may be unacceptable, and there would be a necessity to make a machine to meet both the safety and scanning requirements of the application. The programming of a CNC machine so that the surface of the object is completely scanned without a collision of the probe or machine with the object is often highly complex.
  • the design of the machine and the degrees of freedom inherent in the design and limitations in the probe design, such as the standoff distance during scanning between the probe and the object, mean that it is impossible to come up with a scanning strategy that will scan the complete surface of the object. It is common that the object has to be manually picked up and replaced in a different position and/or orientation one or more times during scanning. Each time that this occurs, the object has to be reregistered to a uniform coordinate system such that the data from the different scans can be accurately combined.
  • Manually operated scanning machines can be categorized into three types—horizontal arm machines, multiply-jointed arms, and devices based on remote position-sensing means.
  • Manually driven, horizontal arm measuring machines usually have three orthogonal axes and are usually based on a traveling column design. These machines are usually quite large, with the bed being at floor level so large items such as cars can easily be moved onto and off of them. Often motors can be engaged on one or more axes to aid the manual movement of the machine.
  • the probe is normally mounted at a fixed orientation on the end of the horizontal arm. This orientation may be changed and various devices may be attached between the end of the horizontal arm and the probe to aid the changing of the orientation, most of these devices having two axes.
  • Horizontal arm machines have the disadvantage of not being able to easily orient the probe in six degrees of freedom. The limited flexibility in the design of a horizontal arm machine makes most of the far side of the object unscannable.
  • Multiply jointed arms commonly comprise multiple linkages and are available for scanning complex objects.
  • a multiply jointed arm typically has six joint axes, but may have more or less joint axes.
  • At the end of the multiply jointed arm there is usually a tip reference point—such as a sphere whose center is the reference point, or a cone ending in a point.
  • Scanning is carried out by bringing the point or sphere into contact with the object being scanned.
  • the computer monitoring the multiply jointed arm measures the angles at all the joints of the multiply jointed arm and calculates the position of that reference point in space.
  • the direction of the last link in the multiply jointed arm is also calculated.
  • Positions can typically be output continuously at a rate of around 100 points per second, but the rate can be much more or much less.
  • the accuracy is of the order of 0.1 to 0.5 mm.
  • the points from the arm are usually sparse and unorganized. The sparseness and lack of organization of the points make it difficult to provide enough information for constructing a computer model of the object that is of acceptable quality.
  • a multiply jointed arm with multiple linkages has a limited working volume.
  • the arms become very expensive, less accurate and tiring, and difficult to operate.
  • the limited working volume can be increased by leapfrogging, in which the whole arm/base is moved to access another volume; but this requires a time-consuming system of registering at least three points each time the arm is moved and recombining the data sets from each arm position.
  • Manufacturers of multiply jointed arms provide precalibrated arms and test methods that the user may employ to make sure that the arm is still calibrated to an acceptable accuracy.
  • Such test methods use, for example, the standard tip reference point at the end of the arm and a reference sphere or a ball-bar, which is a rod with two cylindrical cups that has a precise known distance between a home ball and a end of arm ball.
  • the arm tip at the end of the ball bar is moved on the surface of a spherical domain, the arm records positions that are later compared to a perfect sphere, and error estimates for the arm are output.
  • Remote position-sensing devices include hand-held devices that transmit or receive position information in a calibrated reference volume using different physical methods, including electromagnetic pulses and sound waves.
  • a hand-held device may be connected to the rest of the system by means of a cable. These devices are prone to generating scanned points with very large errors. Some devices cannot work when the object being scanned has metallic components. They are less accurate than multiply jointed arms, with accuracies of the order of 0.5 mm upwards.
  • Point probes measure a single point at a time and technologies include mechanical contact methods and optical distance measurement methods.
  • Stripe probes measure a number of points in a line, either simultaneously or rapidly in a scanned sequence. The most common stripe technology is laser stripe triangulation.
  • Area probes measure a two-dimensional array of points on a surface, either simultaneously or in a scanned sequence. The most common technologies are interference fringe and multiple stripe projection. Some area methods require the device to be held still for a few seconds during data capture. Stripe and area methods have an in-built speed advantage over point methods, as there is less motion of the probe relative to the object. There are differences between the methods in terms of accuracy and cost, but these do not generalize with category. For example, a particular area technology may be cheaper and more accurate than another point technology.
  • Means of capturing independent reference/feature points by contact are well known and efficient. Structured light using stripe or area methods is not good at capturing independent feature points because there is no way for the operator to align a known point on the object with a point on the stripe or in the area.
  • Geometrical errors in the scanning process stem from many sources. CCD cameras can typically capture video at 25 frames per second.
  • One major disadvantage in normal use is that from any given demand for a frame, there is a variability of 40 msecs until the start of capture of that frame. If the probe is being moved at, for example, 100 mm/sec, this can lead to a geometrical error of 4 mm in the probe's data. The duration of frame capture depends upon the shutter speed; e.g., 1/100 sec is 10 msecs.
  • One major disadvantage in normal use is that if the probe is being moved with a slow shutter speed, an additional geometrical error is created.
  • An arm is typically connected to a computer by a serial cable with arms typically generating positions at 125 positions per second as they move.
  • Point formats include independent points, lines of points where a plane intersects with a surface, 2.5D areas of points that are commonly known as range images that are single valued in Z and 3D point arrays, which are often generated by medical scanners.
  • the point formats have many standard representations including the Range Image Standard (RIS) resulting from the European ESPRIT Research & Development Project 6911, IGES, and DXF published by AutoDesk, Inc., in the USA.
  • RIS Range Image Standard
  • Polygon formats include polygons of different geometrical forms. Polygons may be 3- or more sided and formats may include mixed numbers of sides or always the same number of sides. Special cases, such as Delaunay triangulation, can specify the positioning of vertices and the relative lengths of the sides of polygons. Standard representations of polygon formats include STL, published by 3D Systems, Inc., in the USA; IGES; OBJ, published by Wavefront, Inc., in the USA; and DXF.
  • Complex surface formats include Bezier, NURBS, and COONS patches.
  • Standard representations of complex surface formats include IGES, VDA-FS, SET, STEP, and DXF.
  • the objective of scanning can be simply to gather a number of three-dimensional points on the surface of the object, or it may be to create a computer model in a format that is useful for the application in which the model is to be used. It is generally true that a cloud of points alone is not much use in many applications and that more structure is needed to make a computer model efficient to manipulate in typical applications such as visualization, animation, morphing and surface or solid modeling.
  • any file whatever its format can be compressed using standard reversible utilities, such as PKZIP/PKUNZIP from PKWare in the USA.
  • an octet format can be used to reduce the size of the arrays that represent a surface.
  • An octet format splits a cubic volume into eight, smaller cubes, and only further subdivides cubes by eight if they contain information.
  • An octet format is reversible. Moving from unstructured point representation to polygon or complex surface formats often produces large compressions but relies on approximations, so the process is nearly always irreversible.
  • Chordal tolerancing is a commonly used method of reducing the quantity of discrete points in a 2D or 3D polyline.
  • the intermediate data structure does not record the orientation of each stripe; it does not record breaks in the data, but assumes that all the points are connected by a surface; and it does not record jumps in the data, such as those caused by occlusions.
  • the intermediate data structures are usually range images.
  • a number of unregistered range images may be registered, polygonized, and integrated together.
  • the raw data is a number of range images of an object—typically from 5 to 20 in number—with each one either being a cylindrical or a linear range image.
  • the process is not automatic and requires a combination of operator guidance and automated execution of algorithms. The operator first tries to align (i.e., register) the range images to each other on the computer using a graphics display. This process is not accurate and is followed by an automatic least squares fitting process that attempts to adjust the position and orientation of each range image such that they fit together as well as possible. This process is lengthy, often taking hours on a powerful computer.
  • Each range image is then independently polygonized into a network of 2.5D triangular polygons. Finally, the networks of triangular polygons are integrated. The output is a single 3D polygon data set.
  • the process is expensive, both in terms of capital equipment cost and people time. It can take up to two years to become skilled enough to scan objects to produce good enough models. It can work and produce good results for detailed objects.
  • a coordinate measuring machine with a contact tip reference point is commonly used. It is usual to mark up the object with the desired surface patch boundaries by using a marking device, such as a pen or a scribe. These patch boundaries are then hand digitized with the contact point probe.
  • the software package then generates a CNC scanning program that automatically takes more points along the boundaries and inside the patches.
  • the software then automatically generates a first attempt at the surface model. This method is used because it is quicker and easier for the operator to define patch boundaries that will lead to a surface model with the desired structure before scanning, than to define the patch boundaries after scanning using a software package on a computer with a display showing the scanned points. It can take several days, and often weeks, to create patch boundaries that are usually splines, then create the patches and then trim the patches to form a surface model by using only the scanned points and a computer.
  • Scanned data points have been displayed in real-time.
  • the display of points has the disadvantage of easily becoming confusing to interpret, and also that the observer does not know when parts of the object's surface have been missed during scanning.
  • a scanning apparatus for scanning an object to provide a computer model thereof, comprising means for scanning the object to capture data from a plurality of points on the surface of the object, where the scanning means captures data from two or more points simultaneously; means for generating intermediate data structures therefrom; means for combining the intermediate data structures to provide the model; means for display; and means for manually operating the scanning apparatus.
  • the apparatus is most efficient in the quest to reduce the time and cost of generating a computer model from a real-world object by means of scanning with both time and cost reductions of an order of magnitude achieved over conventional techniques.
  • the model is generated automatically from the intermediate data in a form that may be immediately usable in a wide range of applications.
  • the scanning means may use structured light to more quickly scan the surface of the object.
  • the scanning means may also be operable to sense the color of the surface of the object, resulting in a model more like the real-world object.
  • the scanning means therein comprises means for generating a signal for scanning the object, signal detection means for detecting the signal reflected from the object, and means operable in response to the detected signal to provide the data for the intermediate data structure.
  • the structured light is preferably projected as a plane of light such that a stripe is formed on a viewing plane that is situated normal to the projection axis of the signal generating means and situated at the average standoff distance from the signal generating means.
  • the structured light may be projected such that a pattern is formed on an area of a viewing plane that is situated normal to the projection axis of the signal generating means and situated at the average standoff distance from the signal generating means.
  • the signal generating means may be an illumination source such as a laser diode or one or more bulbs.
  • the scanning means can be mounted on many different types of manual machines, giving enhanced flexibility for objects ranging in size from small to very large.
  • the scanning means can be mounted on a multiply jointed arm for accurate scanning.
  • the scanning means may be a self-contained unit that contains a remote position sensor and incorporates a display to give the most flexibility in scanning.
  • the color data is also captured from the object and then mapped on to the model.
  • the data is displayed simultaneously as a plurality of display polygons.
  • FIG. 1 is a schematic representation of a scanning apparatus according to the invention
  • FIG. 2 is a schematic perspective drawing of a probe
  • FIG. 3(a) illustrates a first embodiment of the configuration of the optical elements housed in a probe
  • FIG. 3(b) illustrates a lamp configuration
  • FIG. 3(c) illustrates an alternative to the lamp configuration of FIG. 3(b) ;
  • FIG. 3(d) is a graph illustrating intensity as a function of distance along the line A 1 -A 2 of FIG. 3(a) ;
  • FIG. 4 illustrates a second embodiment of the configuration of the optical elements housed in a probe
  • FIGS. 5(a) to 5(d) illustrate a method of calibrating the color of the scanning apparatus of FIG. 1 ;
  • FIG. 6 is a schematic block diagram illustrating the capture of color and position data
  • FIG. 7 is a schematic representation illustrating how the detection of the color and position data is synchronized
  • FIG. 8 is a schematic representation of the end of the multiply jointed arm of the apparatus of FIG. 1 ;
  • FIG. 9 is a schematic illustration of the turntable and multiply-jointed arm of the apparatus of FIG. 1 ;
  • FIG. 10 illustrates the mounting of the probe on the multiply jointed arm
  • FIG. 11 illustrates the alignment of the mount on the multiply jointed arm
  • FIG. 12 illustrates the alignment of the probe on the multiply jointed arm
  • FIG. 13 illustrates a linear range image
  • FIGS. 14(a)-14(b) illustrate cylindrical range images
  • FIG. 15 illustrates the range image placing method
  • FIG. 16 illustrates the surface normal extension method
  • FIG. 17 represents the structure of a single point in a range image
  • FIG. 18 illustrates the representation of an object by three range images
  • FIG. 19 illustrates the range image updating method
  • FIGS. 20(a)-20(b) illustrate first and second stripes captured on a CCD array
  • FIGS. 20(c)-20(f) illustrate the respective captured data points and strings of data points from the first and second stripes of FIGS. 20(a) and 20(b) ;
  • FIG. 20(g) illustrates polygons generated from these strings
  • FIG. 21 illustrates a probe mounted on a head and a heads-up display
  • FIG. 22 illustrates color image mapping
  • FIG. 23 illustrates the timing for position interpolation
  • FIG. 24 illustrates the triggering of the arm position measurement
  • FIG. 25 illustrates an object with marked lines thereon
  • FIG. 26(a) illustrates a probe mounted on a multiply jointed arm, which is mounted on a horizontal arm machine
  • FIG. 26(b) illustrates two opposing horizontal arm machines
  • FIG. 27 illustrates a human foot being scanned
  • FIG. 28(a) illustrates stripe sections of a pipe network and panel
  • FIG. 28(b) illustrates partial polygon models of a pipe network and panel
  • FIG. 28(c) illustrates extrapolated polygon model of a pipe network
  • FIG. 29(a) illustrates stripe scanning
  • FIG. 29(b) illustrates area scanning
  • a scanning apparatus 100 comprises a multiply-jointed arm 1 with an arm control unit 2 and a probe 3 .
  • the control unit 2 which includes a processing unit 10 , is coupled to a computer or processing unit 4 and color monitor 7 .
  • the probe 3 is also coupled to a probe control unit 5 that is likewise coupled to the computer 4 .
  • the intermediate data is displayed on the color monitor 7 as rendered polygons 13 .
  • the probe 3 provides a stripe 8 , which is projected onto an object 9 positioned on a turntable 14 .
  • the stripe 8 is in the form of a plane of light. Buttons 6 are also provided to control data capture.
  • a color frame grabber 11 in the computer 4 is mounted on a standard bus 12 and coupled to the probe 3 .
  • the computer 4 , probe control unit 5 , arm control unit 2 , buttons 6 , color frame grabber 11 , and monitor 7 are provided separately.
  • the computer 4 and monitor 7 may be a personal computer and VDU, although for certain applications, it may be more convenient for one or all of them to be provided on the probe 3 .
  • the multiply jointed arm 1 and the probe 3 are coupled to the computer 4 by means of the control units 2 , 5 discussed above.
  • the computer 4 receives information from the scanning stripe 8 , the position/orientation of the arm 1 in terms of X,Y,Z coordinates, with the coordinates I,J,K of the surface normal of the probe 3 and color data if required.
  • the probe 3 is lightweight and resilient so as to withstand being knocked without losing its calibration.
  • the structured light is preferably projected as a plane of light 364 such that a stripe 8 is formed on a viewing plane 360 that is situated normal to the projection axis 361 of the signal generating means 362 and situated at the average standoff distance S from the signal generating means.
  • the structured light may be projected such that a pattern 363 is formed on an area 365 of a viewing plane 360 that is situated normal to the projection axis 361 of the signal generating means 362 , and situated at the average standoff distance S from the signal generating means.
  • the pattern 363 in this example is a number of stripes that may be of different colors.
  • the probe 3 comprises a number of components mounted on a base plate 20 .
  • a stripe generator 22 for example, containing a laser diode—provides the stripe 8 for projection onto the object 9 to be scanned.
  • the laser will be of Class 2 or less, according to the CDRH 1040.11 classification in the USA, viz. less than 1 mW in power at 670 nm wavelength.
  • the stripe 8 is nominally focused at some point P.
  • a lens assembly 24 is used to focus the image onto a high resolution CCD camera 25 .
  • the camera 25 may be oriented at an angle, satisfying the Scheimpflug condition.
  • An optical interference notch filter 26 is used to selectively image light of the wavelength of the stripe 8 .
  • a simple glass cut-off filter 27 reduces ambient light within the probe.
  • Information on the color of the surface of the object may be recorded in intensity scales or color scales, such as RGB.
  • An intensity scale estimate of the color of the surface may be obtained by recording the reflected light level of the stripe as it is imaged on the high resolution CCD camera 25 at each point.
  • a high level indicates a light surface at that point that scatters much of the projected light
  • a low level indicates a dark surface at that point that absorbs much of the projected light.
  • a color estimate of the color of the surface can be obtained by means of a color camera 29 comprising a color CCD array.
  • Color scanning requires that the object be lit. Lighting may be by means of ambient light from external light sources or by lamps situated on the probe. There are several disadvantages in using ambient light only for color scanning. First, ambient light intensity varies over nearly every object in a standard environment such as a room with overhead lighting. Second, it can be a time-consuming procedure to position a number of lights so as to evenly light the object. Third, the probe itself can cast shadows onto the object.
  • lamps 28 (a)- 28 (d) are provided around the lens 31 of the camera 29 for illumination, or a ring lamp 28 ′ could be used. This configuration is used to avoid any problems of shadowing.
  • the lamps may include respective back reflectors 32 a- 32 d, where appropriate.
  • the lamps are set to give an average intensity of around 80-150 Lux, but the intensity could be much more or less and, during use, ambient light is reduced significantly below this level—for example, by dimming or switching off overhead lights. This removes any effects from variations in ambient light.
  • the lamps may be tilted with respect to the camera axis to ensure that light of a more even intensity is projected onto the object 9 at the average scanning standoff distance.
  • the lamps should be small in size to obtain the least possible weight penalty, especially if two or more lamps are used. To extend their life, they can be operated at a lower voltage at certain time periods—for example, while preparing for the capture of each image. When the operator triggers a color image capture, the voltage to the lamps can be momentarily increased to maximum. The lamps are only switched on for color capture. During the process of 3D capture, the lamps are switched off where this will also have the added advantage of increasing the signal to noise ratio. An access panel 35 can be provided over the lamps so that the lamps can be easily replaced without opening the probe and risking losing its calibration. To improve results when scanning a reflective surface, polarizing material 34 is placed between the camera 29 , lamps 28 a- 28 d, and the object 9 . To reduce variations in the projected light, diffusers 33 a- 33 d are placed in the light path between each lamp and object 9 or, alternatively, the lamp glass or back reflectors are treated accordingly.
  • the base plate is provided with a removable two-piece cover 21 mounted on the base plate 20 to define a housing for the components of the probe 3 to exclude ambient light and to protect the components. It may have a metallic interior coating to reduce electromagnetic emissions and susceptibility.
  • the cover is appropriate for both embodiments.
  • the probe 3 in the second embodiment can also capture the color of the surface of the object 9 .
  • the color is captured using an optical system that is coplanar with the light stripe 8 .
  • the main advantage of coplanarity is that it provides the color of each point directly, whereas non-coplanar systems, such as in the first embodiment, require extensive post-processing computation to map the captured color data onto the captured 3D data.
  • non-coplanar systems such as in the first embodiment, require extensive post-processing computation to map the captured color data onto the captured 3D data.
  • the whereabouts of the stripe in the color camera is significantly variable due to the non-alignment of the two cameras, leading to an embodiment that is operated for scanning in two passes—3D and color—instead of the one pass that is achievable from the first embodiment.
  • a color sample 32 of the object 9 is directed back along the direction of where the stripe 8 would be if the stripe generator were illuminated where it is reflected by a half-silvered mirror 30 and focused onto the color camera 29 via a lens assembly 31 .
  • the color and position can be captured synchronously.
  • the environment would need to be dark in this case, and lighting 28 would not be required.
  • a look-up method is provided for determining where to read the color from the rectangular array of the color camera 29 , depending on the object distance S at all points along the stripe 8 .
  • a color camera look-up table 166 is prepared by viewing the stripe—when it is permanently illuminated—at a sufficient number of points in a rectangular array covering the distance measuring depth and the stripe measuring length.
  • a fixed-length, flat object 161 is one item that can be used for this purpose, and it typically has a white surface.
  • the flat object 161 is placed on a light stripe absorbent background 162 .
  • the probe 3 and the flat object 161 move relative to each other in the direction of the W axis such that stripes 163 are imaged.
  • the color camera image 165 shows the imaged stripes 163 a collected in a region 164 of the array. In the case of perfect coplanarity, the imaged stripes will be superimposed (see 163 b in FIG. 5(c) ).
  • the look-up table 166 is then built up so that, for a scanned point 167 on an object with coordinates V 1 ,W 1 , will have a color image position Cx,Cy in the look-up table 166 that stores an array of values of Cx,Cy for the V,W ranges. During scanning, the color image is usually scanned before the stripe measurement.
  • the extent of the look-up table 166 can determine how much of the color image 165 needs to be stored while the points are being calculated, i.e., the extent of region 164 . This reduces the amount of memory needed in the computer 4 and the required bandwidth for transferring color information from the camera 29 into the computer 4 , allowing the possible use of lower cost units, such as the frame grabber 11 located in the probe control unit 5 , or in the computer 4 on a bus, as discussed above. It is probably not worth the expense of building a perfectly coplanar system, as a roughly coplanar system can be calibrated as described above to produce as effective a result.
  • the light stripe 8 and color camera 29 can be switched on and off such that the color is recorded shortly before or shortly after the position. Adjusting circuitry can be provided to change the exposure times of the color camera 29 and the stripe generator 22 and to ensure that the gap between the color capture and the stripe capture is minimized.
  • a video synchronization generator 60 generates a synchronization signal 61 of pulses at video camera rate—which, using the CCIR format, is 50 times per second.
  • the synchronization signal 61 is fed into the high-resolution camera 25 and the color camera 29 .
  • the exposure time of the color camera 29 can be set manually using a switch 63 or remotely with an electrical signal.
  • the synchronization signal 61 is also fed into a standard circuit 62 , which switches on the stripe generator 22 for a period of time after an initial delay.
  • the period of time that the stripe generator is on may be set manually using a control 64 , and the delay between when the synchronization signal 61 is received and the stripe generator 22 is switched on may also be set manually using a control 65 .
  • the synchronization signal 61 is represented by the trace SYN.
  • the illumination of the stripe generator 22 is represented by the trace L.
  • the exposure of the color camera 29 is represented by the trace C.
  • the exposure of the high-resolution camera 25 is represented by the trace M. This is only one way by example of controlling coplanar probes.
  • the probe 3 projects the stripe 8 so that the measuring area starts before a tip reference point 51 provided at the end of the multiply-jointed arm 1 and extends a further distance.
  • the tip reference point 51 may be the tip of a cone, a sphere, or a rolling device, such as a rolling wheel or ball, or the point may be anything similar providing a tip.
  • the tip reference point 51 is the center of a sphere 52 . This tip reference point 51 enables the operator to scan hard objects by tracing the sphere 52 along the object 9 in strips in contact with the object 9 .
  • the sphere 52 acts as a scanning guide, and typical instructions might be to keep the tip reference point 51 about 20 mm from the object 9 while scanning. In this way, the probe 3 may be kept close enough to the object 9 for the stripe 8 to be in the measuring area but without touching the soft object 9 .
  • the stripe 8 typically starts 100 mm from the tip reference point 51 at the end of the arm 1 , but could be much closer or further away, and can be used to measure objects lying between the two points W 1 and W 2 , as measured from the end 55 of the probe 3 .
  • the probe 3 may often be retrofitted onto the arm 1 , and because a mechanical arm has a diameter of typically 20-60 mm, this presents an alignment problem.
  • the plane of the stripe 8 is not coaxial, but may be either in a plane 53 parallel to the arm end axis 54 or in a plane 53 a angled to this axis 54 , so as to cross the axis 54 at some point.
  • the use of a manually rotated turntable has several advantages. For a given arm size, larger objects can be scanned. The operator does not have to move around the object. This makes scanning physically easier, more enjoyable, and there is less chance of either the operator or the arm accidentally knocking the object or of any reference point being lost.
  • the position and coordinate system of the turntable 14 must be known relative to that of the arm 1 .
  • the tip reference point 51 can be placed in a locating cone or cup 202 in the table at a large radius. Points are transmitted regularly by the arm control unit 2 and recorded on the computer 4 as the turntable 14 is manually rotated. Functions that fit a plane and a circle through these points provide complete position and orientation information on the turntable 14 in the arm coordinate system.
  • the turntable may be designed to have precise mechanical resting positions 206 a- 206 d, e.g., every 15 degrees. These resting positions 206 would be apparent from a pointer 208 indicating the angle on an attached scale 209 of 360 degrees.
  • the operator could type in the new angle into the computer each time the turntable was rotated. However, the process of typing in an angle means that the operator may have to put down the probe 3 . This slows down scanning, and there is scope for an error to be made by the operator.
  • the process is faster and less error prone.
  • the sensor 203 is accurate—such as an encoder with, for example, 10,000 lines—then the turntable 14 could be positioned at any orientation and its angle known precisely. This allows for scanning while rotating the turntable, although care must be taken that dynamics do not lead to position errors or to the object moving relative to the turntable.
  • the sensor 203 is less accurate—such as a potentiometer—then the turntable 14 could also have precise mechanical resting positions 206 . This gives the advantages of high accuracy and lower manufacturing cost.
  • the software must check for movement of the turntable 14 . If it has been moved then with a less accurate turntable sensor 203 , probe data should be thrown away until the turntable 14 has stopped moving. In all cases, the turntable 14 should be capable of being operated by one hand such that the probe does not have to be laid down. It is often the case that an object on a turntable is scanned with regular increments, e.g., eight scans every 45 degrees. To aid the operator in incrementing by X degrees, different shaped and/or colored icons could be placed every X degrees on the scale and on the other regular intervals. Typical intervals might be 45, 60, 90 degrees. With reference again to FIG. 2 , this method can also be used with a probe 3 including one or more remote position sensors 261 with a tip reference point 51 .
  • the manual turntable may be driven by a motor operable by means of hand controls.
  • a mounting device 210 , 214 for the probe is illustrated in FIG. 10 .
  • Accurate and repeatable geometric positioning of the probe on the arm is required. This is provided by the mounting device 210 , 214 .
  • the mounting device 210 , 214 provides a standard mechanical interface that may preferably be used for all probes and arms, which is both small and light, and that is easy to use to mount and dismount the probe onto and off the arm.
  • the mounting device comprises a arm side mount 210 that comprises a flat mating surface 211 with two precisely dimensioned projections 212 located in precise positions on the mating surface 211 .
  • the mounting device also comprises a probe side mount 214 comprising a flat mating surface 215 and two precisely-dimensioned recesses or holes 216 corresponding to the two projections 212 in the arm side mount 210 . It is essential that the geometric repeatability in position and orientation is very high.
  • a standard mounting device between any arm and any probe gives several advantages. When the arm has to be used without the probe, then the probe has to be removed. If the mounting device is not repeatable, then the system will require realignment before use each time the probe is remounted.
  • a range of probes will be supplied with different weights, speeds, sizes, and accuracies corresponding to different functions.
  • Each probe can be supplied with alignment data relative to the datum of the probe side mount 214 and, in this way, any probe may be attached to the arm without the requirement of realignment.
  • a user may also have one or more different arms. In order to fit the same probe to two different arms, the user needs only to acquire an extra adapter for the second arm, which would fit onto the arm and include the arm side mount 210 of the mounting device 210 , 214 .
  • the arm side-mounting device 210 may be attached to any machine, including multiply-jointed arms, horizontal arms, and orientation devices such as manual, two-axis orientation devices.
  • the transformation matrix Tam can be found in several ways. Now referring to FIG. 11 , a particularly simple, cost effective, and practical method involves the use of a reference plate 220 .
  • the reference plate 220 has three orthogonal flat surfaces 222 and a mounting point 221 to which the arm side mount 210 can be attached in a precisely known position relative to the orthogonal planes. Tam can be calculated using the following steps:
  • the above method can be encapsulated in the main scanning software provided with the scanning system or in a separate program. This has the advantage that much time is saved over an alternative of the user calculating Tam manually from arm positions output by the arm manufacturer's software and manually inputting the resulting Tam into the main scanning system software.
  • the probe side mount 214 is integral to the probe and does not move relative to the probe coordinate system.
  • the transformation matrix Tmp is provided by the probe supplier with the calibration data for the probe.
  • the handedness of the coordinate systems of the arm 1 and the probe 3 would be known.
  • the relationship between the normals of the surfaces on the alignment calibration object 230 could be specified. One way of doing this is by labeling the three faces 231 , 232 , 233 and specifying the order in which the three faces must be scanned.
  • the main advantages of the above apparatus and its method of aligning the probe are (1) that it involves a single alignment calibration object that is cheap to manufacture to the required geometrical tolerance and is relatively light and compact; (2) that the method is robust, simple to carry out from written instructions and quick; (3) that the processing can be encapsulated in the main scanning software provided with the scanning system or in a separate program; (4) that there is no need to have any preliminary geometric information about the orientation and position of the probe relative to the tip of the arm at the start of this method—for example, the probe could be slung on the underside of the arm pointing backwards and the method would work; and (5) that if the probe is knocked or damaged such that Tmp changes but the calibration is still valid, then this method of alignment will still work.
  • Some applications for which 3D surfaces are required that also require 3D reference points are animations involving joint movements where a joint is to be specified in the context of the 3D model. In this case, the joint can be quickly defined from one or more 3D reference points.
  • a new method of using the scanning system is to use the probe 3 to scan the surface and to use the tip reference point 51 to capture individual 3D points by contact.
  • An alternative method is to project a calibrated crosshair onto the object and use an optical method of picking up individual points. This can be used in both stripe and area systems. The calibrated crosshair is usually switched on just during the period in which individual points are captured.
  • the second mode is commonly used for tracing out important feature lines, such as style lines or patch boundaries.
  • important feature lines such as style lines or patch boundaries.
  • a stripe sensor instead of projecting a crosshair, it may only be necessary to project a second stripe at the same time as the main stripe.
  • the crosshairs may be calibrated by the probe supplier using a three-axis computer controlled machine, a known calibration object, and standard image processing techniques.
  • the scanning apparatus 100 is operable to scan an object and thereby generate a computer model of the object's surface using an intermediate data structure for efficiently storing points on the surface of the object during scanning, creating an instance of the intermediate data structure for the particular object; and controlling the storage of the scanned points in the intermediate data structures during scanning with an operator control system.
  • Three examples of these intermediate data structures may be points or encoded stripes or range images.
  • Points have the disadvantage of being unorganized and much information obtained from the structure of the probe and the method of its use is lost if the 3D data is reduced to points.
  • an encoded stripe intermediate data structure stores data from one stripe at a time.
  • the stripes are stored in the order of capture.
  • the time of capture of each stripe is recorded.
  • the orientation of the probe is recorded for each stripe.
  • the raw data points from the stripe may be processed before storing in the data structure to determine jump and break flags and to sample or chordally tolerance the raw data points to reduce the size of the intermediate data structure without losing any significant information.
  • An intermediate data structure can be used in which the surface of an object is described by means of a finite number of linear and cylindrical range images that are, to some extent, characterized by the shape of the object.
  • a linear range image 70 is illustrated with reference to FIG. 13 .
  • the range image 70 has a coordinate system U,V,W and a spacing of points dU in the U direction and dV in the V direction.
  • the linear range image 70 contains in its header definition its relationship to the world coordinate system X,Y,Z, i.e., the arm coordinate system. In the disclosed invention, the linear range image 70 cannot store negative W values.
  • Cylindrical range images 71 , 72 are described in FIGS. 14(a)-14(b) .
  • the range image has a coordinate system W,R,A where A is an angle.
  • the spacing of points is dW in the W direction and dA in the A orientation.
  • the cylindrical range images 71 , 72 contain in their header definitions, their relationships to the world coordinate system X,Y,Z. In the disclosed invention, the cylindrical range images 71 , 72 cannot store negative R values.
  • the range image placing algorithm takes a scanned point and tries to place it into defined range images by projecting a ray along the normal to the range image 105 . If the point has a negative value in a range image—for example, point 104 —then it is not stored in that range image. If the point is outside the extent of that range image, it is not stored in the range image unless the range image is extensible—in which case, the algorithm extends the range image far enough to place the point. If a point already exists in the range image position in which the point is to be placed, then the two points are compared.
  • the nearest point 102 is stored and the furthest point 101 is rejected. If the two points are within the error of the scanner, then their values are averaged and the average value is stored.
  • the range image-placing algorithm is simple and quick, but it is indiscriminate, often placing points incorrectly in range images and relying upon them being overwritten by a nearer point. If the range image is very dense, but populated with few values, then up to half the points populated could be incorrect because the surface normal of the point is incorrect. This can restrict successful scanning to coarse range images.
  • the range image-placing algorithm is improved upon with the surface normal extension.
  • the range image-placing algorithm does not have an estimate of the surface normal of the point to be placed. Also, it does not take into account the orientation of the probe when the stripe is captured.
  • To improve the range image placing the fact that most stripes are scanned in sequence and have near predecessor and near successor stripes is used. For example, as illustrated in FIG. 16 , there are eight neighboring points 116 on a stripe 114 and on its predecessor 113 and successor 115 . These can be used to approximate the surface normal of a point P before it is placed in a range image.
  • Three sequentially scanned stripes 113 , 114 , 115 are shown on an object 111 and projected onto a range image 112 as stripes 113 a, 114 a, and 115 a.
  • the point P with coordinates Xp,Yp,Zp on stripe 114 , has eight near neighbors 116 on the respective stripes 113 , 114 , 115 as described above, and an approximate surface normal Np with coordinates Ip,Jp,Kp.
  • the probe orientation for stripe 114 is N S with coordinates Is,Js,Ks.
  • the correct one is the one that can be seen from the probe 3 orientation—assuming that the changes in probe orientation for the three stripes are not significant to the surface normal direction. If the surface normal N P of a point P is found to be facing away from the surface normal N R , then the point is not placed on the range image.
  • This surface normal extension eliminates the majority of incorrect point placements in range images.
  • three stripes of points are buffered before the first stripe of points is placed in the range images.
  • the normal extension in a modified form can also be used for the first and last stripes by using the two successive or two previous stripes. When the three stripes 113 , 114 , 115 are nearly coincident, perhaps because the arm is moving too slowly, then the accuracy of the surface normal estimate is low and the normal cannot be used.
  • a different normal calculation can be made using any neighboring points already placed in the range image instead of the neighboring stripes.
  • a further, normal extension to the range image placing algorithm combines both the stripe and the range image data to provide a better estimate of the surface normal.
  • the calculations involved in these normal extensions can provide a bottleneck to the scanning process. The bottleneck can be overcome by using only two stripes, less samples (5 instead of 9) or a faster computer.
  • range images that are positioned in the object coordinate system must be defined.
  • the range images have specific mathematical definitions. Two basic types of range image are used—linear and cylindrical—as discussed above.
  • a range image has direction and a zero position. The range image can only store points that are in front of its zero position. If there are two or more surfaces of the object in line with a point in the range image, then the surface that is nearest to the range image's zero position is represented in the range image.
  • a range image can be constrained in size or unconstrained in size.
  • the range image can be one image of fixed density or comprise a patchwork of a number of adjoining images of different densities. Each grid position in the range image is single-valued.
  • the range image will typically use 4 bytes to store a depth value Z, from 1 to 4 bytes to store the gray scale or color value 1 , and from 1 to 3 bytes to store the orientation N. This is illustrated with reference to FIG. 17 , which illustrates how a single point is represented.
  • the 3 bytes suggested for orientation N will not permit a very accurate orientation to be stored. More bytes could be used, but there is a trade-off between data storage size, processing time for converting floating number orientations to/from a compressed integer format, and accuracy.
  • Range images will normally require from 5 to 11 bytes to store each point, depending on the operator's requirements. For comparison, 20 bytes are typically required to store an ASCII X,Y,Z value.
  • the number and position of the range images used in the process are such that they are sufficient to be able to store enough of the surface of the object to enable a computer model of the desired accuracy and detail to be generated.
  • the number and position of all the range images may be defined by the operator before scanning. Alternatively, just one may be defined by the operator before scanning begins, followed by the definition of others at any point during scanning.
  • the operator has a choice of several strategies. He can define range images and scan range one at a time. He can define a number of range images and scan simultaneously. He can define some range images and scan followed by defining more range images and then scanning. If a point is scanned that does not fit onto any defined range image, then it is rejected. Alternatively, such rejected points could be automatically saved for placing into any new range images that the operator may subsequently define.
  • a typical number of range images varies from 1 to 20. Some range images need only be very small in size—small enough to cover a part of the object that is otherwise hidden from recording on other range images.
  • the density of each range image can vary. For instance, a large, smooth part of the object does not need a high point density; but a small, finely detailed ornament may require a high point density.
  • Each range image has a direction.
  • the operator may select the most suitable set of predefined range images from a library of range image sets. He can then edit the set to suit his object. Each new set is then stored in the library.
  • a set can be thought of as a set of templates.
  • a range image set consisting of five cylindrical range images for the limbs and the trunk, together with five linear range images for the top of the head/shoulders, hands, and feet.
  • a range image set consisting of five cylindrical range images for the limbs and the trunk, together with five linear range images for the top of the head/shoulders, hands, and feet.
  • one cylindrical range image for the car's body and two linear range images at each end of the car could be enough. It is important to note that the axis of a cylindrical range image must lie within the object or part of the object being scanned.
  • a range image is manually defined by the operator by first selecting the appropriate range image type—cylindrical or linear—and second, placing the probe to give the desired position and orientation of the range image and selecting it using the operator control system.
  • the probe could be positioned to first give the position and direction of the axis and then to give the maximum radius.
  • an inference method provided for updating range images from the other registered range images is also a novel method.
  • the inference method progresses through each array position in the range image 121 that is to be updated.
  • the inference algorithm can update positions that either have no value or have a value with a surface normal that is steeper than a given value, or have a less steep value, or any combination of these according to the operator's requirements. If a position in the range image 121 is to be updated, then that position is projected as a normal ray 126 onto all the other range images 120 , 125 , one at a time. If the ray intersects with another range image 120 , then the local triangular surface element through which the ray first passes is located on the surface 123 and constructed.
  • the value 124 at the intersection of the ray 126 and the triangular element 122 is then inferred and placed onto the range image being updated. If the ray intersects several range images 120 , 125 , then the inferred values from the range images are averaged after outliers have been removed. Outliers are removed by using a tolerance such as the error of the scanner. The original value (if it exists) in the range image 121 being updated, could be included in this outlier removal/averaging process.
  • the inference method is particularly used when an additional range image is added at a late stage in the scanning process or if range images are defined/scanned one at a time.
  • the method enables surface areas that are nearly orthogonal to the range image, i.e., are almost vertical walls, to be well defined from data stored in the other range images. This provides a better set of points for carrying out the polygonization of one range image resulting in a more accurate polygonal network and simplifying the polygonization process.
  • the probe 3 provides data that is displayed on the display monitor 7 as a rendered polygonal surface 13 in real-time or with an acceptable delay such that the user can watch the display monitor 7 and use the feedback of the rendered surface to guide his movement of the probe 3 .
  • Real-time is defined in the context of visualization as an operation reacting with a delay small enough to be acceptable to an operator in normal use.
  • the probe 3 could be a stripe probe or an area probe. Where the probe captures 3D and color information, then the color information can be mapped onto the 3D model to texture it, as discussed below.
  • the surface to be displayed is calculated for stripe probes one additional stripe at a time.
  • a stripe 301 is captured, it is converted using one of several commonly used methods into a finite string 303 of 3D points 302 a, 302 b and flags 304 , 305 in the world coordinate system X,Y,Z using the previously obtained calibration and alignment data for the probe 3 .
  • the maximum number of points is usually equal to the number of rows 281 or the number of columns 282 in the CCD array 25 , depending on the orientation of the CCD in the optical setup. This disclosure will refer to rows, but it can equally apply to columns or any other way of organizing the data recorded by the camera.
  • the position on the CCD array can usually be calculated to subpixel accuracy by one of several commonly used methods. If there is no data for one or more neighboring rows, such as missing positions 302 e, 302 f (not shown), then a “break” flag 304 is output into the string to indicate a break in the surface recorded. If there is a significant jump discontinuity in range above a maximum value that is appropriately set for the scanning resolution of the probe 3 , such as between 302 j and 302 k, then a “jump” flag 205 is output into tile string of 3D points to indicate either a vertical wall relative to the probe orientation or an occluded surface. The string 303 is filtered to reduce the number of points while effectively transmitting most of the information.
  • the object of filtering is to reduce the amount of data processed for surface rendering and hence increase the speed of the rendering process with a minimal degradation in the quality of the surface.
  • the first method of filtering is to skip some of the stripes.
  • Another method is to sample the rows, e.g., take every nth row.
  • a third method is to chordal tolerance all the points in the stripe and discard the points that are surplus and within tolerance.
  • the first and second filtering methods are preferred because of their simplicity and because the resulting regular grid of points produces regular polygons that look good on the display, as opposed to long thin polygons that might result from a chordal tolerancing process that can have rapidly changing surface normals if the data points are slightly noisy due to inaccuracies in the probe and the arm, and may present an unattractive “orange peel” effect on the display.
  • the same process is repeated for a second stripe 306 capturing data points 307 , resulting in a second string 308 of 3D values 307 a, 307 b, etc., and flags.
  • a surface comprising triangular or quad polygons is then constructed between the two strings 303 and 308 , resulting in a string of polygons 309 .
  • the string of polygons is then displayed by a renderer.
  • the renderer may or may not take into account the previous polygons displayed, the viewpoint, and lighting model.
  • the color information can be mapped onto the polygon.
  • the precise mapping algorithm depends on the format of the raw color information, which depends on the design of the probe.
  • the raw color information may comprise point, line, or area samples.
  • the raw color information may be adjusted before mapping using calorimetric calibration and intensity calibration data. During the mapping process, the color information may be adjusted for the probe to polygon distance at point of color capture and polygon orientation to probe at point of capture. The basis for the adjustments is a set of calibration procedures carried out for each individual probe.
  • the viewpoint for the surface displayed can have a constant position, zoom, and orientation in the world coordinate system of the object such that, as the probe is moved, the surface displayed increases where the data is captured.
  • the viewpoint is set before scanning starts, either with an input device (such as buttons) on the arm, foot pedals, a mouse, and a keyboard, or by using the probe to determine the viewpoint.
  • the viewpoint can have a constant position, zoom, and orientation in the probe coordinate system such that, as the probe moves, the surface is completely re-rendered at regular intervals, each time with the new surface displayed where the data has been captured, with the regular intervals being at an acceptable real-time rate, such as 25 displays per second or less often.
  • the viewpoint can have a constant position, zoom, and orientation in the world coordinate system where the surface displayed increases where the data is captured that is completely updated to that of the probe coordinate system on operator demand such as by the depressing of a button or foot pedal or at regular time intervals, such as every 10 seconds.
  • the different methods for updating the viewpoint provide different advantages, depending on the size and type of the object being scanned and the speed of the computer in recalculating the surface display from a different viewpoint.
  • the display 7 can be mounted on the probe 3 such that the rendered surface 13 , or other image displayed, moves with the probe movement.
  • the display 7 could be mounted in front of the operator's eyes as part of a heads-up display 271 , which may or may not also allow the operator 270 to see his real environment as well as the rendered surface 13 , or it can be mounted elsewhere.
  • the operator watches the rendered surface 13 on the display 7 while scanning because this has the important advantage of ensuring that all the object is scanned.
  • watching the display 7 as a large display monitor situated on a workbench can be advantageous.
  • a display screen on the probe 3 is advantageous because the operator moves with the probe.
  • a heads-up display may be best for nearly all applications because it is feeding back most directly to the operator.
  • the display 7 with the probe 3 as, for instance, a color LCD screen is small, lightweight, realtime and flat, while having sufficient resolution to render the surface so that the operator can see what has and what has not been scanned.
  • a display mounted on the probe could be tiltable in one or two axes relative to the probe. The ability to tilt the display relative to the probe can give the operator improved ability to scan in spaces with poor visual access. Buttons 6 on the probe can be used to navigate menus and select from menus.
  • the probe might have memory 262 , which could be both dynamic memory and magnetic memory, such as a CDROM or digital video disk (DVD).
  • the probe might have a local power source 260 , such as batteries. This would be the case with one or more remote position sensors 261 mounted inside the probe. Although one remote position sensor is sufficient, more accuracy is obtained by averaging the positions coming from three or more remote position sensors. Another benefit of three or more sensors is that when a spurious position is output by one or more sensors, this can be detected and the data ignored.
  • Detection of incorrect positions is by means of comparing the positions output by the three sensors to their physical locations within the probe to see if the variation is larger than the combined, acceptable error of the sensors. Since remote position sensor technology is likely to remain much less accurate than multiply jointed arm technology, it is preferable that probes with remote sensors use array scanning means rather than stripe scanning means. With a single array scan, all the data in the array (i.e., a range image) is accurately registered to each other, but with stripes there are position errors been any two sequential stripes. It is possible to use an iterative closest point (ICP) algorithm on overlapping range images to substantially reduce the errors caused by the remote position sensors; but this is not possible with stripes.
  • ICP iterative closest point
  • a number of different technologies exist for area probes including binary stereo, photometric stereo, texture gradients, range from focus, range from motion, time of flight, Moire interferometric, and patterned structured light systems.
  • the most common systems in use in industrial applications are time of flight, Moire, and patterned structured light.
  • Different area probe technologies have different advantages and disadvantages for manual scanning.
  • Time of flight systems use a modulated laser spot to measure a scene by the phase shift between outgoing and reflected beams, which is proportional to the range of the object point.
  • a complete range image is captured by scanning the whole region of interest.
  • this technique is advantageous since it is line of sight, although the accuracy is generally of the order of 1-2 mm unless multiple measurements are taken at each point, thus reducing scanning speed significantly. It is thus too slow.
  • Moire systems use gratings in front of projection and viewing optics to produce an interference pattern that varies according to local changes in height on the object. Absolute measurements and measurements across discontinuities are only possible by taking several measurements with different grating configurations or from different project angles. For relative height measurement, these systems offer high accuracy. It is thus too problematic to obtain absolute measurements.
  • a depth from focus range area sensor has recently been demonstrated that allows the real-time determination of range from pairs of single images from synchronized cameras, albeit with the use of relatively complex hardware. It is thus too complex to use at this point in the development of the technology.
  • patterned structured light systems come in many families and rely on projection of a light pattern and viewing off-axis from the projection angle.
  • Synchronously scanned laser triangulation probes can be raster scanned over a 2D area.
  • a laser stripe triangulation line can be scanned in one direction to produce area measurements.
  • Scanning can be mechanical or electronic.
  • Multiple laser or light stripes 363 can also be simultaneously projected over an object to obtain the same effect as a scanned stripe, but this has the disadvantage that, in a single image it is not possible to differentiate between the stripes.
  • a number of systems use a gray-coded sequence of binary stripe patterns that solves the ambiguity problem.
  • the sensor should remain stationary during the capture process.
  • An alternative solution is the projection of color-coded light stripes that allow the unambiguous determination of range, even with depth discontinuities from a single image. Note that the simultaneous use of a number of stripes is herein classified as an area technique and not a stripe technique.
  • Each stripe is one color.
  • Each color may be a discrete wavelength, such as provided by a number of different laser diodes or a subset of a spectrum range of color generated from a white light source. Either all of the colors may be unique or a small number of colors may repeat. The repetition of a small number of colors can lead to ambiguity if stripes of the same colors are not sufficiently separated.
  • the probe encapsulation would have advantages in terms of cost reduction and complete flexibility in freedom of use because even cables may not be required and the only limits would be the range and accuracy of the remote position sensor.
  • the probe with a display mounted on it might receive its power along a cable that may follow the path of the arm, and the computer may be situated in the base of the arm, which would reduce the weight of the probe and reduce operator fatigue.
  • the probe 3 with one or more remote position sensors 261 could be mounted on the operator's head 270 with fixing means 272 to produce a head-mounted scanning system 274 .
  • the standoff from the probe to the object using a head-mounted scanning system 274 would be quite large, for example, 250 mm, but it could be more or less.
  • the strip polygonization of intermediate data to automatically create a polygonal model is described for a stripe scanner.
  • the following description is by means of an example and comprises the following steps:
  • the intermediate data is preferably in an encoded stripe form as described above.
  • a polygon mesh integration method such as an implicit surface method to integrate the 2.5D polygonal meshes into a computer model comprising one or more 3D polygonal meshes.
  • the range image polygonization of intermediate data to automatically create a polygonal model is similar to strip polygonization. Each range image is effectively a group of stripes with the same surface normal. Steps 1 and 2 above are, therefore, not needed. There are two ways of carrying out the equivalent of step 3 above. Range image data may be chordal toleranced as a series of stripes as described in step 3, and the polygonization process continued with steps 4 to 9, as required. In the second way, given the greater structure of a range image over a group of stripes, steps 3 and 4 may be combined and a range image tolerancing algorithm combined with a 2.5D polygonization algorithm and the polygonization process continued with steps 5 to 9, as required.
  • Range image polygonization is better suited to area scanners and strip polygonization is better suited to stripe scanners. If the intermediate data structure is range images then the range image polygonization will work whether each range image relates to a particular data capture instant or is part of a defined range image structure that is characterized by the shape of the object.
  • the combining of color data onto the 3D model is known as texture mapping.
  • raw color data in the form of color images can be texture mapped onto the 3D model, it must first be corrected by means of various calibrations.
  • the first geometric calibration is to take out lens distortion. Standard means are used for this based on imaging geometric objects of known size and extracting pixel coordinates using standard image processing techniques.
  • the second is to create the camera model. A simple pinhole model can be used or a more complex model. Standard means are used for this based on imaging geometric objects of known size from different distances and extracting pixel coordinates using standard image processing techniques.
  • the third is generating the alignment transform.
  • a method has been developed based on 3D and color imaging geometric objects of known size using the probe. For all three methods, a three-axis computer controlled machine is used to ensure precise distances. The probe engineering must be geometrically stable enough such that this transform will only be recalculated rarely such as after the probe has been dropped or damaged.
  • a diffuse, flat, white surface is imaged normal to the camera axis at a number of different distances from the probe to the surface. The distances are chosen to cover the whole scanning range from closest point to furthest point. The variations in mean intensity recorded in the camera are used to calibrate the probe with distance. This calibration data is used to correct the color data recorded when scanning an object such that all color data is corrected to a known distance equivalent.
  • a diffuse, flat, white surface is imaged at various angles to the camera axis at a fixed distance from the probe to the surface. The angles are chosen to the point at which there is significant deviation from the Lambertian model.
  • the variations in mean intensity recorded in the camera are used to calibrate the probe intensity with relative surface angle to the probe. This calibration data is used to correct the color data recorded when scanning an object such that all color data is corrected to a normal equivalent.
  • a standard calorimetric calibration is carried out using reference colors, such as Macbeth charts that are mounted normal to the color camera axis at a known distance from the probe. Corrections are made to a commonly used color standard, such as to the CIE. Individual pixels in the camera may be color- and intensity-corrected.
  • the calibration information can be incorporated into the software as, for example, constants or tables or equations for the probe design. Others calibrations are carried out once on the setup of each probe after manufacture. Other calibrations could be carried out each time the scanning system is used—for example, the scanning of a white surface at a known distance will set the lamp intensity relative to the intensity when the bulbs were new.
  • mapping color images 320 onto the 3D model 324 to form texture maps.
  • Surface elements on the 3D model may be flat polygons or elements of a high-level surface form.
  • a mapping method for color images is:
  • Each color image 320 is corrected using calibration and geometric data.
  • the color image whose normal 323 is closest in orientation to the normal 322 of the surface element 321 is selected (the master image) and the texture map coordinates for that surface element go to the mapping of that surface element onto that master image.
  • the closest image normal is that of 320 a in this case.
  • the other color images that map onto the surface element are then processed. If the surface normal difference between the surface element and a color image is above a certain tolerance, then that image is ignored. This is because the color quality obtained in the image degrades significantly as the surface orientation of the object relative to the image becomes very steep.
  • the part of the master image on which the surface element maps is then improved by a weighted average of all the color image mapped parts. The basis of the weighting is the cosine of the difference in surface normal between the surface element and the color image.
  • the arm position in the middle of the frame is estimated by interpolating in six degrees of freedom between the two arm positions B,A using the time (t 2 -T/2) at the middle of the frame capture as the interpolation weighting between t 1 and t 3 .
  • This interpolation method can increase the accuracy of a non-triggered system by a large amount and is extremely significant in the quest to obtain geometrically accurate data.
  • the operating system under which the interpolation software runs may be set to prioritize the interpolation software as high priority so that the introduction of delays due to other software being executed is minimized. Even if another software function interrupts this process, the validity of the process is not impaired unless the interrupting process is of extraordinarily long duration. Prioritization is not essential, but will contribute to reduced timing error where prioritizing is available in the operating system.
  • triggering there are many methods of carrying it out.
  • One method is, with reference now to FIG. 24 , that the synchronization signal 240 from a CCD camera 25 is stripped off by electronic circuitry 241 , and a relay 242 is used to generate a series of trigger pulses 243 to the arm computer 2 .
  • This has the advantage of eliminating both the arm and camera variabilities and increasing the accuracy of the scanning as much as possible for a given arm and camera.
  • the operator interface means alone—not including the standard computer means such as mouse and keyboard—can be used to control the scanning and computer model generation process and the functionality of the options that can be actuated.
  • the operator interface means include means for navigating menus such as buttons, foot pedals, joysticks, trackballs, and the position-sensing means—arm or remote position sensor.
  • the operator can simply select the required operations and operating parameters, which could include, for example, being able to:
  • Complex surfaces can be created from marked surface patch boundaries.
  • the object 130 is painted a uniform color (if necessary) before marking the patch boundaries 131 by hand in another color, e.g., using a black marker pen on a white object. It is not important to mark these boundaries accurately, as they usually lie away from features such as edges or rapid changes in surface normal.
  • the object is then scanned in using one of the methods disclosed.
  • the color information is then used to automatically generate the patch boundaries by means of an algorithm that separates out the points 132 lying on the patch boundaries by means of a color filter and then fits patch boundary lines such as splines 133 to these points.
  • the edges may also be detected using a separate algorithm.
  • the patch boundaries that have been automatically created from the scan can then be used to create the complex surface model.
  • the main benefit of this method is that it is easier to mark patch boundaries on the object than on the computer model prior to the automatic creation of the complex surface model.
  • FIG. 26(a) an important implementation 333 of the invention is disclosed in which the multiply-jointed arm 1 is mounted on the end of the horizontal arm of the horizontal arm measuring machine 330 for scanning a large object 331 .
  • the horizontal arm measuring machine 330 has a machine control box 332 that outputs the position of the machine to the computer 4 .
  • the arm control 2 and the probe 3 are also connected to the computer 4 .
  • This implementation makes the scanning of large objects more precise in that either a large arm or leapfrogging would be less accurate than a horizontal arm, and simpler in that each time the horizontal arm is moved, the software takes it into account automatically rather than needing to reregister using a leapfrogging method.
  • This invention is a general 3D model-making device and has wide-ranging applicability.
  • the application industries for this invention include design stylists who need to turn clay objects into computer models quickly and accurately; games developers and animators who need to convert new characters into 3D data sets for animation; shoe manufacturers who need to make custom shoes; automotive manufacturers who need to model the actual cable and pipe runs in confined spaces; and medical applications that include radiotherapy and wound treatment. Altogether, some 200 applications have been identified for this invention.
  • the scanning apparatus 100 can be used to scan a human foot 141 with full body weight on it on surfaces of different resilience is also disclosed.
  • the outside of the foot 141 is first scanned using the methods and devices disclosed above with the required amount of body weight being exerted.
  • the foot 141 is then removed and a second scan is carried out of the surface 142 on which the foot 141 was pressed.
  • the first scan is a positive.
  • the second scan is a negative.
  • the surface normals of the second scan are then reversed by means of a simple algorithm and the two scans combined to give the positive shape of the foot.
  • a deformable material is used that it does not spring back. Such a material might be sand, clay, or plaster. Materials of different resilience may be appropriate for different applications. This method is also appropriate when the foot is pressed onto the lower half of a shoe with the sides cut away.
  • a stripe sensor can be activated in a first mode to take a single stripe section by the operator activating a button or foot-pedal. In this way, the operator can take a small number of sections to describe the path of the pipe using his expertise to decide when to take sections. For instance, where a pipe joins another pipe, it may be appropriate to capture many more stripe sections 344 to 349 . Also, where there is a feature such as a fixing on a pipe, it may be appropriate to capture very dense stripes.
  • a second mode would be capturing stripe sections as fast as the sensor can capture them and displaying them as a surface on the display.
  • a third mode would be a mode in which the operator specifies the distance between the sections, e.g., 5 mm, and the system automatically takes a stripe section every, e.g., 5 mm that the stripe travels in 3D space.
  • One method of determining this distance is to select the point at the average standoff distance in the middle of the stripe, i.e., the center point of the measuring range, and when this point has moved 5 mm, to automatically capture another stripe section.
  • the operator control system should support the simple switching between the three modes.
  • the intermediate data structure in which the stripe sections are collated could be the standard stripe section structure 303 , but includes the changes in mode and the orientation of the probe for each section.
  • panel sections along which the pipes and cables run are also captured 342 a, 342 d. Where there is no contact between the pipe and the panel, there is a jump or break in the stripe section. These can be flagged in the data structure with jump flags 305 and break flags 304 .
  • a high level model should be created and output from this data.
  • a polygonization or surfacing method joins the sections together and can handle the joining of pipes, panels, etc.
  • the result is high level models 350 to 352 . If more information is known about the pipe or cable, such as its section if it is constant or its form even if the form's dimensions change, e.g., circular but varying diameter, the model 351 can be automatically expanded to 353 . Alternatively, two scanned sides of the same pipe can be automatically joined. This gives the automobile manufacturer the high level model that he needs.
  • the color camera does not need to be included.
  • a single camera could be utilized for both color and position sensing.
  • the fitter in the probe could be a narrow band pass filter or a red high band pass filter, as required.
  • the system is adaptable to many types of model generation not just those discussed herein.
  • the data collected by the probe could be used for other applications and could be stored for dissemination elsewhere—for example, by electronic mail.
  • the probe can be a stripe or an area probe.
  • the display can be mounted anywhere depending upon the application requirements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

A scanning apparatus and method for generating computer models of three-dimensional objects comprising means for scanning the object to capture data from a plurality of points on the surface of the object so that the scanning means may capture data from two or more points simultaneously, sensing the position of the scanning means, generating intermediate data structures from the data, combining intermediate data structures to provide the model; display, and manually operating the scanning apparatus. The signal generated is structured light in the form of a stripe or an area from illumination sources such as a laser diode or bulbs which enable data for the position and color of the surface to be determined. The object may be on a turntable and may be viewed in real time as rendered polygons on a monitor as the object is scanned.

Description

CROSS-REFERENCES TO RELATED APPLICATIONS
This application is a broadening reissue of U.S. Pat. No. 7,313,264, issued Dec. 25, 2007, which is a continuation of U.S. patent application Ser. No. 09/000,215, filed on May 26, 1998, now U.S. Pat. No. 6,611,617, which is a 371 of international application PCT/GB96/01868, filed Jul. 25, 1996, which in turn claims the benefit of British Application No. 9515311.0, filed Jul. 26, 1995.
FIELD OF THE INVENTION
This invention relates to an apparatus and method for scanning a three-dimensional object.
BACKGROUND OF THE INVENTION
Real-world, three-dimensional objects, whether with natural form (e.g., geographical, plant, human, or animal-like) or man-imagined form (e.g., sculptures, reliefs, cars, boats, planes, or consumer products) are difficult to scan. This is because of features such as rapidly varying surface normals and surfaces for which a line of sight is difficult because it is partially obscured by other parts of the object.
Scanning machines—also known as digitizing machines—for scanning objects or parts of objects can be categorized into two types—computer numerically controlled (CNC) and manually operated. A scanning machine includes a unit that contains a sensing means commonly referred to as a probe.
Objects or parts of objects can be scanned on CNC scanning machines with a number of computer numerically controlled (CNC) linear and rotating motor-driven axes. Different CNC machines can move/reorient the probe or the object—or both—by a combination of translation and rotation about these axes. Different machine designs are suited to different classes of objects. Probes can be temporarily or permanently attached to most types of CNC machine tool or CNC coordinate measuring machine that can then be used for scanning. As examples, small and simple three-axis CNC milling machines may be used or large, complex five-axis machines may be used. The points captured by CNC machines are usually on a regular grid and the rate varies from around 1 point per second up to around 20,000 points per second, depending on the technology being used, and the object being scanned. The points from these scanning machines are accurate to the order of 0.05 mm. CNC machines with probes scan by executing one or more programs that move the axes of the machine such that there is relative motion between the probe and the object.
CNC machines are expensive, partly because of the incorporation of motors and the associated equipment for assuring precision motion, such as linear guides and drive screws. Few CNC machines are flexible enough so that the probe can be oriented in six degrees of freedom so as to scan the complete surface of a complex object. Even when a CNC machine has six degrees of freedom, it is often not sufficiently flexible so as to position the probe to scan the complete surface of the object without colliding with the object. When the object is a person or expensive, the risk of using a CNC machine may be unacceptable, and there would be a necessity to make a machine to meet both the safety and scanning requirements of the application. The programming of a CNC machine so that the surface of the object is completely scanned without a collision of the probe or machine with the object is often highly complex. Usually, the design of the machine and the degrees of freedom inherent in the design and limitations in the probe design, such as the standoff distance during scanning between the probe and the object, mean that it is impossible to come up with a scanning strategy that will scan the complete surface of the object. It is common that the object has to be manually picked up and replaced in a different position and/or orientation one or more times during scanning. Each time that this occurs, the object has to be reregistered to a uniform coordinate system such that the data from the different scans can be accurately combined.
Manually operated scanning machines can be categorized into three types—horizontal arm machines, multiply-jointed arms, and devices based on remote position-sensing means.
Manually driven, horizontal arm measuring machines usually have three orthogonal axes and are usually based on a traveling column design. These machines are usually quite large, with the bed being at floor level so large items such as cars can easily be moved onto and off of them. Often motors can be engaged on one or more axes to aid the manual movement of the machine. The probe is normally mounted at a fixed orientation on the end of the horizontal arm. This orientation may be changed and various devices may be attached between the end of the horizontal arm and the probe to aid the changing of the orientation, most of these devices having two axes. Horizontal arm machines have the disadvantage of not being able to easily orient the probe in six degrees of freedom. The limited flexibility in the design of a horizontal arm machine makes most of the far side of the object unscannable.
Multiply jointed arms commonly comprise multiple linkages and are available for scanning complex objects. A multiply jointed arm typically has six joint axes, but may have more or less joint axes. At the end of the multiply jointed arm, there is usually a tip reference point—such as a sphere whose center is the reference point, or a cone ending in a point.
Scanning is carried out by bringing the point or sphere into contact with the object being scanned. The computer monitoring the multiply jointed arm then measures the angles at all the joints of the multiply jointed arm and calculates the position of that reference point in space. The direction of the last link in the multiply jointed arm is also calculated. Positions can typically be output continuously at a rate of around 100 points per second, but the rate can be much more or much less. The accuracy is of the order of 0.1 to 0.5 mm. The points from the arm are usually sparse and unorganized. The sparseness and lack of organization of the points make it difficult to provide enough information for constructing a computer model of the object that is of acceptable quality. A multiply jointed arm with multiple linkages has a limited working volume. In general, if a larger working volume is required, the arms become very expensive, less accurate and tiring, and difficult to operate. The limited working volume can be increased by leapfrogging, in which the whole arm/base is moved to access another volume; but this requires a time-consuming system of registering at least three points each time the arm is moved and recombining the data sets from each arm position. Manufacturers of multiply jointed arms provide precalibrated arms and test methods that the user may employ to make sure that the arm is still calibrated to an acceptable accuracy. Such test methods use, for example, the standard tip reference point at the end of the arm and a reference sphere or a ball-bar, which is a rod with two cylindrical cups that has a precise known distance between a home ball and a end of arm ball. As the arm tip at the end of the ball bar is moved on the surface of a spherical domain, the arm records positions that are later compared to a perfect sphere, and error estimates for the arm are output.
Remote position-sensing devices include hand-held devices that transmit or receive position information in a calibrated reference volume using different physical methods, including electromagnetic pulses and sound waves. A hand-held device may be connected to the rest of the system by means of a cable. These devices are prone to generating scanned points with very large errors. Some devices cannot work when the object being scanned has metallic components. They are less accurate than multiply jointed arms, with accuracies of the order of 0.5 mm upwards.
There are three broad categories of scanning probes that could be mounted on the end of a multiply jointed scanning machine—point, stripe, and area probes. Point probes measure a single point at a time and technologies include mechanical contact methods and optical distance measurement methods. Stripe probes measure a number of points in a line, either simultaneously or rapidly in a scanned sequence. The most common stripe technology is laser stripe triangulation. Area probes measure a two-dimensional array of points on a surface, either simultaneously or in a scanned sequence. The most common technologies are interference fringe and multiple stripe projection. Some area methods require the device to be held still for a few seconds during data capture. Stripe and area methods have an in-built speed advantage over point methods, as there is less motion of the probe relative to the object. There are differences between the methods in terms of accuracy and cost, but these do not generalize with category. For example, a particular area technology may be cheaper and more accurate than another point technology.
Means of capturing independent reference/feature points by contact are well known and efficient. Structured light using stripe or area methods is not good at capturing independent feature points because there is no way for the operator to align a known point on the object with a point on the stripe or in the area.
Geometrical errors in the scanning process stem from many sources. CCD cameras can typically capture video at 25 frames per second. One major disadvantage in normal use is that from any given demand for a frame, there is a variability of 40 msecs until the start of capture of that frame. If the probe is being moved at, for example, 100 mm/sec, this can lead to a geometrical error of 4 mm in the probe's data. The duration of frame capture depends upon the shutter speed; e.g., 1/100 sec is 10 msecs. One major disadvantage in normal use is that if the probe is being moved with a slow shutter speed, an additional geometrical error is created. An arm is typically connected to a computer by a serial cable with arms typically generating positions at 125 positions per second as they move. At this rate, there is a variability of 8 msecs between when a position is needed at the computer and when it arrives. This can also introduce a geometrical error when the probe is moving. The total variabilities of the CCD camera and the arm can cause large aggregate errors.
There is a wide range of formats for 3D information in current use. These include the general categories—point formats, polygon formats, and complex surface formats.
Point formats include independent points, lines of points where a plane intersects with a surface, 2.5D areas of points that are commonly known as range images that are single valued in Z and 3D point arrays, which are often generated by medical scanners. The point formats have many standard representations including the Range Image Standard (RIS) resulting from the European ESPRIT Research & Development Project 6911, IGES, and DXF published by AutoDesk, Inc., in the USA.
Polygon formats include polygons of different geometrical forms. Polygons may be 3- or more sided and formats may include mixed numbers of sides or always the same number of sides. Special cases, such as Delaunay triangulation, can specify the positioning of vertices and the relative lengths of the sides of polygons. Standard representations of polygon formats include STL, published by 3D Systems, Inc., in the USA; IGES; OBJ, published by Wavefront, Inc., in the USA; and DXF.
Complex surface formats include Bezier, NURBS, and COONS patches. Standard representations of complex surface formats include IGES, VDA-FS, SET, STEP, and DXF.
The objective of scanning can be simply to gather a number of three-dimensional points on the surface of the object, or it may be to create a computer model in a format that is useful for the application in which the model is to be used. It is generally true that a cloud of points alone is not much use in many applications and that more structure is needed to make a computer model efficient to manipulate in typical applications such as visualization, animation, morphing and surface or solid modeling.
There are often benefits to be gained from reducing the size of the files associated with the model formats. Any file, whatever its format can be compressed using standard reversible utilities, such as PKZIP/PKUNZIP from PKWare in the USA. With 3D point arrays, an octet format can be used to reduce the size of the arrays that represent a surface. An octet format splits a cubic volume into eight, smaller cubes, and only further subdivides cubes by eight if they contain information. An octet format is reversible. Moving from unstructured point representation to polygon or complex surface formats often produces large compressions but relies on approximations, so the process is nearly always irreversible. It is also difficult to automate so as to give good enough results. Chordal tolerancing is a commonly used method of reducing the quantity of discrete points in a 2D or 3D polyline. As an intermediate data structure, it has disadvantages in that the intermediate data structure does not record the orientation of each stripe; it does not record breaks in the data, but assumes that all the points are connected by a surface; and it does not record jumps in the data, such as those caused by occlusions.
Most scans today are carried out using a multiply jointed arm with a tip reference point. It is usual to first mark the object to be scanned with a permanent or temporary marking device, such as an ink pen or scribe, to create a polygonal network of splines. A single point is then scanned at each network intersection. On the computer, the points are linked together into a polygonal structure. The overall process (marking, scanning, and linking) of creating a 3D polygonal model is at a typical rate of 1 point (or vertex on the model) every 3 seconds. In some implementations, the network is not marked on, but appears on a computer display as each point is scanned. With this implementation, the network is built up interactively. This method is suitable for models with a relatively small number of vertices, i.e., hundreds and thousands. The method is very slow, requires skill, patience, and concentration, and is expensive in human time—particularly for large, detailed objects that can take three weeks to scan.
An alternative method of scanning with a multiply jointed arm and contact tip reference point has often been tried, in which independent points are rapidly captured without the aid of a network. The points are then input into a surfacing software package, which then constructs a polygonal network between the points. However, the “polygonization” of unorganized data points is usually very slow, and speed decreases significantly as the number of points increases. The results are usually so poor as to be unacceptable. There is usually a significant amount of hand editing of the data required.
Where a CNC scanning machine is used, the intermediate data structures are usually range images. A number of unregistered range images may be registered, polygonized, and integrated together. The raw data is a number of range images of an object—typically from 5 to 20 in number—with each one either being a cylindrical or a linear range image. The process is not automatic and requires a combination of operator guidance and automated execution of algorithms. The operator first tries to align (i.e., register) the range images to each other on the computer using a graphics display. This process is not accurate and is followed by an automatic least squares fitting process that attempts to adjust the position and orientation of each range image such that they fit together as well as possible. This process is lengthy, often taking hours on a powerful computer. Each range image is then independently polygonized into a network of 2.5D triangular polygons. Finally, the networks of triangular polygons are integrated. The output is a single 3D polygon data set. The process is expensive, both in terms of capital equipment cost and people time. It can take up to two years to become skilled enough to scan objects to produce good enough models. It can work and produce good results for detailed objects.
For smooth objects, where the objective is to create complex surface formats, a coordinate measuring machine with a contact tip reference point is commonly used. It is usual to mark up the object with the desired surface patch boundaries by using a marking device, such as a pen or a scribe. These patch boundaries are then hand digitized with the contact point probe. The software package then generates a CNC scanning program that automatically takes more points along the boundaries and inside the patches. The software then automatically generates a first attempt at the surface model. This method is used because it is quicker and easier for the operator to define patch boundaries that will lead to a surface model with the desired structure before scanning, than to define the patch boundaries after scanning using a software package on a computer with a display showing the scanned points. It can take several days, and often weeks, to create patch boundaries that are usually splines, then create the patches and then trim the patches to form a surface model by using only the scanned points and a computer.
Scanned data points have been displayed in real-time. The display of points has the disadvantage of easily becoming confusing to interpret, and also that the observer does not know when parts of the object's surface have been missed during scanning.
SUMMARY OF THE INVENTION
According to the present invention, there is provided a scanning apparatus for scanning an object to provide a computer model thereof, comprising means for scanning the object to capture data from a plurality of points on the surface of the object, where the scanning means captures data from two or more points simultaneously; means for generating intermediate data structures therefrom; means for combining the intermediate data structures to provide the model; means for display; and means for manually operating the scanning apparatus.
The apparatus is most efficient in the quest to reduce the time and cost of generating a computer model from a real-world object by means of scanning with both time and cost reductions of an order of magnitude achieved over conventional techniques. The model is generated automatically from the intermediate data in a form that may be immediately usable in a wide range of applications.
The scanning means may use structured light to more quickly scan the surface of the object. The scanning means may also be operable to sense the color of the surface of the object, resulting in a model more like the real-world object.
Preferably, the scanning means therein comprises means for generating a signal for scanning the object, signal detection means for detecting the signal reflected from the object, and means operable in response to the detected signal to provide the data for the intermediate data structure.
The structured light is preferably projected as a plane of light such that a stripe is formed on a viewing plane that is situated normal to the projection axis of the signal generating means and situated at the average standoff distance from the signal generating means.
Alternatively, the structured light may be projected such that a pattern is formed on an area of a viewing plane that is situated normal to the projection axis of the signal generating means and situated at the average standoff distance from the signal generating means.
The signal generating means may be an illumination source such as a laser diode or one or more bulbs.
During scanning, the operator may see the surface he has scanned appearing in real-time as rendered polygons on the display such that he may more easily scan the object. The operator may mount the object on a turntable, and then he may scan from a seated position rather than walking around the object. The scanning means can be mounted on many different types of manual machines, giving enhanced flexibility for objects ranging in size from small to very large. The scanning means can be mounted on a multiply jointed arm for accurate scanning. The scanning means may be a self-contained unit that contains a remote position sensor and incorporates a display to give the most flexibility in scanning. According to the invention, there is also provided a method for scanning an object to provide a computer model thereof, comprising the following steps:
    • Manually scanning the object with a signal by manual operation of a signal generating means;
    • Detecting the reflected signal;
    • Generating intermediate data structures for the points;
    • Combining the intermediate data structures to provide the model; and
    • Displaying the data, wherein the data is captured from a plurality of points on the surface of the object simultaneously.
According to a further aspect of this method of the invention, the color data is also captured from the object and then mapped on to the model.
Preferably, the data is displayed simultaneously as a plurality of display polygons.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
FIG. 1 is a schematic representation of a scanning apparatus according to the invention;
FIG. 2 is a schematic perspective drawing of a probe;
FIG. 3(a) illustrates a first embodiment of the configuration of the optical elements housed in a probe;
FIG. 3(b) illustrates a lamp configuration;
FIG. 3(c) illustrates an alternative to the lamp configuration of FIG. 3(b);
FIG. 3(d) is a graph illustrating intensity as a function of distance along the line A1-A2 of FIG. 3(a);
FIG. 4 illustrates a second embodiment of the configuration of the optical elements housed in a probe;
FIGS. 5(a) to 5(d) illustrate a method of calibrating the color of the scanning apparatus of FIG. 1;
FIG. 6 is a schematic block diagram illustrating the capture of color and position data;
FIG. 7 is a schematic representation illustrating how the detection of the color and position data is synchronized;
FIG. 8 is a schematic representation of the end of the multiply jointed arm of the apparatus of FIG. 1;
FIG. 9 is a schematic illustration of the turntable and multiply-jointed arm of the apparatus of FIG. 1;
FIG. 10 illustrates the mounting of the probe on the multiply jointed arm;
FIG. 11 illustrates the alignment of the mount on the multiply jointed arm;
FIG. 12 illustrates the alignment of the probe on the multiply jointed arm;
FIG. 13 illustrates a linear range image;
FIGS. 14(a)-14(b) illustrate cylindrical range images;
FIG. 15 illustrates the range image placing method;
FIG. 16 illustrates the surface normal extension method;
FIG. 17 represents the structure of a single point in a range image;
FIG. 18 illustrates the representation of an object by three range images;
FIG. 19 illustrates the range image updating method;
FIGS. 20(a)-20(b) illustrate first and second stripes captured on a CCD array;
FIGS. 20(c)-20(f) illustrate the respective captured data points and strings of data points from the first and second stripes of FIGS. 20(a) and 20(b);
FIG. 20(g) illustrates polygons generated from these strings;
FIG. 21 illustrates a probe mounted on a head and a heads-up display;
FIG. 22 illustrates color image mapping;
FIG. 23 illustrates the timing for position interpolation;
FIG. 24 illustrates the triggering of the arm position measurement;
FIG. 25 illustrates an object with marked lines thereon;
FIG. 26(a) illustrates a probe mounted on a multiply jointed arm, which is mounted on a horizontal arm machine;
FIG. 26(b) illustrates two opposing horizontal arm machines;
FIG. 27 illustrates a human foot being scanned;
FIG. 28(a) illustrates stripe sections of a pipe network and panel;
FIG. 28(b) illustrates partial polygon models of a pipe network and panel;
FIG. 28(c) illustrates extrapolated polygon model of a pipe network;
FIG. 29(a) illustrates stripe scanning; and
FIG. 29(b) illustrates area scanning.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Referring now to FIG. 1, a scanning apparatus 100 comprises a multiply-jointed arm 1 with an arm control unit 2 and a probe 3. The control unit 2, which includes a processing unit 10, is coupled to a computer or processing unit 4 and color monitor 7. The probe 3 is also coupled to a probe control unit 5 that is likewise coupled to the computer 4. The intermediate data is displayed on the color monitor 7 as rendered polygons 13. The probe 3 provides a stripe 8, which is projected onto an object 9 positioned on a turntable 14. The stripe 8 is in the form of a plane of light. Buttons 6 are also provided to control data capture. A color frame grabber 11 in the computer 4 is mounted on a standard bus 12 and coupled to the probe 3.
The computer 4, probe control unit 5, arm control unit 2, buttons 6, color frame grabber 11, and monitor 7 are provided separately. For example, the computer 4 and monitor 7 may be a personal computer and VDU, although for certain applications, it may be more convenient for one or all of them to be provided on the probe 3.
The multiply jointed arm 1 and the probe 3 are coupled to the computer 4 by means of the control units 2, 5 discussed above. The computer 4 receives information from the scanning stripe 8, the position/orientation of the arm 1 in terms of X,Y,Z coordinates, with the coordinates I,J,K of the surface normal of the probe 3 and color data if required.
Referring now to FIG. 2, an embodiment of the probe 3 for use with remote position sensors 261 is shown. The probe 3 is lightweight and resilient so as to withstand being knocked without losing its calibration.
Referring now to FIG. 29(a), the structured light is preferably projected as a plane of light 364 such that a stripe 8 is formed on a viewing plane 360 that is situated normal to the projection axis 361 of the signal generating means 362 and situated at the average standoff distance S from the signal generating means.
Referring now to FIG. 29(b), the structured light may be projected such that a pattern 363 is formed on an area 365 of a viewing plane 360 that is situated normal to the projection axis 361 of the signal generating means 362, and situated at the average standoff distance S from the signal generating means. The pattern 363 in this example is a number of stripes that may be of different colors.
Two alternative embodiments of the probe 3 are described. Now referring to FIGS. 3(a)-3(d), one embodiment of the probe 3 is described. The probe 3 comprises a number of components mounted on a base plate 20. A stripe generator 22—for example, containing a laser diode—provides the stripe 8 for projection onto the object 9 to be scanned. Typically, the laser will be of Class 2 or less, according to the CDRH 1040.11 classification in the USA, viz. less than 1 mW in power at 670 nm wavelength. The stripe 8 is nominally focused at some point P. A lens assembly 24 is used to focus the image onto a high resolution CCD camera 25. The camera 25 may be oriented at an angle, satisfying the Scheimpflug condition. An optical interference notch filter 26 is used to selectively image light of the wavelength of the stripe 8. A simple glass cut-off filter 27 reduces ambient light within the probe.
Information on the color of the surface of the object may be recorded in intensity scales or color scales, such as RGB.
An intensity scale estimate of the color of the surface may be obtained by recording the reflected light level of the stripe as it is imaged on the high resolution CCD camera 25 at each point. A high level indicates a light surface at that point that scatters much of the projected light, and a low level indicates a dark surface at that point that absorbs much of the projected light. These indications may be false, such as on specular surfaces, as known to a person skilled in the art.
A color estimate of the color of the surface can be obtained by means of a color camera 29 comprising a color CCD array. Color scanning requires that the object be lit. Lighting may be by means of ambient light from external light sources or by lamps situated on the probe. There are several disadvantages in using ambient light only for color scanning. First, ambient light intensity varies over nearly every object in a standard environment such as a room with overhead lighting. Second, it can be a time-consuming procedure to position a number of lights so as to evenly light the object. Third, the probe itself can cast shadows onto the object.
Four lamps 28(a)-28(d) are provided around the lens 31 of the camera 29 for illumination, or a ring lamp 28′ could be used. This configuration is used to avoid any problems of shadowing. The lamps may include respective back reflectors 32a-32d, where appropriate. The lamps are set to give an average intensity of around 80-150 Lux, but the intensity could be much more or less and, during use, ambient light is reduced significantly below this level—for example, by dimming or switching off overhead lights. This removes any effects from variations in ambient light. The lamps may be tilted with respect to the camera axis to ensure that light of a more even intensity is projected onto the object 9 at the average scanning standoff distance. The lamps should be small in size to obtain the least possible weight penalty, especially if two or more lamps are used. To extend their life, they can be operated at a lower voltage at certain time periods—for example, while preparing for the capture of each image. When the operator triggers a color image capture, the voltage to the lamps can be momentarily increased to maximum. The lamps are only switched on for color capture. During the process of 3D capture, the lamps are switched off where this will also have the added advantage of increasing the signal to noise ratio. An access panel 35 can be provided over the lamps so that the lamps can be easily replaced without opening the probe and risking losing its calibration. To improve results when scanning a reflective surface, polarizing material 34 is placed between the camera 29, lamps 28a-28d, and the object 9. To reduce variations in the projected light, diffusers 33a-33d are placed in the light path between each lamp and object 9 or, alternatively, the lamp glass or back reflectors are treated accordingly.
Referring now to FIG. 4, the second embodiment of the probe 3 is described. As also preferable with the first embodiment, the base plate is provided with a removable two-piece cover 21 mounted on the base plate 20 to define a housing for the components of the probe 3 to exclude ambient light and to protect the components. It may have a metallic interior coating to reduce electromagnetic emissions and susceptibility. The cover is appropriate for both embodiments.
The probe 3 in the second embodiment can also capture the color of the surface of the object 9. The color is captured using an optical system that is coplanar with the light stripe 8. The main advantage of coplanarity is that it provides the color of each point directly, whereas non-coplanar systems, such as in the first embodiment, require extensive post-processing computation to map the captured color data onto the captured 3D data. In non-coplanar systems, the whereabouts of the stripe in the color camera is significantly variable due to the non-alignment of the two cameras, leading to an embodiment that is operated for scanning in two passes—3D and color—instead of the one pass that is achievable from the first embodiment.
During use, with the stripe generator 22 switched off and the lamp 28 switched on, a color sample 32 of the object 9 is directed back along the direction of where the stripe 8 would be if the stripe generator were illuminated where it is reflected by a half-silvered mirror 30 and focused onto the color camera 29 via a lens assembly 31.
In a probe 3, where the stripe generator 22 produces a white stripe, the color and position can be captured synchronously. The environment would need to be dark in this case, and lighting 28 would not be required.
A look-up method is provided for determining where to read the color from the rectangular array of the color camera 29, depending on the object distance S at all points along the stripe 8. Now referring to FIGS. 5(a)-(d), a color camera look-up table 166 is prepared by viewing the stripe—when it is permanently illuminated—at a sufficient number of points in a rectangular array covering the distance measuring depth and the stripe measuring length. A fixed-length, flat object 161 is one item that can be used for this purpose, and it typically has a white surface. The flat object 161 is placed on a light stripe absorbent background 162. The probe 3 and the flat object 161 move relative to each other in the direction of the W axis such that stripes 163 are imaged. The color camera image 165 shows the imaged stripes 163a collected in a region 164 of the array. In the case of perfect coplanarity, the imaged stripes will be superimposed (see 163b in FIG. 5(c)). The look-up table 166 is then built up so that, for a scanned point 167 on an object with coordinates V1,W1, will have a color image position Cx,Cy in the look-up table 166 that stores an array of values of Cx,Cy for the V,W ranges. During scanning, the color image is usually scanned before the stripe measurement. The extent of the look-up table 166 can determine how much of the color image 165 needs to be stored while the points are being calculated, i.e., the extent of region 164. This reduces the amount of memory needed in the computer 4 and the required bandwidth for transferring color information from the camera 29 into the computer 4, allowing the possible use of lower cost units, such as the frame grabber 11 located in the probe control unit 5, or in the computer 4 on a bus, as discussed above. It is probably not worth the expense of building a perfectly coplanar system, as a roughly coplanar system can be calibrated as described above to produce as effective a result.
Referring now to FIGS. 6 and 7, on coplanar systems, if the light stripe 8 is of a color other than white, then it is difficult to capture the object's surface position and color at the same time. To overcome this problem, the light stripe 8 and color camera 29 can be switched on and off such that the color is recorded shortly before or shortly after the position. Adjusting circuitry can be provided to change the exposure times of the color camera 29 and the stripe generator 22 and to ensure that the gap between the color capture and the stripe capture is minimized. To achieve this, a video synchronization generator 60 generates a synchronization signal 61 of pulses at video camera rate—which, using the CCIR format, is 50 times per second. The synchronization signal 61 is fed into the high-resolution camera 25 and the color camera 29. The exposure time of the color camera 29 can be set manually using a switch 63 or remotely with an electrical signal. The synchronization signal 61 is also fed into a standard circuit 62, which switches on the stripe generator 22 for a period of time after an initial delay. The period of time that the stripe generator is on may be set manually using a control 64, and the delay between when the synchronization signal 61 is received and the stripe generator 22 is switched on may also be set manually using a control 65. With reference to FIG. 7, the synchronization signal 61 is represented by the trace SYN. The illumination of the stripe generator 22 is represented by the trace L. The exposure of the color camera 29 is represented by the trace C. The exposure of the high-resolution camera 25 is represented by the trace M. This is only one way by example of controlling coplanar probes.
Referring now to FIG. 8, the probe 3 projects the stripe 8 so that the measuring area starts before a tip reference point 51 provided at the end of the multiply-jointed arm 1 and extends a further distance. The tip reference point 51 may be the tip of a cone, a sphere, or a rolling device, such as a rolling wheel or ball, or the point may be anything similar providing a tip. In the embodiment described herein, the tip reference point 51 is the center of a sphere 52. This tip reference point 51 enables the operator to scan hard objects by tracing the sphere 52 along the object 9 in strips in contact with the object 9. For soft objects, the sphere 52 acts as a scanning guide, and typical instructions might be to keep the tip reference point 51 about 20 mm from the object 9 while scanning. In this way, the probe 3 may be kept close enough to the object 9 for the stripe 8 to be in the measuring area but without touching the soft object 9. The stripe 8 typically starts 100 mm from the tip reference point 51 at the end of the arm 1, but could be much closer or further away, and can be used to measure objects lying between the two points W1 and W2, as measured from the end 55 of the probe 3. The ideal method—from a usability point of view—is for the plane of the stripe 8 to be coaxial with the axis 54 of the last section 50 of the arm 1. This is the case for a purpose designed arm and a hand-held probe. The probe 3 may often be retrofitted onto the arm 1, and because a mechanical arm has a diameter of typically 20-60 mm, this presents an alignment problem. In this case, the plane of the stripe 8 is not coaxial, but may be either in a plane 53 parallel to the arm end axis 54 or in a plane 53a angled to this axis 54, so as to cross the axis 54 at some point. By crossing the axis 54 of the arm 1 somewhere in the measuring range, the ergonomics of the arm 1 can be enhanced because the light plane is in an easier to use position. This crossover is typically somewhere between the tip reference point 51 and the end W2 of the measuring range.
Referring now to FIG. 9, the use of a manually rotated turntable has several advantages. For a given arm size, larger objects can be scanned. The operator does not have to move around the object. This makes scanning physically easier, more enjoyable, and there is less chance of either the operator or the arm accidentally knocking the object or of any reference point being lost.
The position and coordinate system of the turntable 14 must be known relative to that of the arm 1. The tip reference point 51 can be placed in a locating cone or cup 202 in the table at a large radius. Points are transmitted regularly by the arm control unit 2 and recorded on the computer 4 as the turntable 14 is manually rotated. Functions that fit a plane and a circle through these points provide complete position and orientation information on the turntable 14 in the arm coordinate system.
During scanning, it is important to know the turntable angle. The turntable may be designed to have precise mechanical resting positions 206a-206d, e.g., every 15 degrees. These resting positions 206 would be apparent from a pointer 208 indicating the angle on an attached scale 209 of 360 degrees. The operator could type in the new angle into the computer each time the turntable was rotated. However, the process of typing in an angle means that the operator may have to put down the probe 3. This slows down scanning, and there is scope for an error to be made by the operator.
With an electrical connection 204 between a position sensor 203 on the turntable 14 and the computer such that the computer could know either precisely or roughly the turntable angle, the process is faster and less error prone. If the sensor 203 is accurate—such as an encoder with, for example, 10,000 lines—then the turntable 14 could be positioned at any orientation and its angle known precisely. This allows for scanning while rotating the turntable, although care must be taken that dynamics do not lead to position errors or to the object moving relative to the turntable. If the sensor 203 is less accurate—such as a potentiometer—then the turntable 14 could also have precise mechanical resting positions 206. This gives the advantages of high accuracy and lower manufacturing cost. Each time the probe 3 captures data from the object 9, the software must check for movement of the turntable 14. If it has been moved then with a less accurate turntable sensor 203, probe data should be thrown away until the turntable 14 has stopped moving. In all cases, the turntable 14 should be capable of being operated by one hand such that the probe does not have to be laid down. It is often the case that an object on a turntable is scanned with regular increments, e.g., eight scans every 45 degrees. To aid the operator in incrementing by X degrees, different shaped and/or colored icons could be placed every X degrees on the scale and on the other regular intervals. Typical intervals might be 45, 60, 90 degrees. With reference again to FIG. 2, this method can also be used with a probe 3 including one or more remote position sensors 261 with a tip reference point 51. The manual turntable may be driven by a motor operable by means of hand controls.
Each time a probe 3 is mounted on the arm 1, if the mounting is not repeatable to a high accuracy, then the transformation in six degrees of freedom between the arm coordinate system X,Y,Z and the probe coordinate system U,V,W will have to be found.
A mounting device 210, 214 for the probe is illustrated in FIG. 10. Accurate and repeatable geometric positioning of the probe on the arm is required. This is provided by the mounting device 210, 214. The mounting device 210, 214 provides a standard mechanical interface that may preferably be used for all probes and arms, which is both small and light, and that is easy to use to mount and dismount the probe onto and off the arm. The mounting device comprises a arm side mount 210 that comprises a flat mating surface 211 with two precisely dimensioned projections 212 located in precise positions on the mating surface 211. The mounting device also comprises a probe side mount 214 comprising a flat mating surface 215 and two precisely-dimensioned recesses or holes 216 corresponding to the two projections 212 in the arm side mount 210. It is essential that the geometric repeatability in position and orientation is very high.
A standard mounting device between any arm and any probe gives several advantages. When the arm has to be used without the probe, then the probe has to be removed. If the mounting device is not repeatable, then the system will require realignment before use each time the probe is remounted.
Typically, a range of probes will be supplied with different weights, speeds, sizes, and accuracies corresponding to different functions. Each probe can be supplied with alignment data relative to the datum of the probe side mount 214 and, in this way, any probe may be attached to the arm without the requirement of realignment. A user may also have one or more different arms. In order to fit the same probe to two different arms, the user needs only to acquire an extra adapter for the second arm, which would fit onto the arm and include the arm side mount 210 of the mounting device 210, 214. The arm side-mounting device 210 may be attached to any machine, including multiply-jointed arms, horizontal arms, and orientation devices such as manual, two-axis orientation devices.
To calculate the six degrees of freedom transformation between the arm coordinate system and the probe coordinate system, one can either treat it as one transformation or as a multiplication of two transforms if the accurately repeatable mount is considered as an intermediate reference point, i.e., the transformation matrix Tap between the arm and the probe coordinate systems is equal to the transformation matrix Tam between the arm and the mount coordinate systems multiplied by the transformation matrix Tmp between the mount and the probe coordinate systems:
Tap=(Tam)·(Tmp)
The transformation matrix Tam can be found in several ways. Now referring to FIG. 11, a particularly simple, cost effective, and practical method involves the use of a reference plate 220. The reference plate 220 has three orthogonal flat surfaces 222 and a mounting point 221 to which the arm side mount 210 can be attached in a precisely known position relative to the orthogonal planes. Tam can be calculated using the following steps:
    • The reference plate 220 is fixed so that it cannot move relative to the arm coordinate system;
    • The arm side mount 210 is fixed rigidly onto the arm 1 (if it is not already present), without the probe attached;
    • The three orthogonal planes 222 of the plate 220 are measured by the tip reference point 51 on the arm such as to fully define the position and orientation of the reference plate;
    • The arm mount is then mated with the mounting point 221 on the reference plate 220;
    • The arm position and orientation are recorded; and
    • The transformation matrix Tam is then calculated from the known geometry of the reference plate 220 and the measurements from the previous steps.
The above method can be encapsulated in the main scanning software provided with the scanning system or in a separate program. This has the advantage that much time is saved over an alternative of the user calculating Tam manually from arm positions output by the arm manufacturer's software and manually inputting the resulting Tam into the main scanning system software.
The probe side mount 214 is integral to the probe and does not move relative to the probe coordinate system. The transformation matrix Tmp is provided by the probe supplier with the calibration data for the probe.
The direct calculation of Tap using the arm and probe coordinate systems but without involving an intermediate mount can be carried out in many ways. Most of the ways involve using the probe mounted on the arm to capture data from one or more geometrical objects. The problem has proven to be very difficult, since many of the standard methods produce inaccurate results in either the orientation or position components often due to inherent instabilities triggered by relatively small errors. One way is disclosed, by example, for an S stripe probe:
    • Now referring to FIG. 12, the transformation matrix Tap is calculated by:
    • 1. Mounting the alignment calibration object with three orthogonal faces 230 so that the three orthogonal flat surfaces 231, 232, 233 are reachable and accessible by the probe 3 mounted on an arm 1;
    • 2. Capturing at least three stripes from three different orientations on the first flat surface 231. The orientations need to be different enough to provide stability to the mathematical algorithm (in practice, a variation of at least five degrees between every two of the stripes is sufficient);
    • 3. Repeating this three or more stripes capture on a second flat surface 232 and on a third flat surface 233; and
    • 4. Processing the data from the nine or more stripes in an iterative fashion to output Tap.
The handedness of the coordinate systems of the arm 1 and the probe 3 would be known. The relationship between the normals of the surfaces on the alignment calibration object 230 could be specified. One way of doing this is by labeling the three faces 231, 232, 233 and specifying the order in which the three faces must be scanned.
The main advantages of the above apparatus and its method of aligning the probe are (1) that it involves a single alignment calibration object that is cheap to manufacture to the required geometrical tolerance and is relatively light and compact; (2) that the method is robust, simple to carry out from written instructions and quick; (3) that the processing can be encapsulated in the main scanning software provided with the scanning system or in a separate program; (4) that there is no need to have any preliminary geometric information about the orientation and position of the probe relative to the tip of the arm at the start of this method—for example, the probe could be slung on the underside of the arm pointing backwards and the method would work; and (5) that if the probe is knocked or damaged such that Tmp changes but the calibration is still valid, then this method of alignment will still work.
In using scanning systems to provide data for 3D applications software, the need for specific 3D reference points in addition to 3D surfaces became apparent. Some applications for which 3D surfaces are required that also require 3D reference points are animations involving joint movements where a joint is to be specified in the context of the 3D model. In this case, the joint can be quickly defined from one or more 3D reference points. A new method of using the scanning system is to use the probe 3 to scan the surface and to use the tip reference point 51 to capture individual 3D points by contact. An alternative method is to project a calibrated crosshair onto the object and use an optical method of picking up individual points. This can be used in both stripe and area systems. The calibrated crosshair is usually switched on just during the period in which individual points are captured. There could be two modes—in the first mode individual points are captured each time a button is clicked; and in the second mode, a stream of individual points are captured from when a button is first pressed until it is pressed again. The second mode is commonly used for tracing out important feature lines, such as style lines or patch boundaries. In the case of a stripe sensor, instead of projecting a crosshair, it may only be necessary to project a second stripe at the same time as the main stripe. The crosshairs may be calibrated by the probe supplier using a three-axis computer controlled machine, a known calibration object, and standard image processing techniques.
The scanning apparatus 100 is operable to scan an object and thereby generate a computer model of the object's surface using an intermediate data structure for efficiently storing points on the surface of the object during scanning, creating an instance of the intermediate data structure for the particular object; and controlling the storage of the scanned points in the intermediate data structures during scanning with an operator control system.
Three examples of these intermediate data structures may be points or encoded stripes or range images.
Points have the disadvantage of being unorganized and much information obtained from the structure of the probe and the method of its use is lost if the 3D data is reduced to points.
In the case of stripe probes, much information may be retained to improve the speed and quality of construction of a model from intermediate data if an encoded stripe intermediate data structure is used. Such a structure stores data from one stripe at a time. The stripes are stored in the order of capture. The time of capture of each stripe is recorded. The orientation of the probe is recorded for each stripe. The raw data points from the stripe may be processed before storing in the data structure to determine jump and break flags and to sample or chordally tolerance the raw data points to reduce the size of the intermediate data structure without losing any significant information.
In the case of area probes, the advantages of a range image as an intermediate data structure are well known. These advantages include a data structure that relates well to the area based data capture method and the efficiency of storage in an image in which only Z values are stored.
An intermediate data structure can be used in which the surface of an object is described by means of a finite number of linear and cylindrical range images that are, to some extent, characterized by the shape of the object.
A linear range image 70 is illustrated with reference to FIG. 13. The range image 70 has a coordinate system U,V,W and a spacing of points dU in the U direction and dV in the V direction. The linear range image 70 contains in its header definition its relationship to the world coordinate system X,Y,Z, i.e., the arm coordinate system. In the disclosed invention, the linear range image 70 cannot store negative W values.
Cylindrical range images 71, 72 are described in FIGS. 14(a)-14(b). The range image has a coordinate system W,R,A where A is an angle. The spacing of points is dW in the W direction and dA in the A orientation. The cylindrical range images 71, 72 contain in their header definitions, their relationships to the world coordinate system X,Y,Z. In the disclosed invention, the cylindrical range images 71, 72 cannot store negative R values. The direction +R and position R=0 of a cylindrical range image defines whether the points stored are inside the range image, as in FIG. 14(a), or outside, as in FIG. 14(b).
Referring now to FIG. 15, the range image placing algorithm takes a scanned point and tries to place it into defined range images by projecting a ray along the normal to the range image 105. If the point has a negative value in a range image—for example, point 104—then it is not stored in that range image. If the point is outside the extent of that range image, it is not stored in the range image unless the range image is extensible—in which case, the algorithm extends the range image far enough to place the point. If a point already exists in the range image position in which the point is to be placed, then the two points are compared. If the distance between the two points in space is outside a tolerance d, such as the error of the scanner, then the nearest point 102 is stored and the furthest point 101 is rejected. If the two points are within the error of the scanner, then their values are averaged and the average value is stored.
The range image-placing algorithm is simple and quick, but it is indiscriminate, often placing points incorrectly in range images and relying upon them being overwritten by a nearer point. If the range image is very dense, but populated with few values, then up to half the points populated could be incorrect because the surface normal of the point is incorrect. This can restrict successful scanning to coarse range images.
The range image-placing algorithm is improved upon with the surface normal extension. The range image-placing algorithm does not have an estimate of the surface normal of the point to be placed. Also, it does not take into account the orientation of the probe when the stripe is captured. To improve the range image placing, the fact that most stripes are scanned in sequence and have near predecessor and near successor stripes is used. For example, as illustrated in FIG. 16, there are eight neighboring points 116 on a stripe 114 and on its predecessor 113 and successor 115. These can be used to approximate the surface normal of a point P before it is placed in a range image. Three sequentially scanned stripes 113, 114, 115 are shown on an object 111 and projected onto a range image 112 as stripes 113a, 114a, and 115a. The point P, with coordinates Xp,Yp,Zp on stripe 114, has eight near neighbors 116 on the respective stripes 113, 114, 115 as described above, and an approximate surface normal Np with coordinates Ip,Jp,Kp. The probe orientation for stripe 114 is NS with coordinates Is,Js,Ks. By calculating the surface normals NS, NP, and NR, where NR is the normal of the range image 112, one is given a choice of two opposite surface normals. The correct one is the one that can be seen from the probe 3 orientation—assuming that the changes in probe orientation for the three stripes are not significant to the surface normal direction. If the surface normal NP of a point P is found to be facing away from the surface normal NR, then the point is not placed on the range image. This surface normal extension eliminates the majority of incorrect point placements in range images. In a practical implementation, three stripes of points are buffered before the first stripe of points is placed in the range images. The normal extension in a modified form can also be used for the first and last stripes by using the two successive or two previous stripes. When the three stripes 113, 114, 115 are nearly coincident, perhaps because the arm is moving too slowly, then the accuracy of the surface normal estimate is low and the normal cannot be used. A different normal calculation can be made using any neighboring points already placed in the range image instead of the neighboring stripes. A further, normal extension to the range image placing algorithm combines both the stripe and the range image data to provide a better estimate of the surface normal. The calculations involved in these normal extensions can provide a bottleneck to the scanning process. The bottleneck can be overcome by using only two stripes, less samples (5 instead of 9) or a faster computer.
A number of range images that are positioned in the object coordinate system must be defined. The range images have specific mathematical definitions. Two basic types of range image are used—linear and cylindrical—as discussed above. A range image has direction and a zero position. The range image can only store points that are in front of its zero position. If there are two or more surfaces of the object in line with a point in the range image, then the surface that is nearest to the range image's zero position is represented in the range image. A range image can be constrained in size or unconstrained in size. The range image can be one image of fixed density or comprise a patchwork of a number of adjoining images of different densities. Each grid position in the range image is single-valued. The range image will typically use 4 bytes to store a depth value Z, from 1 to 4 bytes to store the gray scale or color value 1, and from 1 to 3 bytes to store the orientation N. This is illustrated with reference to FIG. 17, which illustrates how a single point is represented. The 3 bytes suggested for orientation N will not permit a very accurate orientation to be stored. More bytes could be used, but there is a trade-off between data storage size, processing time for converting floating number orientations to/from a compressed integer format, and accuracy. Range images will normally require from 5 to 11 bytes to store each point, depending on the operator's requirements. For comparison, 20 bytes are typically required to store an ASCII X,Y,Z value.
Now referring to FIG. 18, it is possible to define—for a finite object 9 of any shape—a finite number of range images of the above types that, for all practical purposes, enables any and every point on the external surface of the object 8 to be stored in one or more range images 81, 82, 83.
With objects with deep external features, such as the inside of an ear, it may not be possible or practical to scan all parts of the external surface, but it is possible to represent them theoretically.
The number and position of the range images used in the process are such that they are sufficient to be able to store enough of the surface of the object to enable a computer model of the desired accuracy and detail to be generated.
In a manual process, the number and position of all the range images may be defined by the operator before scanning. Alternatively, just one may be defined by the operator before scanning begins, followed by the definition of others at any point during scanning. The operator has a choice of several strategies. He can define range images and scan range one at a time. He can define a number of range images and scan simultaneously. He can define some range images and scan followed by defining more range images and then scanning. If a point is scanned that does not fit onto any defined range image, then it is rejected. Alternatively, such rejected points could be automatically saved for placing into any new range images that the operator may subsequently define.
A typical number of range images varies from 1 to 20. Some range images need only be very small in size—small enough to cover a part of the object that is otherwise hidden from recording on other range images. The density of each range image can vary. For instance, a large, smooth part of the object does not need a high point density; but a small, finely detailed ornament may require a high point density. Each range image has a direction.
The operator may select the most suitable set of predefined range images from a library of range image sets. He can then edit the set to suit his object. Each new set is then stored in the library. A set can be thought of as a set of templates. As an example, for a human form there could be a range image set consisting of five cylindrical range images for the limbs and the trunk, together with five linear range images for the top of the head/shoulders, hands, and feet. For a car, one cylindrical range image for the car's body and two linear range images at each end of the car could be enough. It is important to note that the axis of a cylindrical range image must lie within the object or part of the object being scanned.
A range image is manually defined by the operator by first selecting the appropriate range image type—cylindrical or linear—and second, placing the probe to give the desired position and orientation of the range image and selecting it using the operator control system. For a cylindrical range image, the probe could be positioned to first give the position and direction of the axis and then to give the maximum radius.
Now referring to FIG. 19, an inference method provided for updating range images from the other registered range images is also a novel method. The inference method progresses through each array position in the range image 121 that is to be updated. The inference algorithm can update positions that either have no value or have a value with a surface normal that is steeper than a given value, or have a less steep value, or any combination of these according to the operator's requirements. If a position in the range image 121 is to be updated, then that position is projected as a normal ray 126 onto all the other range images 120, 125, one at a time. If the ray intersects with another range image 120, then the local triangular surface element through which the ray first passes is located on the surface 123 and constructed. The value 124 at the intersection of the ray 126 and the triangular element 122 is then inferred and placed onto the range image being updated. If the ray intersects several range images 120, 125, then the inferred values from the range images are averaged after outliers have been removed. Outliers are removed by using a tolerance such as the error of the scanner. The original value (if it exists) in the range image 121 being updated, could be included in this outlier removal/averaging process.
The inference method is particularly used when an additional range image is added at a late stage in the scanning process or if range images are defined/scanned one at a time. The method enables surface areas that are nearly orthogonal to the range image, i.e., are almost vertical walls, to be well defined from data stored in the other range images. This provides a better set of points for carrying out the polygonization of one range image resulting in a more accurate polygonal network and simplifying the polygonization process.
The probe 3 provides data that is displayed on the display monitor 7 as a rendered polygonal surface 13 in real-time or with an acceptable delay such that the user can watch the display monitor 7 and use the feedback of the rendered surface to guide his movement of the probe 3. Real-time is defined in the context of visualization as an operation reacting with a delay small enough to be acceptable to an operator in normal use. The probe 3 could be a stripe probe or an area probe. Where the probe captures 3D and color information, then the color information can be mapped onto the 3D model to texture it, as discussed below.
The surface to be displayed is calculated for stripe probes one additional stripe at a time. Referring now to FIGS. 20(a)-20(g), as a stripe 301 is captured, it is converted using one of several commonly used methods into a finite string 303 of 3D points 302a, 302b and flags 304, 305 in the world coordinate system X,Y,Z using the previously obtained calibration and alignment data for the probe 3. The maximum number of points is usually equal to the number of rows 281 or the number of columns 282 in the CCD array 25, depending on the orientation of the CCD in the optical setup. This disclosure will refer to rows, but it can equally apply to columns or any other way of organizing the data recorded by the camera. Where the stripe crosses a row the position on the CCD array can usually be calculated to subpixel accuracy by one of several commonly used methods. If there is no data for one or more neighboring rows, such as missing positions 302e, 302f (not shown), then a “break” flag 304 is output into the string to indicate a break in the surface recorded. If there is a significant jump discontinuity in range above a maximum value that is appropriately set for the scanning resolution of the probe 3, such as between 302j and 302k, then a “jump” flag 205 is output into tile string of 3D points to indicate either a vertical wall relative to the probe orientation or an occluded surface. The string 303 is filtered to reduce the number of points while effectively transmitting most of the information. The object of filtering is to reduce the amount of data processed for surface rendering and hence increase the speed of the rendering process with a minimal degradation in the quality of the surface. The first method of filtering is to skip some of the stripes. Another method is to sample the rows, e.g., take every nth row. A third method is to chordal tolerance all the points in the stripe and discard the points that are surplus and within tolerance. Where computer speed is limited, the first and second filtering methods are preferred because of their simplicity and because the resulting regular grid of points produces regular polygons that look good on the display, as opposed to long thin polygons that might result from a chordal tolerancing process that can have rapidly changing surface normals if the data points are slightly noisy due to inaccuracies in the probe and the arm, and may present an unattractive “orange peel” effect on the display. The same process is repeated for a second stripe 306 capturing data points 307, resulting in a second string 308 of 3D values 307a, 307b, etc., and flags. A surface comprising triangular or quad polygons is then constructed between the two strings 303 and 308, resulting in a string of polygons 309. The string of polygons is then displayed by a renderer. The renderer may or may not take into account the previous polygons displayed, the viewpoint, and lighting model.
If color has been recorded for a polygon then the color information can be mapped onto the polygon. The precise mapping algorithm depends on the format of the raw color information, which depends on the design of the probe. The raw color information may comprise point, line, or area samples. The raw color information may be adjusted before mapping using calorimetric calibration and intensity calibration data. During the mapping process, the color information may be adjusted for the probe to polygon distance at point of color capture and polygon orientation to probe at point of capture. The basis for the adjustments is a set of calibration procedures carried out for each individual probe.
The viewpoint for the surface displayed can have a constant position, zoom, and orientation in the world coordinate system of the object such that, as the probe is moved, the surface displayed increases where the data is captured. The viewpoint is set before scanning starts, either with an input device (such as buttons) on the arm, foot pedals, a mouse, and a keyboard, or by using the probe to determine the viewpoint. Alternatively, the viewpoint can have a constant position, zoom, and orientation in the probe coordinate system such that, as the probe moves, the surface is completely re-rendered at regular intervals, each time with the new surface displayed where the data has been captured, with the regular intervals being at an acceptable real-time rate, such as 25 displays per second or less often. Alternatively, the viewpoint can have a constant position, zoom, and orientation in the world coordinate system where the surface displayed increases where the data is captured that is completely updated to that of the probe coordinate system on operator demand such as by the depressing of a button or foot pedal or at regular time intervals, such as every 10 seconds. The different methods for updating the viewpoint provide different advantages, depending on the size and type of the object being scanned and the speed of the computer in recalculating the surface display from a different viewpoint.
Referring again to FIG. 2, the display 7 can be mounted on the probe 3 such that the rendered surface 13, or other image displayed, moves with the probe movement. Referring now to FIG. 21, the display 7 could be mounted in front of the operator's eyes as part of a heads-up display 271, which may or may not also allow the operator 270 to see his real environment as well as the rendered surface 13, or it can be mounted elsewhere. In practical use, it is found that the operator watches the rendered surface 13 on the display 7 while scanning because this has the important advantage of ensuring that all the object is scanned. In some applications, such as scanning large objects or using a turntable to rotate the object, watching the display 7 as a large display monitor situated on a workbench can be advantageous. In other applications, such as scanning a spherical type object, a display screen on the probe 3 is advantageous because the operator moves with the probe. With sufficient quality in a heads-up display and with no negative effects such as feeling sick, a heads-up display may be best for nearly all applications because it is feeding back most directly to the operator.
Referring again to FIG. 2, it is already technically possible to integrate the display 7 with the probe 3 as, for instance, a color LCD screen is small, lightweight, realtime and flat, while having sufficient resolution to render the surface so that the operator can see what has and what has not been scanned. A display mounted on the probe could be tiltable in one or two axes relative to the probe. The ability to tilt the display relative to the probe can give the operator improved ability to scan in spaces with poor visual access. Buttons 6 on the probe can be used to navigate menus and select from menus.
As computing power becomes faster and more compact, it will be possible to encapsulate the computer 4 in the probe 3 as well as having the display 7 mounted on the probe. The probe might have memory 262, which could be both dynamic memory and magnetic memory, such as a CDROM or digital video disk (DVD). The probe might have a local power source 260, such as batteries. This would be the case with one or more remote position sensors 261 mounted inside the probe. Although one remote position sensor is sufficient, more accuracy is obtained by averaging the positions coming from three or more remote position sensors. Another benefit of three or more sensors is that when a spurious position is output by one or more sensors, this can be detected and the data ignored. Detection of incorrect positions is by means of comparing the positions output by the three sensors to their physical locations within the probe to see if the variation is larger than the combined, acceptable error of the sensors. Since remote position sensor technology is likely to remain much less accurate than multiply jointed arm technology, it is preferable that probes with remote sensors use array scanning means rather than stripe scanning means. With a single array scan, all the data in the array (i.e., a range image) is accurately registered to each other, but with stripes there are position errors been any two sequential stripes. It is possible to use an iterative closest point (ICP) algorithm on overlapping range images to substantially reduce the errors caused by the remote position sensors; but this is not possible with stripes.
A number of different technologies exist for area probes including binary stereo, photometric stereo, texture gradients, range from focus, range from motion, time of flight, Moire interferometric, and patterned structured light systems. The most common systems in use in industrial applications are time of flight, Moire, and patterned structured light. Different area probe technologies have different advantages and disadvantages for manual scanning.
Time of flight systems use a modulated laser spot to measure a scene by the phase shift between outgoing and reflected beams, which is proportional to the range of the object point. A complete range image is captured by scanning the whole region of interest. For a small area, this technique is advantageous since it is line of sight, although the accuracy is generally of the order of 1-2 mm unless multiple measurements are taken at each point, thus reducing scanning speed significantly. It is thus too slow.
Moire systems use gratings in front of projection and viewing optics to produce an interference pattern that varies according to local changes in height on the object. Absolute measurements and measurements across discontinuities are only possible by taking several measurements with different grating configurations or from different project angles. For relative height measurement, these systems offer high accuracy. It is thus too problematic to obtain absolute measurements.
A depth from focus range area sensor has recently been demonstrated that allows the real-time determination of range from pairs of single images from synchronized cameras, albeit with the use of relatively complex hardware. It is thus too complex to use at this point in the development of the technology.
Referring now to FIG. 29(b), patterned structured light systems come in many families and rely on projection of a light pattern and viewing off-axis from the projection angle. Synchronously scanned laser triangulation probes can be raster scanned over a 2D area. A laser stripe triangulation line can be scanned in one direction to produce area measurements. Scanning can be mechanical or electronic. Multiple laser or light stripes 363 can also be simultaneously projected over an object to obtain the same effect as a scanned stripe, but this has the disadvantage that, in a single image it is not possible to differentiate between the stripes. To overcome this problem, a number of systems use a gray-coded sequence of binary stripe patterns that solves the ambiguity problem. However, the sensor should remain stationary during the capture process. An alternative solution is the projection of color-coded light stripes that allow the unambiguous determination of range, even with depth discontinuities from a single image. Note that the simultaneous use of a number of stripes is herein classified as an area technique and not a stripe technique.
The simultaneous projection of color-coded light stripes overcomes the disadvantages of the previously described systems and is the preferred area embodiment of this invention. Each stripe is one color. Each color may be a discrete wavelength, such as provided by a number of different laser diodes or a subset of a spectrum range of color generated from a white light source. Either all of the colors may be unique or a small number of colors may repeat. The repetition of a small number of colors can lead to ambiguity if stripes of the same colors are not sufficiently separated.
The probe encapsulation would have advantages in terms of cost reduction and complete flexibility in freedom of use because even cables may not be required and the only limits would be the range and accuracy of the remote position sensor.
If an arm is being used as the position sensor, the probe with a display mounted on it might receive its power along a cable that may follow the path of the arm, and the computer may be situated in the base of the arm, which would reduce the weight of the probe and reduce operator fatigue.
Referring again to FIG. 21, if a heads-up display 271 is being used, the probe 3 with one or more remote position sensors 261 could be mounted on the operator's head 270 with fixing means 272 to produce a head-mounted scanning system 274. This would lead to hands-free scanning, although some method of navigating menus, e.g., verbally with speech recognition by means of a microphone 273 or via buttons would be important in practice. It is likely that the standoff from the probe to the object using a head-mounted scanning system 274 would be quite large, for example, 250 mm, but it could be more or less.
There are several ways of automatically polygonizing intermediate data to form a 3D polygonal model. Two ways are described—strip polygonization and range image polygonization.
The strip polygonization of intermediate data to automatically create a polygonal model is described for a stripe scanner. The following description is by means of an example and comprises the following steps:
1. Take the intermediate data in the order in which it is scanned, including the probe orientation for each stripe. For a stripe probe, this will typically consist of a number of neighboring stripes with occasional discontinuities, such as when the scanning process is paused or a turntable is turned or the direction of scanning is reversed. The intermediate data is preferably in an encoded stripe form as described above.
2. Group the data into stripes of similar probe orientations and no discontinuities. An acceptable variation of the probe orientation in a group of data may be ten degrees. The average normal for each set of stripes is specified. A new group is started each time a discontinuity appears or when the probe orientation varies unacceptably.
3. If not already done in the intermediate data, filter the stripes in each group using a chordal tolerancing routine to reduce the quantity of points and maintain the positions of the break and jump flags.
4. Use a 2.5D polygonization method to polygonize each group. This will result in a number of 2.5D polygonal meshes. There may be holes in any of the meshes. The method eliminates occluded surfaces behind surfaces resulting from variations in the probe orientation within the group.
5. Use a polygon mesh integration method such as an implicit surface method to integrate the 2.5D polygonal meshes into a computer model comprising one or more 3D polygonal meshes.
6. If required, use the known base plane of the object specified during the scanning setup to automatically close the bottom of the model where the object could not be scanned because it was resting on a table or turntable.
7. If required, use a general closing function to automatically close all holes in the model.
8. If required, use a smoothing function set such that features created by known levels of inaccuracy in the 3D scanning process are smoothed out and features greater in size than the inaccuracy of the system are maintained.
9. Convert the internal polygon format into an output file of a commonly used polygon file format, such as DXF.
The range image polygonization of intermediate data to automatically create a polygonal model is similar to strip polygonization. Each range image is effectively a group of stripes with the same surface normal. Steps 1 and 2 above are, therefore, not needed. There are two ways of carrying out the equivalent of step 3 above. Range image data may be chordal toleranced as a series of stripes as described in step 3, and the polygonization process continued with steps 4 to 9, as required. In the second way, given the greater structure of a range image over a group of stripes, steps 3 and 4 may be combined and a range image tolerancing algorithm combined with a 2.5D polygonization algorithm and the polygonization process continued with steps 5 to 9, as required.
Area scanners usually output range images. In general, range image polygonization is better suited to area scanners and strip polygonization is better suited to stripe scanners. If the intermediate data structure is range images then the range image polygonization will work whether each range image relates to a particular data capture instant or is part of a defined range image structure that is characterized by the shape of the object.
The combining of color data onto the 3D model is known as texture mapping.
Before raw color data in the form of color images can be texture mapped onto the 3D model, it must first be corrected by means of various calibrations.
An important calibration is the geometric calibration of the color camera and finding the alignment transform of the color camera to the calibrated 3D measurement system in the probe. Without these calibrations/alignments, neighboring color samples when mapped together will produce visible errors. The objective of these calibrations is to get the geometric errors much smaller than those of the arm accuracy. The first geometric calibration is to take out lens distortion. Standard means are used for this based on imaging geometric objects of known size and extracting pixel coordinates using standard image processing techniques. The second is to create the camera model. A simple pinhole model can be used or a more complex model. Standard means are used for this based on imaging geometric objects of known size from different distances and extracting pixel coordinates using standard image processing techniques. The third is generating the alignment transform. A method has been developed based on 3D and color imaging geometric objects of known size using the probe. For all three methods, a three-axis computer controlled machine is used to ensure precise distances. The probe engineering must be geometrically stable enough such that this transform will only be recalculated rarely such as after the probe has been dropped or damaged.
Much of the effect of distance from the probe to the object on recorded light intensity can be calibrated out. A diffuse, flat, white surface is imaged normal to the camera axis at a number of different distances from the probe to the surface. The distances are chosen to cover the whole scanning range from closest point to furthest point. The variations in mean intensity recorded in the camera are used to calibrate the probe with distance. This calibration data is used to correct the color data recorded when scanning an object such that all color data is corrected to a known distance equivalent.
Much of the effect of tilt of the surface from the camera axis on the color quality can be removed, but the effectiveness of this depends on at least the surface reflectance for each color. A diffuse, flat, white surface is imaged at various angles to the camera axis at a fixed distance from the probe to the surface. The angles are chosen to the point at which there is significant deviation from the Lambertian model. The variations in mean intensity recorded in the camera are used to calibrate the probe intensity with relative surface angle to the probe. This calibration data is used to correct the color data recorded when scanning an object such that all color data is corrected to a normal equivalent.
A standard calorimetric calibration is carried out using reference colors, such as Macbeth charts that are mounted normal to the color camera axis at a known distance from the probe. Corrections are made to a commonly used color standard, such as to the CIE. Individual pixels in the camera may be color- and intensity-corrected.
Some of the above calibrations vary little among probes manufactured to the same design. This is probably due to tight manufacturing tolerances. The calibration information can be incorporated into the software as, for example, constants or tables or equations for the probe design. Others calibrations are carried out once on the setup of each probe after manufacture. Other calibrations could be carried out each time the scanning system is used—for example, the scanning of a white surface at a known distance will set the lamp intensity relative to the intensity when the bulbs were new.
Referring now to FIG. 22, there are several methods of mapping color images 320 onto the 3D model 324 to form texture maps. Surface elements on the 3D model may be flat polygons or elements of a high-level surface form. A mapping method for color images is:
1. Each color image 320 is corrected using calibration and geometric data.
2. For each surface element 321, the color image whose normal 323 is closest in orientation to the normal 322 of the surface element 321 is selected (the master image) and the texture map coordinates for that surface element go to the mapping of that surface element onto that master image. The closest image normal is that of 320a in this case.
3. The other color images that map onto the surface element are then processed. If the surface normal difference between the surface element and a color image is above a certain tolerance, then that image is ignored. This is because the color quality obtained in the image degrades significantly as the surface orientation of the object relative to the image becomes very steep. The part of the master image on which the surface element maps is then improved by a weighted average of all the color image mapped parts. The basis of the weighting is the cosine of the difference in surface normal between the surface element and the color image.
The apparatus and methods disclosed above each singly produce an improved color “copy” of the 3D model and a significant commercial advantage.
Ways of improving the scanning timing and consequently reducing geometrical errors are disclosed.
Where no electrical triggering is possible, to reduce the inaccuracy caused by the time difference between the recording of the arm position and the capturing of the frame, the following method is employed:
1. With reference now to FIG. 23, the arm position before the frame is captured B is recorded and the time t1 of this is recorded.
2. A frame is requested.
3. When the frame has been captured C, the time t2 is recorded. There is a known delay T/2 with little variability from the middle of the frame capture to this time t2, which is largely dependent on the shutter time open T.
4. The arm position after A is recorded and the time t3 of this is recorded.
5. The arm position in the middle of the frame is estimated by interpolating in six degrees of freedom between the two arm positions B,A using the time (t2-T/2) at the middle of the frame capture as the interpolation weighting between t1 and t3.
6. In the case of a long interrupt, if the difference between t1 and t3 is significantly large, then the data is deleted.
This interpolation method can increase the accuracy of a non-triggered system by a large amount and is extremely significant in the quest to obtain geometrically accurate data.
In addition, the operating system under which the interpolation software runs may be set to prioritize the interpolation software as high priority so that the introduction of delays due to other software being executed is minimized. Even if another software function interrupts this process, the validity of the process is not impaired unless the interrupting process is of extraordinarily long duration. Prioritization is not essential, but will contribute to reduced timing error where prioritizing is available in the operating system.
In the case where triggering is possible, there are many methods of carrying it out. One method is, with reference now to FIG. 24, that the synchronization signal 240 from a CCD camera 25 is stripped off by electronic circuitry 241, and a relay 242 is used to generate a series of trigger pulses 243 to the arm computer 2. This has the advantage of eliminating both the arm and camera variabilities and increasing the accuracy of the scanning as much as possible for a given arm and camera.
The operator interface means alone—not including the standard computer means such as mouse and keyboard—can be used to control the scanning and computer model generation process and the functionality of the options that can be actuated. The operator interface means include means for navigating menus such as buttons, foot pedals, joysticks, trackballs, and the position-sensing means—arm or remote position sensor.
Using any of the above means, the operator can simply select the required operations and operating parameters, which could include, for example, being able to:
    • Setup Scanning Apparatus
    • Select which position-sensing device is being used, i.e., arm.
    • Align the probe to the position-sensing device; align the turntable.
    • Set the sampling of the points, i.e., sampling step or chordal tolerance
    • Set when data thrown away because arm is moving too fast;
    • Data Collection
    • Pre-scan object to find out where it is.
    • Collect data points continuously, while that option is selected, for example.
    • Collect one set of points such as a stripe section.
    • Collect sets of data points at pre-determined intervals of position.
    • Collect contact reference points.
    • Pause and re-start data collection.
    • Collect Color Images.
    • Process
    • Generate polygonal or surface model from intermediate data.
    • Generate model in selected output format, e.g., 3DS, OBJ.
    • Map color images onto model.
    • Blend overlapping color images on model.
    • Close holes in polygon mesh.
    • Slice polygon mesh.
    • Smooth polygon mesh.
    • Decimate polygon mesh.
    • Flip normals in polygon mesh.
    • Change datum and orientation of coordinate system.
    • Edit
    • select/cut/paste/delete points
    • select/cut/paste/delete polygons
    • select/cut/paste/delete color images
    • Test
    • Check the performance of the system by processing data from scanning a sphere.
    • Check the performance of the system by processing data from scanning a flat surface.
    • Display
    • Display points in rendered color according to depth.
    • Redraw the computer display from the position and orientation of the probe.
    • Select the field of view of the redraw, i.e., from zoom to wide angle.
    • Select a viewpoint from list of preset viewpoints.
    • Display rendered data in one color.
    • Display rendered data using the scanned color data.
    • Display the computer model generated from polygons or complex surfaces.
    • Model Data
    • Save the points/intermediate data/model onto a storage medium such as a hard disk.
    • Publish the intermediate data/computer model as an object, such as an object that may be automatically available to another software package on the computer or over a network.
    • Load the points/intermediate data model from a storage medium, such as a hard disk.
    • Range Image
    • Create a new linear range image using the position and orientation of the probe when the option is selected.
    • Create a new cylindrical range image using the position and orientation of the probe when the option is selected.
    • Select one of the defined range images from all the defined range images.
    • Change the density of that range image.
    • Delete the selected range images.
    • Delete all range images.
    • Select a set of range images from a library of range image sets. Library range image sets could be mathematically organized, e.g., precisely orthogonal to each other, which may have advantages in some uses of the scanned data.
    • Add the selected library set to the currently defined range images.
    • Creating a new library set from the existing combination of range images. In this way, if several similar objects are to be scanned, the optimum range image combination can be set for the first one and automatically reused on the others.
    • Setting the selected library set as the default library set. In this way, for instance, a default library set of six range images that form a cube may be used for many objects, such that the process of range image definition is not needed, making the total scanning process quicker.
    • Delete all current data points in all range images.
    • Delete all the data points in the selected range image only.
    • Display points from the selected range image only.
    • Display points from all range images.
    • Display points with different colors for each range image.
    • Update all range images from all the other range images by a process of trying to fill in gaps or check entries in one range image by studying the others.
    • Update the selected range image from all the other range images by an inference process of trying to fill in gaps or check entries in one range image by studying the others. This is particularly useful when a new range image is defined after a lot of scanning has taken place.
    • Constrain the size of a range image. This is often done when a range image is defined specially to capture a small part of the surface of the object that is not covered by the other range images. This can be done to save memory on a computer with limited memory and can also speed up the whole process
    • Choose and initiate an algorithm for automatically constructing a model of polygons or complex surfaces from one range image.
    • Choose and initiate an algorithm for automatically constructing a model of polygons or complex surfaces from all the range images.
    • Set the parameters, such as the degree of accuracy, by which an algorithm constructs a model of polygons or complex surfaces.
    • Select an integration algorithm that combines the polygon models that have been generated from the range images.
    • Select a predefined sequence of algorithms that automatically generates a complete model of polygons or complex surfaces from a set of range images.
Complex surfaces can be created from marked surface patch boundaries. Referring now to FIG. 25, the object 130 is painted a uniform color (if necessary) before marking the patch boundaries 131 by hand in another color, e.g., using a black marker pen on a white object. It is not important to mark these boundaries accurately, as they usually lie away from features such as edges or rapid changes in surface normal. The object is then scanned in using one of the methods disclosed. The color information is then used to automatically generate the patch boundaries by means of an algorithm that separates out the points 132 lying on the patch boundaries by means of a color filter and then fits patch boundary lines such as splines 133 to these points. The edges may also be detected using a separate algorithm. The patch boundaries that have been automatically created from the scan can then be used to create the complex surface model. The main benefit of this method is that it is easier to mark patch boundaries on the object than on the computer model prior to the automatic creation of the complex surface model. Referring now to FIG. 26(a), an important implementation 333 of the invention is disclosed in which the multiply-jointed arm 1 is mounted on the end of the horizontal arm of the horizontal arm measuring machine 330 for scanning a large object 331. The horizontal arm measuring machine 330 has a machine control box 332 that outputs the position of the machine to the computer 4. The arm control 2 and the probe 3 are also connected to the computer 4. This implementation makes the scanning of large objects more precise in that either a large arm or leapfrogging would be less accurate than a horizontal arm, and simpler in that each time the horizontal arm is moved, the software takes it into account automatically rather than needing to reregister using a leapfrogging method. In industry, firms that have large objects, such as automotive manufacturers, usually have horizontal arm machines so this makes the implementation particularly attractive.
Referring now to FIG. 26(b), firms that have large objects, such as automotive manufacturers, often have two horizontal arm machines situated opposing each other, both of which can reference to the same object coordinate system. In this case, the whole of the object may be scanned by scanning part of the object with the probe fitted to the first horizontal arm machine and the rest of the object with the probe fitted to the second horizontal arm machine.
This invention is a general 3D model-making device and has wide-ranging applicability. The application industries for this invention include design stylists who need to turn clay objects into computer models quickly and accurately; games developers and animators who need to convert new characters into 3D data sets for animation; shoe manufacturers who need to make custom shoes; automotive manufacturers who need to model the actual cable and pipe runs in confined spaces; and medical applications that include radiotherapy and wound treatment. Altogether, some 200 applications have been identified for this invention.
Referring now to FIG. 27, as an example of the applications for the scanning apparatus 100 in accordance with the invention, the scanning apparatus 100 can be used to scan a human foot 141 with full body weight on it on surfaces of different resilience is also disclosed. The outside of the foot 141 is first scanned using the methods and devices disclosed above with the required amount of body weight being exerted. The foot 141 is then removed and a second scan is carried out of the surface 142 on which the foot 141 was pressed. The first scan is a positive. The second scan is a negative. The surface normals of the second scan are then reversed by means of a simple algorithm and the two scans combined to give the positive shape of the foot. It is important that if a deformable material is used that it does not spring back. Such a material might be sand, clay, or plaster. Materials of different resilience may be appropriate for different applications. This method is also appropriate when the foot is pressed onto the lower half of a shoe with the sides cut away.
There is a need by automobile manufacturers to identify the actual route of pipes and cables in confined areas, such as an engine department. Automobile manufacturers are trying to model in 3D CAD all aspects of a car. They need some way of scanning pipes and cables in the car reference system so that high level 3D models of the pipes and cables are output that can be introduced into the CAD system for identifying actual routing and potential interferences. In the scanning of pipes and cables, for instance, in confined spaces, if there is a problem with black or shiny items not being scannable, these can be first dusted with a white powder that is easily removed after scanning.
Referring now to FIG. 28(a), it is often better to scan a cable or pipe 341 as a number of stripe sections 342 to 349, rather than as a large number of densely spaced stripes. A stripe sensor can be activated in a first mode to take a single stripe section by the operator activating a button or foot-pedal. In this way, the operator can take a small number of sections to describe the path of the pipe using his expertise to decide when to take sections. For instance, where a pipe joins another pipe, it may be appropriate to capture many more stripe sections 344 to 349. Also, where there is a feature such as a fixing on a pipe, it may be appropriate to capture very dense stripes. A second mode would be capturing stripe sections as fast as the sensor can capture them and displaying them as a surface on the display. A third mode would be a mode in which the operator specifies the distance between the sections, e.g., 5 mm, and the system automatically takes a stripe section every, e.g., 5 mm that the stripe travels in 3D space. One method of determining this distance is to select the point at the average standoff distance in the middle of the stripe, i.e., the center point of the measuring range, and when this point has moved 5 mm, to automatically capture another stripe section. When the operator is scanning pipes and cables, the operator control system should support the simple switching between the three modes.
The intermediate data structure in which the stripe sections are collated could be the standard stripe section structure 303, but includes the changes in mode and the orientation of the probe for each section. In scanning pipes and cables, panel sections along which the pipes and cables run are also captured 342a, 342d. Where there is no contact between the pipe and the panel, there is a jump or break in the stripe section. These can be flagged in the data structure with jump flags 305 and break flags 304.
To be useful to an automobile manufacturer, a high level model should be created and output from this data. A polygonization or surfacing method joins the sections together and can handle the joining of pipes, panels, etc. The result is high level models 350 to 352. If more information is known about the pipe or cable, such as its section if it is constant or its form even if the form's dimensions change, e.g., circular but varying diameter, the model 351 can be automatically expanded to 353. Alternatively, two scanned sides of the same pipe can be automatically joined. This gives the automobile manufacturer the high level model that he needs.
As will be understood to persons skilled in the art, there are various modifications within the scope of the present invention. For example, the color camera does not need to be included. A single camera could be utilized for both color and position sensing. The fitter in the probe could be a narrow band pass filter or a red high band pass filter, as required. The system is adaptable to many types of model generation not just those discussed herein. The data collected by the probe could be used for other applications and could be stored for dissemination elsewhere—for example, by electronic mail. The probe can be a stripe or an area probe. The display can be mounted anywhere depending upon the application requirements.
While the preferred embodiment of the invention has been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.

Claims (59)

1. A scanning apparatus, comprising:
a multiply-jointed arm having a plurality of arm segments and a data communication link to transmit data; and
a scanner mounted on an arm segment of the multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
(a) a light source operable to emit light onto the object surface;
(b) a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
(c) a data processor operable to process the electrical image data signals to generate processed data of reduced quantity, the data processor being connected to the data communication link to transmit the processed data therealong.
2. A scanning apparatus according to claim 1, wherein the data processor is operable to generate the processed data of reduced quantity by processing the electrical image data signals to generate measurement data and processing the measurement data to reduce the quantity thereof.
3. A scanning apparatus according to claim 1, wherein the data processor is operable to generate the processed data of reduced quantity by filtering the data.
4. A scanning apparatus according to claim 1, wherein the data processor is operable to generate the processed data of reduced quantity by discarding data.
5. A scanning apparatus according to claim 1, wherein the communication link comprises a cable.
6. A scanning apparatus according to claim 1, further comprising a battery power supply within the apparatus to power the scanner.
7. A scanner mountable on a multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
a light source operable to emit light onto the object surface;
a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
a data processor operable to process the electrical image data signals to generate processed data of reduced quantity, the data processor being connectable to a data communication link to transmit the processed data therealong.
8. A scanner according to claim 7, wherein the data processor is operable to generate the processed data of reduced quantity by processing the electrical image data signals to generate measurement data and processing the measurement data to reduce the quantity thereof.
9. A scanner according to claim 7, wherein the data processor is operable to generate the processed data of reduced quantity by filtering the data.
10. A scanner according to claim 7, wherein the data processor is operable to generate the processed data of reduced quantity by discarding data.
11. A scanning apparatus, comprising:
a multiply-jointed arm having a plurality of arm segments;
a scanner mounted on an arm segment of the multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
(a) a light source operable to emit light onto the object surface;
(b) a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
(c) a data processor operable to process the electrical image data signals to generate digital image data; and
a bus connected to the data processor of the scanner to transmit the digital image data.
12. A scanning apparatus according to claim 11, wherein the data processor comprises a frame grabber.
13. A scanning apparatus according to claim 11, further comprising a battery power supply within the apparatus to power the scanner.
14. A scanner mountable on a multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
a light source operable to emit light onto the object surface;
a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
a data processor operable to process the electrical image data signals to generate digital image data,
the data processor being connectable to a bus to transmit the digital image data.
15. A scanner according to claim 14, wherein the data processor comprises a frame grabber.
16. A coordinate measuring machine, comprising:
a multiply-jointed arm having a plurality of arm segments and a physical data path to transmit data; and
a scanner mounted on an arm segment of the multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
a light source operable to emit light onto the object surface;
a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
a data processor operable to process the electrical image data signals to generate data defining coordinate measurements of the surface of the object, and to transmit the generated data on the physical data path.
17. A coordinate measuring machine according to claim 16, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising three-dimensional positions.
18. A coordinate measuring machine according to claim 16, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising points in three-dimensional space.
19. A coordinate measuring machine according to claim 16, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising connected polygons in three-dimensional space.
20. A coordinate measuring machine according to claim 16, wherein the physical data path comprises a cable.
21. A coordinate measuring machine according to claim 16, further comprising a batter battery power supply within the apparatus to power the scanner.
22. A scanner mountable on a multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the scanner having a housing enclosing:
a light source operable to emit light onto the object surface;
a light detector operable to detect light reflected from the object surface and to generate electrical image data signals in dependence upon the detected light; and
a data processor operable to process the electrical image data signals to generate data defining coordinate measurements of the surface of the object, and to transmit the generated data on a physical data path.
23. A scanner according to claim 22, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising three-dimensional positions.
24. A scanner according to claim 22, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising points in three-dimensional space.
25. A scanner according to claim 22, wherein the data processor is arranged to process the electrical image data signals to generate data defining coordinate measurements comprising connected polygons in three-dimensional space.
26. A laser scanning apparatus, comprising:
a multiply-jointed arm having a plurality of arm segments and a data communication link to transmit data; and
a laser scanner mounted on an arm segment of the multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the laser scanner having a housing enclosing:
(a) a laser to emit a laser stripe onto the object surface;
(b) a camera operable to generate images of laser light reflected from the object surface; and
(c) a data processor operable to process the images generated by the camera to generate processed data defining a position of the laser stripe in the images, the data processor being connected to the data communication link to transmit the processed data therealong.
27. A laser scanning apparatus according to claim 26, wherein:
the camera is arranged to generate images comprising a plurality of pixels; and
the data processor is arranged to process the images generated by the camera to generate processed data defining a position of the laser stripe in the images to sub-pixel accuracy.
28. A laser scanning apparatus according to claim 26, wherein the data communication link comprises a cable.
29. A laser scanning apparatus according to claim 26, further comprising a batter battery power supply within the apparatus to power the laser scanner.
30. A laser scanner mountable on a multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the laser scanner having a housing enclosing:
a laser to emit a laser stripe onto the object surface;
a camera operable to generate images of laser light reflected from the object surface; and
a data processor operable to process the images generated by the camera to generate processed data defining a position of the laser stripe in the images, the data processor being connectable to a data communication link to transmit the processed data therealong.
31. A laser scanner according to claim 30, wherein:
the camera is arranged to generate images comprising a plurality of pixels; and
the data processor is arranged to process the images generated by the camera to generate processed data defining a position of the laser stripe in the images to sub-pixel accuracy.
32. A laser scanning apparatus, comprising:
a multiply-jointed arm having a plurality of arm segments and a data communication link to transmit data; and
a laser scanner mounted on an arm segment of the multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the laser scanner having a housing enclosing:
(a) a laser to emit at least one laser stripe onto the object surface;
(b) a camera operable to generate images of laser light reflected from the object surface, each image comprising a plurality of pixels; and
(c) a data processor operable to process the images generated by the camera to perform measurements to sub-pixel accuracy, the data processor being connected to the data communication link to transmit results of the measurements therealong.
33. A laser scanning apparatus according to claim 32, wherein the data communication link comprises a cable.
34. A laser scanning apparatus according to claim 32, further comprising a batter battery power supply within the apparatus to power the laser scanner.
35. A laser scanner mountable on a multiply-jointed arm for movement therewith to capture data from a plurality of points on a surface of an object, the laser scanner having a housing enclosing:
a laser to emit at least one laser stripe onto the object surface;
a camera operable to generate images of laser light reflected from the object surface, each image comprising a plurality of pixels; and
a data processor operable to process the images generated by the camera to perform measurements to sub-pixel accuracy, the data processor being connectable to a data communication link to transmit results of the measurements therealong.
36. An apparatus comprising:
a scanner including a light source which emits light to an object so that the light reflects off the object, and a light detector which detects the reflected light, wherein the scanner outputs information indicated by the detected light;
a position detector detecting position of at least one of the object and the scanner at a timing corresponding to a timing at which the light detector detects the reflected light;
a processor determining a relative position of the object to the scanner using the information output by the scanner; and
a three dimensional data generator generating three-dimensional data of the object using the determined relative position and the information output by the position detector.
37. An apparatus comprising according to claim 36, wherein
the timing at which the light detector detects the reflected light is at least one of different timings defined by a synchronization signal; and
the apparatus further comprising a trigger pulse generator receiving the synchronization signal and, in response thereto, outputting trigger pulses to the position detector, which indicate the timing at which the position detector is to detect the position.
38. An apparatus according to claim 37, wherein timing at which the trigger pulse generator outputs trigger pulses is a predetermined time behind timing at which the trigger pulse generator receives the synchronization signal.
39. An apparatus comprising according to claim 36, wherein the position detector is a remote position sensor.
40. An apparatus comprising according to claim 36, wherein the position detector calculates the position to thereby detect the position.
41. An apparatus comprising according to claim 36, further comprising:
a probe which captures data from individual points on the object touched by the probe.
42. An apparatus comprising:
a scanner including a light source which emits light to an object so that the light reflects off the object, and a light detector which detects the reflected light, so that the scanner thereby scans the object, wherein the scanner outputs information indicated by the detected light; and
a processor which, as the object is being scanned by the scanner, determines a changing relative positional relationship between the object and the scanner at a timing corresponding to a timing at which the light detector detects the reflected light, and
a three dimensional data generator generating three-dimensional data of the object using the information output by the scanner and the determined change in positional relationship.
43. An apparatus comprising:
a scanner including a light source which emits light to an object so that the light reflects off the object, and a light detector which detects the reflected light, wherein the scanner outputs information indicated by the detected light;
a position detector detecting position of at least one of the object and the scanner at a timing corresponding to a timing at which the light detector detects the reflected light; and
a processor determining a relative position of the object to the scanner using the information output by the scanner, and generating three-dimensional data of the object using the determined relative position and the information output by the position detector.
44. An apparatus comprising:
a scanner including a light source which emits light to an object so that the light reflects off the object, and a light detector which detect the reflected light, wherein the scanner outputs first information indicated by the detected light;
a position detector which processes second information related to position of at least one of the object and the scanner at a timing related to a timing at which the light detector detects the reflected light;
a processor which communicates with the position detector and the scanner, and which generates three-dimensional data of the object using the first information outputted by the scanner and the second information outputted by the position detector.
45. An apparatus according to claim 44, wherein
the timing at which the light detector detects the reflected light is at least one of different timings defined by a synchronization signal; and
the apparatus further comprising a trigger pulse generator receiving the synchronization signal and, in response thereto, outputting trigger pulses, of which each indicates a timing at which the position detector is to detect the position, to the position detector.
46. An apparatus according to claim 45, wherein the timing at which the trigger pulse generator outputs trigger pulses is a predetermined time behind a timing at which the trigger pulse generator receives the synchronization signal.
47. An apparatus according to claim 46, wherein the position detector calculates the position to thereby detect the position.
48. An apparatus comprising:
a scanner including a light source which emits light to an object so that the light reflects off the object, and a light detector which detects the reflected light, wherein the scanner outputs first information indicated by the detected light;
a processor unit which communicates with the scanner and which generates three-dimensional data of the object using the first information outputted by the scanner and second information relative to a position of at least one of the object and the scanner at a timing corresponding to a timing at which the light detector detects the reflected light.
49. An apparatus according to claim 48 further comprising:
a position detector which detects relative position between the scanner and the object at the timing corresponding to the timing at which the light detector detects the reflected light.
50. An apparatus according to claim 48, wherein
the timing at which the light detector detects the reflected light is at least one of different timings defined by a synchronization signal; and
the apparatus further comprising a trigger pulse generator receiving the synchronization signal and, in response thereto, outputting trigger pulses, of which each indicates a timing at which the position detector is to detect the position, to the position detector.
51. An apparatus according to claim 49, wherein
the timing at which the light detector detects the reflected light is at least one of different timings defined by a synchronization signal; and
the apparatus further comprising a trigger pulse generator receiving the synchronization signal and, in response thereto, outputting trigger pulses, of which each indicates a timing at which the position detector is to detect the position, to the position detector.
52. An apparatus according to claim 50, wherein a timing at which the trigger pulse generator outputs trigger pulses is a predetermined time behind a timing at which the trigger pulse generator receives the synchronization signal.
53. An apparatus according to claim 51, wherein a timing at which the trigger pulse generator outputs trigger pulses is a predetermined time behind a timing at which the trigger pulse generator receives the synchronization signal.
54. An apparatus according to claim 52, wherein the position detector calculates the position to thereby detect the position.
55. An apparatus according to claim 53, wherein the position detector calculates the position to thereby detect the position.
56. A method for generating three-dimensional data of the object, comprising:
irradiating light onto the object from a scanner;
detecting light reflected from a surface of the object by the scanner;
outputting information indicated by the detected light from the scanner;
detecting relative position of the object to the scanner at a timing related to a timing at which the reflected light is detected; and
generating three-dimensional data of the object using the detected relative position and the outputted information.
57. A method according to claim 56, further comprising:
generating a synchronization signal from the scanner; and
generating a trigger pulse, in response to receiving the synchronization signal at a trigger pulse generator,
wherein the relative position of the object to the scanner is detected at a timing at which the trigger pulse is detected by the position sensor.
58. A method according to claim 57, wherein the trigger pulse is output at a timing which is a predetermined time behind a timing at which the trigger pulse generator receives the synchronization signal.
59. A method for generating three-dimensional data of an object, comprising:
irradiating light onto the object from a scanner;
detecting light reflected from a surface of the object by the scanner;
outputting information indicated by the detected light from the scanner;
generating a trigger pulse synchronized with a timing at which the sensor detects light;
processing position information relative to the position of the scanner;
outputting the position information in response to the trigger pulse; and
generating three-dimensional data of the object using the position data and the information output from the sensor.
US12/647,319 1995-07-26 2009-12-24 Scanning apparatus and method Expired - Fee Related USRE43895E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/647,319 USRE43895E1 (en) 1995-07-26 2009-12-24 Scanning apparatus and method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
GBGB9515311.0A GB9515311D0 (en) 1995-07-26 1995-07-26 Stripe scanners and methods of scanning
GB9515311.0 1995-07-26
PCT/GB1996/001868 WO1997005449A1 (en) 1995-07-26 1996-07-25 Scanning apparatus and method
US09/000,215 US6611617B1 (en) 1995-07-26 1996-07-25 Scanning apparatus and method
US10/601,043 US7313264B2 (en) 1995-07-26 2003-06-20 Scanning apparatus and method
US12/647,319 USRE43895E1 (en) 1995-07-26 2009-12-24 Scanning apparatus and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/601,043 Reissue US7313264B2 (en) 1995-07-26 2003-06-20 Scanning apparatus and method

Publications (1)

Publication Number Publication Date
USRE43895E1 true USRE43895E1 (en) 2013-01-01

Family

ID=10778272

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/000,215 Expired - Lifetime US6611617B1 (en) 1995-07-26 1996-07-25 Scanning apparatus and method
US10/601,043 Ceased US7313264B2 (en) 1995-07-26 2003-06-20 Scanning apparatus and method
US12/647,319 Expired - Fee Related USRE43895E1 (en) 1995-07-26 2009-12-24 Scanning apparatus and method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/000,215 Expired - Lifetime US6611617B1 (en) 1995-07-26 1996-07-25 Scanning apparatus and method
US10/601,043 Ceased US7313264B2 (en) 1995-07-26 2003-06-20 Scanning apparatus and method

Country Status (7)

Country Link
US (3) US6611617B1 (en)
EP (3) EP0840880B1 (en)
JP (1) JPH11509928A (en)
AU (1) AU6626896A (en)
DE (2) DE69619826D1 (en)
GB (1) GB9515311D0 (en)
WO (1) WO1997005449A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130200139A1 (en) * 2012-02-06 2013-08-08 Oracle International Corporation Topographic spot scanning for a storage library
US20130221086A1 (en) * 2012-02-29 2013-08-29 Oracle International Corporation Contrast spot scanning for a storage library
US20130266228A1 (en) * 2012-04-10 2013-10-10 Siemens Industry, Inc. Automatic part identification and workflow generation
US8576411B2 (en) * 2010-11-10 2013-11-05 Yazaki Corporation Component position measurement method
US10089415B2 (en) 2013-12-19 2018-10-02 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10122997B1 (en) 2017-05-03 2018-11-06 Lowe's Companies, Inc. Automated matrix photo framing using range camera input

Families Citing this family (270)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9515311D0 (en) * 1995-07-26 1995-09-20 3D Scanners Ltd Stripe scanners and methods of scanning
FR2764397B1 (en) * 1997-06-09 1999-10-01 Aerospatiale METHOD AND DEVICE FOR TAKING SPHERICAL VIEWS FOR THE PRODUCTION OF A MATRIX OF 3D IMAGES OF AN OBJECT
US6621889B1 (en) * 1998-10-23 2003-09-16 Varian Medical Systems, Inc. Method and system for predictive physiological gating of radiation therapy
US7158610B2 (en) * 2003-09-05 2007-01-02 Varian Medical Systems Technologies, Inc. Systems and methods for processing x-ray images
US6937696B1 (en) 1998-10-23 2005-08-30 Varian Medical Systems Technologies, Inc. Method and system for predictive physiological gating
US6980679B2 (en) * 1998-10-23 2005-12-27 Varian Medical System Technologies, Inc. Method and system for monitoring breathing activity of a subject
US6279579B1 (en) * 1998-10-23 2001-08-28 Varian Medical Systems, Inc. Method and system for positioning patients for medical treatment procedures
US6973202B2 (en) * 1998-10-23 2005-12-06 Varian Medical Systems Technologies, Inc. Single-camera tracking of an object
WO2000066973A1 (en) * 1999-04-30 2000-11-09 Christoph Wagner Method for optically detecting the shape of objects
US6166811A (en) 1999-08-12 2000-12-26 Perceptron, Inc. Robot-based gauging system for determining three-dimensional measurement data
DE19956646C1 (en) * 1999-11-25 2001-10-18 Dieter Dirksen Colored surface spatial coordinates measuring device uses computer-controlled topometric surface measurement via phase measuring profilometry
US7106898B2 (en) * 1999-12-06 2006-09-12 California Institute Of Technology 3D scanning using shadows
DE10009870C2 (en) * 2000-03-01 2002-02-07 Bernward Maehner Method and device for examining test objects
JP4341135B2 (en) * 2000-03-10 2009-10-07 コニカミノルタホールディングス株式会社 Object recognition device
US6804380B1 (en) * 2000-05-18 2004-10-12 Leica Geosystems Hds, Inc. System and method for acquiring tie-point location information on a structure
US6639684B1 (en) * 2000-09-13 2003-10-28 Nextengine, Inc. Digitizer using intensity gradient to image features of three-dimensional objects
HU0003886D0 (en) * 2000-10-04 2000-12-28 Method and apparatus for preparing statue-like copy
JP2002164066A (en) * 2000-11-22 2002-06-07 Mitsubishi Heavy Ind Ltd Stacked heat exchanger
EP1211481A3 (en) * 2000-11-29 2004-05-19 microSystems GmbH Control device to check the contours and/or the position of constructional elements
US7200642B2 (en) * 2001-04-29 2007-04-03 Geodigm Corporation Method and apparatus for electronic delivery of electronic model images
US7215803B2 (en) * 2001-04-29 2007-05-08 Geodigm Corporation Method and apparatus for interactive remote viewing and collaboration of dental images
US7769430B2 (en) 2001-06-26 2010-08-03 Varian Medical Systems, Inc. Patient visual instruction techniques for synchronizing breathing with a medical procedure
ITTO20010640A1 (en) * 2001-07-03 2003-01-03 Newlast Automation Srl DEVICE AND METHOD FOR DETECTION OF SHAPES, IN PARTICULAR FOOTWEAR MODELS.
JP2003032495A (en) * 2001-07-13 2003-01-31 Fujitsu Ltd Smoothing method and smoothing circuit
US7034960B2 (en) * 2001-08-16 2006-04-25 Sun Chemical Corporation System and method for managing electronic transmission of color data
FR2832577B1 (en) * 2001-11-16 2005-03-18 Cit Alcatel ADAPTIVE DATA ACQUISITION FOR NETWORK OR SERVICE MANAGEMENT SYSTEMS
JP3855756B2 (en) * 2001-12-07 2006-12-13 ブラザー工業株式会社 3D color shape detection device and 3D scanner
DE60318396T2 (en) * 2002-02-14 2008-05-21 Faro Technologies, Inc., Lake Mary PORTABLE COORDINATE MEASURING MACHINE WITH JOINT ARM
US7881896B2 (en) 2002-02-14 2011-02-01 Faro Technologies, Inc. Portable coordinate measurement machine with integrated line laser scanner
SE523681C2 (en) * 2002-04-05 2004-05-11 Integrated Vision Prod System and sensor for mapping properties of an object
DE10233372B4 (en) * 2002-07-18 2004-07-22 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Measuring system and method for recording geometric quantities
US7620444B2 (en) 2002-10-05 2009-11-17 General Electric Company Systems and methods for improving usability of images for medical applications
FR2846413B1 (en) * 2002-10-24 2005-03-11 Luc Vergnaud DEVICE FOR NON-CONTACT SCANNING OF A 3-DIMENSION OBJECT
US8064684B2 (en) * 2003-04-16 2011-11-22 Massachusetts Institute Of Technology Methods and apparatus for visualizing volumetric data using deformable physical object
WO2004096502A1 (en) 2003-04-28 2004-11-11 Stephen James Crampton Cmm arm with exoskeleton
US20050010450A1 (en) * 2003-05-05 2005-01-13 Geodigm Corporation Method and apparatus for utilizing electronic models of patient teeth in interdisciplinary dental treatment plans
DE10331460A1 (en) * 2003-07-10 2005-02-17 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Three dimensional measuring arrangement, e.g. for making templates or models of real objects, has a robot with a measurement head and a positioning arrangement on which the object is supported and which rotates about its own axis
GB2405466B (en) 2003-08-27 2006-01-25 Teraview Ltd Method and apparatus for investigating a non-planner sample
US8571639B2 (en) 2003-09-05 2013-10-29 Varian Medical Systems, Inc. Systems and methods for gating medical procedures
US7064810B2 (en) * 2003-09-15 2006-06-20 Deere & Company Optical range finder with directed attention
DE10345743A1 (en) * 2003-10-01 2005-05-04 Kuka Roboter Gmbh Method and device for determining the position and orientation of an image receiving device
WO2005060629A2 (en) * 2003-12-11 2005-07-07 Strider Labs, Inc. Probable reconstruction of surfaces in occluded regions by computed symmetry
US7693325B2 (en) 2004-01-14 2010-04-06 Hexagon Metrology, Inc. Transprojection of geometry data
US7152456B2 (en) 2004-01-14 2006-12-26 Romer Incorporated Automated robotic measuring system
US7327857B2 (en) * 2004-03-09 2008-02-05 General Electric Company Non-contact measurement method and apparatus
US7824346B2 (en) * 2004-03-11 2010-11-02 Geodigm Corporation Determining condyle displacement utilizing electronic models of dental impressions having a common coordinate system
US7711179B2 (en) * 2004-04-21 2010-05-04 Nextengine, Inc. Hand held portable three dimensional scanner
US7466416B2 (en) * 2004-04-30 2008-12-16 X-Rite, Inc. Color measurement system
DE602005004332T2 (en) 2004-06-17 2009-01-08 Cadent Ltd. Method for providing data related to the oral cavity
US7557966B2 (en) * 2004-08-11 2009-07-07 Acushnet Company Apparatus and method for scanning an object
US7266425B2 (en) * 2004-09-30 2007-09-04 Rockwell Automation Technologies, Inc. Systems and methods that facilitate motion control through coordinate system transformations
US20060074305A1 (en) * 2004-09-30 2006-04-06 Varian Medical Systems Technologies, Inc. Patient multimedia display
US7307737B1 (en) * 2004-10-08 2007-12-11 Snap-On Incorporated Three-dimensional (3D) measuring with multiple reference frames
US20090083016A1 (en) * 2004-11-17 2009-03-26 Tecosim Technische Simulation Gmbh Method for Generating a Calculation Model for a Mechanical Structure
MX2007006105A (en) * 2004-11-22 2007-07-24 Bridgestone Corp External-appearance inspection apparatus.
EP1694048B1 (en) * 2005-02-16 2013-01-09 X-Rite Europe GmbH Colour measuring device and measuring method therefor
US7808644B2 (en) * 2005-03-24 2010-10-05 Obe Ohnmacht & Baumgartner Gmbh & Co. Kg Device for optically measuring the shapes of objects and surfaces
DE102005016525A1 (en) * 2005-04-08 2006-10-19 Degudent Gmbh Method for three-dimensional shape detection of a body
US7571638B1 (en) * 2005-05-10 2009-08-11 Kley Victor B Tool tips with scanning probe microscopy and/or atomic force microscopy applications
US9423693B1 (en) 2005-05-10 2016-08-23 Victor B. Kley In-plane scanning probe microscopy tips and tools for wafers and substrates with diverse designs on one wafer or substrate
JP2006349547A (en) * 2005-06-17 2006-12-28 Kanto Auto Works Ltd Noncontact type three-dimensional shape measuring method and measuring machine
US20080232679A1 (en) * 2005-08-17 2008-09-25 Hahn Daniel V Apparatus and Method for 3-Dimensional Scanning of an Object
US9119541B2 (en) * 2005-08-30 2015-09-01 Varian Medical Systems, Inc. Eyewear for patient prompting
WO2007033273A2 (en) * 2005-09-13 2007-03-22 Romer Incorporated Vehicle comprising an articulator of a coordinate measuring machine
WO2007043899A1 (en) 2005-10-14 2007-04-19 Applied Research Associates Nz Limited A method of monitoring a surface feature and apparatus therefor
US7780085B2 (en) * 2005-12-09 2010-08-24 Larue John D Round surface scanner
WO2008108749A1 (en) * 2006-01-20 2008-09-12 Nextpat Limited Desktop three-dimensional scanner
US7995834B1 (en) 2006-01-20 2011-08-09 Nextengine, Inc. Multiple laser scanner
DE502006009454D1 (en) * 2006-03-23 2011-06-16 Bernward Maehner OTHER OBJECTS
WO2007125081A1 (en) 2006-04-27 2007-11-08 Metris N.V. Optical scanning probe
US7568293B2 (en) * 2006-05-01 2009-08-04 Paul Ferrari Sealed battery for coordinate measurement machine
KR100753536B1 (en) * 2006-05-04 2007-08-30 주식회사 아이너스기술 Method for detecting 2 dimension sketch data of source model data for 3 dimension reverse modeling
US7805854B2 (en) 2006-05-15 2010-10-05 Hexagon Metrology, Inc. Systems and methods for positioning and measuring objects using a CMM
DE102006031833A1 (en) * 2006-05-24 2007-12-06 Dr. Wirth Grafische Technik Gmbh & Co. Kg Method for generating image information
RU2445007C2 (en) * 2006-05-24 2012-03-20 Конинклейке Филипс Электроникс, Н.В. Superposition of coordinate systems
US7784107B2 (en) * 2006-06-02 2010-08-24 Victor B. Kley High speed measurement, analysis and imaging systems and methods for length scales from meter to sub-nanometer
KR100753537B1 (en) * 2006-06-09 2007-08-30 주식회사 아이너스기술 Method for reverse modeling using mesh data with feature
DE102006031580A1 (en) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Method and device for the three-dimensional detection of a spatial area
DE102006031142B4 (en) * 2006-07-05 2012-02-16 Prüf- und Forschungsinstitut Pirmasens e.V. Method and device for three-dimensional measurement and detection of the complete object surface of a spherical measurement object such as a sports ball
US20080021317A1 (en) * 2006-07-24 2008-01-24 Siemens Medical Solutions Usa, Inc. Ultrasound medical imaging with robotic assistance for volume imaging
US7652275B2 (en) * 2006-07-28 2010-01-26 Mitutoyo Corporation Non-contact probe control interface
US7256899B1 (en) 2006-10-04 2007-08-14 Ivan Faul Wireless methods and systems for three-dimensional non-contact shape sensing
US8054500B2 (en) * 2006-10-10 2011-11-08 Hewlett-Packard Development Company, L.P. Acquiring three-dimensional structure using two-dimensional scanner
WO2008064276A2 (en) 2006-11-20 2008-05-29 Hexagon Metrology Ab Coordinate measurement machine with improved joint
WO2008080142A1 (en) * 2006-12-22 2008-07-03 Romer, Inc. Improved joint axis for coordinate measurement machine
JP2008256462A (en) * 2007-04-03 2008-10-23 Fanuc Ltd Image display method of shape data
CN104807425A (en) * 2007-04-03 2015-07-29 六边形度量衡股份公司 Method and device for exact measurement of objects
DE102007022361A1 (en) 2007-05-04 2008-11-06 Friedrich-Schiller-Universität Jena Device and method for the contactless detection of spatial coordinates of a surface
US7953247B2 (en) 2007-05-21 2011-05-31 Snap-On Incorporated Method and apparatus for wheel alignment
US7546689B2 (en) * 2007-07-09 2009-06-16 Hexagon Metrology Ab Joint for coordinate measurement device
US8605983B2 (en) 2007-08-17 2013-12-10 Renishaw Plc Non-contact probe
US8116519B2 (en) * 2007-09-26 2012-02-14 Honda Motor Co., Ltd. 3D beverage container localizer
US7774949B2 (en) * 2007-09-28 2010-08-17 Hexagon Metrology Ab Coordinate measurement machine
US7627448B2 (en) * 2007-10-23 2009-12-01 Los Alamost National Security, LLC Apparatus and method for mapping an area of interest
CN101424520B (en) * 2007-10-31 2011-03-23 鸿富锦精密工业(深圳)有限公司 Method for detecting partial contour outline of object curved surface
WO2009116508A1 (en) * 2008-03-19 2009-09-24 株式会社安川電機 Shape measuring device and robot device with the same
US20090322859A1 (en) * 2008-03-20 2009-12-31 Shelton Damion M Method and System for 3D Imaging Using a Spacetime Coded Laser Projection System
US7779548B2 (en) * 2008-03-28 2010-08-24 Hexagon Metrology, Inc. Coordinate measuring machine with rotatable grip
US8122610B2 (en) * 2008-03-28 2012-02-28 Hexagon Metrology, Inc. Systems and methods for improved coordination acquisition member comprising calibration information
US8520930B2 (en) 2008-04-18 2013-08-27 3D Scanners Ltd. Method and computer program for improving the dimensional acquisition of an object
EP2277000B1 (en) * 2008-04-18 2011-11-02 3D Scanners Ltd Method and computer program for improving the dimensional acquisition of an object
US7640674B2 (en) * 2008-05-05 2010-01-05 Hexagon Metrology, Inc. Systems and methods for calibrating a portable coordinate measurement machine
US8265425B2 (en) * 2008-05-20 2012-09-11 Honda Motor Co., Ltd. Rectangular table detection using hybrid RGB and depth camera sensors
US10667727B2 (en) 2008-09-05 2020-06-02 Varian Medical Systems, Inc. Systems and methods for determining a state of a patient
US7908757B2 (en) 2008-10-16 2011-03-22 Hexagon Metrology, Inc. Articulating measuring arm with laser scanner
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US9739595B2 (en) * 2008-12-11 2017-08-22 Automated Precision Inc. Multi-dimensional measuring system with measuring instrument having 360° angular working range
FR2940423B1 (en) * 2008-12-22 2011-05-27 Noomeo DENSE RECONSTRUCTION THREE-DIMENSIONAL SCANNING DEVICE
DK2401575T3 (en) * 2009-02-25 2020-03-30 Dental Imaging Technologies Corp Method and apparatus for generating a display of a three-dimensional surface
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009018464B4 (en) * 2009-04-22 2014-10-30 Leuze Electronic Gmbh + Co. Kg Optical sensor
PL2438397T3 (en) * 2009-06-01 2019-05-31 Dentsply Sirona Inc Method and device for three-dimensional surface detection with a dynamic reference frame
EP3620762A1 (en) * 2009-06-30 2020-03-11 Hexagon Technology Center GmbH Coordinate measurement machine with vibration detection
BE1018834A3 (en) * 2009-07-21 2011-09-06 Praet Peter Van INSPECTION CAMERA MOUNTED ON A MEDICAL DIAGNOSIS AUTOMAT.
GB0915904D0 (en) 2009-09-11 2009-10-14 Renishaw Plc Non-contact object inspection
US20110112786A1 (en) * 2009-11-06 2011-05-12 Hexagon Metrology Ab Cmm with improved sensors
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
TW201120405A (en) * 2009-12-11 2011-06-16 Jeteazy System Co Ltd Height measurement device having probe and method for using the device to measure height.
US20110213247A1 (en) * 2010-01-08 2011-09-01 Hexagon Metrology, Inc. Articulated arm with imaging device
US8630314B2 (en) 2010-01-11 2014-01-14 Faro Technologies, Inc. Method and apparatus for synchronizing measurements taken by multiple metrology devices
DE112011100290T5 (en) * 2010-01-20 2013-02-28 Faro Technologies Inc. Coordinate measuring machine with an illuminated probe end and operating method
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8832954B2 (en) 2010-01-20 2014-09-16 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8615893B2 (en) 2010-01-20 2013-12-31 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine having integrated software controls
US8875409B2 (en) 2010-01-20 2014-11-04 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US8638446B2 (en) 2010-01-20 2014-01-28 Faro Technologies, Inc. Laser scanner or laser tracker having a projector
US8898919B2 (en) 2010-01-20 2014-12-02 Faro Technologies, Inc. Coordinate measurement machine with distance meter used to establish frame of reference
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
GB2489651B (en) 2010-01-20 2015-01-28 Faro Tech Inc Coordinate measurement machines with removable accessories
US8276286B2 (en) * 2010-01-20 2012-10-02 Faro Technologies, Inc. Display for coordinate measuring machine
US8677643B2 (en) 2010-01-20 2014-03-25 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
GB2477347A (en) * 2010-02-01 2011-08-03 Cambridge Entpr Ltd A Hand Operated Controller
DE102010008416A1 (en) * 2010-02-18 2011-08-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V., 80686 Device for optical measurement of pin or rod-shaped body e.g. screw thread, has line projectors arranged such that light line coincides with rotating axis, and evaluation- and memory units detecting, evaluating and storing optical signals
USD643319S1 (en) 2010-03-29 2011-08-16 Hexagon Metrology Ab Portable coordinate measurement machine
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US9157865B2 (en) * 2010-05-03 2015-10-13 United Technologies Corporation Machine tool—based, optical coordinate measuring machine calibration device
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US8127458B1 (en) 2010-08-31 2012-03-06 Hexagon Metrology, Inc. Mounting apparatus for articulated arm laser scanner
US8542348B2 (en) * 2010-11-03 2013-09-24 Rockwell Automation Technologies, Inc. Color sensor insensitive to distance variations
US8867804B2 (en) * 2010-11-08 2014-10-21 Cranial Technologies, Inc. Method and apparatus for automatically generating trim lines for cranial remodeling devices
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
CN102564302A (en) 2010-12-10 2012-07-11 通用电气公司 Measuring system and method
TWI421794B (en) * 2010-12-16 2014-01-01 Univ Nat Taipei Technology Apparatus for reconstructing three-dimensional object model
US8939369B2 (en) 2011-01-24 2015-01-27 Datalogic ADC, Inc. Exception detection and handling in automated optical code reading systems
GB2518769A (en) 2011-03-03 2015-04-01 Faro Tech Inc Target apparatus and method
US8687172B2 (en) 2011-04-13 2014-04-01 Ivan Faul Optical digitizer with improved distance measurement capability
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
US9164173B2 (en) 2011-04-15 2015-10-20 Faro Technologies, Inc. Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light
JP2014516409A (en) 2011-04-15 2014-07-10 ファロ テクノロジーズ インコーポレーテッド Improved position detector for laser trackers.
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
DE102011018597B9 (en) 2011-04-21 2013-01-24 Vrmagic Gmbh Method for synchronized operation of a camera and a projector
DE102011104550B4 (en) 2011-06-17 2014-04-30 Precitec Kg Optical measuring device for monitoring a joint seam, joining head and laser welding head with the same
DE102011051781A1 (en) * 2011-07-12 2013-01-17 Göpel electronic GmbH Deflectometric arrangement for surface inspection
BR112014002063A2 (en) * 2011-07-29 2017-02-21 Hexagon Metrology Inc coordinate measurement system data reduction
GB2508564A (en) * 2011-09-13 2014-06-04 Osi Optoelectronics Inc Improved laser rangefinder sensor
JP2013071194A (en) * 2011-09-27 2013-04-22 Hitachi Koki Co Ltd Cutting machine, and emergency stop method of motor
US9391498B2 (en) * 2011-09-29 2016-07-12 General Electric Company Methods and systems for use in configuring a coil forming machine
DE102011119073A1 (en) 2011-11-15 2013-05-16 Fiagon Gmbh Registration method, position detection system and scanning instrument
US9451810B2 (en) 2011-11-18 2016-09-27 Nike, Inc. Automated identification of shoe parts
US10552551B2 (en) 2011-11-18 2020-02-04 Nike, Inc. Generation of tool paths for shore assembly
US8958901B2 (en) 2011-11-18 2015-02-17 Nike, Inc. Automated manufacturing of shoe parts
US8849620B2 (en) * 2011-11-18 2014-09-30 Nike, Inc. Automated 3-D modeling of shoe parts
US8755925B2 (en) 2011-11-18 2014-06-17 Nike, Inc. Automated identification and assembly of shoe parts
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
CN103186704A (en) * 2011-12-29 2013-07-03 鸿富锦精密工业(深圳)有限公司 Hunting filtering system and method
US8763267B2 (en) 2012-01-20 2014-07-01 Hexagon Technology Center Gmbh Locking counterbalance for a CMM
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
JP6099675B2 (en) 2012-01-27 2017-03-22 ファロ テクノロジーズ インコーポレーテッド Inspection method by barcode identification
JP6290854B2 (en) 2012-03-30 2018-03-07 ニコン メトロロジー エン ヴェー Improved optical scanning probe
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US9069355B2 (en) 2012-06-08 2015-06-30 Hexagon Technology Center Gmbh System and method for a wireless feature pack
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
CN103676447A (en) * 2012-09-07 2014-03-26 致伸科技股份有限公司 Desktop type three-dimensional image scanner
US20140081459A1 (en) * 2012-09-20 2014-03-20 Marc Dubois Depth mapping vision system with 2d optical pattern for robotic applications
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US8966775B2 (en) 2012-10-09 2015-03-03 Nike, Inc. Digital bite line creation for shoe assembly
US9841311B2 (en) 2012-10-16 2017-12-12 Hand Held Products, Inc. Dimensioning system
CN104919272B (en) * 2012-10-29 2018-08-03 7D外科有限公司 Integrated lighting and optical surface topology detection system and its application method
US9866900B2 (en) 2013-03-12 2018-01-09 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to detect shapes
US9250214B2 (en) 2013-03-12 2016-02-02 Hexagon Metrology, Inc. CMM with flaw detection system
US9188430B2 (en) 2013-03-14 2015-11-17 Faro Technologies, Inc. Compensation of a structured light scanner that is tracked in six degrees-of-freedom
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9778572B1 (en) 2013-03-15 2017-10-03 Victor B. Kley In-plane scanning probe microscopy tips and tools for wafers and substrates with diverse designs on one wafer or substrate
DE202013002483U1 (en) 2013-03-15 2014-06-16 Csb-System Ag Device for measuring an animal carcass half
DE102013104004A1 (en) 2013-04-19 2014-10-23 Schoen + Sandt Machinery Gmbh Test device and method
US9651525B2 (en) * 2013-06-27 2017-05-16 TecScan Systems Inc. Method and apparatus for scanning an object
US9900585B2 (en) * 2013-09-18 2018-02-20 Matter and Form Inc. Device, system and method for three-dimensional modeling
DE102013017500B3 (en) 2013-10-17 2015-04-02 Faro Technologies, Inc. Method and apparatus for optically scanning and measuring a scene
DE102013222230A1 (en) 2013-10-31 2015-04-30 Fiagon Gmbh Surgical instrument
JP2015099116A (en) * 2013-11-20 2015-05-28 セイコーエプソン株式会社 Component analyzer
US9163921B2 (en) 2013-12-18 2015-10-20 Hexagon Metrology, Inc. Ultra-portable articulated arm coordinate measurement machine
US9594250B2 (en) 2013-12-18 2017-03-14 Hexagon Metrology, Inc. Ultra-portable coordinate measurement machine
CN104827480A (en) * 2014-02-11 2015-08-12 泰科电子(上海)有限公司 Automatic calibration method of robot system
US20150260509A1 (en) * 2014-03-11 2015-09-17 Jonathan Kofman Three dimensional (3d) imaging by a mobile communication device
WO2015155209A1 (en) 2014-04-08 2015-10-15 Nikon Metrology Nv Measurement probe unit for metrology applications
DE102014105456B4 (en) * 2014-04-16 2020-01-30 Minikomp Bogner GmbH Method for measuring the outer contour of three-dimensional measuring objects and associated measuring system
US9759540B2 (en) 2014-06-11 2017-09-12 Hexagon Metrology, Inc. Articulating CMM probe
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
JP2016035623A (en) * 2014-08-01 2016-03-17 キヤノン株式会社 Information processing apparatus and information processing method
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9903811B2 (en) * 2014-08-12 2018-02-27 The Boeing Company Multi-spectral reflectometer
US9675430B2 (en) 2014-08-15 2017-06-13 Align Technology, Inc. Confocal imaging apparatus with curved focal surface
EP3194884B1 (en) 2014-09-19 2023-11-01 Hexagon Metrology, Inc Multi-mode portable coordinate measuring machine
DE102014114506B4 (en) * 2014-10-07 2020-06-04 Sick Ag Camera for mounting on a conveyor and method for inspection or identification
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9575004B2 (en) * 2014-11-17 2017-02-21 The Boeing Company Automated low cost method for illuminating, evaluating, and qualifying surfaces and surface coatings
JP6126067B2 (en) 2014-11-28 2017-05-10 ファナック株式会社 Collaborative system with machine tool and robot
US9958256B2 (en) * 2015-02-19 2018-05-01 Jason JOACHIM System and method for digitally scanning an object in three dimensions
EP3265885A4 (en) 2015-03-03 2018-08-29 Prenav Inc. Scanning environments and tracking unmanned aerial vehicles
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
IL255734B2 (en) * 2015-05-20 2023-10-01 Magic Leap Inc Tilt shift iris imaging
DE102015108389A1 (en) * 2015-05-27 2016-12-01 Carl Zeiss Industrielle Messtechnik Gmbh Lighting control when using optical measuring devices
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
WO2017023290A1 (en) 2015-07-31 2017-02-09 Hewlett-Packard Development Company, L.P. Turntable peripheral for 3d scanning
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
DE102015119806A1 (en) * 2015-11-16 2017-05-18 Grob-Werke Gmbh & Co. Kg Method for displaying machining in a machine tool
US10451407B2 (en) * 2015-11-23 2019-10-22 The Boeing Company System and method of analyzing a curved surface
DE102015120380B4 (en) 2015-11-25 2023-08-10 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method for constructing and/or manufacturing a fluid line system in a motor vehicle
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
CN109196303B (en) 2016-04-01 2020-10-23 乐高公司 Toy scanner
US10013527B2 (en) 2016-05-02 2018-07-03 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
EP3258211B1 (en) 2016-06-17 2023-12-13 Hexagon Technology Center GmbH Determining object reflection properties with respect to particular optical measurement
DE102016214228A1 (en) * 2016-08-02 2018-02-08 Dr. Johannes Heidenhain Gmbh Method and device for controlling a milling machine
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US10893190B2 (en) 2017-02-02 2021-01-12 PreNav, Inc. Tracking image collection for digital capture of environments, and associated systems and methods
ES2973886T3 (en) * 2017-02-15 2024-06-24 3Shape As Monitor the scan volume of a 3D scanner
US10325413B2 (en) * 2017-02-28 2019-06-18 United Technologies Corporation Generating smooth optimized parts
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
EP3606410B1 (en) 2017-04-04 2022-11-02 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
JP6626036B2 (en) * 2017-04-18 2019-12-25 ファナック株式会社 Laser processing system with measurement function
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
DE102017130909A1 (en) * 2017-12-21 2019-06-27 Weber Maschinenbau Gmbh Breidenbach Optical measuring device
US10739128B2 (en) * 2018-02-26 2020-08-11 The Boeing Company Laser scanner scanning using a computer numerical controlled (CNC) system for movement
DE102018002622A1 (en) * 2018-03-29 2019-10-02 Twinner Gmbh 3-D object detection system
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
JP7297891B2 (en) 2018-07-19 2023-06-26 アクティブ サージカル, インコーポレイテッド Systems and Methods for Multimodal Sensing of Depth in Vision Systems for Automated Surgical Robots
EP3637047A1 (en) * 2018-09-11 2020-04-15 MSG Maschinenbau GmbH Device and method for measuring the straightness of a rod-shaped workpiece
CN109764807B (en) * 2019-01-14 2021-03-05 广东原点智能技术有限公司 2D visual detection method and detection system for engine cylinder position calibration
CN118476870A (en) * 2019-01-21 2024-08-13 华科精准(北京)医疗科技有限公司 Surgical robot system and application method thereof
DE102019103519B4 (en) 2019-02-12 2021-09-16 Carl Zeiss Industrielle Messtechnik Gmbh Device for determining dimensional and / or geometric properties of a measurement object
EP3719749A1 (en) 2019-04-03 2020-10-07 Fiagon AG Medical Technologies Registration method and setup
JP2022526626A (en) 2019-04-08 2022-05-25 アクティブ サージカル, インコーポレイテッド Systems and methods for medical imaging
WO2020234653A1 (en) 2019-05-20 2020-11-26 Aranz Healthcare Limited Automated or partially automated anatomical surface assessment methods, devices and systems
WO2021035094A1 (en) 2019-08-21 2021-02-25 Activ Surgical, Inc. Systems and methods for medical imaging
DE102019122655A1 (en) * 2019-08-22 2021-02-25 M & H Inprocess Messtechnik Gmbh Measuring system
US10885704B1 (en) 2019-09-04 2021-01-05 Microsoft Technology Licensing, Llc 3D mapping by distinguishing between different environmental regions
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
KR102423720B1 (en) * 2020-09-21 2022-07-22 주식회사 모션다이나믹스 Radome polishing system
US11717967B2 (en) 2021-03-04 2023-08-08 TecScan Systems Inc. System and method for scanning an object using an array of ultrasonic transducers
US20220329737A1 (en) * 2021-04-13 2022-10-13 Okibo Ltd 3d polygon scanner
WO2024166602A1 (en) * 2023-02-07 2024-08-15 株式会社キーエンス Three-dimensional measuring device

Citations (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0159187A2 (en) 1984-04-17 1985-10-23 Simon-Carves Limited A surface topography measuring system
US4628469A (en) 1982-09-29 1986-12-09 Technical Arts Corporation Method and apparatus for locating center of reference pulse in a measurement system
US4649504A (en) 1984-05-22 1987-03-10 Cae Electronics, Ltd. Optical position and orientation measurement techniques
US4691446A (en) 1985-09-05 1987-09-08 Ferranti Plc Three-dimensional position measuring apparatus
US4825391A (en) 1987-07-20 1989-04-25 General Electric Company Depth buffer priority processing for real time computer image generating systems
EP0328443A1 (en) 1988-02-09 1989-08-16 Sa Kreon Industrie Method for digitizing the surface of a three-dimensional object, and mapping apparatus to carry out this method
FR2629198A1 (en) 1988-03-25 1989-09-29 Kreon Ingenierie Marketing METHOD FOR DETERMINING AND RECONSTRUCTING THE SPATIAL COORDINATES OF EACH POINT FROM A SET OF POINTS SAMPLING A THREE-DIMENSIONAL SURFACE, AND METHOD FOR PRODUCING A THREE-DIMENSIONAL IMAGE OF THIS SURFACE FROM SUCH COORDINATES
WO1990008939A1 (en) 1989-02-06 1990-08-09 Vision 3D Method for calibrating a system for the tri-dimensional acquisition of shape and system for implementing such method
US4982102A (en) 1989-06-27 1991-01-01 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
US4993835A (en) 1989-06-16 1991-02-19 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
US5008555A (en) 1988-04-08 1991-04-16 Eaton Leonard Technologies, Inc. Optical probe with overlapping detection fields
DE3938714A1 (en) 1989-11-23 1991-05-29 Bernd Dr Breuckmann Optical determination of object shapes, shape variations - using structured coloured light projected onto objects for high resolution, dynamic measurement
US5090811A (en) 1989-05-31 1992-02-25 General Electric Company Optical radius gauge
WO1992007233A1 (en) 1990-10-15 1992-04-30 Schulz Waldean A Method and apparatus for three-dimensional non-contact shape sensing
WO1992008103A1 (en) 1990-10-24 1992-05-14 Böhler Gesellschaft M.B.H. Process and device for the opto-electronic measurement of objects
US5168528A (en) 1990-08-20 1992-12-01 Itt Corporation Differential electronic imaging system
US5189291A (en) 1989-05-01 1993-02-23 Symbol Technologies, Inc. Bar code reader operable as remote scanner or with fixed terminal
US5191642A (en) 1987-04-09 1993-03-02 General Electric Company Method for efficiently allocating computer resource for real time image generation
US5193120A (en) 1991-02-27 1993-03-09 Mechanical Technology Incorporated Machine vision three dimensional profiling system
FR2685764A1 (en) 1991-12-30 1993-07-02 Kreon Ind COMPACT AND HIGH RESOLUTION OPTICAL SENSOR FOR THREE DIMENSIONAL SHAPE ANALYSIS.
GB2264602A (en) 1991-12-20 1993-09-01 3D Scanners Ltd Object examination
GB2264601A (en) 1991-12-31 1993-09-01 3D Scanners Ltd Object inspection
US5251296A (en) 1990-03-16 1993-10-05 Hewlett-Packard Company Methods and apparatus for generating arbitrarily addressed, arbitrarily shaped tiles in computer graphics systems
US5255096A (en) 1992-04-10 1993-10-19 Boyle William M Video time code synchronized robot control apparatus
US5264678A (en) 1991-09-26 1993-11-23 Applied Research, Inc. Weld-bead profilometer
US5268996A (en) 1990-12-20 1993-12-07 General Electric Company Computer image generation method for determination of total pixel illumination due to plural light sources
EP0589750A1 (en) 1992-09-24 1994-03-30 KREON INDUSTRIE, Société Anonyme 3D graphics data process for patch generation, segmentation and selection of the set of points representing surfaces or curves
US5319445A (en) 1992-09-08 1994-06-07 Fitts John M Hidden change distribution grating and use in 3D moire measurement sensors and CMM applications
JPH06186025A (en) 1992-12-16 1994-07-08 Yunisun:Kk Three dimensional measuring device
JPH06229741A (en) 1993-01-29 1994-08-19 Central Glass Co Ltd Method and apparatus for inspecting transparent planar item
US5349378A (en) 1992-12-21 1994-09-20 Robotic Vision Systems, Inc. Context independent fusion of range and intensity imagery
US5357599A (en) 1992-07-30 1994-10-18 International Business Machines Corporation Method and apparatus for rendering polygons
US5362970A (en) 1979-04-30 1994-11-08 Sensor Adaptive Machines, Inc. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US5402582A (en) 1993-02-23 1995-04-04 Faro Technologies Inc. Three dimensional coordinate measuring apparatus
US5413454A (en) 1993-07-09 1995-05-09 Movsesian; Peter Mobile robotic arm
GB2288249A (en) 1994-04-08 1995-10-11 Moeller J D Optik Operating microscope unit with data interface
WO1996006325A1 (en) 1994-08-24 1996-02-29 Tricorder Technology Plc Scanning arrangement and method
WO1996010205A1 (en) 1994-09-28 1996-04-04 William Richard Fright Arbitrary-geometry laser surface scanner
EP0750176A1 (en) 1995-06-21 1996-12-27 Asulab S.A. Apparatus for measuring an angular speed
EP0750175A1 (en) 1995-06-23 1996-12-27 KREON INDUSTRIE, Société Anonyme Acquisition and numerisation method for objects through a transparent wall and system for carrying out this method
US5611147A (en) 1993-02-23 1997-03-18 Faro Technologies, Inc. Three dimensional coordinate measuring apparatus
US5784282A (en) 1993-06-11 1998-07-21 Bertin & Cie Method and apparatus for identifying the position in three dimensions of a movable object such as a sensor or a tool carried by a robot
US5812710A (en) 1996-02-07 1998-09-22 Fujitsu Limited Apparatus and method for optical equalization and amplification
US5886703A (en) 1995-02-01 1999-03-23 Virtus Corporation Perspective correct texture mapping system and methods with intelligent subdivision
US6611617B1 (en) 1995-07-26 2003-08-26 Stephen James Crampton Scanning apparatus and method
US20030191603A1 (en) 2002-02-14 2003-10-09 Simon Raab Portable coordinate measurement machine with integrated line laser scanner

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5264578A (en) 1987-03-20 1993-11-23 Allergan, Inc. Disubstituted acetylenes bearing heterobicyclic groups and heteroaromatic or phenyl groups having retinoid like activity
US5594044A (en) * 1995-03-03 1997-01-14 Videojet Systems International, Inc. Ink jet ink which is rub resistant to alcohol

Patent Citations (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5362970A (en) 1979-04-30 1994-11-08 Sensor Adaptive Machines, Inc. Method and apparatus for electro-optically determining the dimension, location and attitude of objects
US4628469A (en) 1982-09-29 1986-12-09 Technical Arts Corporation Method and apparatus for locating center of reference pulse in a measurement system
EP0159187A2 (en) 1984-04-17 1985-10-23 Simon-Carves Limited A surface topography measuring system
US4649504A (en) 1984-05-22 1987-03-10 Cae Electronics, Ltd. Optical position and orientation measurement techniques
US4691446A (en) 1985-09-05 1987-09-08 Ferranti Plc Three-dimensional position measuring apparatus
US5191642A (en) 1987-04-09 1993-03-02 General Electric Company Method for efficiently allocating computer resource for real time image generation
US4825391A (en) 1987-07-20 1989-04-25 General Electric Company Depth buffer priority processing for real time computer image generating systems
EP0328443A1 (en) 1988-02-09 1989-08-16 Sa Kreon Industrie Method for digitizing the surface of a three-dimensional object, and mapping apparatus to carry out this method
FR2629198A1 (en) 1988-03-25 1989-09-29 Kreon Ingenierie Marketing METHOD FOR DETERMINING AND RECONSTRUCTING THE SPATIAL COORDINATES OF EACH POINT FROM A SET OF POINTS SAMPLING A THREE-DIMENSIONAL SURFACE, AND METHOD FOR PRODUCING A THREE-DIMENSIONAL IMAGE OF THIS SURFACE FROM SUCH COORDINATES
EP0348247A1 (en) 1988-03-25 1989-12-27 Kreon Industrie Procedure to determine and to reconstruct spatial coordinates of each of the points of a collection of points sampling a three-dimensional surface and reconstruction of that surface from these coordinates
US5008555A (en) 1988-04-08 1991-04-16 Eaton Leonard Technologies, Inc. Optical probe with overlapping detection fields
WO1990008939A1 (en) 1989-02-06 1990-08-09 Vision 3D Method for calibrating a system for the tri-dimensional acquisition of shape and system for implementing such method
US5189291A (en) 1989-05-01 1993-02-23 Symbol Technologies, Inc. Bar code reader operable as remote scanner or with fixed terminal
US5090811A (en) 1989-05-31 1992-02-25 General Electric Company Optical radius gauge
US4993835A (en) 1989-06-16 1991-02-19 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
US4982102A (en) 1989-06-27 1991-01-01 Mitsubishi Denki Kabushiki Kaisha Apparatus for detecting three-dimensional configuration of object employing optical cutting method
DE3938714A1 (en) 1989-11-23 1991-05-29 Bernd Dr Breuckmann Optical determination of object shapes, shape variations - using structured coloured light projected onto objects for high resolution, dynamic measurement
US5251296A (en) 1990-03-16 1993-10-05 Hewlett-Packard Company Methods and apparatus for generating arbitrarily addressed, arbitrarily shaped tiles in computer graphics systems
US5168528A (en) 1990-08-20 1992-12-01 Itt Corporation Differential electronic imaging system
US5198877A (en) 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
WO1992007233A1 (en) 1990-10-15 1992-04-30 Schulz Waldean A Method and apparatus for three-dimensional non-contact shape sensing
WO1992008103A1 (en) 1990-10-24 1992-05-14 Böhler Gesellschaft M.B.H. Process and device for the opto-electronic measurement of objects
US5268996A (en) 1990-12-20 1993-12-07 General Electric Company Computer image generation method for determination of total pixel illumination due to plural light sources
US5193120A (en) 1991-02-27 1993-03-09 Mechanical Technology Incorporated Machine vision three dimensional profiling system
US5264678A (en) 1991-09-26 1993-11-23 Applied Research, Inc. Weld-bead profilometer
GB2264602A (en) 1991-12-20 1993-09-01 3D Scanners Ltd Object examination
EP0550300A1 (en) 1991-12-30 1993-07-07 Sa Kreon Industrie Compact optical sensor of high resolution for analysing three-dimensional shapes
US5424835A (en) 1991-12-30 1995-06-13 Kreon Industrie High-resolution compact optical sensor for scanning three-dimensional shapes
FR2685764A1 (en) 1991-12-30 1993-07-02 Kreon Ind COMPACT AND HIGH RESOLUTION OPTICAL SENSOR FOR THREE DIMENSIONAL SHAPE ANALYSIS.
GB2264601A (en) 1991-12-31 1993-09-01 3D Scanners Ltd Object inspection
US5255096A (en) 1992-04-10 1993-10-19 Boyle William M Video time code synchronized robot control apparatus
US5255096B1 (en) 1992-04-10 1997-12-23 William M Boyle Video time code synchronized robot control apparatus
US5357599A (en) 1992-07-30 1994-10-18 International Business Machines Corporation Method and apparatus for rendering polygons
US5319445A (en) 1992-09-08 1994-06-07 Fitts John M Hidden change distribution grating and use in 3D moire measurement sensors and CMM applications
EP0589750A1 (en) 1992-09-24 1994-03-30 KREON INDUSTRIE, Société Anonyme 3D graphics data process for patch generation, segmentation and selection of the set of points representing surfaces or curves
JPH06186025A (en) 1992-12-16 1994-07-08 Yunisun:Kk Three dimensional measuring device
US5349378A (en) 1992-12-21 1994-09-20 Robotic Vision Systems, Inc. Context independent fusion of range and intensity imagery
JPH06229741A (en) 1993-01-29 1994-08-19 Central Glass Co Ltd Method and apparatus for inspecting transparent planar item
US5402582A (en) 1993-02-23 1995-04-04 Faro Technologies Inc. Three dimensional coordinate measuring apparatus
US5611147A (en) 1993-02-23 1997-03-18 Faro Technologies, Inc. Three dimensional coordinate measuring apparatus
US5784282A (en) 1993-06-11 1998-07-21 Bertin & Cie Method and apparatus for identifying the position in three dimensions of a movable object such as a sensor or a tool carried by a robot
US5413454A (en) 1993-07-09 1995-05-09 Movsesian; Peter Mobile robotic arm
GB2288249A (en) 1994-04-08 1995-10-11 Moeller J D Optik Operating microscope unit with data interface
WO1996006325A1 (en) 1994-08-24 1996-02-29 Tricorder Technology Plc Scanning arrangement and method
WO1996010205A1 (en) 1994-09-28 1996-04-04 William Richard Fright Arbitrary-geometry laser surface scanner
US5886703A (en) 1995-02-01 1999-03-23 Virtus Corporation Perspective correct texture mapping system and methods with intelligent subdivision
EP0750176A1 (en) 1995-06-21 1996-12-27 Asulab S.A. Apparatus for measuring an angular speed
EP0750175A1 (en) 1995-06-23 1996-12-27 KREON INDUSTRIE, Société Anonyme Acquisition and numerisation method for objects through a transparent wall and system for carrying out this method
US6611617B1 (en) 1995-07-26 2003-08-26 Stephen James Crampton Scanning apparatus and method
US7313264B2 (en) 1995-07-26 2007-12-25 3D Scanners Limited Scanning apparatus and method
US5812710A (en) 1996-02-07 1998-09-22 Fujitsu Limited Apparatus and method for optical equalization and amplification
US20030191603A1 (en) 2002-02-14 2003-10-09 Simon Raab Portable coordinate measurement machine with integrated line laser scanner

Non-Patent Citations (422)

* Cited by examiner, † Cited by third party
Title
"Digibot 3D Object Digitizing Systems," Cadence, Nov. 1991.
"Software Architecture" (no date), Marked as Page No. M0082698.
"Software Interface" (no date), Marked as Page No. M0082697.
"System Block Diagram", (no date), Marked as Page No. M0082696.
3D Digitizer, 3D Videolaser, Descriptive Notice, Labege, Oct. 21, 1989.
3D Scanners information on Replica Surface Digitising and NC Program Preparation System, (no date), in English with French translation, Marked as Page Nos. M0083790-M0083796.
3D Scanners information on Reversa Reverse Engineering System (no date), in English with French translation, Marked as Page Nos. M0083786-M0083789.
3D Scanners Ltd. information on "Replica 3D Surface Digitising and NC Program Preparation System", (no date), Marked as Page Nos. M0083181-M0083182.
3D Scanners Ltd. information on "Stripe 3D Surface Digitising Probe" (no date), Marked as Page Nos. M0083183-M0083184.
3D Scanners Ltd. information on "Surfa Flatness Sensing System" (no date), Marked as Page Nos. M0083185-M0083186.
3D Scanners Memo to Greg Fraser of Faro Technologies From Stuart Hamilton, Dated Aug. 18, 1995.
3D Scanners Memo to Gregory Fraser of Faro Technologies From Stephen Crampton, Dated Apr. 22, 2009.
3D Scanners News Release, "Hand Held 3D Laser Scanner Announced by 3D Scanners," Dated Aug. 7, 1995.
3D Scanners, "Hand Held 3D Laser Scanner Announced by 3D Scanners" Dated Aug. 7, 1995.
3D Technologies Handover, Comments in no particular order (no time) (no date), Marked as Page No. M0082692.
3D Technologies Handover, Comments in no particular order (no time) (no. date), Marked as Page No. M0082688.
3D Technology Inc. "Sensor Configuration Overview", (no date), Marked as Page No. M0082695.
3D Videolaser, Typical Use as a No-Contact Sensor for 3-D Production Line Control, Marked as Page Nos. M0083618-M0083620 (no date).
A. D. Linney, et al., "Use of 3-D visualisation system in the planning and evaluation of facial surgery," Proc. SPIE Conf. on Biostereometrics and Applications, Boston, MA, Nov. 1990, Marked as Page Nos. M0082715-M0082724.
Address and telephone Information for Mr. Pierre Veron, Marked as Page No. M0082910.
All You Need to Know About Kreon Reverse Engineering System, Kreon Industries, Apr. 1996.
Autofact '94, Nov. 13-17, 1994, Cobo Conference Center, Detroit, Michigan, Conference Proceeding, total 14 pages.
Automated 4-Axis 3D Laser Digitizing, Digi-Botics 3D Laser Digitizing Systems, total 2 pages.
Bernard C. Jiang, et al., "A Review of Recent Developments in Robot Metrology", Journal of Manufacturing Systems, vol. 7, No. 4, (1988), pp. 339-357.
Besl, P.J. and N.D. McKay, "A Method for Registration of 3-D Shapes", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 4, No. 2, pp. 239-256, Feb. 1992.
Bob Simon and Terry T. Wohlers, "Capturing Surface Data" total 1 page, Cadence Nov. 1991.
Bordereau, De Pieces Complementaires, Communiquees, dated Mar. 19, 1997, Marked as Page Nos. M0083777-M0083778.
Bordereau, De Pieces Complementaires, Communiquees, dated Mar. 19, 1997, Marked as Page Nos. M0083777-M0083778. English Translation provided, marked as pp. M0083777-M0083778.
Brochure of Kreon Industries, Inc., A New Liberty in Reverse Engineering, 7 pages.
Certificate of Service; Case 1:08-cv-11187-PBS, Document 321, Filed Jun. 10, 2011, p. 1.
Chia-Wei Liao, et al., "Surface Approximation of a Cloud of 3D Points", Graphical Models and Image Processing, vol. 57, No. 1, Jan. 1995, pp. 67-74.
Complaint and Demand for Jury Trial, Filed Jul. 11, 2008.
Counterclaim Defendants' Motion for Leave to File Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 234, Filed Aug. 27, 2010, pp. 1-3.
Counterclaim Defendants' Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 246, Filed Sep. 10, 2010, pp. 1-28.
Counterclaim Defendants' Motion to File under Seal its Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 235, Filed Aug. 27, 2010, pp. 1-4.
D.K. Naidu et al., A Comparative Analysis of Algorithms for Determining the Peak Position of a Stripe to Sub-Pixel Accuracy, Department of Artificial Intelligence, University of Edinburgh, pp. 217-225.
Decision of Final Rejection drafted Dec. 15, 2006, for Japanese Patent Application No. H09-507376.
Declaration of Angela T. Rella in Support of Nikon Metrology's Surreply Memorandum in Further Support of its Opposition to Defendant'S Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct, Civil Action No. 08-CV-11187 (PBS) Filed May 20, 2010.
Declaration of Gregory D. Hager, Ph.D., Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Declaration of Gregory D. Hager, Ph.D., Filed Sep. 14, 2009, Civil Action No. 08-CV-11187(PBS).
Declaration of Koenraad Van der Elst in Support of Counterclaim Defendants' Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 238, Filed Aug. 27, 2010, pp. 1-3.
Declaration of Martin J. O'Donnell, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Declaration of Merton E. Thompson in Support of Counterclaim Defendants' Motion to File for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Documents 237 and 237-1 through 237-9, Filed Aug. 27, 2010.
Declaration of Merton E. Thompson in Support of the Opposition to Defendant Faro Technologies Inc.'s Motion to Compel Production of Documents and an Additional Deposition Relating to Certain Antitrust and Damages Issues from Plaintiffs and from Nikon Corporation and Request for Attorneys Fees and Costs; Case 1:08-cv-11187-PBS, Documents 269 and 269-1 through 269-15, Filed Oct. 18, 2010.
Declaration of Merton Thompson, Civil Action No. 08-CV-11187(PBS), Filed Mar. 8, 2010.
Declaration of Stephen Crampton in Support of Plaintiffs' Opposition to Defendant'S Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Declaration of William J. Cass, Civil Action No. 08-CV-11187 (PBS), Filed Feb. 9, 2010.
Declaration of William J. Cass, Filed Aug. 13, 2009, Civil Action No. 08-CV-11187(PBS).
Declaration of William J. Cass; Case 1:08-cv-11187-PBS, Document 257, Filed Oct. 12, 2010, pp. 1-9.
Declaration of William J. Cass; Case 1:08-cv-11187-PBS, Documents 251 and 251-1 through 251-6, Filed Oct. 4, 2010.
Defendant Faro Technologies Inc.'s Answer, Affirmative Defenses, Counterclaim and Demand for Jury Trial, Filed Dec. 5, 2008.
Defendant Faro Technologies Inc.'s Designation of Additional Exhibits; Case 1:08-cv-11187-PBS, Document 270, Filed Oct. 19, 2010, pp. 1-2.
Defendant Faro Technologies Inc.'s Designation of Additional Exhibits; Case 1:08-cv-11187-PBS, Document 277, Filed Oct. 21, 2010, p. 1.
Defendant Faro Technologies Inc.'s First Amended Answer, Affirmative Defenses, Counterclaim and Demand for Jury Trial, Filed Sep. 1, 2009.
Defendant Faro Technologies Inc.'s Motion for Leave to File Supplemental Briefing and Request for Additional Findings of Fact and Conclusions of Law that the Asserted U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 310, Filed May 20, 2011, pp. 1-6.
Defendant Faro Technologies Inc.'s Motion to Compel Production of Documents and an Additional Deposition Relating to Certain Antitrust and Damages Issues and Request for Attorneys Fees and Costs; Case 1:08-cv-11187-PBS, Document 261, Filed Oct. 14, 2010, pp. 1-15.
Defendant Faro Technologies Inc.'s Motion to Seal the Hearing on: Faro's Motion for Summary Judgment of Non-Infringement; and Request to Make this Case Exceptional under 35 U.S.C. § 285; Case 1:08-cv-11187-PBS, Document 314 (6 pages) and 314-1 (three pages), Filed Jun. 3, 2011.
Defendant Faro Technologies Inc.'s Notice of Correction of the Record; Case 1:08-cv-11187-PBS, Documents 322 (2 pages) and 322-1 (69 pages), Filed Jun. 10, 2011.
Defendant Faro Technologies Inc.'s Opposition to: Plaintiffs' Motion for Leave to File Supplemental Brief regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim in Light of New Controlling Precedent; Case 1:08-cv-11187-PBS, Document 306, Filed Jan. 26, 2011, pp. 1-3.
Defendant Faro Technologies Inc.'s Renewed Motion for Summary Judgment of Non-Infringement; Case 1:08-cv-11187-PBS, Document 313, Filed May 26, 2011, pp. 1-3.
Defendant Faro Technologies Inc.'s Response to Plaintiffs' Proposed Conclusions of Law and Application of the Law to the Facts regarding Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Documents 298 and 298-1, Filed Dec. 10, 2010.
Defendant Faro Technologies Inc.'s Supplemental Brief that the Asserted Patents are Unenforceable under Therasense, Inc. v. Becton, Dickinson & Co.; Case 1:08-cv-11187-PBS, Document 318, Filed Jun. 9, 2011, pp. 1-22.
Defendant Faro Technologies Inc.'S Trial Brief for Bench Trial on Inequitable Conduct, Civil Action No. 08-CV-11187(PBS), Filed Aug. 18, 2010.
Defendant Faro Technologies Inc.'s. Response to Plaintiffs' Proposed Additional Findings of Fact and Conclusions of Law regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Documents 324 (98 pages), 324-1 (6 pages), 324-2 (3 pages), Filed Jun. 17, 2011, pp. 1-107.
Defendant Faro Technologies Inc.'s: Memorandum of Law in Support of its Request for Additional Findings of Fact and Conclusions of Law that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 312, Filed May 23, 2011, pp. 1-23.
Defendant Faro Technologies Inc.'s: Post Trial Brief; Case 1:08-cv-11187-PBS, Documents 299 and 299-1 , Filed Dec. 10, 2010.
Defendant Faro Technologies Inc.'s: Request for Additional Findings of Fact and Conclusions of Law on its Claim that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 311, Filed May 23, 2011, pp. 1-26.
Defendant Faro Technologies Inc.'s: Request for Findings of Fact and Conclusions of Law on its Claim that the Patent-In-Suit are Unenforceable due to Inequitable Conduct during Patent Prosecution; Case 1:08-cv-11187-PBS, Document 292, Filed Nov. 19, 2010, 54 pages.
Defendant Faro Technologies Inc.'s: Responses to Plaintiffs' Proposed Findings of Fact regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Documents 297 and 291-1, Filed Dec. 10, 2010.
Defendant Faro Technologies, Inc.'s Local Rule 56.1 Statement of Material Facts in Support of: Faro's Motion for Partial Summary Judgment That the Patents-In-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed Feb. 9, 2010.
Defendant Faro Technologies, Inc.'s Markman Brief, Filed Aug. 13, 2009.
Defendant Faro Technologies, Inc.'s Memorandum of Law in Support of Faro's Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution; That This Action is an Exceptional Case; and That Faro be Awarded its Attorneys Fees Under 35 U.S.C. Section 285, Civil Action No. 08-CV-11187 (PBS), Filed Feb. 9, 2010.
Defendant Faro Technologies, Inc.'s Preliminary Contentions, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 23, 2010.
Defendant Faro Technologies, Inc.'s Reply to Plaintiffs' Markman Brief, Filed Sep. 14, 2009.
Defendant Faro Technologies, Inc.'s Supplement to its Preliminary Contentions, Civil Action No. 08-CV-11187 (PBS), Filed Jan. 13, 2010.
Defendant's Assented-To Motion for Extension of Time to File Opposition to Plaintiffs' Motion for Summary Judgment and Extension for Plaintiffs to File Reply Brief; Case 1:08-cv-11187-PBS, Document 248, Filed Sep. 13, 2010, pp. 1-3.
Digibotics, Inc. News Release "Digibot II and Surfacer Software Team Up to Streamline and Enhance Reverse Engineering Process", Jul. 8, 1994.
Digibotics, Inc. News Release "Digibot II Visual Scan Interface and Automated Scan Procedures Simplifies the Task of Scanning Objects in 3 Dimensions", Dated Jul. 9, 1994.
Digibotics, Inc. News Release "Digibotics Announces Its 32 Bit Data Editor, Digiedit NT32, for the Microsoft's Windows NT® Operating System", 1 page.
Digibotics, Inc. News Release "Digibotics Announces Its New Interactive Triangulator".
Document labeled "Conclusions", R.G. 95/14282, For 3D Scanners versus Kreon Industrie, dated Jul. 8, 1997, Marked as Page Nos. M0083684-M0083687.
Document labeled "Conclusions", R.G. 95/14282, For 3D Scanners versus Kreon Industrie, dated Jul. 8, 1997, Marked as Page Nos. M0083684-M0083687. English Translation provided, marked as pp. M0083684-M0083687.
Document labeled "Conclusions", R.G.: 95/14284, dated Feb. 20, 1998, Marked as Page Nos. M0083676-M0083683.
Document labeled "Conclusions", R.G.: 95/14284, dated Feb. 20, 1998, Marked as Page Nos. M0083676-M0083683. English Translation provided, marked as pp. M0083676-M0083683.
Document labeled "Conclusions", RG No. 95/14284, dated Mar. 21, 1997, Marked as Page Nos. M0083705-M0083714.
Document labeled "Conclusions", RG No. 95/14284, dated Mar. 21, 1997, Marked as Page Nos. M0083705-M0083714. English Translation provided, marked as pp. M0083705-M0083714.
Document labeled Conclusions:, RG No. 95/14284, dated Mar. 22, 1996, Marked as Page Nos. M0083716-M0083722.
Document labeled Conclusions:, RG No. 95/14284, dated Mar. 22, 1996, Marked as Page Nos. M0083716-M0083722. English Translation provided, marked as pp. M0083716-M0083721.
Document titled "Conclusions" for 3D Scanners versus Kreon Industrie, Hearing for preparation for trial of May 15, 1998, Marked as Page Nos. M0083665-M0083669.
Draft Letter of Intent between Vision 3D and 3D Scanners Limited (no date), Marked as Page Nos. M0083203-M0083205.
Drawing labeled "General Sahpe of 3D Videolaser Sensor", Marked as Page Nos. M0083812-M0083813.
Electronic Clerk's Note for proceedings held before Magistrate Judge Marianne B. Bowler; dated Nov. 8, 2010; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Notice Re: Courtesy Copies dated Dec. 13, 2010; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Order enetered granting on Motion to Compel dated Nov. 5, 2010; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Order entered Granting Faro's Motion for Leave dated May 23, 2011; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Order entered granting Motion for Leave to File Excess Pages dated Jun. 10, 2011; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Order finding Motion on Order and Sealed Motion as Moot, dated Nov. 22, 2010; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Order Setting Hearing on Summary Judgment Motion dated May 5, 2011; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Procedural Order entered re Notice/Request for Additional Findings of Fact and Conclusions of Law, dated May 23, 2011; Case 1:08-cv-11187-PBS, pp. 1-2.
Electronic Procedural Order Entered, dated May 26, 2011; Case 1:08-cv-11187-PBS, pp. 1-2.
English translation of French language documents with first document labeled "Requete a Fin de Saisie-Contrefacon", Marked as Page Nos. M0083893-M0083908, with pp. M0083900-M0083901 being an English document labeled "Reversa, Reverse Engineering System". Partial English Translation provided, marked as pp. M0083893-M0083889 and M0083902-M0083908.
English translation of Patent No. UK 0550300, dated May 24, 1995, Marked as Page Nos. M0083753-M0083776.
English translation of Request for Authorization to Perform a Seizure for Infringement dated Mar. 14, 1995, Marked as Page Nos. M0083873-M0083885.
Envelope addressed to 3D Scanners Limited dated Dec. 2, 1993, Marked as Page Nos. M0083172-M0083173.
EOIS General Information on the EOIS Mini-Moire Sensor, Exhibit Raphael 161.
EOIS Letter From the President John K. Fitts, Ph.D. Addressed to Direct Dimensions Dated Jun. 22, 1995.
EOIS Letter From the President John K. Pitts, Ph.D. Addressed to Wohlers Associates, Dated Nov. 25, 1992.
European Patent Office Search Report on European Search for Application No. EP 92 40 3280, Marked as Page No. M0083414. English Translation provided, marked as p. M0083414.
European Patent Office Search Report on European Search for Application No. EP, 93 40 2208, Marked as Page No. M008338. English Translation provided, marked as p. M008338.
Exhibit 1, Re-Notice of Deposition of Stephen Crampton Dated Apr. 21, 2009, Civil Action No. 08-CV-11187(PBS).
Exhibit 1, Thomas R. Kurfess, P.E., Earned Degrees, Employment, Teaching and Education.
Exhibit 11, "Hand Held 3D Laser Scanner Announced by 3D Scanners", Aug. 7, 1995.
Exhibit 11, EOIS Letter From President John K. Fitts to Wohlers Associates, Dated Nov. 25, 1992.
Exhibit 12, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Aug. 13, 1995.
Exhibit 13, Modern Applications News, "Moire Sensor Technology and the CMM", pp. 38-39, Sep. 1994.
Exhibit 14, Data Creator Development Objectives for Wednesday Jun. 28.
Exhibit 14, EOIS Product Information on Mini-Moire™ Portable Arm System Non-Contact 3D Data Collection.
Exhibit 15, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Aug. 4, 1995.
Exhibit 16, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Mar. 11, 2009.
Exhibit 17, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Aug. 18, 1995.
Exhibit 18, "Optoelectronic 3D-Trigger Probe, OTS5-LD".
Exhibit 18, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Aug. 4, 1995.
Exhibit 19, 3D Scanners Memo to Gregory Fraser From Stephen Crampton Dated Apr. 22, 2009.
Exhibit 19, Industrial Faro Arm, Bronze Series, Liberated . . . CMM, 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 2, Complaint and Demand for Jury Trial Dated Jul. 11, 2008.
Exhibit 2, Kurfess Expert Report, List of Equipment.
Exhibit 20, Memo From Stephen Crampton of 3D Scanners to Gregory Fraser of Faro Technologies Dated Apr. 22, 2009.
Exhibit 21, Letter From Stephen Crampton to Dr. Stuart Hamilton Dated Apr. 8, 1995.
Exhibit 22, Data Creation, Business Plan 1995-1997 Dated Apr. 7, 1995.
Exhibit 22, Facsimile Transmission to Mr. Allen Sajedi of Faro Technologies From Mr. Jean Louis Dalla Verde of Societe Kreon Industrie, Dated Jun. 5, 1995.
Exhibit 23, DTI Smart Competition Proposal, Data Creator Flexible 3D Data Capture System, Dated Apr. 7, 1995.
Exhibit 24, Deposition of Stephen Crampton Dated May 7, 2009, Civil Action No. 08-CV-11187 (PBS).
Exhibit 25, Data Creator Progress Meeting 1, Dated May 15, 1995.
Exhibit 26, Memo From Stephen Crampton of 3D Scanners to Alan Sajadi of Faro Technologies Dated Jun. 7, 1996.
Exhibit 27, 3D Scanners Information Memorandum (No Date).
Exhibit 27, 3D Technology, Inc. Jan. 24, 1994 Fax to Naval Kapoor From Ed Vinarus Re: Sensor Connector.
Exhibit 28, Diagram and Chart Showing Designation des Signaux de Sortie du Connecteur Capteur.
Exhibit 28, Memo From Stuart Hamilton of 3D Scanners to Greg Fraser of Faro Technologies Dated Nov. 5, 1996.
Exhibit 29, Industrial Faro Arm Silver Series Liberated . . . CMM, 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 29, Letter From Naval Kapoor With Attachment Shown as Rev. A, Jan. 9, 1994.
Exhibit 3, E-Mail Dated Mar. 12, 2003 From Chris Dryden to Stephen Crampton Re: MM Patent and Faro.
Exhibit 3, Kurfess Expert Report, Materials Considered.
Exhibit 30, 3D Multimedia Support Centre 22127: 3DMSC, Supplement to Proposal Oct. 2, 1995.
Exhibit 30, Letter From Naval Kapoor to Mr. Peter Champ Dated Feb. 17, 1994.
Exhibit 31, Memo From Stuart Hamilton of 3D Scanners to Charlie Pritchard of Digital Media Centre, DIT Dated Mar. 14, 1995.
Exhibit 32, Data Creator Development Objectives for Wednesday Jun. 28.
Exhibit 32, Memo From Stuart Hamilton of 3D Scanners to Charlie Pritchard of Digital Media Centre, DIT Dated Mar. 14, 1995.
Exhibit 33, DTI Smart Competition Proposal, Data Creator, Flexible 3D Data Capture System, Dated Apr. 7, 1995.
Exhibit 33, Interfacing and Alignment of Data Creator to the Faro Arm Dated Aug. 8, 1995.
Exhibit 34, Data Creator Progress Meeting 1, Stephen Crampton, May 15, 1995.
Exhibit 34, Note From Dan Mikogami to Stephen Crampton (No Date).
Exhibit 35, Hand Held 3D Laser Scanner Announced by 3D Scanners Dated Aug. 7, 1995.
Exhibit 35, Slip Sheet, Creation Date: May 12, 1995 7:51:00 PM; Date Last Saved: May 12, 1995 12:57:00 PM; Name: STU13.DOC.
Exhibit 36, "Interfacing and Alignment of Data Creator to the Faro Arm", Dated Aug. 8, 1995.
Exhibit 36, Complaint and Demand for Jury Trial, Filed Jul. 11, 2008, Case 1:08-cv-11187-PBS.
Exhibit 37, 3D Scanners Memo to Greg Fraser From Stuart Hamilton Dated Aug. 18, 1995.
Exhibit 38, 3D Scanners Memo to Alan Sajadi From Stephen Crampton Dated Jun. 7, 1996.
Exhibit 39, Faro Technologies, Inc., Third Party Software, Interface Drivers/Add-Ons & Re-Seller Information (No Date).
Exhibit 39, Final Office Action for U.S. Appl. No. 09/000,215 Mailed Jul. 31, 2002.
Exhibit 4, Metris U.S.A., Inc. v. Faro Technologies, Inc. Memorandum and Order, Dated Oct. 22, 2009, Civil Action No. 08-11187-PBS.
Exhibit 40, Amendment After Final Filed Oct. 15, 2002 in Response to Final Office Action Mailed Jul. 31, 2002.
Exhibit 40, Memo From Stephen Crampton of 3D Scanners to Alan Sajadi of Faro Technologies Dated Mar. 11, 2009.
Exhibit 41, Industrial Faro Arm Bronze Series Liberated . . . CMM, 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 41, Notice of Allowability for U.S. Appl. No. 09/000,215.
Exhibit 42, Industrial Faro Arm Silver Series Liberated . . . CMM 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 43, Faro Bronze Triggering Circuit Settings PC 1.4.97.
Exhibit 44, Memo From Stephen Crampton of 3D Scanners to Alan Sajadi of Faro Technologies Dated Mar. 11, 2009.
Exhibit 45, Faro Laser Scanarm, the Measure of Success (No Date).
Exhibit 49, D.E. Whitney, et al., "Development and Control of an Automated Robotic Weld Bead Grinding System", Transactions of the ASME, vol. 112, Jun. 1990, pp. 166-176.
Exhibit 5, Notice of Allowability Dated Mar. 4, 2003 for U.S. Appl. No. 09/000,215.
Exhibit 56, Faro Technologies, Inc. Fax From Manny Bravo to Rupal Patel of 3D Scanners Dated Aug. 15, 2000.
Exhibit 57, E-Mail From Peter Champ to Stephen Crampton; Peter Champ; Phil Hand; and Dicken Smith Dated Mar. 4, 1999.
Exhibit 6, U.K. Patent Application No. GB 2 264 602 Published Sep. 1, 1993.
Exhibit 7, U.K. Patent Application No. GB 2 264 601 Published Sep. 1, 1993.
Exhibit 71, Faro Laser Scanarm V3, the Measure of Success (No Date).
Exhibit 75, Faro Technologies, Inc. Introduction to the Faroarm, (No Date).
Exhibit 79, Memo From Peter Champ to Simon Raab of Faro Technologies, Inc. Dated Mar. 11, 2009.
Exhibit 8, Combined Declaration and Power of Attorney in Patent Application for U.S. Appl. No. 09/000,215.
Exhibit 8, Industrial Faro Arm, Bronze Series, Liberated . . . CMM, 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 80, Subpoena in a Civil Case, Case Number: 08CV11187 (PBS).
Exhibit 82, Christensen, O'Connor, Johnson Kindness Information on Kevan L. Morgan.
Exhibit 87, Amendment Filed Oct. 13, 2006, in U.S. Appl. No. 10/601,043.
Exhibit 88, Transmittal Letter Having a Date of Deposit of Jun. 20, 2003, for Patent Application Entitled Scanning Apparatus and Method, U.S. Appl. No. 10/601,043.
Exhibit 89, USPTO PTO-1556 Fee Record Sheet.
Exhibit 9, Industrial Faro Arm, Silver Series, Liberated . . . CMM, 2-2-2 and 2-1-3 Configuration 1995, Faro Technologies, Inc.
Exhibit 9, Transmittal Letter to the U.S. Designated/Elected Office (DO/EO/US), Concerning a Filing Under 37 U.S.C. 371 for U.S. Appl. No. 09/000,215.
Exhibit 90, USPTO Bib Data Sheet for U.S. Appl. No. 10/601,043.
Exhibit 91, Continuation Application of U.S. Appl. No. 09/000,215.
Exhibit 92, Combined Declaration and Power of Attorney of Stephen James Crampton Dated Mar. 19, 1998.
Exhibit 93, Preliminary Amendment, U.S. Appl. No. 10/601,043, Filed by Mail Nov. 12, 2003.
Exhibit 94, Amendment Transmittal Letter, U.S. Appl. No. 10/601,043, Filed Nov. 17, 2003.
Exhibit 95, Second Preliminary Amendment, U.S. Appl. No. 10/601,043, Filed Dec. 8, 2005.
Exhibit 96, Amendment and Request for Reconsideration, U.S. Appl. No. 09/000,215, Dated Dec. 14, 2000.
Exhibit 99, Re-Notice of Deposition of Samuel Shafner, Civil Action No. 08-CV-11187(PBS).
Exhibit A, Defendant Faro Technologies Inc.'s: Request for Additional Findings of Fact and Conclusions of Law that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 310-1, Filed May 20, 2011, pp. 1-27.
Exhibit A, Plaintiffs' Supplemental Memorandum of Law regarding the Impact of the Federal Circuit's Recent Decision in Therasense, Inc. v. Becton, Dickinson & Co. on Faro's Allegations of Inequitable Conduct; Case 1:08-cv-11187-PBS, Document 319-1, Filed Jun. 6, 2011, pp. 1-35.
Exhibit B, Defendant Faro Technologies Inc.'s: Memorandum of Law in Support of its Request for Additional Findings of Fact and Conclusions of Law that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 310-2, Filed May 20, 2011, pp. 1-24.
Expert Report of Thomas Kurfess Pursuant to Fed.R.Civ.P.26(A)(2)(B), Civil Action No. 08-CV-11187 (PBS).
Expert Report, having a signature date of Nov. 19, 1999, Marked as Page Nos. M0083650-M0083654 and M0083657-M0083670. English Translation provided, marked as pp. M0083650-M0083654 and M0083657-M0083670.
Expert Report, pp. 1-5, having a signature date of Nov. 19, 1999, Marked as Page Nos. M0083650-M0083654.
Facsimile from Dirk Esselens, Managing Director of 3D Imaging International Inc. to Mr. Stuart Hamilton of 3D Scanners Ltd. dated Mar. 31, 1995, Marked as Page No. M0082704.
Facsimile from Dirk Esselens, Managing Director of 3D Imaging International Inc. to Mr. Stuart Hamilton of 3D Scanners Ltd. dated Mar. 31, 1995, Marked as Page No. M0082705.
Facsimile from Mr. Lapouyade to Mr. Loisance dated Jun. 28, 1995, Marked as Page No. M0082774.
Facsimile from Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd. dated Jun. 20, 1991, Marked as Page No. M0083199.
Faro Technologies Inc., Theory of Operation, Faro Laser Scanner Version 2, Dated Mar. 9, 2005.
Faro Technologies Inc.'s Motion to File Documents under Seal; Case 1:08-cv-11187-PBS, Document 249, Filed Oct. 4, 2010, pp. 1-2.
Fax dated Dec. 7, 1990 from Mr. Michel Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page No. M0083285.
Fax dated Jun. 11, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083202.
Fax dated Jun. 12, 1990 from Mr. Stephen Crampton to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083336.
Fax dated Jun. 20, 1991 from Stephen Crampton of 3D Scanners Ltd. to Mr. Brunet of Vision 3D, Marked as Page No. M0083200.
Fax dated May 20, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083209.
Fax dated May 23, 1991 from Mr. Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page No. M0083208.
Fax dated May 23, 1991 from Mr. M. Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page No. M0083206.
Fax dated May 23, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083207.
Fax from Naval Kapoor of 3D Technology, Inc., with attachment labeled "Laser Line Scan Sensor Specification Comparison" and a date of "Jan. 9, 1994 Rev A", Marked as Page Nos. M0082676-M0082679.
Fax Sheet dated Jan. 20, 1994 from Ed Vinarub of 3D Technology, Inc. to Naval Kapoor of 3D Scanners, Ltd. with attachment indicating "Ref. Nr.: wiring", Marked as Page Nos. M0082668-M0082675.
Fax Sheet dated Jan. 26, 1994 from Naval Kapoor to Mr. Peter Champ of 3D Scanners, Ltd., Marked as Page No. M0082667.
Findings of Fact, Conclusion of Law, and Order; Case 1:08-cv-11187-PBS, Document 309, Filed May 4, 2011, pp. 1-66.
Findings of Fact, Conclusion of Law, and Order; Case 1:08-cv-11187-PBS, Document 331, Filed Sep. 19, 2011, pp. 1-45.
Fisher, et al., "A Hand-Held Optical Surface Scanner for Environmental Modeling and Virtual Reality,", Department of Artificial Intelligence, University of Edinburgh.
Fisher, R.B., et al., "A Hand-Held Optical Surface Scanner for Environmental Modeling and Virtual Reality," Proceedings of Virtual Reality World, Stuttgart, Germany, Feb. 1996.
Forward Thinking, total 6 pages, brochure from 3D Scanners, London, England.
French document labeled "Proces-Verbal De Saisie Contrefacon", (no date), Marked as Page Nos. M0082809-M0082812.
French document labeled "Requete A Fin De Saisie-Contrefacon", (no date), Marked as Page Nos. M0082813-M0082819.
French document labeled Requete A Fin De Saisie-Contrefacon (Brevit d'Invention), Marked as Page Nos. M0082793-M0082807, with pp. M0082800-M0082801 in English.
French language documents with first document labeled "Requete a Fin de Saisie-Contrefacon", Marked as Page Nos. M0083893-M0083908, with pp. M0083900-M0083901 being an English document labeled "Reversa, Reverse Engineering System".
French language legal documents with first page being a copy of a tab with handwritten notation indicating "Summons", dated Apr. 21, 1995, Marked as Page Nos. M0083886-MM0083892.
Imatronic information for LDM145 Compact Laser Diode Module (no date), Marked as Page Nos. M0082659-M0082661.
Imatronic information for LDM145 Compact Laser Diode Module (no date), Marked as Page Nos. M0082665-M0082666.
Individual Partner Information, Marked as Page No. M0082645.
Information labeled "3D Scanners Profil De La Societe" (no date), Marked as Page Nos. M0083803-M0083806.
Information labeled "3D Scanners Stripe Surface Digital Probe" (no date), Marked as Page Nos. M0083797-M0083799.
Information labeled "Surfa Flatness Sensing System" (no date), Marked as Page Nos. M0083800-M0083802.
Information on "3D Videolaser" (no date), written in both French and English, Marked as Page Nos. M0083215-M0083217.
Information on "Replica 3D Surface Digitising and NC Program Preparation System" (no date), Marked as Page Nos. M0083189-M0083190.
Information on "Replica, Reverse Engineering System", 3D Scanners (no date), in English with French translation, Marked as Page Nos. M0083779-M0083785.
Information on "Stripe 3D Surface Digitising Probe" (no date), Marked as Page Nos. M0083187-M0083188.
Information on "Surfa Flat Product Line Flatness Sensor" (no date), Marked as Page Nos. M0083191-M0083195.
Information on Great Britain Patent Application No. 2264602, Marked as Page Nos. M0083688-M0083698.
Information on Great Britain Patent Application No. 2264602, Marked as Page Nos. M0083688-M0083698. English Translation provided, marked as pp. M0083688-M0083698.
Information relating to various patent applications with Kreon Industrie listed as an applicant, Marked as Page Nos. M0083344-M0083355.
Invoice dated May 11, 1995 from 3D Scanners Ltd. to Kreon Industrie, Marked as Page No. M0083165.
John M. Fitts, "Moire Sensor Technology and the CMM", Modern Applications News, Sep. 1994, pp. 38-39.
Joint Motion for Scheduling Order; Case 1:08-cv-11187-PBS, Documents 335 (2 pages), 335-1 (2 pages), 335-2 (1 page) Filed Dec. 15, 2011.
Joint Statement of Agreed Upon Claim Terms, Filed Nov. 9, 2009, Civil Action No. 08-CV-11187(PBS).
Judgment of Oct. 8, 1991, Commercial Court of Toulouse, French document Marked as Page Nos. M0083838-M0083849.
Judgment of Oct. 8, 1991, Commercial Court of Toulouse, French document Marked as Page Nos. M0083839-M0083851. English Translation provided, marked as pp. M0083839-M0083851.
Kreon Color Brochure, "Mozart'S Bust Reconstructed Thanks to the Kreon Industries Technology", Exhibit Raphael 141.
Kreon Handscan User'S Manual, Revision 1.15, Software Version 1.5, May 1997.
Kreon Industries "Contactless 3-D Digitization, Mozart's Bust Reconstructed Thanks to the Kreon Industries Technology" (no date), Marked as Page Nos. M0083508-M0083515.
Kreon Industries "Technical Data", "Software: Kreon Handscan", "Kreon Reporter" (no date), Marked as Page Nos. M0083502-M0083507.
Kreon Industries "Technical Data", "Software: Kreon Handscan", "Kreon Reporter" (no date), Marked as Page Nos. M0083503-M0083505. English Translation provided, marked as pp. M0083503-M0083505.
Kreon Industries information for Mach'Pro dated Sep. 7, 1994, Marked as Page Nos. M0083158-M0083160.
Kreon Industries information for Mach'Pro dated Sep. 7, 1994, Marked as Page Nos. M0083158-M0083160. English Translation provided, marked as pp. M0083158-M0083160.
Kreon Industries, All You Need to Know about Kreon Reverse Engineering System, Apr. 1996, Cover page, Contents page, pp. 1-28, Marked as Page Nos. M0083516-M0083545.
Kreon Products Brochure, "Complete Package: KLS 171 Sensor (or KLS 151)", Exhibit Raphael 183.
Laser Focus World, Back to Basics: Optical Computing, total 4 pages, Dec. 1995.
Letter dated Apr. 19, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083273.
Letter dated Apr. 20, 1993 from Mr. Lapouyade of Kreon Industries to 3D Scanners Ltd., Marked as Page Nos. M0083175-M0083176.
Letter dated Apr. 23, 1991 from Mr. Michel Brunet of Vision 3D to 3D Scanners Ltd., Marked as Page No. M0083214.
Letter dated Apr. 23, 1991 from Mr. Michel Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page Nos. M0083212-M0083213.
Letter dated Apr. 29, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083210-M0083211.
Letter dated Apr. 29, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083674-M0083675.
Letter dated Apr. 29, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083674-M0083675. English Translation provided, marked as pp. M0083674-M0083675.
Letter dated Dec. 1, 1993 from Mr. Valade of Kreon Industrie to 3D Scanners Ltd., Marked as Page Nos. M0083170-M0083171.
Letter dated Dec. 7, 1990 from Mr. Michel Brunet of Vision 3D to the attention of Mr. Stephen Crampton of 3D Scanners Ltd., with attachment "3D Digitizer, 3D Videolaser, Description Notice", Marked as Page Nos. M0083286-M0083335.
Letter dated Dec. 7, 1993 to Mr. Valade of Kreon Industrie from Mr. Stephen Crampton, Marked as Page No. M0083169.
Letter dated Feb. 17, 1994 from Naval Kapoor of 3D Technology, Inc. to Mr. Peter Champ of 3D Scanners Ltd., Marked as Page No. M0082685.
Letter dated Feb. 18th (no year), from Mr. Valade of Kreon Industrie to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page No. M0083166.
Letter dated Feb. 29, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083274-M0083275.
Letter dated Jan. 17, 1994 from Naval Kapoor of 3D Technology, Inc. to Stephen Crompton of 3D Scanners Ltd., Marked as Page No. M0082658.
Letter dated Jan. 17, 1996 from Mr. Denis Monegier du Sorbier of Clery, De La Myre Mory & Monegier du Sorbier to Mr. Stephen Crampton of 3D Scanners, Marked as Page Nos. M0082877-M0082879.
Letter dated Jan. 2, 1991 from Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083283-M0083284.
Letter dated Jul. 22, 1991 from Mr. Brunet to Mr. Crampton, Marked as Page Nos. M0083817-M0083819.
Letter dated Jul. 22, 1991 from Mr. Michel Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd., Marked as Page No. M0083179.
Letter dated Jul. 30, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083834-M0083837.
Letter dated Jul. 30, 1991 from Stephen Crampton to Mr. M. Michel Brunet of Vision 3D, Marked as Page No. M0083178.
Letter dated Jul. 4, 1991 from Mr. Brunet to Mr. Crampton, Marked as Page Nos. M0083820-M0083823.
Letter dated Jul. 9, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083180.
Letter dated Jun. 11, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page No. M0083201.
Letter dated Jun. 11, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083671-M0083673.
Letter dated Jun. 11, 1991 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. Michel Brunet of Vision 3D, Marked as Page Nos. M0083671-M0083673. English Translation provided, marked as pp. M0083672-M0083673.
Letter dated Mar. 21, 1995, Letterhead shows Michel D. Cabour, Joelle Girod-Chataignier; typed address of Societe Mutistation; "Affaire: Kreon Industries C/Multistation", "Dossier: 50301632", Marked as Page No. M0082646.
Letter dated Mar. 9, 1994 from Naval Kapoor of 3D Technology, Inc. to Mr. Peter Champ of 3D Scanners Ltd. with attachments "CNC Driver Specification" and "Driver Status Report.", Marked as Page Nos. M0082680-M0082684.
Letter dated May 10, 1995 from Mr. Stephen Crampton of 3D Scanners Ltd. to Mr. P. Lapouyade of Kreon Industries, Marked as Page Nos. M0083162-M0083164.
Letter dated May 11, 1993 from Mr. Stephen Crampton to Mr. Lapouyade of Kreon, Marked as Page No. M0083174.
Letter dated May 24, 1995 from Mr. Lapouyarde of Kreon Industrie to 3D Scanners Ltd., Marked as Page No. M0083161.
Letter dated Oct. 25, 1991 from Stephen Crampton of 3D Scanners Ltd. to Mr. Cousseau of Kreon Industrie, Marked as Page No. M0083177.
Letter dated Oct. 28, 1988 from Mr. Michel Brunet of Vision 3D to the University College London, Dept. of Medical Physics, Marked as Page No. M0083621.
Letter from Linklaters & Alliance to Monsieur Peter Champ of 3D Scanners, dated Dec. 12, 2000, "Affaire: 3D Scanners c/ Kreon", Marked as Page No. M0082611.
Letter from Mr. Michel Brunet of Vision 3D to Mr. Stephen Crampton of 3D Scanners Ltd. dated Jul. 14, 1991, Marked as Page Nos. M0083972-M0083975.
Letter from William J. Cass to The Honorable Patti B. Saris, dated Jul. 6, 2011, Case 1:08-cv-11187-PBS, p. 1.
Letter from William J. Cass, Case 1:08-cv-11187-PBS, Document 282, Filed Oct. 29, 2010, p. 1.
Letter from Zachary R. Gates to the Clerk, dated Nov. 2, 2010, p. 1.
Letter from Zachary R. Gates to the Honorable Patti B. Saris, dated Feb. 22, 2011, Case 1:08-cv-11187-PBS, Document 308, Filed Feb. 22, 2011, p. 1.
Letter from Zachary R. Gates to the Honorable Patti B. Saris, dated Nov. 2, 2011, p. 1.
Letter of Intent between Vision 3D and 3D Scanners Limited (no date), Marked as Page Nos. M0083196-M0083198.
Letter of Intent between Vision 3D and 3D Scanners Ltd. signed and dated Jul. 22, 1991 and Jul. 30, 1991, Marked as Page Nos. M0083340-M0083343.
Letter of Intent signed and dated Jul. 22, 1991 and Jul. 30, 1991, Marked as Page Nos. M0083824-M0083833.
Lettre De Souscription, having signature dates of Jul. 22, 1991, and Jul. 30, 1991, Marked as Page Nos. M0083699-M0083704.
Lettre De Souscription, having signature dates of Jul. 22, 1991, and Jul. 30, 1991, Marked as Page Nos. M0083699-M0083704. English Translation provided, marked as pp. M0083699-M0083704.
Levoy, "Polygon-Assisted JPEG and MPEG Compression of Synthetic Images", Computer Science Department, Stanford University.
List of Non-Confidential Documents Filed with Court post Aug. 18, 2010, pp. 1-3.
Memorandum and Order, Filed Oct. 22, 2009, Civil Action No. 08-CV-11187(PBS).
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Judgment in a Civil Case, dated Aug. 13, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Eight, Aug. 8, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Five, Aug. 3, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Four, Aug. 2, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Nine, Aug. 9, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day One, Jul. 30, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Seven, Aug. 7, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Six, Aug. 6, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Ten, Aug. 10, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Three, Aug. 1, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Transcript from Jury Trial—Day Two, Jul. 31, 2012.
Metris U.S.A., Inc., et al., v. Faro Technologies, Inc., CA No. 08-11187-PBS, Verdict dated Aug. 10, 2012.
Motion of File under Seal Plaintiffs' Proposed Findings of Fact, Conclusion of Law, and Supporting Exhibits regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Document 293, Filed Nov. 19, 2010, pp. 1-4.
Motion to File under Seal Plaintiffs' Post-Trial Brief and Response to Defendant Faro Technologies, Inc.'s Request for Findings of Fact and Conclusions of Law Concerning Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Document 300, Filed Dec. 10, 2010, pp. 1-4.
Motion to File under Seal the Counterclaim Defendants' Reply Brief to their Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 283, Filed Nov. 2, 2010, pp. 1-4.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 236, Filed Aug. 27, 2010, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 250, Filed Oct. 4, 2010, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 284, Filed Nov. 2, 2010, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 295, Filed Nov. 19, 2010, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 301, Filed Dec. 10, 2010, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 327, Filed Jul. 6, 2011, p. 1.
Notice of Manual Filing; Case 1:08-cv-11187-PBS, Document 330, Filed Jul. 6, 2011, p. 1.
Notice of Reasons for Rejection drafted Dec. 15, 2006, for Japanese Patent Application No. H09-507376.
Order dated Mar. 16, 1996, Marked as Page Nos. M0083733-M0083735.
Order to Replace the Expert, Judgment of Feb. 5, 1999, Marked as Page Nos. M0083657-M0083664.
Order; Case 1:08-cv-11187-PBS, Document 333, Filed Oct. 17, 2011, p. 1.
Order; Case 1:08-cv-11187-PBS, Document 334, Filed Dec. 2, 2011, p. 1.
Parts List, 2 pages, (no date), Marked as Page Nos. M0082690-M0082691.
Paul J. Besl, "Active, Optical Range Imaging Sensors", Machine Vision and Applications (1988), Chapter 1, pp. 127-152, Marked as Page Nos. M0082725-M0082750.
Persona, 3D Human Forth Scanner, total 2 pages, brochure from 3D Scanners Ltd., London, England.
Philips information for Camera Module Range VC31 (no date), Marked as Page Nos. M0082662-M0082664.
Plain Sheet of Paper with handwritten notation of "3D Technology", Marked as Page No. M0082657.
Plain Sheets of Papers with handwritten notations on right size of paper, Marked as Page Nos. M0083113-M0083116.
Plaintiff-In-Counterclaim Faro Technologies, Inc.'s Opposition to: Counterclaim Defendants' Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, 27 pages.
Plaintiffs' Answer and Affirmative Defenses to Defendant Faro Technologies Incorporated's First Amended Counterclaim, Filed Sep. 21, 2009.
Plaintiffs' Memorandum of Law in Response to Defendant Faro's Request for Additional Findings of Fact and Conclusions of Law that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Document 316, Filed Jun. 6, 2011, pp. 1-26.
Plaintiffs' Motion and Memorandum of Law for Leave to File Memorandum in Excess of Twenty Pages; Case 1:08-cv-11187-PBS, Document 319, Filed Jun. 9, 2011, pp. 1-5.
Plaintiffs' Motion for Leave to File Plaintiffs' Supplemental Brief regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim in Light of New Controlling Precedent; Case 1:08-cv-11187-PBS, Document 305, Filed Jan. 19, 2011, pp. 1-3.
Plaintiffs' Objections to Defendant's Evidence in Support of its Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Plaintiffs' Opposition to Defendant'S Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Plaintiffs' Preliminary Claim Construction Brief, Filed Aug. 13, 2009, Civil Action No. 08-CV-11187(PBS).
Plaintiffs' Proposed Additional Findings of Fact, Conclusions of Law, and Application of the Law to the Facts regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Document 317, Filed Jun. 6, 2011, pp. 1-35.
Plaintiffs' Renewed Opposition to Faro's Renewed Motion for Summary Judgment of Non-lnfringment; That this is an Exceptional Case; and that Faro be Awarded its Attorneys' Fees; Case 1:08-cv-11187-PBS, Document 325, Filed Jun. 22, 2011, pp. 1-4.
Plaintiffs' Reply to Faro's Response to Plaintiffs' Additional Findings of Fact and Conclusions of Law regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim; Case 1:08-cv-11187-PBS, Documents 326 (17 pages), 326-1 (1 pages), Filed Jun. 24, 2011, pp. 1-17.
Plaintiffs' Response to Defendant Faro Technologies Inc.'s Request for Additional Findings of Fact and Conclusions of Law on its Claim that U.S. Patent No. 7,313,264 is Unenforceable; Case 1:08-cv-11187-PBS, Documents 315 (65 pages) and 315-1 (2 pages), Filed Jun. 6, 2011.
Plaintiffs' Response to Defendant'S Claim Construction Brief, Filed Sep. 14, 2009, Civil Action No. 08-CV-11187(PBS).
Plaintiffs' Response to Faro's Motion to Seal the Hearing on Faro's Motion for Summary Judgment of Non-Infringement and Request to Make this Case Exceptional; Case 1:08-cv-11187-PBS, Document 323, Filed Jun. 16, 2011, pp. 1-5.
Plaintiff'S Statement of Disputed Facts in Opposition to Faro'S Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed Mar. 8, 2010.
Plaintiffs' Supplemental Brief regarding Faro's Inequitable Conduct Affirmative Defense and Counterclaim in Light of New Controlling Precedent; Case 1:08-cv-11187-PBS, Documents 305-1 (5 pages) and 305-2 (13 pages) , Filed Jan. 19, 2011, pp. 1-5.
Plaintiffs' Supplemental Memorandum of Law regarding the Impact of the Federal Circuit's Recent Decision in Therasense, Inc. v. Becton, Dickinson & Co. on Faro's Allegations of Inequitable Conduct; Case 1:08-cv-11187-PBS, Document 320, Filed Jun. 10, 2011, pp. 1-34.
Plaintiff'S Surreply in Further Support of Its Opposition to Defendant's Motion for Partial Summary Judgment That the Patents-in-Suit Are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS), Filed May 20, 2010.
Plaintiff'S Trial Brief for Bench Trial on Inequitable Conduct, Civil Action No. 08-CV-11187 (PBS), Filed Aug. 13, 2010.
Plantiffs Metris U.S.A., Inc., Metris N.V., Metris IPR N.V. and 3D Scanners Ltd. Answer to Defendant Faro Technologies Incorporated's Counterclaim, Filed Dec. 26, 2008.
Postal date stamp, dated May 15, 1998, Marked as Page No. M0083670.
Postal receipt dated Dec. 7, 1993 of mail sent to Mr. Valade of Kreon Industries, Marked as Page Nos. M0083167-M0083168.
Project Meeting Agenda dated Aug. 27, 2003, Marked as Page Nos. M0083977-M0083978.
Project Meeting Minutes dated Aug. 27, 2003, Marked as Page Nos. M00083979-M0083981.
Prospectus Kreon Industries 3D Videolaser (no date), Marked as Page Nos. M00837807-M0083811.
Prospectus Kreon KL 50-A Laser Sensor (no date), Marked as Page Nos. M0083814-M0083816.
Replica, Reverse Engineering System, total 4 pages, brochure from 3D Scanners Ltd., London, England.
Reply Memorandum in Support of Defendant Faro'S Motion for Partial Summary Judgment That the Patents-in-Suit are Unenforceable Due to Inequitable Conduct During Patent Prosecution, Civil Action No. 08-CV-11187 (PBS).
Request for Authorization to Perform a Seizure for Infringement dated Mar. 14, 1995, Marked as Page Nos. M0083730-M0083732.
Reversa, Reverse Engineering System, total 2 pages, brochure from 3D Scanners Ltd., London; England.
Romer Report, Jan./Feb. 1995, 4 pages.
Romer Report, Jul./Aug. 1995, 4 pages.
Romer Report, Mar./Apr. 1995, 4 pages.
Romer Report, May/Jun. 1995, 4 pages.
Sakaguchi, et al., "Acquisition of Entire Surface Data Based on Fusion of Range Data," IEICE Transactions, vol. E 74, No. 10 (1991).
Sakaguchi, Y., et al., "Acquisition of Entire Surface Data Based on Fusion of Range Data," IEICE Transactions E74(10):3417-3421, Oct. 1991.
Scheduling Order; Case 1:08-cv-11187-PBS, Document 336, Filed Dec. 18, 2011, pp. 1-2.
Second Original Report on Seizure for Infringement dated Mar. 17, 1995, Marked as Page Nos. M0083739-M0083742.
Service of an order dated Mar. 17, 1995, Marked as Page Nos. M0083736-M0083737.
Service of the Act (no date), Marked as Page No. M0083738.
Statement of Material Facts for which there is no Genuine Issue to be Tried in Support of Counterclaim Defendants' Motion for Summary Judgment as to Faro's Antitrust and Unfair Competition Counterclaims (Counterclaim Counts VII-IX); Case 1:08-cv-11187-PBS, Document 247, Filed Sep. 10, 2010, pp. 1-10.
Stephen's Reverse Engineering Comments (no date), Marked as Page No. M0082689.
Summons to Appear at the Tribunal De Grande Instance of Paris dated Apr. 21, 1995, Marked as Page Nos. M0083723-M0083729.
Summons to Appear at the Tribunal De Grande Instance of Paris, dated Apr. 21, 1995, French language Marked as Page Nos. M0083850-M0083865, partial English translation Marked as Page Nos. M0083866-M0083872.
SupraNews Nov./Dec. 1992, 4 pages.
SupraNews, Feb./Mar. 1993, 4 pages.
SupraNews, Jan./Feb. 1994, 4 pages.
SupraNews, Mar./Apr. 1994, 4 pages.
SupraNews, May/Jun. 1993, 4 pages.
SupraNews, May/Jun. 1994, 4 pages.
SupraNews, Nov./Dec. 1993, 4 pages.
SupraNews, Nov./Dec. 1994, 4 pages.
SupraNews, Sep./Oct. 1993, 4 pages.
SupraNews, Sep./Oct. 1994, 4 pages.
System Label Reference Sheet, Title: ModelMaker Y Labels, dated Jun. 18, 2004, Marked as Page No. M0083976.
System Label Reference Sheet, Title: ModelMaker Z Labels, dated Mar. 22, 2004, Marked as Page Nos. M0083982-M0083988.
Terry T. Wohlers, "3D Digitizers", Computer Graphics World, Jul. 1992, pp. 73-77.
Terry T. Wohlers, "Reverse Engineering Systems From Product to CAD and Back Again," Cadence, Jan. 1993, pp. 45, 46, 48, 50, 52, 54, 56, 57.
The CNC Institute Inc. "CNC Driver Specification by Mark Knobloch," dated Jan. 31, 1994, Marked as Page Nos. M0082686-M0082687.
Title page, 3D Machining Training Guide, Chapter 7, Case Studies (no date), Marked as Page No. M0082909.
Turk, et al., "Zippered Polygon Meshes From Range Images", Computer Science Department, Stanford University.
United States District Court, District of Massachusetts, Exhibit List; Case 1:08-cv-11187-PBS, Document 291, Filed Nov. 19, 2010, pp. 1-6.
United States District Court, District of Massachusetts, Exhibit List; Case 1:08-cv-11187-PBS, Document 294, Filed Nov. 19, 2010, pp. 1-2.
United States District Court, District of Massachusetts; Notice Transcript Redaction Policy, Case 1:08-cv-11187-PBS, Document 242, Filed Sep. 2, 2010, pp. 1-8.
United States District Court, District of Massachusetts; Notice Transcript Redaction Policy, Case 1:08-cv-11187-PBS, Document 281, Filed Oct. 28, 2010, pp. 1-8.
United States District Court, District of Massachusetts; Notice Transcript Redaction Policy, Case 1:08-cv-11187-PBS, Document 290, Filed Nov. 10, 2010, pp. 1-8.
United States District Court, District of Massachusetts; Notice Transcript Redaction Policy, Case 1:08-cv-11187-PBS, Document 329, Filed Jul. 6, 2011, pp. 1-8.
United States District Court, District of Massachusetts; Standing Procedural Order Re: Sealing Court Documents, Case 1:08-cv-11187-PBS, Document 239, Filed Aug. 30, 2010, pp. 1-2.
United States District Court, District of Massachusetts; Standing Procedural Order Re: Sealing Court Documents, Case 1:08-cv-11187-PBS, Document 285, Filed Nov. 3, 2010, pp. 1-2.
United States District Court, District of Massachusetts; Standing Procedural Order Re: Sealing Court Documents, Case 1:08-cv-11187-PBS, Document 296, Filed Nov. 23, 2010, pp. 1-2.
United States District Court, District of Massachusetts; Standing Procedural Order Re: Sealing Court Documents, Case 1:08-cv-11187-PBS, Document 302, Filed Dec. 13, 2010, pp. 1-2.
Using Digibot to Digitize and Model Subjects for Computer Imaging of Gross Anatomy at Colorado State University, Jul. 1989, 1 Page (Title Page).
Various copies of date stamped mail, Marked as Page Nos. M0083337-M0083339.
Various French language documents, Marked as Page Nos. M0083930-M0083971.
Various Graphs, with first page having a date of Sep. 19, 1997, Marked as Page Nos. M0083117-M0083157.
Various photographs (no date), Marked as Page Nos. M0083743-M0083752.
Various photographs and images (no date), Marked as Page Nos. M0083909-M0083929.
Various Pictures, and blank pages with handwritten numbers, (no date), Marked as Page Nos. M0083546-M0083601.
Vision 3D Business Plan 1991-1993 (no date), Marked as Page Nos. M0083218-M0083241. English Translation provided, marked as pp. M0083218-M0083241.
Vision 3D Business Plan 1991-1993 (no date), Marked as Page Nos. M0083218-M0083272, with Dataquest "Research Newsletter" on pp. M0083242-M0083245 and "Authorized Distributor, Integrator, Standard Contract" on M0083248-M0083271 in English.
Vision 3D document labeled "Potential Partners", addressed to 3D Scanners Ltd., dated Jan. 10, 1991, Marked as Page Nos. M0083276-M0083282.
Vision 3D, Information on "3D Videolaser 1/300/Head" (no date), written in French and English, Marked as Page Nos. M0083628-M0083631.
Vision 3D, Information on "3D Videolaser" (no date), written in French and English, Marked as Page Nos. M0083622-M0083627.
Vision 3D, Information on "3D Videolaser™ Sensor", (no date), Marked as Page Nos. M0083602-M0083617.
Wolf & Beck, "Optoelectronic 3D-Trigger Probe OTS5-LD".

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8576411B2 (en) * 2010-11-10 2013-11-05 Yazaki Corporation Component position measurement method
US20130200139A1 (en) * 2012-02-06 2013-08-08 Oracle International Corporation Topographic spot scanning for a storage library
US8596525B2 (en) * 2012-02-06 2013-12-03 Oracle International Corporation Topographic spot scanning for a storage library
US20130221086A1 (en) * 2012-02-29 2013-08-29 Oracle International Corporation Contrast spot scanning for a storage library
US8613386B2 (en) * 2012-02-29 2013-12-24 Oracle International Corporation Contrast spot scanning for a storage library
US20130266228A1 (en) * 2012-04-10 2013-10-10 Siemens Industry, Inc. Automatic part identification and workflow generation
US10089415B2 (en) 2013-12-19 2018-10-02 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10122997B1 (en) 2017-05-03 2018-11-06 Lowe's Companies, Inc. Automated matrix photo framing using range camera input

Also Published As

Publication number Publication date
AU6626896A (en) 1997-02-26
US6611617B1 (en) 2003-08-26
EP1160539A1 (en) 2001-12-05
EP1160539B1 (en) 2003-07-02
EP0840880B1 (en) 2002-03-13
EP0840880A1 (en) 1998-05-13
US20030231793A1 (en) 2003-12-18
US7313264B2 (en) 2007-12-25
DE69628956T2 (en) 2004-05-27
JPH11509928A (en) 1999-08-31
DE69619826D1 (en) 2002-04-18
DE69628956D1 (en) 2003-08-07
WO1997005449A1 (en) 1997-02-13
GB9515311D0 (en) 1995-09-20
EP1355127A3 (en) 2003-11-19
EP1355127A2 (en) 2003-10-22

Similar Documents

Publication Publication Date Title
USRE43895E1 (en) Scanning apparatus and method
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US7336375B1 (en) Wireless methods and systems for three-dimensional non-contact shape sensing
Gühring Dense 3D surface acquisition by structured light using off-the-shelf components
EP0805948B1 (en) Scanning arrangement and method
US7747067B2 (en) System and method for three dimensional modeling
KR100407436B1 (en) Method and apparatus for capturing stereoscopic images using image sensors
EP2183544B1 (en) Non-contact measurement apparatus and method
US7271377B2 (en) Calibration ring for developing and aligning view dependent image maps with 3-D surface data
JP2003532062A (en) Combined stereoscopic, color 3D digitization and motion capture system
JP2009078133A (en) Device for determining 3d coordinates of object, in particular of tooth
WO2016040271A1 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US11727635B2 (en) Hybrid photogrammetry
EP1840507B1 (en) Method and integrated system for the digital survey of three-dimensional environments.
Fisher et al. A hand-held optical surface scanner for environmental modeling and virtual reality
Lee et al. Modeling real objects using video see-through augmented reality
WO2000021034A1 (en) Method and apparatus for the scanning of spatial objects and to create a three dimensional computer model
JPH0843044A (en) Measuring apparatus for three dimensional coordinate
Scopigno et al. Tutorial T1: 3D data acquisition
JPH0626828A (en) Apparatus and method for measuring shape
Knyaz Accurate photorealistic texture mapping for metric 3D models
Katwal The State of the Art of Range Imaging as of 2004
Clark et al. Depth Sensing by Variable Baseline Triangulation.
Malz Three-dimensional sensors for high-performance surface measurement in reverse engineering
Sansoni et al. Projection of Structured Light

Legal Events

Date Code Title Description
FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY