CN105190232A - Three-dimensional coordinate scanner and method of operation - Google Patents

Three-dimensional coordinate scanner and method of operation Download PDF

Info

Publication number
CN105190232A
CN105190232A CN201480015935.5A CN201480015935A CN105190232A CN 105190232 A CN105190232 A CN 105190232A CN 201480015935 A CN201480015935 A CN 201480015935A CN 105190232 A CN105190232 A CN 105190232A
Authority
CN
China
Prior art keywords
light
camera head
projector
pattern
scanner
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201480015935.5A
Other languages
Chinese (zh)
Inventor
贝恩德-迪特马尔·贝克
罗伯特·E·布里奇斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faro Technologies Inc
Original Assignee
Faro Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/932,267 external-priority patent/US9482529B2/en
Application filed by Faro Technologies Inc filed Critical Faro Technologies Inc
Publication of CN105190232A publication Critical patent/CN105190232A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/045Correction of measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A noncontact optical three-dimensional measuring device that includes a first projector, a first camera, a second projector, and a second camera; a processor electrically coupled to the first projector, the first camera, the second projector, and the second camera; and computer readable media which, when executed by the processor, causes the first digital signal to be collected at a first time and the second digital signal to be collected at a second time different than the first time and determines three-dimensional coordinates of a first point on the surface based at least in part on the first digital signal and the first distance and determines three-dimensional coordinates of a second point on the surface based at least in part on the second digital signal and the second distance.

Description

Three-dimensional coordinate scanner and method of operating
Background technology
Theme disclosed herein relates to three-dimensional coordinate scanner, particularly relates to the triangulation type scanner of multiple mode with data acquisition.
The acquisition of the three-dimensional coordinate of object or environment is known.Such as, the various technology of such as time-of-flight method or triangulation can be used.The such as time-of-flight system of laser tracker, total powerstation or flight time scanner can make the spot on the beam direction retroreflector target of such as laser beam or the surface of object.Absolute distance meter is used to march to target or spot based on light and return required time span to determine the distance of target or spot.By the surface making laser beam or target move through object, the coordinate of object can be determined.The advantage of time-of-flight system is to have relatively high precision, but in some cases, due to each point in time-of-flight system usually necessary independent measurement surface, therefore time-of-flight system may be slower compared to some other systems.
By contrast, triangulation is used to measure the scanner of three-dimensional coordinate by the light in wire (such as, laser rays from laser rays detector) pattern projection on the surface or by the pattern projection of the light (such as, structured light) of overlay area on the surface.Such as by camera head and projector are attached to common frame, this camera head is coupled to projector with fixed relationship.The light sent from projector is by surface reflection and be caught on camera device and detect.Because camera head and projector are arranged with fixed relationship, therefore triangle principle can be used determine the distance of object.Compared with using the coordinate measuring set of sense of touch detector (tactileprobe), triangulation system provides advantage in quick obtaining coordinate data in large region.As used herein, the set of the D coordinates value provided by triangulation system obtained thus is called cloud data or referred to as a cloud.
When using laser scanner, many problems may disturb the acquisition of high precision cloud data.These problems such as include but not limited to: the change of the level of the result of the change of the incident angle on the change as the reflectivity of body surface or the surface relative to projection light and light received in the camera head plane of delineation; The low resolution of the adjacent edges at the edge in such as hole; And multipath interference.In some cases, operator may not perceive problem and maybe can not eliminate problem.In these cases, result is cloud data disappearance or makes mistakes.
Therefore, although existing scanner is applicable to the expection object of these scanners, particularly can be adapted to undesirably condition in the scanner providing the data point of improvement to obtain providing, still there are the needs for improving.
Summary of the invention
According to an aspect of the present invention, a kind of non-contact optical three-dimensional measuring apparatus is provided.This non-contact optical three-dimensional measuring apparatus comprises: assembly, it comprises projector, first camera head and the second camera head, wherein, projector, first camera head and the second camera head are relative to each other fixed, between projector and the first camera head, there is the first distance and there is second distance between projector and the second camera head, this projector has light source, this projector is configured to the first light of any one had in multiple spatial varying pattern to be transmitted on the surface of object, first camera head has the first lens and the first light-sensitive array, first camera head is configured to receive from the Part I of the first light of this surface reflection and responsively produces the first digital signal, first camera head has the first visual field, this first visual field is the first angle viewing areas of the first camera head, second camera head has the second lens and the second light-sensitive array, second camera head is configured to receive from the Part II of the first light of this surface reflection and responsively produces the second digital signal, second camera head has the second visual field, this second visual field is the second angle viewing areas of the second camera head, second visual field is different from the first visual field, and processor, it is electrically coupled to projector, the first camera head and the second camera head, and computer-readable medium, it makes when being run by processor collect the first digital signal in the very first time and collect the second digital signal in second time different from the very first time, and the three-dimensional coordinate of first determined on this surface based on the first digital signal and the first distance at least in part also determines the three-dimensional coordinate of the second point on this surface at least in part based on the second digital signal and second distance.
According to an aspect of the present invention, the method for the three-dimensional coordinate on a kind of surface for determining object is provided.The method comprises: provide and comprise projector, the assembly of the first camera head and the second camera head, wherein, projector, first camera head and the second camera head are relative to each other fixed, between projector and the first camera head, there is the first distance and there is second distance between projector and the second camera head, this projector has light source, this projector is configured to the first light of any one had in multiple spatial varying pattern to be transmitted into this on the surface, first camera head has the first lens and the first light-sensitive array, first camera head is configured to receive the Part I from the first light of this surface reflection, first camera head has the first visual field, this first visual field is the first angle viewing areas of the first camera head, second camera head has the second lens and the second light-sensitive array, second camera head is configured to receive the Part II from the first light of this surface reflection, second camera head has the second visual field, this second visual field is the second angle viewing areas of the second camera head, second visual field is different from the first visual field, the processor being electrically coupled to projector, the first camera head and the second camera head is provided, in the first example, will there is the first light of the first pattern selected in the middle of described multiple spatial varying pattern from projector to this on the surface, in this first example, utilize the first camera head to obtain first image on this surface, and responsively and by the first digital signal be sent to processor, determine the first set of the three-dimensional coordinate of first on this surface, wherein this first set is at least in part based on the first pattern, the first digital signal and the first distance, perform the diagnostic procedure of the quality that evaluation first is gathered, at the second pattern of the first light determining to select in the middle of multiple spatial varying pattern, this second pattern is at least in part based on the result of diagnostic procedure, in the second example, will there is the second light of the second pattern from projector to this on the surface, in the second example, utilize the second camera head to obtain second image on this surface, and responsively and by secondary signal be sent to processor, and determine the second point on this surface three-dimensional coordinate second set, wherein this second set at least in part based on the second pattern, the second digital signal and second distance.
According to the description carried out below in conjunction with accompanying drawing, these and other advantage and feature will become more obvious.
Accompanying drawing explanation
In the ending of this instructions, particularly point out in detail in the claims and explicitly call for protection to be regarded as theme of the present invention.According to the detailed description of carrying out below in conjunction with accompanying drawing, aforementioned and further feature of the present invention and advantage apparent, in the accompanying drawings:
Fig. 1 is the top schematic view of scanner according to an embodiment of the invention;
Fig. 2 is the process flow diagram of the method for the scanner that application drawing 1 is shown;
Fig. 3 is the top schematic view of scanner according to another embodiment of the present invention;
Fig. 4 is the process flow diagram of the method for the scanner that application drawing 3 is shown;
Fig. 5 A is the schematic diagram according to the element in the laser scanner of embodiment;
Fig. 5 B is the process flow diagram of the method for the operation scanner illustrated according to embodiment;
Fig. 6 is the top schematic view of scanner according to another embodiment of the present invention;
Fig. 7 is the process flow diagram of the method for the operation scanner illustrated according to embodiment;
Fig. 8 A and 8B is the stereographic map of the scanner be combined with remote probe apparatus according to an embodiment of the invention;
Fig. 9 is the process flow diagram of the method for the scanner that application drawing 5 is shown;
Figure 10 is the top schematic view of the scanner according to embodiment;
Figure 11 is the process flow diagram of the method for the scanner that operation Figure 10 is shown; And
Figure 12 is the process flow diagram of the diagnostic method illustrated according to embodiment.
Embodiment part such as illustrates embodiments of the invention and advantage and feature with reference to accompanying drawing.
Embodiment
The embodiment provides the reliability of three-dimensional coordinate and the advantage of precision that improve the point cloud data that scanner obtains.Embodiments of the invention detect operation that is abnormal and adjustment scanner automatically and provide advantage to obtain in expected result in obtained data.Embodiments of the invention detect abnormal and provide in instruction to the operator in the region needing additional data to obtain and provide advantage in obtained data.Further embodiment of the present invention detects abnormal and provides advantage to remote probe can be utilized to provide in instruction to the operator obtaining additional data and obtain in obtained data.
Scanner device obtains the three-dimensional coordinate data of object.In one embodiment, the scanner 20 shown in Fig. 1 has the housing 22 comprising the first camera head 24, second camera head 26 and projector 28.Light 30 is transmitted on the surface 32 of object 34 by projector 28.In the exemplary embodiment, projector 28 uses the visible light source thrown light on to pattern generator.Visible light source can be such as laser instrument, superluminescent diode, incandescent lamp, xenon lamp, light emitting diode (LED) or other light-emitting device.In one embodiment, pattern generator is chromium sliding part (chrome-on-glassslide) on glass that on it, etching has a structured light patterns.This sliding part can have single pattern or move into and shift out multiple patterns of appropriate location as required.Manually or automatically sliding part can be arranged on operating position.In other embodiments, source pattern can by the light of digital micro-mirror device (DMD) (such as, TexasInstruments company manufacture Digital light projector (DLP), liquid-crystal apparatus (LCD), liquid crystal over silicon (LCOS) device or the similar installation that uses under transmission mode instead of reflective-mode) reflection or transmission.Projector 28 can also comprise change emergent light to cover the lens combination 36 of desired region.
In the present embodiment, projector 28 can be configured to and is issued on region 37 by structured light.As used herein, term " structured light " refers to the two-dimensional pattern of the light transmitted on region that can be used for the information of the coordinate of the point determined on object, that project object.In one embodiment, structured light patterns will comprise at least three the non-colinear pattern elements be arranged in this region.Each in these three non-colinear pattern elements all transmits the information that can be used for determining point coordinate.In another embodiment, the projector that can be configured to both projected area pattern and line pattern is set.In one embodiment, projector is the digital micro-mirror device (DMD) being configured to switch back and forth between both.In one embodiment, DMD projector can also with the inswept line of grating pattern or inswept point.
Usually, there is the structured light patterns of encoded light pattern and this two type of non-coding light pattern.As used herein, encoded light pattern is the three-dimensional coordinate on the illuminated surface of object is the pattern obtained by obtaining single image.Utilize encoded light pattern, can obtain when projection arrangement just moves relative to object and register cloud data.The encoded light pattern of one type comprises the group element (such as, geometric configuration) being arranged in line, and at least three elements wherein in these elements are non-colinear.Such pattern element due to they arrangement but discernible.
By contrast, non-coding structured light patterns as used in this article does not allow to carry out by single pattern the pattern measured.A series of non-coding light pattern can sequentially by projection also imaging.For this situation, usually need to keep projector to fix relative to object.
Should be appreciated that scanner 20 can use coding or noncoding structured light patterns.Structured light patterns can comprise pattern disclosed in the journal of writings " DLP-BasedStructuredLight3DImagingTechnologiesandApplicat ions " that JasonGeng delivers in ProceedingsofSPIE, Vol.7932.In addition, in some following in this article embodiments, projector 28 sends the pattern being configured as inswept light or inswept luminous point.Inswept light and inswept luminous point on the region of light identify the interference of such as multipath some types abnormal in provide advantage.When scanner keeps static automatic inswept line provide surface point evenly also there is advantage in sampling.
First camera head 24 comprises light sensor 44, and this light sensor 44 generates the digital picture/expression in the region 48 in the visual field of this sensor.This sensor can be charge-coupled image sensor (CCD) type sensor or complementary metal oxide semiconductor (CMOS) (CMOS) the type sensor such as with pel array.First camera head 24 such as can also comprise other parts such as, but not limited to lens 46 and other optical device.Related first focal length of lens 46 tool.Sensor 44 and lens 46 carry out cooperating to define the first visual field " X ".In the exemplary embodiment, the first visual field " X " be 16 degree (be 0.28 inch for per inch).
Equally, the second camera head 26 comprises light sensor 38, and this light sensor 38 generates the digital picture/expression in the region 40 in the visual field of this sensor.This sensor can be charge-coupled image sensor (CCD) type sensor or complementary metal oxide semiconductor (CMOS) (CMOS) the type sensor such as with pel array.Second camera head 26 such as can also comprise other parts such as, but not limited to lens 46 and other optical device.Related second focal length of lens 42 tool, this second focal length is different from the first focal length.Sensor 38 and lens 42 carry out cooperating to define the second visual field " Y ".In the present example embodiment, the second visual field " Y " be 50 degree (be 0.85 inch for per inch).Second visual field Y is greater than the first visual field X.Equally, region 40 is greater than region 48.Should be appreciated that larger visual field makes it possible to obtain quickly the given area of the body surface 32 that will measure; But if light-sensitive array 44 and 38 has identical pixel quantity, then less visual field will provide higher resolution.
In the present example embodiment, projector 28 and the first camera head 24 are arranged by fixed relationship with the angle making sensor 44 and can receive the light of the surface reflection from object 34.Equally, projector 28 and the second camera head 26 are arranged by fixed relationship with the angle making sensor 38 and can receive the light reflected from the surface 32 of object 34.Because projector 28, first camera head 24 and the second camera head 26 have fixing geometric relationship, the Distance geometry coordinate of the point therefore on surface can utilize their triangle relation to determine.Although the visual field (FOV) of camera head 24 and 26 shown in Figure 1 does not have overlap, FOV can partly overlap or completely overlapping.
Projector 28 and camera head 24,26 are electrically coupled to the controller 50 be arranged in housing 22.Controller 50 can comprise one or more microprocessor, digital signal processor, storer and circuit for signal conditioning.Scanner 20 can also comprise and can manually boot to initiate the operation of scanner 20 and the actuator (not shown) of data capture by operator.In one embodiment, controller 50 performs the image procossing of X, Y, Z coordinate data of the some cloud determining the surface 32 representing object 34.Such as, coordinate data this locality can be stored in such as volatibility or nonvolatile memory 54.Storer can be removable, such as flash drive or storage card.In other embodiments, scanner 20 has the telecommunication circuit 52 allowing scanner 20 coordinate data to be transferred to teleprocessing system 56.Communication media 58 between scanner 20 and teleprocessing system 56 can be wired (such as, Ethernet) or wireless (such as, bluetooth, IEEE802.11).In one embodiment, the image obtained that teleprocessing system 56 can be transmitted via communication media 58 based on scanner 20 determines coordinate data.
As shown in four-headed arrow 47, relative motion can be carried out between body surface 32 and scanner 20.There is several mode that this relative motion can be provided.In an embodiment, scanner is hand-scanner and object 34 is fixing.Relative motion is provided relative to moving at body surface by making this scanner.In another embodiment, scanner is attached to end effector of robot.Along with robot moves relative to body surface, robot provides relative motion.In another embodiment, scanner 20 or object 34 are attached to mechanically moving mechanism, such as vertical frame coordinate measuring machine or joint arm CMM.Along with mechanically moving mechanism is relative to body surface mobile scanners have been 20, by mechanically moving, mechanism provides relative motion.In certain embodiments, provide motion by the action of operator, and in other embodiments, by the mechanism be under computing machine control to provide motion.
Referring now to Fig. 2, the operation of scanner 20 according to method 1260 is described.As shown in block 1262, first structured light patterns is transmitted on the region 37 on the surface 32 of object 34 by projector 28.Light 30 from projector 28 reflects from surface 32 reflected light 62 received as the second camera head 26.The three-D profile on surface 32 affects the image of the pattern that the light-sensitive array 38 in the second camera head 26 is caught.Use from the information of one or more image collection of one or more pattern, controller 50 or teleprocessing system 56 determine the one-to-one relationship between the pattern of the light that the pixel of light-sensitive array 38 and projector 28 send.Use this one-to-one relationship, use principle of triangulation to determine the three-dimensional coordinate of the point on surface 32.This acquisition of three-dimensional coordinate data (cloud data) shown in block 1264.By making scanner 20 move relative to surface 32, the some cloud of whole object 34 can be created.
During scan process, as shown in block 1266, controller 50 or teleprocessing system 56 can detect undesirably situation or the problem of cloud data.The method for detecting this problem is discussed below for Figure 12.Detected problem can be such as have inerrancy in cloud data in a particular area.There is inerrancy may be due to very few from the light of this regional reflex or too much caused in data.Very few or too much reflected light may produce due to the difference of the reflectivity on body surface, such as, as the high or variable result of the incident angle of light 30 on body surface 32 or the result as antiradar reflectivity (black or transparent) material or glossy surface.Specified point on object can be angled to produce the very bright specular reflectance being known as flicker.
Another possible cause faultless is had to be lack resolution in the region with the change of fine feature, sharp edges or fast deep in cloud data.Thisly lack the result that resolution may be such as hole.
Another possible cause faultless is had to be multipath interference in cloud data.Usually, the light ray from projector 28 shines the point on surface 32 and is scattered in angular range.Scattered light is imaged onto on the fleck on light-sensitive array 38 by the lens 42 of camera head 26.Equally, scattered light can be imaged onto on the fleck on light-sensitive array 44 by the lens 46 of camera head 24.When arriving the light of point on surface 32 and not being the light ray that only comes from from projector 28 but coming from the fill-in light from another part reflection on surface 32 in addition, multipath interference occurs.Such fill-in light can comprise the pattern of the light that light-sensitive array 38,44 receives, and prevents from thus determining three-dimensional coordinate a little exactly.For Figure 12, the method for identifying the existence that multipath disturbs is described in this application.
If determine that a cloud is good at block 1266 middle controller, then this process terminates.Otherwise, in block 1268, determine that scanner uses in a manual mode or uses in automatic mode.If this pattern is manually, then in block 1270, guide operator that scanner is moved to desired locations.
There is the multiple mode that can indicate movement desired by operator.In an embodiment, the desired orientation of the pilot lamp instruction movement on scanner body.In another embodiment, the surface in the direction that instruction operator will move through is projected light onto.In addition, the color of projection light can beacon scanning device be closely or excessively far away relative to object.In another embodiment, the instruction of the display in the region that light will be projected to about operator is carried out.Such display can be that the figure of cloud data represents, cad model or the combination of both.This display can be presented on the computer monitor or be built on the display in scanister.
In any embodiment in these embodiments, expect the method for the approximate location determining scanner.In one case, scanner can be attached to joint arm CMM, this joint arm CMM at its abutment use angle scrambler to determine position and the orientation of the scanner being attached to its end.In another case, scanner comprises the inertial sensor being positioned at device.Inertial sensor such as can comprise gyroscope, accelerometer and magnetometer.Determine the other method of the approximate location of sensor be irradiate to be positioned on object or photogrammetric point around as gauge point.Like this, the wide FOV camera head in scanner can determine the approximate location of scanner relative to object.
In another embodiment, the cad model on computer screen represents expects the additional region measured, and operator carrys out correspondingly mobile scanners have been by making the feature on object and the characteristic matching on scanner.By upgrading the cad model on screen when scanning, the rapid feedback of the desired region whether measuring this part can be provided to operator.
After scanner is moved to appropriate location by operator, in block 1272, little FOV camera head 24 is utilized to measure.By watching relatively little region in block 1272, improve the resolution of the three-dimensional coordinate so obtained and providing to characterize the better ability of the feature in such as hole and edge.
Because narrow FOV camera head watches relatively little region compared to wide FOV camera head, therefore projector 28 can throw light on to relatively little region.Owing to there is the relatively less lighting point that light can be reflected back into object on object, therefore there is advantage in this in the interference of elimination multipath.There is less field of illumination and also make more easily to control to expose to obtain best light quantity for the given reflectivity of tested object and incident angle.In block 1274, if collect all points, then this process terminates at block 1276; Otherwise this process continues.
Be in automatic embodiment in the pattern from block 1268, then, in block 1278, scanner is moved to desired locations by automechanism.In certain embodiments, automechanism will have to provide the sensor of the information relevant with the relative position of scanner and tested object.For the embodiment that automechanism is robot, the angular transducer in joint of robot provides the information relevant with orientation with the position of the end effector of robot for keeping scanner.For the embodiment of the automechanism mobile object of another type, linear encoder or other sensor various can provide the information of the relative position about object and scanner.
After scanner or object are moved to appropriate location by automechanism, in block 1280, utilize little FOV camera head to carry out three-dimensional measurement.Repeat these by means of block 1282 to measure, until all measurements all complete and process terminates at block 1284.
In one embodiment, when scanner from utilize the second camera head 26 obtain data switch to the first camera head 24 time, projector 28 change structure light pattern.In another embodiment, identical structured light patterns is for both camera heads 24,26.In another embodiment, when the first camera head 24 obtains data, projector 28 sends by inswept line or puts the pattern formed.Utilizing after the first camera head 24 obtains data, process use second camera head 26 continues scanning.This process continues, until operator has scanned the desired region of part.
Although should be appreciated that and the process of Fig. 2 is shown for linear or sequential processes, in other embodiments, can one or more in step shown in executed in parallel.In the method shown in Fig. 2, first the method relates to measures whole object, then performs more detailed measurement according to the assessment of obtained cloud data.To measure in detail by using the camera head 24 with little FOV or critical area brings into use the alternative scheme of camera head 20.
It is also understood that common way is the mode of the FOV of camera head or the projector providing the mode of change camera head lens or projection lenses as changing in scanning system in existing scanning system.But such change is consuming time, and usually need such as to put the object of plate (dotplate) to determine the ancillary relief step of the aberration correction parameter of camera head or projecting apparatus system in the front placement of camera head or projector.Thus, provide the scanning system of two camera heads (such as, the camera head 24,26 of Fig. 1) with different FOV in measuring speed and provide significant advantage enabling with fully automatic mode in scanner.
Figure 3 illustrates another embodiment of scanner 20, this scanner 20 has the housing 22 comprising the first coordinate acquisition system 76 and the second coordinate acquisition system 78.First coordinate obtains system 76 and comprises the first projector 80 and the first camera head 82.Similar with the embodiment of Fig. 1, light 84 is transmitted on the surface 32 of object 34 by projector 80.In the present example embodiment, projector 80 uses the visible light source thrown light on to pattern generator.Visible light source can be laser instrument, superluminescent diode, incandescent lamp, light emitting diode (LED) or other light-emitting device.In one embodiment, pattern generator is chromium sliding part on glass that on it, etching has a structured light patterns.This sliding part can have single pattern or move into and shift out multiple patterns of appropriate location as required.Sliding part can be arranged on operating position manually or automatically.In other embodiments, source pattern can reflect by digital micro-mirror device (DMD) (Digital light projector (DLP), liquid-crystal apparatus (LCD), liquid crystal over silicon (LCOS) device or the similar installation that uses under transmission mode instead of reflective-mode that such as TexasInstruments company manufactures) or the light of transmission.Projector 80 can also comprise the lens combination 86 changed into by emergent light and have and expect focus characteristic.
First camera head 82 comprises light-sensitive array sensor 88, and this light sensor 88 generates the digital picture/expression in the region 90 in the visual field of this sensor.This sensor can be charge-coupled image sensor (CCD) type sensor or complementary metal oxide semiconductor (CMOS) (CMOS) the type sensor such as with pel array.First camera head 82 such as can also comprise other parts such as, but not limited to lens 92 and other optical device.First projector 80 and the first camera head 82 with make the first camera head 82 can detect the light 85 from the first projector 80 reflected from the surface 32 of object 34 angle, arrange by fixed relationship.Should be appreciated that because the first camera head 92 and the first projector 80 are arranged with fixed relationship, therefore can use the coordinate of point in region 90 that above-described triangle principle is determined on surface 32.Although Fig. 3 being shown for making the first camera head 82 near the first projector 80 to know, should be appreciated that and the opposite side of camera head closer to housing 22 can be placed.By making the first camera head 82 and the first projector 80 interval must be far away, the precision that expection 3D measures improves.
Second coordinate obtains system 78 and comprises the second projector 94 and the second camera head 96.Projector 94 has light source, and this light source can comprise the light source of laser instrument, light emitting diode (LED), superluminescent diode (SLED), xenon lamp or some other suitable type.In an embodiment, lens 98 are for being focused into light 100 by the light received from LASER Light Source and can comprising the lens of one or more cylindrical lens or other shape various.Because lens can comprise the set of one or more separate lenses or lens, therefore also these lens are called " lens combination " herein.This light is roughly straight, that is, will be less than about 1% of its length relative to the maximum deviation of line.The lens of the utilizable type of embodiment are rod lenss.Rod lens is generally in the polished wholecircle barrel shape be made up of glass or plastics in the ground at circumference and two ends.Such lens convert the collimated light of the diameter by rod to line.The lens of operable another type are cylindrical lenses.Cylindrical lens is the lens of the shape with part-cylindrical.Such as, a surface of cylindrical lens can be plane, and the form of opposite face is cylindric.
In another embodiment, projector 94 generates the two-dimensional pattern of the light in the region of covering surfaces 32.Then, the coordinate obtained thus is obtained system 78 and be called structure light scan device.
Such as, the second camera head 96 comprises the sensor 102 of such as charge-coupled image sensor (CCD) type sensor or complementary metal oxide semiconductor (CMOS) (CMOS) type sensor.Second camera head 96 can also comprise other parts such as, but not limited to lens 104 and other optical device.Second projector 94 and the second camera head 96 are with the angle layout making the second camera head 96 can detect the light 106 from the second projector 94 reflected from object 34.Should be appreciated that because the second projector 94 and the second camera head 96 are arranged with fixed relationship, therefore can use the coordinate of point on the line formed by light 100 that above-described triangle principle is determined on surface 32.It is also understood that camera head 96 and projector 94 can be positioned at the opposite side of housing 22 to improve 3D measuring accuracy.
In another embodiment, second coordinate obtains system and is configured to project various pattern, and these patterns not only can comprise fixing light but also can comprise inswept light, inswept luminous point, encoded light pattern (overlay area) or order light pattern (overlay area).The projective patterns of every type has the different advantages of such as speed, precision and anti-multipath interference.Coming assess performance requirement and/or the characteristic by checking return data or expection body form by (based on the scan-data collected, rebuilding according to cad model or 3D) for each particular measurement, the type of the projective patterns making Performance optimization can be selected.
In another embodiment, the distance from the first coordinate acquisition system 76 to body surface 32 is different from from the second coordinate acquisition system 78 to the distance of body surface 32.Such as, camera head 96 can be positioned at than the position of camera head 88 closer to object 32.Like this, the second coordinate obtains the resolution of system 78 and precision and can obtain the position of system 76 relative to the first coordinate and precision improve to some extent.In many cases, utilize low resolution system 76 to carry out the relatively large and level and smooth object of rapid scanning, then utilize high-resolution system 78 to scan the details comprising edge and hole, this is helpful.
Scanner 20 can in a manual mode or automatic mode use.In a manual mode, point out operator according to use in acquisition system come mobile scanners have been closer to or further from body surface.In addition, can project will the light beam in direction of movement or light pattern to operator's beacon scanning device for scanner 20.As an alternative, the pilot lamp on device can beacon scanning device should the direction of movement.In automatic mode, the relative to each other movement automatically of scanner 20 or object 34 can be made according to measurement requirement.
Similar with the embodiment of Fig. 1, the first coordinate obtains system 76 and the second coordinate acquisition system 78 is electrically coupled to the controller 50 be arranged in housing 22.Controller 50 can comprise one or more microprocessor, digital signal processor, storer and circuit for signal conditioning.Scanner 20 can also comprise and can manually boot to initiate the operation of scanner 20 and the actuator (not shown) of data capture by operator.In one embodiment, controller 50 performs the image procossing of X, Y, Z coordinate data of the some cloud on the surface 32 determining represented object 34.Such as, coordinate data this locality can be stored in such as volatibility or nonvolatile memory 54.Such as, storer can be removable, such as flash drive or storage card.In other embodiments, scanner 20 has the telecommunication circuit 52 allowing scanner 20 coordinate data to be transferred to teleprocessing system 56.Communication media 58 between scanner 20 and teleprocessing system 56 can be wired (such as, Ethernet) or wireless (such as, bluetooth, IEEE802.11).In one embodiment, coordinate data determined by teleprocessing system 56 and scanner 20 by obtained image transmitting on communication media 58.
Referring now to Fig. 4, the method 1400 of the scanner 20 of application drawing 3 will be described.In block 1402, structured light patterns is transmitted on the region 90 on the surface 32 of object 34 by the first projector 80 of the first coordinate acquisition system 76 of scanner 20.From projector 80 light 84 from surface 32 reflection and reflected light 85 received by the first camera head 82.As mentioned above, distortion is produced in the imaging pattern of light that the change of the surface profile on surface 32 receives at the first light-sensitive array 88.Because pattern utilizes structured light, light or luminous point to be formed, therefore in some instances controller 50 or teleprocessing system 56 can determine surface 32 on point and the pixel in light-sensitive array 88 between one-to-one relationship.Make it possible to like this in block 1404, use above-mentioned triangle principle to obtain cloud data, namely determine X, Y, Z coordinate of the point on surface 32.By being moved relative to surface 32 by scanner 20, the some cloud of whole object 34 can be created.
In block 1406, controller 50 or teleprocessing system 56 determine that cloud data has expected data qualitative attribute or has potential problems.Discuss the type of contingent problem with reference to Fig. 2 and do not repeat this discussion here above.If determine that a cloud has expected data qualitative attribute at block 1406 middle controller, then this process terminates.Otherwise, in block 1408, determine that scanner uses in a manual mode or uses in automatic mode.If this pattern is manual, then in block 1410, guide operator that scanner is moved to desired locations.
There is several mode by the movement of operator's indicative of desired as described above with reference to Figure 2.Here this discussion is not repeated.
Obtain to guide operator and expect to move, need the method for the approximate location determining scanner.As described with reference to fig. 2, method can comprise: scanner 20 is attached to joint arm CMM, uses the inertial sensor in scanner 20, throws light on or the image of feature and display is matched to photogrammetric point.
After scanner is moved to appropriate location by operator, in block 1412, utilize the second coordinate to obtain system 78 measure.By using the second coordinate to obtain system, resolution and precision can be improved or can problem be eliminated.In block 1414, if collected a little, then this process terminates at block 1416; Otherwise this process continues.
If the operator scheme from block 1408 is automatic, then, in block 1418, scanner is moved to desired locations by automechanism.As a rule, automechanism will have to provide the sensor of the information relevant with the relative position of scanner and tested object.For the situation that automechanism is robot, the angular transducer in joint of robot provides the information relevant with orientation with the position of the end effector of robot for keeping scanner.For the automechanism of other type, linear encoder or other sensor various can provide the information of the relative position about object and scanner.
After scanner or object are moved to appropriate location by automechanism, in block 1420, the second coordinate is utilized to obtain system 78 to carry out three-dimensional measurement.Repeat these by means of block 1422 to measure, until all measurements all complete.This process terminates at block 1424.
Although should be appreciated that the process of Fig. 4 is illustrated as linear or sequential processes, in other embodiments, can one or more steps in these steps shown in executed in parallel.In the method shown in Fig. 4, first the method relates to measures whole object, then performs more detailed measurement according to the assessment of obtained cloud data.Obtain system 78 by using the second coordinate to measure in detail or critical area brings into use the alternative scheme of scanner 20.
It is also understood that common practice is the mode of the FOV of camera head or the projector providing the method for change camera head lens or projection lenses as changing in scanning system in existing scanning system.But such change is consuming time and the object such as putting plate is placed on the front of camera head or projector to determine the ancillary relief step of the aberration correction parameter of camera head or projecting apparatus system by usual needs.Thus, the system providing two different coordinates to obtain system of the scanning system 20 of such as Fig. 3 provides significant advantage in measuring speed and enabling with fully automatic mode in scanner.
As multipath interfere result and carry out scanner measure time may make a mistake.The origin of present discussion multipath interference, and the first method for eliminating or reduce multipath interference is described.
When the part in the light shining body surface before being back to camera head first from another surface scattering of object time, there is the situation of multipath interference.For the point received on the surface of this scattered light, so the light being sent to light-sensitive array not only corresponds to the light that directly projects from projector but also corresponding to the difference be sent to projector and from the light of object scattering.Particularly for the situation of scanner of projection two dimension (structure) light, the possibility of result of multipath interference can make calculated from projector to the distance out of true of the body surface of this point.
With reference to Fig. 5 A, the example that multipath disturbs is shown, in the present embodiment, light 4525 projects on the surperficial 451OA of object by scanner 4570.Light 4525 is perpendicular to paper.In an embodiment, the row of light-sensitive array is parallel to paper and arranges perpendicular to paper.Often row represents the point of the incident line 4525 on the direction vertical with paper.The distance from projector to object for the point on line is obtained by first calculating center of gravity for every row.For surface point 4526, the center of gravity on light-sensitive array 4541 is represented by this point 4526.The position 4546 of the center of gravity on light-sensitive array can be used to calculate distance from the camera head centre of perspectivity 4544 to object point 4526.This calculating is based on the triangle relation according to principle of triangulation.Calculating to carry out these, needing the parallax range D from the camera head centre of perspectivity 4544 to projector perspective center 4523.In addition, need to know the relative orientation of projecting apparatus system 4520 to camera system 4540.
Disturb by multipath the mistake caused in order to understand, consider point 4527.Be imaged onto on the point 4548 light-sensitive array 4541 from the light scioptics 4542 of this point reflection or scattering.But, except directly to receive from projector and from the light of point 4527 scattering, additional optical reflexed to point 4527 from point 4526 before being imaged onto on light-sensitive array.This light will most possibly scatter to undesirably position make two centers of gravity be formed in given row.As a result, the observations of two centers of gravity on given row is the good instructions that there is multipath interference.
For the situation of the structured light projected on the region of body surface, therefore the secondary reflection of the point from such as 4527 usually not as obvious like that for the light projected on line, and more likely causes the error of measured 3D surface coordinate.
By using the projector of the adjustable lighting pattern had on display element 4521, lighting pattern can be changed.Display element 4521 can be the digital micromechanics mirror (DMM) of such as Digital light projector (DLP).Such device comprise by means of electric signal can rapid adjustment with the multiple little mirror of rapid adjustment lighting pattern.Can produce and can comprise LCD (liquid crystal display) and LCOS (liquid crystal over silicon) display by other device of display pattern of regulating of electricity.
For checking in project structured light to the system on region that the mode that multipath disturbs changes display with throw light.There is multiple center of gravity in a line and expression is existed multipath interference.By inswept light, can when without the need to overlay area when operator's mobile detector.
Can by regulating display that light is arranged to any expected angle by electricity.By changing the direction of the light projected, multipath interference can be eliminated in many cases.
For there is many knuckles and steep angle to make the surface being difficult to avoid reflecting, can use and can display be regulated to carry out inswept luminous point by electricity.In some cases, secondary reflection can be produced according to single luminous point, but usually determine which reflected light spot is effectively relatively easy.
Can also use and can display be regulated to switch fast between coding pattern and uncoded pattern by electricity.In many cases, coding pattern is used to carry out 3D measurement based on the single frame of camera head information.On the other hand, multiple pattern (order or uncoded pattern) can be used to obtain the larger precision of measured 3D coordinate figure.
Past, used can electricity regulate display project order pattern in a series of patterns in each-such as, a series of grey level patterns after each all has the sinusoidal pattern sequence of out of phase.
Method of the present invention provides advantage in the following areas relative to earlier processes: selective recognition or eliminate the problem of such as multipath interference and instruction is preferably singly clapped pattern (such as, coding pattern) or clap pattern (multiple-shotpattern) to obtain the method for required precision as early as possible more.
For the situation of Line scanner, often there is the mode in order to determine the existence that multipath disturbs.When there is not multipath interference, the light that reflects by the point on body surface be imaged onto on the region of contiguous pixels with single file.If there are two or more regions of the row receiving a large amount of light, then indicate multipath interference.The example in the additional illumination region on this multipath disturbed condition and consequent light-sensitive array has been shown in Fig. 5 A.Surface 4510A has more deep camber now near point of crossing 4526.Be line 4528 on the surface that point of crossing place is orthogonal, and incident angle is 4531.The direction of reflection ray 4529 is obtained according to the reflection angle 4523 equaling this incident angle.As previously discussed, in fact light 4529 represent the general direction of the light in angular range inscattering.The center of scattered light shines surperficial 4510A at the point 4527 by lens 4544 imaging on the point 4548 of light-sensitive array.The high light quantity that can not expect received near point 4548 expresses possibility and there is multipath interference.For Line scanner, the main concern of multipath interference not for shown in Fig. 5 A, two sizable distances in spot 4546 and 4527 interval and can carrying out separately situation about analyzing, but overlap each other for two spots or smear (smear) situation together.In this case, the center of gravity corresponding with the desired point corresponding to point 4546 in Figure 15 E cannot possibly be determined.This problem is for being such as deteriorated by the situation referring again to the understandable scanner projected light onto on 2 dimensional region of Fig. 5 A.If all light needing to be imaged on light-sensitive array 4541 are to determine two-dimensional coordinate, then obviously, the light putting 4527 places will correspond to the desired pattern of the light directly projected from projector and reflex to the undesired light of a little 4527 from body surface.As a result, in this case, by likely for the light projected on region, for point 4527 calculate mistake three-dimensional coordinate.
For the light of projection, in many cases, multipath interference can be eliminated by the direction changing line.A possibility uses the projector with intrinsic two-dimentional ability to make Line scanner, makes it possible to inswept line thus or make this line automatically rotate to different directions.As previously discussed, the example of this projector is the projector utilizing digital micro-mirror (DMD).Such as, have multipath to disturb if suspected in the specific scanning utilizing structured light to obtain, then measuring system can be configured to switch to the measuring method using inswept light automatically.
Inswept luminous point instead of light or light region in those regions indicating multipath interference in order to reduce, to minimize or eliminate the other method of multipath interference.By the single luminous point that throws light on, any light from secondary reflection scattering usually easily can be identified.
As following for as described in Figure 12, diagnostic analysis is benefited to the determination by the desired pattern that can electricly regulate display to project.
Except diagnosing and correcting except the purposes in multipath interference, the pattern changing projection light is also obtaining in required precision and resolution and is providing advantage in minimum time amount.In an embodiment, first by measuring on encoded light pattern projection to object in single bat (singleshot).Use collected data and need the hole of more labor, edge or feature and the result analyzed determines the three-dimensional coordinate on surface to determine whether some regions have.Such as can carry out such labor by using the narrow FOV camera head 24 in Fig. 1 or the high resolution scanning device system 78 in Fig. 3.
As mentioned above, also analyze these coordinates to determine the approximate distance of target, thus for such as sinusoidal phase shifted light pattern order being projected the more accurate measuring method of the method on surface to provide initial distance.Use encoded light pattern to eliminate for the initial distance of each acquisition on surface the needs being obtained this information by the spacing changed repeatedly in sinusoidal phase-shifted sweep, save the plenty of time thus.
Referring now to Fig. 5 B, show the exception for overcoming the coordinate data that scanner 20 obtains and put forward high-precision embodiment.Process 211 starts with block 212 by utilizing the object that scanner 20 scans such as object 34.Scanner 20 can be the scanner such as with at least one projector and camera head, the scanner described in the embodiment of such as Fig. 1, Fig. 3, Fig. 5 and Fig. 7.In the present embodiment, in block 212, the first light pattern projects on object by scanner 20.In one embodiment, this first light pattern is coding structure light pattern.Process 211 obtains and determines three-dimensional coordinate data in block 214.Coordinate data is analyzed to determine whether there is any exception, the low resolution of all multipath described above interference, component ambient or there are not data due to surface angle or surface reflectivity change in inquiry block 216.When detecting abnormal, process 211 enters block 218, and in a block 218, the light pattern sent by projector changes into the second light pattern.In an embodiment, the second light pattern is inswept light.
After projection second light pattern, process 211 enters block 220, in this block 220, for detecting that abnormal region obtains and determines three-dimensional coordinate data.Process 211 cycles back to inquiry block 216, in this block 216, determines whether to solve exception.If inquiry block 216 still detects abnormal or lacks precision or resolution, then cycle for the treatment of is back to block 218 and switches to the 3rd light pattern.In an embodiment, the 3rd light pattern is the sinusoidal phase-shift pattern of order.In another embodiment, the 3rd light pattern is inswept luminous point.This iterative process continues, until solve abnormal.Once determine the coordinate data in the region from exception, then process 211 and enter block 222, in this block 222, the pattern sent is switched and is back to the first structured light patterns and continues scan process.Process 211 continues, until operator scans the desired region of object.In the not satisfied situation of scanning information using the method for Figure 11 to obtain, can utilize and utilize sense of touch detector to carry out the problem measured as described herein.
Referring now to Fig. 6, another embodiment of the scanner 20 being mounted to mobile apparatus 120 is shown.Scanner 20 has at least one projector 122 and at least one camera head 124, and both is arranged with fixing geometric relationship, to make it possible to use triangle principle to determine the three-dimensional coordinate of the point on surface 32.Scanner 20 can be the scanner identical with the scanner such as described in reference Fig. 1 or Fig. 3.In one embodiment, scanner is identical with the scanner of the Figure 10 with sense of touch detector.But, scanner used in the embodiment in fig 6 can be the arbitrary scan device of such as structured light or Line scanner, such as, the United States Patent (USP) 7 owned together being entitled as " PortableCoordinateMeasurementMachinewithIntegratedLineLa serScanner " that on January 18th, 2006 submits to, 246, scanner disclosed in 030.In another embodiment, used in the embodiment in fig 6 scanner is the structure light scan device projected light onto on the region on object.
In the present example embodiment, movable equipment 120 is that the arm section 126,128 connected by means of pivot fitting (pivotandswiveljoint) 130 provides automatically mobile to make arm section 126,128 can the robot device of movement, makes scanner 20 move to the second place (as this is shown in phantom in fig. 6) from primary importance like this.Movable equipment 120 such as can comprise the actuator of such as motor (not shown), these actuator to arm section 126,128 so that arm section 126,128 is moved to the second place from primary importance.Should be appreciated that the movable equipment 120 with joint arm is for exemplary purpose, and claimed the present invention should not be limited to this.In other embodiments, scanner 20 such as can be mounted to the movable equipment making scanner 20 movement via rail, wheel, track, band or cable or above-mentioned combination.In other embodiments, robot has the arm section of varying number.
In one embodiment, movable equipment is articulated arm coordinate measuring machine (AACMM), the U.S. Patent application owned together the 13/491st that such as on January 20th, 2010 submits to, the articulated arm coordinate measuring machine (AACMM) described in No. 176.In the present embodiment, scanner 20 can relate to the manual transfer arm section 126,128 of operator from primary importance to the movement of the second place.
For the embodiment with automatic equipment, movable equipment 120 also comprise be configured to energising to actuator with the controller 132 of transfer arm section 126,128.In one embodiment, controller 132 communicates with controller 134.To discuss in more detail as following, this configuration makes controller 132 can carry out mobile scanners have been 20 in response to the exception of obtained data.Should be appreciated that controller 132,134 can be incorporated in single processing unit or function can be distributed among several processing unit.
By referring to Figure 12 execution analysis, can locate and directional scanning device 20 with obtains expectation measurement result.In certain embodiments, the feature in measurement can benefit from the desired orientation of scanner.Such as, being approximately perpendicular to hole by making scanner camera head 124 be oriented, the measurement of the diameter in hole can be improved.In other embodiments, can location scanning device with reduce or minimize multipath interference possibility.Such analysis can based on can be used as diagnostic procedure a part cad model or can carry out before secondary moves by the data of scanner collected by initial position based at equipment 120 pairs of scanners 20.
Referring now to Fig. 7, the operation of scanner 20 and movable equipment 120 will be described.This process starts with block 136 by utilizing scanner 20 scanning object 34 in primary importance.In block 138, scanner 20 obtains for the point on the surface 32 of object 34 and determines coordinate data.Should be appreciated that movable equipment can mobile scanners have been 20 to obtain about the data of the surface point in desired region.In inquiry block 140, determine whether the coordinate data of a little 142 exists such as the abnormal of such as multipath interference or the need of the direction changed in order to the resolution or measuring accuracy obtaining raising.Should be appreciated that the point 142 of Fig. 6 can represent the region on a single point, dotted line or surface 32.If exception detected or need to improve precision; then process proceeds to block 144, in this block 144, and the position of movable equipment 120 mobile scanners have been 20; such as from primary importance to the second place, and in block 146, rescan region-of-interest to obtain three-dimensional coordinate data.Cycle for the treatment of is back to inquiry block 140, determines whether coordinate data still exists abnormal or whether expect to improve measuring accuracy in this inquiry block 140.If these situations, then mobile scanners have been 20 again, and process continues, until measurement result realizes aspiration level.Once acquisition coordinate data, process and enter block 148 from inquiry block 140, in this block 148, scan process continues, until scan desired region.
Comprise in the embodiment of sense of touch detector (Figure 10) at scanner 20, scanner can be configured to utilize sense of touch detector to contact region-of-interest from primary importance to the movement of the second place.Owing to can determine that the position of sense of touch detector is also determined in the position of scanner thus according to the position of arm section 126,128 and orientation, the three-dimensional coordinate of the point on surface 32 therefore can be determined.
In certain embodiments, the measurement result that the scanner 20 of Fig. 8 A, Fig. 8 B obtains may be hindered and damaged by multipath.In other cases, measurement result possibly cannot provide and expect that resolution or precision are with some characteristics, particularly edge of suitably measured surface 32, hole or complex characteristic.In these cases, operator is expected to make to use remote probe 152 to inquire about point on surface 32 or region.In the embodiment of shown in Fig. 8 A, Fig. 8 B, scanner 20 comprises projector 156 and camera head 154,155, camera head 154,155 relative to projector 156 with the light making projector 156 and send from surface 32 reflection and one that is caught on camera device 154,155 or angle that both receives arrange.Projector 156 and camera head 154,155 are arranged with fixing geometric relationship, make it possible to use triangle principle to determine the three-dimensional coordinate of the point on surface 32.
In one embodiment, as shown in Figure 8 A, projector 156 is configured to be transmitted into by visible ray 157 on the region-of-interest 159 on the surface 32 of object 34.In camera head 154,155 one, field of illumination 159 or the image on both can be used to confirm the three-dimensional coordinate of thrown light on region-of-interest 159.
Scanner 20 is configured to cooperate with remote probe 152, makes operator that probe tip 166 can be made to contact with body surface 132 at thrown light on region-of-interest 159 place.In an embodiment, remote probe 152 comprises at least three non-aligned luminous points 168.Luminous point 168 can be the hot spot point that such as light emitting diode (LED) produces or the reflection light point irradiated by the infrared or visible light source from projector 156 or Fig. 8 B another light source unshowned.In this case infrared or visible light source can be attached to scanner 20 or the outside of scanner 20 can be placed on.By utilizing the three-dimensional coordinate of scanner determination hot spot point 168 and by using the information about the geometric configuration of detector 152, the position of probe tip 166 can being determined, make it possible to the coordinate determining body surface 32 thus.The sense of touch detector used in like fashion eliminates potential problems from multipath interference, and makes it possible to carry out relatively measuring accurately to hole, edge and detailed features.In an embodiment, detector 166 is the sense of touch detectors that can be started by the actuator button (not shown) pressed on detector, or detector 166 can be by contacting touch trigger detector started with surface 32.In response to actuator button or touch the signal that produces of trigger detector, telecommunication circuit (not shown) by Signal transmissions to scanner 20.In an embodiment, hot spot point 168 is replaced with the geometric scheme of the light that can comprise line or curve.
Referring now to Fig. 9, show for using the stationary scanner 20 of Fig. 8 A, Fig. 8 B with remote probe 152 to the process of the coordinate data of the point on the surface 32 obtaining object 34.This process is started with block 170 by the surface 32 of scanning object 34.In block 172, this process obtains and determines the three-dimensional coordinate data on surface 32.Then, this process determines in inquiry block 174 whether the coordinate data in region 159 exists the precision in abnormal or region 159 or whether resolution has problems.Extremely can be the invalid data such as abandoned due to multipath interference.Such as, extremely can also be due to surface reflectivity or the missing data that lacks resolution and produce around the feature of such as opening or hole.With reference to Figure 12 provide with for detecting the relevant details of diagnostic procedure that (identification) multipath disturbs and relevant issues.
Once identify region 159, scanner 20 can obtain the coordinate data in region 159 in block 176 via remote probe 152 to operator's instruction.This region 159 can indicate to throw light on to region 159 by sending visible ray 157.In one embodiment, light 157 is sent by projector 156.The color of light 157 can be changed with the type to operator notification exception or problem.Such as, when there is multipath interference, light 157 can be made to take on a red color, and low resolution can be made to be green.This region of instruction further on the display can representing (such as, cad model) at the figure with object.
Then, process enters block 178, to obtain the image of remote probe 152 when sensor 166 surface in contact 32.Can be that the luminous point 168 of such as LED or reflectance target is caught on camera one of device 154,155 and receives.Use the well-known best fit technique of mathematician, the three-dimensional coordinate at detector center determined by scanner 20 in block 180, determines the three-dimensional coordinate of body surface 32 in block 180 according to the three-dimensional coordinate at detector center.Once obtain the point detected in abnormal region 159, process can proceed to the scanning continuing object 34 in block 182, until scan desired region.
Referring now to Figure 10, another embodiment of the scanner 20 that operator during operation can be hand-held is shown.In the present embodiment, housing 22 can comprise the handle 186 making operator can hold scanner 20 during operation.Housing 22 comprises projector 188 and camera head 190, and both relative to each other reflect from surface 32 with the light 192 making projector send and is caught on camera the angle layout that device 190 receives.The scanner 20 of Figure 10 carries out work in the mode roughly the same with the embodiment of Fig. 1 with Fig. 3, and uses triangle principle to obtain the three-dimensional coordinate data of the point on surface 32.
Scanner 20 also comprises integrated detector component 184.This detector component 184 at one end comprises sensor 194.Such as, sensor 194 can press to operator the sense of touch detector that actuator button (not shown) responds, or sensor 194 can be to the contact touch trigger detector responsively with surface 32.To discuss in more detail as following, detector component 184 makes operator can obtain the coordinate of point on surface 32 by making sensor 194 surface in contact 32.
The actuator circuit of projector 188, camera head 190 and sensor 194 is electrically coupled to the controller 50 be arranged in housing 22.Controller 50 can comprise one or more microprocessor, digital signal processor, storer and circuit for signal conditioning.Scanner 20 such as can also comprise operator can manually boot to initiate the operation of scanner 20 and the actuator (not shown) such as on handle 186 of data capture.In one embodiment, controller 50 performs the image procossing of X, Y, Z coordinate data of the some cloud determining the surface 32 representing object 34.Such as, coordinate data this locality can be stored in such as volatibility or nonvolatile memory 54.Such as, storer can be removable, such as flash memory or storage card.In other embodiments, scanner 20 has the telecommunication circuit 52 making scanner 20 coordinate data can be transferred to teleprocessing system 56.Communication media 58 between scanner 20 and teleprocessing system 56 can be wired (such as, Ethernet) or wireless (such as, bluetooth, IEEE802.11).In one embodiment, coordinate data determined by teleprocessing system 56 and scanner 20 by obtained image transmitting on communication media 58.
Referring now to Figure 11, the operation of the scanner 20 of Figure 10 will be described.This process is started with block 196 by the operator surface 32 that manually mobile scanners have been 20 carrys out scanning object 34.Determine in block 198 and obtain three-dimensional coordinate.In inquiry block 200, determine whether to exist in coordinate data abnormal or the need of raising precision.As mentioned above, for the low resolution of the interference of such as multipath, surface reflectivity change or feature multiple reason and may exception be there is.If exist abnormal, then process enters block 202, in this block 202, to operator indicating area 204.By utilizing projector 188, visible ray 192 can be projected indicating area 204 on surface 32.In one embodiment, make light 192 painted with the type to the exception detected by operator notification.
Then, in block 206, operator continues scanner to move to (by shown in dotted line) second place from primary importance.In the second place, sensor 194 surface in contact 32.Well-known Optimum Fitting Methods can be used, determine to be in position and the orientation (six degree of freedom) of the scanner 20 of the second place based on the image of camera head 190 acquisition.Owing to being known with the physical construction of scanner 20 about the size of ground, sensor 194 and layout, the three-dimensional coordinate data of the point in region 204 therefore can be determined in block 208.Then, process enters block 210, and in this block 210, the scanning of object continues.Scan process continues, until scan desired region.
Conventional method can be used not only to evaluate multipath interference but also usually also evaluate and to comprise the resolution of material type and the quality of effect, surface quality and geometric configuration.Also with reference to Figure 12, in an embodiment, can automatic execution method 4600 under the control of the computer.Step 4602 determines whether can use about the information of the three-dimensional coordinate of tested object.The three-dimensional information of the first type is cad data.The nominal size of the tested object of cad data ordinary representation.The three-dimensional information of the second type is the three-dimensional data-such as measured, and has previously utilized the data that scanner or other measurement device go out.In some cases, step 4602 can comprise another step of the reference frame of coordinate measuring set (such as, laser tracker or six DOF scanner accessories) being aimed at the reference frame of object.In an embodiment, this is undertaken by utilizing at least three points on the surface of laser traces device measurement object.
If be that three-dimensional information can be used for the answer of the problem proposed in step 4602, then in step 4604, use computing machine or processor to calculate object and measure the susceptibility that multipath is disturbed.In an embodiment, this every bar light ray sent by projection scanner projector and calculating is carried out for the angle of often kind of situation or reflectivity.Computing machine or software identification are to each region of the body surface of the error sensitive of the result disturbed as multipath.Step 4604 can also for each position execution analysis to the susceptibility of Multipath Errors of 6-DOF detector relative to tested object.In some cases, as mentioned above, can avoid or minimize multipath interference by selecting 6-DOF detector relative to the appropriate location of tested object and orientation.If be that three-dimensional information is unavailable for the answer of the problem proposed in step 4602, then step 4606 uses any expectation or preferred measuring method to measure the three-dimensional coordinate of body surface.After the interference of calculating multipath, step 4608 can be performed to evaluate the other side of expection quality of scanning.Such qualitative factor is whether the resolution of scanning is enough for the feature of tested object.Such as, if the resolution of device is 3mm and there is the submillimeter feature expecting effective scanning data, then should notice that these problem areas of object are to carry out correction operation after a while.Another part qualitative factor relevant with resolution measures the ability at the edge of object and the edge in hole.Know that scanner performance will make it possible to determine whether scanner resolution is enough good for given edge.Another qualitative factor is the light quantity that expection returns from given feature.When it is expected to such as inner from aperture or return any light from glancing angle to scanner, this light quantity is few.In addition, can from a small amount of light of materials expectations of particular types and color.The material of particular type can have large penetration depth for the light from scanner, and in this case, cannot expect good measurement result.In some cases, auto-programming can to user's query side information.Such as, if computer program just performs step 4604 and 4608 based on cad data, then possibly the type of the material in use or the character of surface of tested object cannot be known.In these cases, step 4608 can comprise the another step of the material behavior obtaining tested object.
Step 4604 and 4608 analysis after, step 4610 determines whether to perform further diagnostic procedure.First example of possible diagnostic procedure is to note whether observing the step 4612 of multipath interference with optimized angle projection striped.The general instruction that the multipath for incident line striped disturbs is discussed above with reference to Fig. 5.Another example of diagnosis algorithm is step 4614, and the one group of line alignd along polar curve direction projects on the source pattern of the light 30 from projector 36 in the source pattern of light, such as Fig. 1 by this step 4614.The light made in the source pattern of light is aligned to the situation of polar curve, so these lines are by the plane of delineation that appears at as straight line on light-sensitive array.At the U.S. Patent application owned together the 13/443rd that on April 11st, 2012 submits to, in No. 946, discuss the use of polar curve in more detail.If if these patterns on light-sensitive array are not straight lines or these lines fog or Noise, then the likely indication problem as the result of multipath interference.
Step 4616 selects the combination of preferred motion based on carried out analysis and diagnostic procedure.If measuring speed particular importance, then preferably can use the 2D of encoded light (structure) pattern to carry out the step 4618 measured.If larger precision is more important, then can preferred steps 4620, this step 4620 use utilization order pattern (such as, the sinusoidal pattern sequence of phase place and spacing change), 2D (structure) pattern of encoded light measures.If system of selection 4618 or 4620, then can expect also to select step 4628, this step 4628 reorientates scanner, in other words, and adjustment scanner is relative to the position of the multipath interference making to be provided by the analysis of step 4604 and mirror-reflection (flicker) this position minimized and orientation.Can to throw light on to problem area from the light of scanner projector by utilizing or to provide such instruction by being presented in monitor display in such region to user.As an alternative, computing machine or processor can select the following step in measuring process automatically.If preferred scanner location cannot eliminate multipath interference and flicker, then can utilize several option.In some cases, duplicate measurements can be carried out by reorientating scanner and combining effective measurement result.In other cases, optional measuring process or replacement can be added to this process to use structured light and perform optional measuring process.As previously mentioned, the step 4622 of scan light striped provide when have from multipath interference problem chance reduce obtain about region information facilitate mode.On region-of-interest, the step 4624 of inswept small light spot point further reduces the chance of the problem from multipath interference.Sense of touch detector is utilized to eliminate the possibility of multipath interference to the step in the region measuring body surface.Sense of touch detector provides the known resolution of the size based on probe tip, and which obviate can the antiradar reflectivity of detectable light or the problem of large optical penetration depth in some tested objects.
As a rule, in step 4630, can based on from measure data that obtain, that combine with the result of the analysis previously performed evaluate in the combination of step 4618-4628 collected by the quality of data.If find that in step 4623 quality can accept, then in step 4634, complete measurement.Otherwise, continue again to analyze in step 4604.In some cases, 3D information may not be accurately as desired.In this case, some steps repeated in previous steps can be helpful.
Although the embodiment having combined only limited quantity describes the present invention in detail, should easy understand, the invention is not restricted to disclosed like this embodiment.But, can modify to the present invention, not describe up to now but any amount of change matched with the spirit and scope of the present invention, change, replacement or equivalent arrangements to comprise.In addition, although described various embodiment of the present invention, should be appreciated that each aspect of the present invention only can comprise some embodiments in described embodiment.Therefore, the present invention should not be regarded as being limited by aforementioned description, but is only limited by the scope of appended claims.
Claims (amendment according to treaty the 19th article)
1. a non-contact optical three-dimensional measuring apparatus, comprising:
Assembly, it comprises the first projector, first camera head, second projector and the second camera head, wherein, described first projector, described first camera head, described second projector and described second camera head are relative to each other fixed, described first projector has the first light source, described first projector is configured to first light with at least one pattern to be transmitted on the surface of object, described first camera head has the first lens and the first light-sensitive array, described first camera head is configured to receive from the Part I of described first light of described surface reflection and responsively produces the first signal, described first camera head has the first visual field of the first angle viewing areas as described first camera head, described second projector has secondary light source, described second projector is configured to the second light to be transmitted on the described surface of described object, described second camera head has the second lens and the second light-sensitive array, described second camera head is configured to receive from the Part II of described second light of described surface reflection and responsively produces secondary signal, described second camera head has the second visual field of the second angle viewing areas as described second camera head, described second visual field is different from described first visual field, and
Processor, it is electrically coupled to described first projector, described second projector, described first camera head and described second camera head, and perform computer executable program code, described computer executable program code performs the operation comprising following content when being performed by described processor: make collect described first signal in the very first time and make to collect described secondary signal in second time different from the described very first time; The three-dimensional coordinate of first on described surface is determined at least in part based on described first signal; And the three-dimensional coordinate of the second point on described surface is determined at least in part based on described secondary signal.
2. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described second only along the light in the direction vertical with the direction of propagation of described second light.
3. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, at least one pattern described comprises at least three non-colinear pattern elements.
4. non-contact optical three-dimensional measuring apparatus according to claim 3, wherein, described second light comprises the second pattern, and described second pattern has at least three non-colinear pattern elements.
5. non-contact optical three-dimensional measuring apparatus according to claim 2, wherein, described light is temporally inswept line pattern.
6. non-contact optical three-dimensional measuring apparatus according to claim 2, wherein, described light is temporally inswept hot spot point.
7. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described first visual field is that at least twice of described second visual field is large.
8. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described first light-sensitive array comprises the first pixel, described first pixel is configured to catch the light reflected from the first area on described surface, described second light-sensitive array comprises the second pixel, described second pixel is configured to catch the light reflected from the second area on described surface, and wherein said second area is less than described first area.
9. a method for the three-dimensional coordinate on the surface determining object, described method comprises:
There is provided and comprise the first projector, first camera head, the assembly of the second projector and the second camera head, wherein, described first projector, described first camera head, described second projector and described second camera head are relative to each other fixed, the first distance is there is between described first projector and described first camera head, second distance is there is between described second projector and described second camera head, described first projector has the first light source, described first projector is configured to first light with at least one pattern to be transmitted on described surface, described first camera head has the first lens and the first light-sensitive array, described first camera head is configured to receive the Part I from described first light of described surface reflection, described first camera head has the first visual field of the first angle viewing areas as described first camera head, described second projector has secondary light source, described second projector is configured to the second light to be transmitted on described surface, described second camera head has the second lens and the second light-sensitive array, described second camera head is configured to receive the Part II from described second light of described surface reflection, described second camera head has the second visual field of the second angle viewing areas as described second camera head, described second visual field is different from described first visual field,
The processor being electrically coupled to described first projector, described first camera head, described second projector and described second camera head is provided;
In the first example, from described first projector, described first light with at least one pattern described is transmitted into described surface;
In described first example, utilize described first camera head to obtain the first image of the first area on described surface, and responsively and by the first signal be sent to described processor;
Determined the first set of the three-dimensional coordinate of first in described first area by described processor, wherein said first set is at least partly based at least one pattern described, described first signal and described first distance;
In the second example, from described second projector, described second light is transmitted into described surface;
In described second example, utilize described second camera head to obtain the second image of the second area on described surface, and responsively and by secondary signal be sent to described processor; And
Determined the second set of the three-dimensional coordinate of the second point in described second area by described processor, wherein said second set is at least partly based on described second light, described secondary signal and described second distance.
10. method according to claim 9, also comprises:
Described assembly is moved to the second place from primary importance;
Wherein, described assembly is in described primary importance in described first example, and described assembly is in the described second place in described second example; And
Wherein, a part for described first area and a part for described second area have territory, common overlapping region.
11. methods according to claim 10, wherein, comprise the step of described assembly movement: guide operator that described assembly is moved to the described second place by the pilot lamp activated on described assembly.
12. methods according to claim 10, wherein, comprise the step of described assembly movement: projected by the 3rd light on described object with the direction of instruction towards described second place movement.
13. methods according to claim 11, wherein, comprise the step of described assembly movement: figure over the display represents a part for the described object that upper instruction will scan.
14. (deletions).
15. methods according to claim 10, also comprise:
The computer-aided mapping cad model of the described object in measurement is provided;
Based on described cad model, by determining that the light ray from described first projector verifies from the 4th of described object of thirdly reflexing to of described object the existence that multipath disturbs, wherein said 4th is the point being carried out imaging by described first camera head; And
The described second place is determined in existence at least in part based on multipath interference.
16. methods according to claim 15, wherein, be transmitted in the step described surface in the second example from described second projector by described second light, described second light is the form of line or spot.
17. methods according to claim 16, wherein, launch in described second example in the step of the second light, described second light is the form of temporally inswept line or temporally inswept spot, and described light is along the direction vertical with the direction of propagation of described second light.
18. methods according to claim 10, also comprise:
By scanning the multiple three-dimensional coordinates obtaining the described surface of described object at least partially on the described surface of described object; And
Based on obtained multiple three-dimensional coordinates, by determining that the light ray from described first projector verifies from the 4th of described object of thirdly reflexing to of described object the existence that multipath disturbs, wherein said 4th is the point being carried out imaging by described first camera head; And
The described second place is determined in existence at least in part based on multipath interference.
19. methods according to claim 18, wherein, be transmitted in the step described surface from described second projector by described second light in the second example, described second light is the form of line or spot, and wherein said line is along the direction vertical with the direction of propagation of described second light.
20. methods according to claim 19, wherein, be transmitted in the step described surface in the second example from described second projector by described second light, described line or described spot are temporally inswept.
21. methods according to claim 9, also comprise: the resolution determining the three-dimensional coordinate of described first set.
22. (original) method according to claim 9, wherein, becomes pattern when at least one pattern described is.
23. methods according to claim 9, wherein, in the step providing assembly, described first visual field is that at least twice of described second visual field is large.

Claims (22)

1. a non-contact optical three-dimensional measuring apparatus, comprising:
Assembly, it comprises the first projector, first camera head, second projector and the second camera head, wherein, described first projector, described first camera head, described second projector and described second camera head are relative to each other fixed, described first projector has the first light source, described first projector is configured to first light with at least one pattern to be transmitted on the surface of object, described first camera head has the first lens and the first light-sensitive array, described first camera head is configured to receive from the Part I of described first light of described surface reflection and responsively produces the first signal, described first camera head has the first visual field of the first angle viewing areas as described first camera head, described second projector has secondary light source, described second projector is configured to the second light to be transmitted on the described surface of described object, described second camera head has the second lens and the second light-sensitive array, described second camera head is configured to receive from the Part II of described second light of described surface reflection and responsively produces secondary signal, described second camera head has the second visual field of the second angle viewing areas as described second camera head, described second visual field is different from described first visual field, and
Processor, it is electrically coupled to described first projector, described second projector, described first camera head and described second camera head, and perform computer executable program code, described computer executable program code performs the operation comprising following content when being performed by described processor: make collect described first signal in the very first time and make to collect described secondary signal in second time different from the described very first time; The three-dimensional coordinate of first on described surface is determined at least in part based on described first signal; And the three-dimensional coordinate of the second point on described surface is determined at least in part based on described secondary signal.
2. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described second only along the light in the direction vertical with the direction of propagation of described second light.
3. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, at least one pattern described comprises at least three non-colinear pattern elements.
4. non-contact optical three-dimensional measuring apparatus according to claim 3, wherein, described second light comprises the second pattern, and described second pattern has at least three non-colinear pattern elements.
5. non-contact optical three-dimensional measuring apparatus according to claim 2, wherein, described light is temporally inswept line pattern.
6. non-contact optical three-dimensional measuring apparatus according to claim 2, wherein, described light is temporally inswept hot spot point.
7. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described first visual field is that at least twice of described second visual field is large.
8. non-contact optical three-dimensional measuring apparatus according to claim 1, wherein, described first light-sensitive array comprises the first pixel, described first pixel is configured to catch the light reflected from the first area on described surface, described second light-sensitive array comprises the second pixel, described second pixel is configured to catch the light reflected from the second area on described surface, and wherein said second area is less than described first area.
9. a method for the three-dimensional coordinate on the surface determining object, described method comprises:
There is provided and comprise the first projector, first camera head, the assembly of the second projector and the second camera head, wherein, described first projector, described first camera head, described second projector and described second camera head are relative to each other fixed, the first distance is there is between described first projector and described first camera head, second distance is there is between described second projector and described second camera head, described first projector has the first light source, described first projector is configured to first light with at least one pattern to be transmitted on described surface, described first camera head has the first lens and the first light-sensitive array, described first camera head is configured to receive the Part I from described first light of described surface reflection, described first camera head has the first visual field of the first angle viewing areas as described first camera head, described second projector has secondary light source, described second projector is configured to the second light to be transmitted on described surface, described second camera head has the second lens and the second light-sensitive array, described second camera head is configured to receive the Part II from described second light of described surface reflection, described second camera head has the second visual field of the second angle viewing areas as described second camera head, described second visual field is different from described first visual field,
The processor being electrically coupled to described first projector, described first camera head, described second projector and described second camera head is provided;
In the first example, from described first projector, described first light with at least one pattern described is transmitted into described surface;
In described first example, utilize described first camera head to obtain first image on described surface, and responsively and by the first signal be sent to described processor;
Determine the first set of the three-dimensional coordinate of first on described surface, wherein said first set is at least partly based at least one pattern described, described first signal and described first distance;
Perform the diagnostic procedure of the qualitative factor determining described first set;
In the second example, from described second projector, described second light is transmitted into described surface;
In described second example, utilize described second camera head to obtain second image on described surface, and responsively and by secondary signal be sent to described processor; And
Determine the second set of the three-dimensional coordinate of the second point on described surface, wherein said second set is at least partly based on described second light, described secondary signal and described second distance.
10. method according to claim 9, wherein, comprises from the step of the second light described in described second projector the previous step described assembly being moved to the second place from primary importance in the second example.
11. methods according to claim 10, wherein, comprise the step of described assembly movement: guide described operator that described assembly is moved to the described second place by the pilot lamp activated on described assembly.
12. methods according to claim 10, wherein, comprise the step of described assembly movement: project light onto on described object with the direction of instruction towards described second place movement.
13. methods according to claim 11, wherein, comprise the step of described assembly movement: figure over the display represents a part for the described object that upper instruction will scan.
14. methods according to claim 9, wherein, the step performing the diagnostic procedure of the qualitative factor determining described first set comprises: qualitative factor is at least partly based on there is multipath interference.
15. methods according to claim 14, wherein, the step performing the diagnostic procedure of the qualitative factor determining described first set comprises:
The computer-aided mapping cad model of the described object in measurement is provided;
Based on described cad model, by determining that the second surface point of light ray from the first surface point reflection of described object to described object from described first projector verifies the existence that multipath disturbs, wherein said second surface point is the point being carried out imaging by described first camera head; And
Described qualitative factor is determined in existence at least in part based on multipath interference.
16. methods according to claim 15, wherein, be transmitted in the step described surface in the second example from described second projector by described second light, described second light is the form of line or spot.
17. methods according to claim 16, wherein, launch in described second example in the step of the second light, described second light is the form of temporally inswept line or temporally inswept spot, and described light is along the direction vertical with the direction of propagation of described second light.
18. methods according to claim 14,
Also comprise the step obtaining multiple three-dimensional coordinates on the described surface of described object at least partially on the described surface by scanning described object; And
In the step performing described diagnostic procedure, also comprise: based on obtained multiple three-dimensional coordinates, by determining that the second surface point of light ray from the first surface point reflection of described object to described object from described first projector verifies the existence that multipath disturbs, wherein said second surface point is the point being carried out imaging by described first camera head; And determine described qualitative factor based on the existence of multipath interference at least in part.
19. methods according to claim 18, wherein, be transmitted in the step described surface from described second projector by described second light in the second example, described second light is the form of line or spot, and wherein said line is along the direction vertical with the direction of propagation of described second light.
20. methods according to claim 19, wherein, be transmitted in the step described surface in the second example from described second projector by described second light, described line or described spot are temporally inswept.
21. methods according to claim 9, wherein, in the step of qualitative factor determining described first set, also comprise:
Determine the resolution of the three-dimensional coordinate of described first set; And
Also determine described qualitative factor based on described resolution at least in part.
22. methods according to claim 9, wherein, become pattern when at least one pattern described is.
CN201480015935.5A 2013-03-15 2014-03-05 Three-dimensional coordinate scanner and method of operation Pending CN105190232A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201361791797P 2013-03-15 2013-03-15
US61/791,797 2013-03-15
US13/932,267 2013-07-01
US13/932,267 US9482529B2 (en) 2011-04-15 2013-07-01 Three-dimensional coordinate scanner and method of operation
PCT/US2014/020481 WO2014149702A1 (en) 2013-03-15 2014-03-05 Three-dimensional coordinate scanner and method of operation

Publications (1)

Publication Number Publication Date
CN105190232A true CN105190232A (en) 2015-12-23

Family

ID=50382644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201480015935.5A Pending CN105190232A (en) 2013-03-15 2014-03-05 Three-dimensional coordinate scanner and method of operation

Country Status (5)

Country Link
JP (1) JP6355710B2 (en)
CN (1) CN105190232A (en)
DE (1) DE112014001483T5 (en)
GB (1) GB2527993B (en)
WO (1) WO2014149702A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106500627A (en) * 2016-10-19 2017-03-15 杭州思看科技有限公司 3-D scanning method and scanner containing multiple different wave length laser instrument
CN106500628A (en) * 2016-10-19 2017-03-15 杭州思看科技有限公司 A kind of 3-D scanning method containing multiple different wave length laser instrument and scanner
CN107131848A (en) * 2016-02-26 2017-09-05 福禄理昂·伟洛米泽 The optical triangle method device of quick and fine and close SHAPE DETECTION can be realized
CN108603934A (en) * 2016-01-28 2018-09-28 讯宝科技有限责任公司 Method and system for the high accuracy positioning using depth value
CN109029389A (en) * 2017-06-12 2018-12-18 赫克斯冈技术中心 For showing the devices, systems, and methods in measurement gap
CN109642789A (en) * 2016-06-24 2019-04-16 3 形状股份有限公司 Use the 3D scanner of structuring detection light beam
CN110446906A (en) * 2017-02-03 2019-11-12 莫迪特3D公司 Three-dimensional scanning device and method
CN110874815A (en) * 2018-08-30 2020-03-10 康耐视公司 Method and apparatus for generating a three-dimensional reconstruction of an object with reduced distortion
CN114080535A (en) * 2019-06-28 2022-02-22 佳能株式会社 Measurement apparatus, imaging apparatus, measurement system, control method, and program

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
GB2518543A (en) 2011-03-03 2015-03-25 Faro Tech Inc Target apparatus and method
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
GB2504890A (en) 2011-04-15 2014-02-12 Faro Tech Inc Enhanced position detector in laser tracker
JP6099675B2 (en) 2012-01-27 2017-03-22 ファロ テクノロジーズ インコーポレーテッド Inspection method by barcode identification
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
GB2531928B (en) * 2014-10-10 2018-12-12 Hand Held Prod Inc Image-stitching for dimensioning
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9964402B2 (en) * 2015-04-24 2018-05-08 Faro Technologies, Inc. Two-camera triangulation scanner with detachable coupling mechanism
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3118576B1 (en) 2015-07-15 2018-09-12 Hand Held Products, Inc. Mobile dimensioning device with dynamic accuracy compatible with nist standard
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10393515B2 (en) 2016-01-20 2019-08-27 Mitsubishi Electric Corporation Three-dimensional scanner and measurement assistance processing method for same
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US20170299379A1 (en) * 2016-04-15 2017-10-19 Lockheed Martin Corporation Precision Hand-Held Scanner
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10917626B2 (en) 2016-11-23 2021-02-09 Microsoft Technology Licensing, Llc Active illumination 3D imaging system
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US10430958B2 (en) 2017-07-11 2019-10-01 Microsoft Technology Licensing, Llc Active illumination 3D zonal imaging system
US10901073B2 (en) 2017-07-11 2021-01-26 Microsoft Technology Licensing, Llc Illumination for zoned time-of-flight imaging
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US10935376B2 (en) * 2018-03-30 2021-03-02 Koninklijke Philips N.V. System and method for 3D scanning
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
CN113514008B (en) * 2020-04-10 2022-08-23 杭州思看科技有限公司 Three-dimensional scanning method, three-dimensional scanning system, and computer-readable storage medium
CN112945142B (en) * 2021-02-02 2022-12-06 江西应用科技学院 Object three-dimensional measurement system and method based on structured light
GB202101612D0 (en) * 2021-02-05 2021-03-24 Ams Sensors Singapore Pte Ltd Distance measurement using field of view

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7246030B2 (en) 2002-02-14 2007-07-17 Faro Technologies, Inc. Portable coordinate measurement machine with integrated line laser scanner
DE10344922B4 (en) * 2003-09-25 2008-06-26 Siemens Audiologische Technik Gmbh All-scanner
US8786682B2 (en) * 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US9204129B2 (en) * 2010-09-15 2015-12-01 Perceptron, Inc. Non-contact sensing system having MEMS-based light source
JP5782786B2 (en) * 2011-04-01 2015-09-24 株式会社ニコン Shape measuring device
GB2504890A (en) * 2011-04-15 2014-02-12 Faro Tech Inc Enhanced position detector in laser tracker

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108603934A (en) * 2016-01-28 2018-09-28 讯宝科技有限责任公司 Method and system for the high accuracy positioning using depth value
CN108603934B (en) * 2016-01-28 2022-04-15 讯宝科技有限责任公司 Method and system for high precision localization using depth values
CN107131848A (en) * 2016-02-26 2017-09-05 福禄理昂·伟洛米泽 The optical triangle method device of quick and fine and close SHAPE DETECTION can be realized
CN109642789A (en) * 2016-06-24 2019-04-16 3 形状股份有限公司 Use the 3D scanner of structuring detection light beam
US11650045B2 (en) 2016-06-24 2023-05-16 3Shape A/S 3D scanner using a structured beam of probe light
CN106500628B (en) * 2016-10-19 2019-02-19 杭州思看科技有限公司 A kind of 3-D scanning method and scanner containing multiple and different long wavelength lasers
CN106500627A (en) * 2016-10-19 2017-03-15 杭州思看科技有限公司 3-D scanning method and scanner containing multiple different wave length laser instrument
CN106500628A (en) * 2016-10-19 2017-03-15 杭州思看科技有限公司 A kind of 3-D scanning method containing multiple different wave length laser instrument and scanner
CN110446906A (en) * 2017-02-03 2019-11-12 莫迪特3D公司 Three-dimensional scanning device and method
CN109029389A (en) * 2017-06-12 2018-12-18 赫克斯冈技术中心 For showing the devices, systems, and methods in measurement gap
US10890447B2 (en) 2017-06-12 2021-01-12 Hexagon Technology Center Gmbh Device, system and method for displaying measurement gaps
CN110874815A (en) * 2018-08-30 2020-03-10 康耐视公司 Method and apparatus for generating a three-dimensional reconstruction of an object with reduced distortion
CN110874815B (en) * 2018-08-30 2024-03-01 康耐视公司 Method and apparatus for generating a three-dimensional reconstruction of an object with reduced distortion
US11954767B2 (en) 2018-08-30 2024-04-09 Cognex Corporation Methods and apparatus for generating a three-dimensional reconstruction of an object with reduced distortion
CN114080535A (en) * 2019-06-28 2022-02-22 佳能株式会社 Measurement apparatus, imaging apparatus, measurement system, control method, and program

Also Published As

Publication number Publication date
GB2527993A (en) 2016-01-06
GB201518275D0 (en) 2015-12-02
JP2016514271A (en) 2016-05-19
DE112014001483T5 (en) 2015-12-10
GB2527993B (en) 2018-06-27
WO2014149702A1 (en) 2014-09-25
JP6355710B2 (en) 2018-07-11

Similar Documents

Publication Publication Date Title
US10267619B2 (en) Three-dimensional coordinate scanner and method of operation
US10119805B2 (en) Three-dimensional coordinate scanner and method of operation
CN105190232A (en) Three-dimensional coordinate scanner and method of operation
US10089415B2 (en) Three-dimensional coordinate scanner and method of operation
US20150015701A1 (en) Triangulation scanner having motorized elements
US6590669B1 (en) Method for optically detecting the shape of objects
US11022692B2 (en) Triangulation scanner having flat geometry and projecting uncoded spots
JP6080969B2 (en) Method and apparatus for determining the posture of an object
US20130057650A1 (en) Optical gage and three-dimensional surface profile measurement method
JP2016516196A (en) Structured optical scanner correction tracked in 6 degrees of freedom

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20151223