CN104380033A - Coordinate measurement machines with removable accessories - Google Patents

Coordinate measurement machines with removable accessories Download PDF

Info

Publication number
CN104380033A
CN104380033A CN201380029985.4A CN201380029985A CN104380033A CN 104380033 A CN104380033 A CN 104380033A CN 201380029985 A CN201380029985 A CN 201380029985A CN 104380033 A CN104380033 A CN 104380033A
Authority
CN
China
Prior art keywords
pattern
structured light
aacmm
equipment
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201380029985.4A
Other languages
Chinese (zh)
Inventor
保罗·C·阿特韦尔
克拉克·H·布里格斯
伯纳姆·斯托克斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faro Technologies Inc
Original Assignee
Faro Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/491,176 external-priority patent/US8832954B2/en
Application filed by Faro Technologies Inc filed Critical Faro Technologies Inc
Publication of CN104380033A publication Critical patent/CN104380033A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • G01B11/007Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines feeler heads therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical techniques
    • G01B5/004Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points
    • G01B5/008Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points using coordinate measuring machines

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • A Measuring Device Byusing Mechanical Method (AREA)

Abstract

A portable articulated arm coordinate measuring machine is provided. The coordinate measuring machine includes a base with an arm portion. A probe end is coupled to an end of the arm portion distal from the base. A device configured to emit a coded structured light onto an object to determine the three dimensional coordinates of a point on the object.

Description

There is the coordinate measuring machine of detachable accessory
Background technology
Present disclosure relates to coordinate measuring machine, and more particularly, relate to the portable articulated arm coordinate measuring machine on the sound end of coordinate measuring machine with connector, this connector allows the accessory device using structured light to carry out non-contact 3-D measurement to be detachably connected to coordinate measuring machine.
Portable articulated arm coordinate measuring machine (AACMM) is in the manufacture of part or be widely applied in producing, in the manufacture of part or to need during the manufacture of part or the different phase of production (such as, machining) fast in producing and verify the size of part exactly.Portable AACMM is static or fixing, high cost and more awkward measurement utility meter reveals significant improvement relative to known, is especially carrying out relative complex part showing significant improvement in the time quantum needed for dimensional measurement.Usually, the user of portable AACMM guides probe along the surface of part to be measured or object simply.Then, measurement data is recorded and is supplied to user.In some cases, data are supplied to user with visual form, such as, are supplied to user on the computer screen with three-dimensional (3D) form.In other cases, data are supplied to user in digital form, and such as, when the diameter of measured hole, text " diameter=1.0034 " display on the computer screen.
The example of portable joint arm CMM of the prior art is disclosed in the common U.S. Patent No. 5,402,582 (' 582) transferred the possession of.' 582 patent discloses a kind of 3D measuring system, and this 3D measuring system comprises and one end has support base and the manually operated joint arm CMM other end with measuring probe.The U.S. Patent No. 5,611,147 (' 147) of common transfer discloses similar joint arm CMM.In ' 147 patents, joint arm CMM comprises multiple feature, and described multiple feature is included in the additional rotation at sound end place, thus provides the axis of 2-2-2 or 2-2-3 to configure (latter event is seven armshafts) to arm.
Non-contact technology can also be adopted to measure three-dimensional surface.The contactless device of one type---is sometimes referred to as laser rays probe or laser line scanning instrument---and sends laser on point or along line.Such as, imaging device such as charge-coupled image sensor (CCD) is oriented to laser instrument adjacent.Laser instrument is arranged to send the light beam by surface reflection.The surface of measurand causes oppositely unrestrained, and this overflows and is oppositely caught by imaging device.The image of the reflected ray on sensor changes along with the distance change between sensor and surface.Can, by knowing the position of the relation between imaging sensor and laser instrument and the laser image on sensor, triangulation method be used to carry out the three-dimensional coordinate of the point on measured surface.The problem adopting laser rays probe to occur is the speed that can move across subject surface according to laser rays probe of the density of measurement point and changes.Laser rays probe motion faster, the distance between point is larger, and dot density is less.Adopt structured light scanner, in each dimension of two dimension, dot spacing is normally uniform, thus usually provides the even measurement of surface of the work point.
Although existing CMM is suitable for its expection object, need a kind of portable AACMM with some feature of embodiment of the present invention.
Summary of the invention
According to an embodiment of the invention, provide a kind of portable articulated arm coordinate measuring machine (AACMM) for measuring object three-dimensional coordinate in space.This AACMM comprises base portion.Be provided with the arm manuallyd locate with contrary first end and the second end, this arm is coupled to base portion rotatably, and this arm comprises the arm section of multiple connection, and each arm section comprises at least one position detector for generation of position signalling.Electronic circuit is configured to receive the position signalling from least one position detector in each arm section.Sound end is coupled to first end.Non-contact 3-D measuring equipment is coupled to sound end, this non-contact 3-D measuring equipment has projector and imageing sensor, this projector tool active planar, this projector is configured to structured light on object, this structured light to be positioned in the plane of source and to comprise the pattern element of at least three not conllinear, and this imageing sensor is arranged to receive the structured light from object reflection.Processor is electrically coupled to electronic circuit, and this processor is configured in response to receiving from the position signalling of position detector and receiving in response to by imageing sensor the three-dimensional coordinate that structured light determines the point on object.
According to an embodiment of the invention, provide a kind of method that the portable articulated arm coordinate measuring machine for measuring object coordinate is in space operated.The method comprises the arm manuallyd locate arranging and have contrary first end and the second end, and this arm comprises the arm section of multiple connection, and each arm section comprises at least one position detector for generation of position signalling.Arrange the sound end being used for measuring object, this sound end is coupled to first end.Electronic circuit receives the position signalling of self-detector.The non-contact 3-D measuring equipment with controller is set, this non-contact 3-D measuring equipment has sensor and projector, this projector arrangement becomes structured light on object, this projector tool active planar, this structured light to be positioned in the plane of source and to comprise the pattern element of at least three not conllinear.Structured light is projected to object from 3-D measuring apparatus.
Accompanying drawing explanation
Referring now to accompanying drawings that show illustrative embodiments, the gamut that illustrative embodiments should not be construed as present disclosure limits to some extent, and wherein identical in some drawings element adopts identical Reference numeral:
Fig. 1 comprises Figure 1A and Figure 1B, is the stereographic map of the portable articulated arm coordinate measuring machine (AACMM) of the embodiment with various aspects of the present invention;
Fig. 2 comprises Fig. 2 A to Fig. 2 D linked together, and is the block diagram being used as the electronic installation of a part of the AACMM in Fig. 1 according to embodiment;
Fig. 3 comprises Fig. 3 A and Fig. 3 B that link together, depicts the block diagram of the detailed features according to the electronic data processing system in Fig. 2 of embodiment;
Fig. 4 is the isometric views of the sound end of AACMM in Fig. 1;
Fig. 5 is the side view of the sound end be coupled with in Fig. 4 of handle;
Fig. 6 is the side view of the sound end be attached with in Fig. 4 of handle;
Fig. 7 is the partial side view of the amplification of the interface portion of sound end in Fig. 6;
Fig. 8 is another partial side view of amplifying of the interface portion of sound end in Fig. 5;
Fig. 9 is the isometric views of the biopsy cavity marker devices of handle in Fig. 4;
Figure 10 is the isometric views of the sound end of the AACMM be attached with in Fig. 1 of the structured light equipment with single camera;
Figure 11 is the isometric views of the biopsy cavity marker devices of equipment in Figure 10;
Figure 12 is the isometric views of the sound end of AACMM in the Fig. 1 being attached with another structured light equipment with double camera;
Figure 13 A and Figure 13 B shows the schematic diagram of the operation of equipment when being attached to the sound end of the AACMM in Fig. 1 in Figure 10;
Figure 14 to Figure 17 is the sequential projection with uncoded binary pattern that can be sent by the structured light equipment in Figure 10 or Figure 12 according to the embodiment of the present invention;
Figure 18 to Figure 19 is the spatial variations color coding pattern that can be sent by the structured light equipment in Figure 10 or Figure 12 according to the embodiment of the present invention;
Figure 20 to Figure 23 is striped index (strip index) coding pattern that can be sent by the structured light equipment in Figure 10 or Figure 12 according to the embodiment of the present invention;
Figure 24 to Figure 31 is the two-dimensional grid pattern that can be sent by the structured light equipment in Figure 10 or Figure 12 according to the embodiment of the present invention;
Figure 32 is the schematic diagram of the photometric technique of pattern for obtaining structured light under multiple illumination conditions;
Figure 33 is the figure of the structure light scan equipment that can operate independent of AACMM according to another embodiment of the present invention.
Embodiment
Portable articulated arm coordinate measuring machine (" AACMM ") is used in numerous applications to obtain the measurement result of object.Embodiments of the present invention provide following advantage: enable operator easily and rapidly accessory device be coupled to the sound end of AACMM, and it uses structured light to provide non-contact measurement to three dimensional object.Embodiments of the present invention additionally provide following advantage: provide communication data, and this communication data represents the some cloud measured by the structured light equipment in AACMM.Embodiments of the present invention provide following advantage: in the distribution of measurement point, provide larger homogeneity, and this can provide higher precision.Embodiments of the present invention additionally provide following advantage: provide electric power and data communication to removable attachment, and without the need to the connection of outside or wiring.
As used herein, term " structured light " refers to the two-dimensional pattern to being used for determining the light projected on the continuous closed region of object, and it passes on the information that may be used for the coordinate of the point determined on object.Structured light patterns is arranged on comprising continuously and the pattern element of in closed region at least three not conllinear.Each pattern element in the pattern element of three not conllinear all passes on the information that can be used for determining point coordinate.
In general, the structured light of two types is had---encoded light pattern and uncoded light pattern.Encoded light pattern as used herein is the light pattern can determining the three-dimensional coordinate on the exposure surface of object by obtaining single image.In some cases, projector equipment relative to as if can to move.In other words, for encoded light pattern, between projection pattern and the image obtained, there is no obvious time relationship.Usually, encoded light pattern comprises a group element (such as geometric configuration), and it is arranged such that at least three elements in element are not conllinear.In some cases, this group element can be arranged to the set of line.Make at least three elements in element not conllinear ensure that pattern is not when by that simple line pattern during projection of such as laser line scanning instrument.Therefore, due to the layout of element, pattern element is identifiable.
On the contrary, uncoded structured light patterns as used herein is such pattern: it is not measured by single pattern relative to during object move when projector.The example of uncoded light pattern is such light pattern: need a series of order pattern then to obtain a series of sequential picture.Due to the time response of projection pattern and the acquisition to image, so the relative motion that do not have between projector and object.
Should be understood that structured light is different from the light projected by the device of the laser rays probe or laser line scanning instrument type that generate light beam.Nowadays the laser rays probe used together with joint arm has scrambling or other side, and these scramblings or other side may be considered to be the feature in generated line, and for this respect, the mode that these features are arranged with conllinear is arranged.Therefore, do not think that this feature in the line of single generation makes projected light become structured light.
Figure 1A and Figure 1B shows the AACMM 100 according to various embodiment of the present invention with stereographic map, and joint arm is the coordinate measuring machine of a type.As shown in FIG. 1A and 1B, exemplary AACMM 100 can comprise six axles or seven axle joint measurment equipment with sound end 401, and this sound end 401 comprises the measuring probe housing 102 of the arm 104 being coupled to AACMM 100 at one end.Arm 104 comprises the first arm section 106, first arm section 106 and is coupled to the second arm section 108 by clutch shaft bearing box group 110 (such as, two bearing boxs).Second arm section 108 is coupled to measuring probe housing 102 by the second bearing box group 112 (such as, two bearing boxs).First arm section 106 is coupled to the base portion 116 of the other end of the arm 104 being positioned at AACMM 100 by the 3rd bearing box group 114 (such as, three bearing boxs).Each bearing box group 110, bearing box group 112, bearing box group 114 provide multiaxial joint motion.Similarly, sound end 401 can comprise measuring probe housing 102, measuring probe housing 102 comprises the bar (shaft) in the 7th axle portion of AACMM 100 (such as, comprise the sleeve of encoder system, this encoder system determination measuring equipment---such as probe 118---is along the moving of the 7th axis of AACMM 100).In the present embodiment, sound end 401 can rotate around the axis at the center extending through measuring probe housing 102.In the use of AACMM100, usually base portion 116 is fixed on workplace.
Each bearing box in each bearing box group 110, bearing box group 112, bearing box group 114 generally includes encoder system (such as, optical angular encoder system).Encoder system (i.e. detecting device) provides the instruction of the position of the bearing box group 110 of each arm section 106, arm section 108 and correspondence, bearing box group 112, bearing box group 114, and it provides probe 118 relative to the instruction (and the position therefore providing the object measured by AACMM 100 in specific reference system in---such as local reference system or overall reference system---) of the position of base portion 116 together.Arm section 106, arm section 108 can be made up of suitable rigid material, and it is such as but not limited to carbon composite.The portable AACMM 100 with six axles or seven axle joint motions (i.e. degree of freedom) provides following advantage: enable operator probe 118 is placed on the ideal position place in 360 ° of regions of base portion 116, provide the arm 104 easily can handled by operator simultaneously.However, it should be understood that there are two arm sections 106, the exemplary plot of arm 104 of arm section 108 is for exemplary object, and invention required for protection is not limited to this.AACMM100 can have and is coupled in any amount of arm section together (and therefore, having the joint motions greater or less than six axles or seven axles or degree of freedom) by bearing box.
Probe 118 is removably mounted to measuring probe housing 102, and housing 102 is connected to bearing box group 112.Handle 126 can such as be dismantled from measuring probe housing 102 by quick connecting interface.As discussed in detail below, handle 126 can substitute to provide another equipment of the non-contact measurement to three dimensional object by being configured to send structured light, thus provides following advantage: enable operator use identical AACMM 100 to carry out contact type measurement and non-contact measurement.In the exemplary embodiment, the accommodating removable probe 118 of probe housing 102, probe 118 is contact type measurement equipment and can has the different end 118 from object physical contact to be measured, and end 118 includes but not limited to: spherical, touch sensible, bending with extended probe.Such as, in other embodiment, by contactless device, as coding structure optical scanning device, perform measurement.In one embodiment, use quick connecting interface that handle 126 is replaced with coding structure optical scanning device.The measuring equipment of other types can replace detachable handle 126 to provide extra function.Such as, the example of this measuring equipment includes but not limited to, one or more illuminating lamp, temperature sensor, thermal scanner, barcode scanner, projector, air painter, camera etc.
As shown in FIG. 1A and 1B, AACMM 100 comprises detachable handle 126, and detachable handle 126 provides following advantage: annex or function are changed, and need dismantle measuring probe housing 102 from bearing box group 112.As more discussed in detail about Fig. 2 D below, detachable handle 126 can also comprise electric connector, makes it possible to and handle 126 and the corresponding electronic installation Change Power and the data that are arranged in sound end 401.
In various embodiments, each bearing box group in bearing box group 110, bearing box group 112, bearing box group 114 makes the arm 104 of AACMM 100 can move around multiple rotation.As mentioned above, such as, each bearing box group in bearing box group 110, bearing box group 112, bearing box group 114 comprises corresponding encoder system such as optical angular encoder, and described encoder system is arranged with the corresponding rotation axis coaxle of such as arm section 106, arm section 108 respectively.Optical encoder system detects around the rotation (revolution) of respective axes or horizontal (pivotable) motion each arm section in such as arm section 106, arm section 108, and by Signal transmissions to the electronic data processing system in AACMM 100, it will be described in greater detail below.Each independent Raw encoder counting is sent separately to electronic data processing system as signal, and signal is further processed into measurement data in electronic data processing system.Disclosed in the U.S. Patent No. 5,402,582 (' 582) at common transfer, do not need relative to AACMM 100 independently position calculator (such as, serial box).
Base portion 116 can comprise attached peripheral device or erecting equipment 120.Erecting equipment 120 makes AACMM 100 can be releasably attached to desired location place, such as, on inspecting stand, machining center, wall or floor.In one embodiment, base portion 116 comprises handle portion 122, the position of the grasping base portion 116 that handle portion 122 is provided convenience to operator when AACMM 100 is moved.In one embodiment, base portion 116 also comprises moveable cap 124, and cap 124 is folding to expose user interface downwards, such as display screen.
According to an embodiment, the base portion 116 of portable AACMM 100 comprises or the accommodating electronic circuit with electronic data processing system, this electronic data processing system comprises two critical pieces: base portion disposal system, its to from the various encoder systems in AACMM 100 data and represent that the data of other arm parameters process to support three-dimensional (3D) position calculation; User interface process system, it comprises onboard operations system, touch-screen display and resident applications software, and this resident applications software makes it possible to realize relatively complete metric function and without the need to being connected to outer computer in AACMM 100.
Electronic data processing system in base portion 116 can communicate with encoder system, sensor and other peripheral hardwares (such as, can be mounted to the structured light equipment on the detachable handle 126 on AACMM 100) away from base portion 116.Support that the electronic installation of these peripheral hardware devices or feature can be arranged in each bearing box group 110,112,114 of portable AACMM 100.
Fig. 2 is the block diagram of the electronic installation used in AACMM 100 according to embodiment.Embodiment shown in Fig. 2 A comprises electronic data processing system 210, electronic data processing system 210 comprise base portion processor plate 204 for realizing base portion disposal system, user board 202, for providing the base portion power panel 206 of electric power, bluetooth module 232 and base portion to tilt (tilt) plate 208.User board 202 comprises for performing application software to realize the computer processor of user interface, display and other functions described herein.
As shown in Figure 2 A and 2 B, electronic data processing system 210 communicates with aforesaid multiple encoder system via one or more arm bus 218.In embodiment as shown in fig. 2 b and fig. 2 c, each encoder system generates encoder data and comprises: scrambler arm bus interface 214, scrambler digital signal processor (DSP) 216, encoder readhead interface 234 and temperature sensor 212.Other equipment such as strain transducer can be attached to arm bus 218.
The sound end electronic installation 230 carrying out with arm bus 218 communicating also is shown in figure 2d.Sound end electronic installation 230 comprises sound end DSP 228, temperature sensor 212, the handle/equipment interface bus 240 be connected with handle 126 or coding structure optical scanning device 242 via the quick connecting interface in embodiment and probe interface 226.Quick connecting interface allows handle 126 to access the data bus, control line and the power bus that are used by coding structure optical scanning device 242 and other annexes.In one embodiment, sound end electronic installation 230 is arranged in the measuring probe housing 102 on AACMM 100.In one embodiment, handle 126 can be dismantled from quick connecting interface and can perform measurement by structure light scan equipment 242, and this structure light scan equipment 242 communicates with the sound end electronic installation 230 of AACMM 100 via interface bus 240.In one embodiment, electronic data processing system 210 is arranged in the base portion 116 of AACMM 100, sound end electronic installation 230 is arranged in the measuring probe housing 102 of AACMM 100, and encoder system is arranged in bearing box group 110, bearing box group 112, bearing box group 114.Probe interface 226 can be connected with sound end DSP 228 by any suitable communication protocol, comprises the enforcement from Maxim Integrated Products (Maxim Integrated Products, Inc.) the commercially available prod of communication protocol 236.
Fig. 3 describes the block diagram according to the detailed features of the electronic data processing system 210 of the AACMM 100 of embodiment.In one embodiment, electronic data processing system 210 is arranged in the base portion 116 of AACMM 100 and comprises base portion processor plate 204, user board 202, base portion power panel 206, bluetooth module 232 and base portion tilt module 208.
In the embodiment shown in Fig. 3 A, base portion processor plate 204 comprises the various functional blocks shown in Fig. 3 A.Such as, base portion functional processor 302 is used for supporting the collection of the measurement data from AACMM 100 and via arm bus 218 and bus control module function 308 to receive original arm data (such as encoder system data).Memory function 304 storage program and static arm configuration data.Base portion processor plate 204 also comprises the optional port function 310 of external hardware for carrying out communicating with any external hardware device or annex (such as coding structure optical scanning device 242).Real-time clock (RTC) and daily record 306, battery pack interface (IF) 316 and diagnostic port 318 are also comprised in the function of the embodiment of the base portion processor plate 204 shown in Fig. 3 A.
Base portion processor plate 204 also manages all wired data communications and wireless data communications with external unit (principal computer) and internal unit (display processor 202).Base portion processor plate 204 have carry out communicating via ethernet feature 320 and ethernet network ability (such as, using the clock synchronous standard as Institute of Electrical and Electric Engineers (IEEE) 1588 and so on), to carry out via LAN function 322 and WLAN (wireless local area network) (WLAN) ability that communicates and via and walk to the ability that serial communication (PSC) function 314 and bluetooth module 232 carry out communicating.Base portion processor plate 204 also comprises the connection with USB (universal serial bus) (USB) equipment 312.
Base portion processor plate 204 transmits and gathers raw measurement data (such as encoder system counting, temperature reading) for raw measurement data is processed into measurement data, and without any need for pre-service, disclosed such in the serial box of as the aforementioned ' 582 patents.Data after process are sent to display processor 328 on user board 202 via RS485 interface (IF) 326 by base portion processor 204.In one embodiment, raw measurement data is also sent to outer computer by base portion processor 204.
Forward now the user board 202 shown in Fig. 3 B to, the angle received by base portion processor and position data are utilized by the application program performed on display processor 328, to provide the automatic gauging system in AACMM 100.Application program can perform to support following function on display processor 328, described function includes but not limited to: the control of pattern measurement, guiding and training figure, remote diagnosis, temperature correction, various operating characteristics, the connection with various network, and the display of measuring object.Together with display processor 328 and liquid crystal display (LCD) 338 (such as, touch-screen LCD) user interface is together, user board 202 comprises multiple interface options, and these interface options comprise secure digital (SD) card interface 330, storer 332, USB host interface 334, diagnostic port 336, camera port 340, audio/video interface 342, dialing/honeycomb (cell) modulator-demodular unit 344 and GPS (GPS) port 346.
Electronic data processing system 210 shown in Fig. 3 A also comprises the base portion power panel 206 of the environmental recorder 362 had for recording environmental data.Base portion power panel 206 also uses AC/DC converter 358 and battery charger controller 360 to provide electric power to electronic data processing system 210.Base portion power panel 206 uses the single-ended bus 354 of built-in integrated circuit (I2C) serial and communicates with base portion processor plate 204 via DMA serial peripheral interface (DSPI) 357.Base portion power panel 206 is connected to inclination sensor and radio-frequency (RF) identification (RFID) module 208 via I/O (I/O) expanded function 364 realized in base portion power panel 206.
Although show independent parts, in other embodiments, all parts or section components can be positioned at different positions physically and/or functionally combine in the mode different from the mode shown in Fig. 3 A and Fig. 3 B.Such as, in one embodiment, base portion processor plate 204 and user board 202 are combined into a physical boards.
Referring now to Fig. 4 to Fig. 9, show the illustrative embodiments of sound end 401, the measuring probe housing 102 of this sound end 401 is with the electromechanical interface connected fast, and this electromechanical interface connected fast makes detachable and interchangeable equipment 400 can couple with AACMM 100.In the exemplary embodiment, equipment 400 comprises shell 402, and such as, shell 402 comprises the handle portion 404 that size and dimension is configured to can be held in the hand of operator, such as pistol grip.Shell 402 is the thin-wall constructions with chamber 406 (Fig. 9).The size in chamber 406 is set up and is configured to hold controller 408.Controller 408 can be the digital circuit such as with microprocessor, or mimic channel.In one embodiment, controller 408 and electronic data processing system 210 (Fig. 2 and Fig. 3) carry out asynchronous two-way communication.Communication connection between controller 408 and electronic data processing system 210 can be wired (such as, via controller 420) or can be the combination of direct or indirect wireless connections (such as bluetooth or IEEE 802.11) or wired connection and wireless connections.Such as, in the exemplary embodiment, shell 402 is such as formed as two half-unit 410, half portion 412 by injection-molded plastic material.Such as, half portion 410, half portion 412 can be fixed together by securing member such as screw 414.In other embodiments, the half portion 410 of shell, half portion 412 can be fixed together by such as bonding agent or ultrasonic soldering.
Handle portion 404 also comprises can by the button of operator's manual activation or actuator 416,418.Actuator 416 and actuator 418 are coupled to controller 408, controller 408 by Signal transmissions to the controller 420 in probe housing 102.In the exemplary embodiment, actuator 416 and actuator 418 perform the function of the actuator 422 and actuator 424 be positioned on the probe housing 102 on equipment 400 opposite.Should be understood that equipment 400 can have extra switch, button or other actuators, extra switch, button or other actuators also can be used for opertaing device 400, AACMM100, and vice versa.In addition, such as, equipment 400 can comprise indicator, such as light emitting diode (LED), sound generator, meter, display or meter.In one embodiment, equipment 400 can comprise the automated digital voice recorder allowing oral suggestion (vertal comment) synchronous with measurement point.In yet, equipment 400 comprises and enables operator by voice actuation command transfer to the microphone of electronic data processing system 210.
In one embodiment, handle portion 404 can be configured to be used by any hand of operator or specific hand (such as left hand or the right hand).Handle portion 404 can also be configured to be beneficial to handicapped operator (such as lack the operator of finger or have the operator of arm prosthesis) and use.In addition, and prescribe a time limit when clearance space has, handle portion 404 can be disassembled and probe housing 102 can independently use.As mentioned above, sound end 401 can also comprise the bar of the 7th axis of AACMM 100.In the present embodiment, equipment 400 can be arranged to rotate around AACMM the 7th axis.
Sound end 401 comprises electromechanical interface 426, its second connector 428 having the first connector 429 (Fig. 8) on the equipment that is positioned at 400 that cooperates with each other and be positioned on probe housing 102.Connector 428 and connector 429 can comprise permission and equipment 400 be coupled to the electric of probe housing 102 and mechanical features.In one embodiment, interface 426 comprises the first surface 430 it with mechanical coupler 432 and electric connector 434.Shell 402 also comprises and is positioned to second surface 436 that is adjacent with first surface 430 and that depart from from first surface 430.In the exemplary embodiment, second surface 436 is the flat surfaces of the distance offseting about 0.5 inch from first surface 430.This skew provide when operator tighten or loosening fasteners (such as the collar 438) time be used for the gap of finger of operator.Interface 426 provide between equipment 400 with probe housing 102 relative fast and the electrical connection of safety and do not need connector inserting pin is aimed at, and do not need independently cable or connector.
Electric connector 434 extends from first surface 430 and comprises one or more connector inserting pin 440, one or more connector inserting pin 440 such as via one or more arm bus 218 with the mode of asynchronous two-way communication and electronic data processing system 210 (Fig. 2 and Fig. 3) electric coupling.It can be the combination of wired (such as via arm bus 218), wireless (such as bluetooth or IEEE 802.11) or wired connection and wireless connections that two-way communication connects.In one embodiment, electric connector 434 is electrically coupled to controller 420.Controller 420 such as can carry out asynchronous two-way communication via one or more arm bus 218 with electronic data processing system 210.Electric connector 434 be positioned to provide to provide with the electric connector 442 on probe housing 102 relative fast and the electrical connection of safety.When equipment 400 is attached to probe housing 102, electric connector 434 and electric connector 442 are interconnected.Electric connector 434 and electric connector 442 can comprise the connector shell of metal parcel respectively; the connector shell of described metal parcel provides electromagnetic interference shield, protects connector inserting pin simultaneously and assists pin to aim at during process equipment 400 being attached to probe housing 102.
The machinery of relative stiffness that machinery coupler 432 is provided between equipment 400 with probe housing 102 couples to support relatively to apply accurately, and the position of equipment 400 on the end of the arm 104 of AACMM 100 does not preferably carry out offseting or moving in such an application.Any movement like this may cause the degeneration of less desirable measurement result precision usually.Utilize the various architectural features of the mechanical attachment configuration section of the electromechanical interface of the quick connection in embodiment of the present invention can realize the result of these expectations.
In one embodiment, mechanical coupler 432 comprises the first teat 444 be positioned on one end 448 (leading edge of equipment 400 or " above ").First teat 444 can comprise key, the jagged or acclive interface of band, and it forms the antelabium 446 extended from the first teat 444.Antelabium 446 is sized to be suitable for be accommodated in the groove 450 limited by the teat 452 extended from probe housing 102 (Fig. 8).Should understand, first teat 444 and groove 450 form coupler device together with the collar 438, to make when antelabium 446 is positioned at groove 450, groove 450 can be used to move both to be limited in the vertical and horizontal of equipment 400 when equipment 400 is attached to probe housing 102.Just as will be described in more detail below, the rotation of the collar 438 can be used to be fixed in groove 450 by antelabium 446.
Relative with the first teat 444, mechanical coupler 432 can comprise the second teat 454.Second teat 454 can have band key, jagged antelabium or acclive interface surface 456 (Fig. 5).Such as, the second teat 454 is oriented to engage the securing member be associated with probe housing 102, the such as collar 438.Just as will be discussed in more detail below, machinery coupler 432 comprises from the outstanding convex surfaces in surface 430, this convex surfaces is adjacent with electric connector 434 or be arranged near electric connector 434, and this provides a pivotal point (Fig. 7 and Fig. 8) to interface 426.It is used as the 3rd Mechanical Contact point in three Mechanical Contact points when equipment 400 is attached to probe housing 102 between equipment 400 and probe housing 102.
Probe housing 102 comprises the collar 438 be coaxially arranged on one end.The collar 438 comprises can the threaded portion of movement between primary importance (Fig. 5) and the second place (Fig. 7).By rotating the collar 438, the collar 438 can be used for fixing or removing device 400 and does not need external tool.The rotation of the collar 438 make the collar 438 along relative coarseness, the right cylinder 474 of square thread moves.Profile (contoured) surface of this relative large scale, square thread is used to make it possible to very little torque to realize obvious clamping force.The coarse pitch of the screw thread of right cylinder 474 also makes the collar 438 be tightened with very little rotation or to unclamp.
In order to equipment 400 is coupled to probe housing 102, antelabium 446 is inserted in groove 450 and equipment pivotable indicated in arrow 464 rotates (Fig. 5) to make the second teat 454 towards surface 458.The collar 438 rotates, and to move or translation becomes and engages with surperficial 456 to make the collar 438 along the direction shown in arrow 462.The collar 438 orders about mechanical coupler 432 against convex surfaces 460 against the movement of skewed surface 456.This contributes to overcoming following potential problems: the foreign matter on the distortion of interface and interface surface, this foreign matter can affect equipment 400 to be settled in the rigidity of probe housing 102.The collar 438 makes mechanical coupler 432 move forward to be pressed on the pedestal of probe housing 102 by antelabium 446 to the power that the second teat 454 applies.When continuing to tighten the collar 438, the second teat 454 is upwards pressed to probe housing 102, thus applies pressure on pivotal point.This provide a kind of device of seesaw type, apply pressure to reduce or eliminate moving or waving of equipment 400 to the second teat 454, antelabium 446 and centered pivot point.Pivotal point is directly pressed against the bottom of probe housing 102, and antelabium 446 applies downward power on the end of probe housing 102 simultaneously.Fig. 5 comprises arrow 462 and arrow 464 to illustrate the moving direction of equipment 400 and the collar 438.Fig. 7 comprises arrow 466, arrow 468 and arrow 470 to illustrate the direction of the applied pressure in interface 426 when the collar 438 is tightened.Should be understood that the offset distance on the surface 436 of equipment 400 provides the gap 472 (Fig. 6) between the collar 438 and surface 436.Gap 472 enables operator be grasped on the collar 438 more securely, reduces the danger clipping to finger along with the rotation of the collar 438 simultaneously.In one embodiment, probe housing 102 has enough hardness to reduce or to prevent the probe housing 102 when the collar 438 is tightened to be out of shape.
The embodiment of interface 426 allows mechanical coupler 432 and electric connector 434 are aimed at rightly and protects electronics interface to avoid being applied in the stress that may occur due to the clamping action on the collar 438, antelabium 446 and surface 456.This is conducive to reducing or eliminating the stress damage to the electric connector 434,442 (it may have solder terminal) being installed on circuit board 476.In addition, embodiment provides the following advantage being better than known method: user does not need use instrument to make equipment 400 be connected to probe housing 102 or to disconnect from probe housing 102.This makes operator can relatively easily manually make equipment 400 be connected with probe housing 102 and disconnect from probe housing 102.
Because interface 426 can realize relatively a large amount of electrical connections shielded, so relative a large amount of function can be shared between AACMM 100 with equipment 400.Such as, the switch be positioned on AACMM 100, button or other actuators can be used to carry out opertaing device 400, and vice versa.In addition, order and data can transfer to equipment 400 from electronic data processing system 210.In one embodiment, equipment 400 is video cameras, and it transmits the data of image recorded, and these data are by the storer being stored in base portion processor 204 or be presented on display 328.In another embodiment, equipment 400 is the image projectors receiving data from electronic data processing system 210.In addition, the temperature sensor being arranged in AACMM 100 or equipment 400 can by another collaborative share.Should be understood that embodiments of the present invention provide following advantage: provide interface flexibly, this interface makes various accessory device 400 fast, easily and reliably can be coupled to AACMM 100.In addition, between AACMM 100 and equipment 400, the ability of sharing functionality makes it possible to reduce by eliminating repeatability the size of AACMM 100, power consumption and complicacy.
In one embodiment, controller 408 can change operation or the function of the sound end 401 of AACMM 100.Such as, with be used alone compared with probe housing 102 itself, when equipment 400 is attached to probe housing 102, controller 408 can change pilot lamp on probe housing 102 to send the light of different colours, the light of varying strength, or at not opening/closing in the same time.In one embodiment, equipment 400 comprises the distance measuring sensor (not shown) measured the distance apart from object.In this embodiment, controller 408 can make the pilot lamp on probe housing 102 change, to provide object distance probe end 118 to have instruction how far for operator.In another embodiment, controller 408 can make the color of pilot lamp change based on the picture quality obtained by coding structure optical scanning device.This is conducive to the demand simplifying controller 420, and makes it possible to by increasing accessory device and improve or strengthen functional.
With reference to Figure 10 to Figure 13, be the projector for non-contact 3-D measuring equipment 500 in embodiments of the present invention, camera, signal transacting, control and indicator interface provide advantage.Such as, equipment 500 comprises paired optical device, such as light projector 508 and camera 510, and they project structured light patterns and receive the two-dimensional pattern reflected from object 501.Equipment 500 uses the method based on triangulation to determine the some cloud of X, Y, Z coordinate data of expression object 501 for each pixel of received image according to the pattern of known injection and the image that obtains.In embodiments, structured light patterns is by the three-dimensional coordinate encoding to make single image to be enough to determine object-point.This coding structure light pattern also can be called as measures three-dimensional coordinate with single shot.
In the exemplary embodiment, projector 508 uses visible light source irradiation pattern generator.Visible light source can be laser instrument, superluminescent diode, incandescent lamp, light emitting diode (LED) or other luminaires.In the exemplary embodiment, pattern generator is chromium-glass slide (chrome-on-glass slide) that etching has structured light patterns.Slide glass can have the single pattern or multiple pattern that move into or shift out appropriate location as required.Slide glass can be arranged in operating position manually or automatically.In other embodiments, source pattern can be reflected or the light of transmission by digital micromirror device (DMD), wherein Digital light projector (DLP), liquid crystal apparatus (LCD), liquid crystal on silicon (LCOS) equipment or the similar devices that under transmission mode instead of reflective-mode use of digital micromirror device (DMD) such as being manufactured by Texas Instruments (Texas Instruments Corporation).Projector 508 can also comprise lens combination 515, and this lens combination can change emergent light to have desired focus characteristics.
Equipment 500 also comprises the shell 502 with handle portion 504.In one embodiment, as mentioned above, equipment 500 can also be included in the interface 426 at one end gone up, and interface 426 is by equipment 500 machinery and be electrically coupled to probe housing 102.In other embodiments, equipment 500 can be integrated in probe housing 102.Interface 426 provides following advantage: make equipment 500 be couple to AACMM 100 fast and easily and dismantle from AACMM 100 and do not need extra instrument.
Camera 510 comprises the light sensor of the digital picture/expression in the region in the visual field generating sensor.Sensor can be charge-coupled image sensor (CCD) type sensor or complementary metal oxide semiconductor (CMOS) (CMOS) the type sensor such as with pel array.Camera 510 can also comprise miscellaneous part, such as but not limited to lens 503 and other optical devices.In the exemplary embodiment, projector 508 and camera 510 are arranged so that sensor can receive the light of the surface reflection from object 501 at a certain angle.In one embodiment, projector 508 and camera 510 are positioned such that equipment 500 can operate when probe end 118 is in appropriate location.In addition, should be understood that equipment 500 relative to probe end 118 be substantially fix and power in handle portion 504 can not affect the aligning of equipment 500 relative to probe end 118.In one embodiment, equipment 500 can have additional actuator (not shown), and this actuator enables operator switch obtaining data from equipment 500 and obtain between data from probe end 118.
Projector 508 and camera 510 are electrically coupled to the controller 512 be arranged in shell 502.Controller 512 can comprise one or more microprocessor, digital signal processor, storer and signal conditioning circuit.Because equipment 500 produces big data quantity and digital signal processing, so controller 512 can be arranged in handle portion 504.Controller 512 is electrically coupled to arm bus 218 via electric connector 434.Equipment 500 can also comprise actuator 514,516, and actuator 514,516 can by operator's manual activation to start the operation and data capture undertaken by equipment 500.In one embodiment, perform image procossing to determine to represent X, Y, Z coordinate data of the some cloud of object 501 by controller 512, and coordinate data transfers to electronic data processing system 210 via bus 240.In another embodiment, image is transferred to electronic data processing system 210, and performs the calculating of coordinate by electronic data processing system 210.
In one embodiment, controller 512 is configured to carry out with electronic data processing system 210 communicating to receive the structured light patterns image from electronic data processing system 210.In yet, can by electronic data processing system 210 automatically or response from operator input and change the pattern be transmitted on object.This can provide following advantage: by allowing to use when conditions permit the pattern being easy to decode, and more complicated pattern can be used when hope realizes precision or the level of resolution of expectation, come with the measurement result of acquisition of shorter processing time degree of precision.
In other embodiments of the present invention, equipment 520 (Figure 12) comprises paired camera 510.Camera 510 is arranged to receive the reflected light from object 501 relative to projector 508 at a certain angle.The use of multiple camera 510 in some applications can provide following advantage: by providing redundant image to improve the precision of measurement result.In other embodiments, redundant image makes it possible to by alternately operating the acquisition speed improved image to camera 510, thus make equipment 500 can quick obtaining order pattern.
Referring now to the operation of Figure 13 A and Figure 13 B description scheme light device 500.First equipment 500 adopt projector 508 structured light patterns 522 to be transmitted on the surface 524 of object 501.Structured light patterns 522 can be included in and be published in Proceedings ofSPIE by Jason Geng, the pattern disclosed in journal of writings " structure light 3 D imaging technique and application (DLP-Based Structured Light 3D Imaging Technologies andApplications) based on DLP " in volume 7932.Structured light patterns 522 can also include but not limited to one of pattern shown in Figure 14 to Figure 32.From projector 508 light 509 by surface 524 reflection and reflected light 511 received by camera 510.Should be understood that such as, when the image of pattern is caught by camera 510, the change on surface 524, such as projection 526, produces distortion in structured light patterns.Because pattern is formed by structured light, therefore in some cases, controller 512 or electronic data processing system 210 can determine penetrate pattern in pixel (such as pixel 513) and the pixel (such as pixel 515) in imaging pattern between one-to-one relationship.This makes it possible to use principle of triangulation to be specified to the coordinate of each pixel in picture pattern.The set of the three-dimensional coordinate on surface 524 is sometimes referred to as a cloud.By making equipment 500 move above surface 524, the some cloud of whole object 501 can be generated.Should be understood that in some embodiments, equipment 500 is coupled to sound end and provides following advantage: position and the orientation of equipment 500 are known by electronic data processing system 210, to make to determine the position of object 501 relative to AACMM 100.
In order to determine the coordinate of pixel, the angle of every bar projection line of the known light 509 crossing at point 527 place with object 522 corresponds to projected angle phi (Φ), and therefore Φ information is encoded in the pattern of injection.In embodiments, system is configured to make it possible to determine the Φ value corresponding with each pixel in imaging pattern.In addition, the angle omega (Ω) of magazine each pixel is known, and projector 508 is same with the parallax range " D " between camera is known.Therefore, following equation is used to obtain the distance " Z " of the position be imaged from camera 510 to pixel:
Z D = sin ( Φ ) sin ( Ω + Φ ) - - - ( 1 )
Therefore, the three-dimensional coordinate of each pixel in obtained image can be calculated.
Generally, there are two class formation light, i.e. coded structured light and uncoded structured light.Common form as the uncoded structured light as shown in Figure 14 to Figure 17 and Figure 28 to Figure 30 depends on the candy strip changed in a periodic manner along a dimension.The pattern of these types is employed to provide the approximate distance apart from object usually in a sequential manner.Some uncoded pattern embodiments such as sinusoid pattern can provide the measurement result of relative good accuracy.But, in order to make these type patterns effective, usually need to make scanning device and object relative to each other keep static.When scanning device or object (relative to the opposing party) are in motion, the coding pattern as shown in Figure 18 to Figure 27 can be preferred.Coding pattern makes it possible to use single obtained image to analyze image.Some coding patterns can be arranged in (such as, the core line perpendicular in projector plane) on projector pattern with certain orientation, thus simplify the analysis of the three-dimensional surface coordinate based on single image.
Core line is crossing with the source plane 517 in Figure 13 B or picture plane 521 (plane of camera sensor) by core face and mathematics line that is that formed.Core face can be through any plane of projector Stereocenter 519 and camera Stereocenter.Core line in source plane 517 in some cases can be parallel with the core line in picture plane 521, but be generally uneven.The one side of core line is: the given core line in projector plane 517 is as plane 521 having corresponding core line.Therefore, any known specific pattern on the core line in projector plane 517 directly can be observed and assessment in picture plane 521.Such as, if along in projector plane 517 core line place coding pattern, then can use the value read from the pixel of camera sensor 510 determine picture plane 521 in code element between spacing.This information can be used to determine the three-dimensional coordinate of the point 527 on object 501.Coding pattern can also be made relative to core line with known angular slope, thus effectively extract subject surface coordinate.The example of coding pattern is shown in Figure 20 to Figure 29.
In the embodiment with periodic patterns such as sinusoidal repeat patterns, sinusoidal cycles represents multiple pattern element.Because there is a large amount of two-dimensional periodic pattern, so pattern element is not conllinear.In some cases, the candy strip with the striped of different in width can presentation code pattern.
Referring now to Figure 14 to Figure 17, show the embodiment of uncoded structured light patterns.Some patterns in pattern use simple switch (or 1,0) type pattern and are called as binary pattern.In some cases, binary pattern is the known pattern with the particular sequence being called as Gray (gray) code sequence.Term Gray code in the three-dimensional tolerance field of structure based light and the term Gray code in electrical engineering field different, in electrical engineering field, term Gray code typically refers to the sequence variation of each single position (bit).The application follows the use to term Gray code as the convention in three-dimensional tolerance field, the wherein black value of a series of scale-of-two of Gray code ordinary representation and white value.Figure 14 A shows the example of binary pattern, and this binary pattern comprises multiple sequential picture 530,532,534, and each image has different candy strips.Usually, striped replaces between bright (exposure) fringe area and dark (non-exposure) fringe area.Sometimes, terms white and Hei Lai is used to refer to irradiated and unirradiated respectively.Therefore, when image 530, image 532, image 534 sequentially project on surface 524, composograph 536 is shown as shown in Figure 14B.It should be noted that for the sake of clarity, two patterns 535 of the bottom in Figure 14 B, pattern 537 be not shown in Figure 14 A.For each point (being represented by the pixel of camera in the picture) on object 501, synthesising pattern 536 has by pattern 530, pattern 532, pattern 534, pattern 535, pattern 537 sequential projection and unique binary value of obtaining, and it corresponds to relatively little possible projected angle Φ scope.By using these projected angles, together with for the known pixels angle Ω of given pixel and known reference distance D, the distance Z that equation (1) obtains from camera to object-point can be used.For each camera pixel, two-dimentional angle is known.Two dimension angle corresponds to one dimension angle Ω usually, is used for calculating distance Z according to equation (1).But that draws from each camera pixel spatially defines two-dimentional angle by camera Stereocenter with the line that object intersects at a point.When combining the value Z calculated, two pixel angles provide the three-dimensional coordinate corresponding with the point in subject surface.
Similarly, except binary pattern, the grey colored pattern of a series of orders of the striped with different gray-scale value can be used.When using in this context, term gray scale typically refers to the some place on object, from white (maximum light), to the grey (less light) of different brackets, irradiation to black (minimum light).Even if the light be projected has the colors such as red, still use this identical nomenclature, and gray-scale value corresponds to the grade of red emission.In embodiments, such as, pattern (Figure 15) has multiple image 538, image 540, image 542, and these images have the striped of different light power level, such as black streaking, grey striped and informal voucher line, for producing the pattern of injection on object 501.Gray-scale value can be used for possible projected angle Φ to determine within the scope of relatively little probable value.As discussed above, equation (1) then can be used to determine distance Z.
In another embodiment, can by obtaining the distance Z apart from object-point to phase shift measurement viewed in multiple image.Such as, in the embodiment shown in Figure 16, the gray scale intensities (intensity) 546 of projector pattern 552, gray scale intensities 548, gray scale intensities 550 change in sinusoidal mode, but have phase shift between projection pattern.Such as, in the first projector pattern, sinusoidal curve gray scale intensities 546 (representing the luminous power of each unit area) has the phase place of zero degree at specified point place.In the second projector pattern, sinusoidal curve brightness 548 has the phase place of 120 degree at identical point place.In the 3rd projector pattern, sinusoidal curve brightness 550 has the phase place of 240 degree at identical point place.(or right) offsets the saying in 1/3rd cycles is left the same for this and sinusoid pattern in each step.Phase-shift method is used to determine the phase place in each camera pixel place projected light, which eliminates the needs considering the information from neighbor when coding pattern single shot.The phase place determining camera pixel in many ways can be made.One method comprises execution multiplication and cumulative process, then gets the arc-tangent value of business.This method well known to a person skilled in the art and discuss no longer further.In addition, for phase-shift method, bias light is offset in the calculating of phase place.Due to these reasons, the value Z calculated for given pixel is usually more accurate than the value Z using coding pattern single shot method to calculate.But for the sinusoid pattern of those single acquisition as shown in Figure 16, all phase places calculated are different from 0 degree to 360 degree.For specific structured light triangulation system, if " thickness " of tested person object change is not too many, then these phase places calculated are enough, because the angle of each projected fringe is known in advance.But, if object is too thick, because pixel can be obtained from the first projection ray in first position impact object or the second projection ray in second position impact object, so may there is ambiguity between the phase place calculated for specific pixel.In other words, if there is following possibility: the phase place change of any pixel in camera array exceeds 2 π radians, then phase place possibly cannot be correctly decoded, and can not realize desired one-to-one relationship.
Figure 17 A shows sequence 1 to the sequence 4 of the projected Gray-codes brightness 554 according to following methods, can be eliminated the ambiguity of the distance Z obtained based on calculated phase place by the method.The set of Gray code pattern is sequentially projected on object.In the example shown, have by four order patterns of 1 of the left side of 554 in Figure 17 A, 2,3,4 instructions.Order pattern 1 is above upper for dark (black) and at the right half part (element 16 to element 31) of pattern in the left-half (element 0 to element 15) of pattern is bright (in vain).Order pattern 2 is leaning on center (element 8 to element 23) for blanking bar and is leaning on edge's (element 2 to element 7, element 24 to element 31) being bright band.Order pattern 3 has two bright bands and three bright bands (element 0 to element 3, element 12 to element 19, element 28 to element 31) be separated at immediate vicinity (element 4 to element 11, element 20 to element 27).Order pattern 4 has four blanking bars (element 2 to element 5, element 10 to element 13, element 18 to element 21, element 26 to element 29) and five bright bands be separated (element 0 to element 1, element 6 to element 9, element 14 to element 17, element 22 to element 25, element 30 to element 31) that are separated.For pixel any given in camera, this series of pattern makes " the object thickness region " of object can rise to 16 times relative to the initial object thickness area corresponding with all elements of element 0 to element 31.
In the other method 556 shown in Figure 17 C, perform the phase-shift method of the method be similar in Figure 16.In the embodiment shown in Figure 17 C, pattern 556A---four sinusoidal cycles are projected on object.Due to reason as discussed above, use the pattern of Figure 17 C may there is ambiguity in the distance Z of object.A kind of mode of reduction or deblurring is sinusoid pattern 556B, 556C that projection adds one or more, and each pattern has different fringe period (fringe period) (pitch).Therefore, such as, in Figure 17 B, second sinusoid pattern 555 with three fringe period instead of four fringe period is projected on object.In embodiments, the difference of the phase place of two patterns 555,556 can by the ambiguity of helping eliminate in the distance Z of distance target.
The other method of deblurring uses dissimilar method, such as, Gray code method in Figure 17 A, eliminates the ambiguity in the distance Z using sinusoidal curve phase-shift method to calculate.
In the application of object and equipment 500 relative motion, expect to use single pattern that camera 510 can be caught and provide to be enough to not need to project to sequential picture to the image of the information that the three-dimensional feature of object 501 is measured.Referring now to Figure 18 and Figure 19, pattern 558, pattern 566 have color distribution, and this color distribution makes it possible to measure object based on single (coding) image in some cases.In the embodiment of Figure 18, pattern 558 make the wavelength used up spatially continually varying line carry out pattern generation, such as in pattern color from blueness to green to yellow to redness to aubergine consecutive variations.Therefore, for each specific spectral wavelength, one-to-one relationship can be formed between the image of injection and image.Adopt the corresponding relation set up, the three-dimensional coordinate of object 501 can be determined according to single imaging pattern.In one embodiment, the striped of pattern 558 is directed perpendicular to the core line in projector plane.Because the core line in projector plane maps to the core line in camera image plane, so can by moving along the direction of the core line in camera image plane and noticing that the color that often kind of situation rolls off the production line obtains projector point and associating between camera point.Should be understood that each pixel in camera image plane corresponds to two-dimentional angle.Color makes it possible to determine the one-to-one relationship between specific projection angle and certain camera angle.Distance (parallax range D) between this correspondence relationship information combining camera and projector and camera and projector, relative to the angle of baseline, are enough to the distance Z determined from camera to object.
Figure 19 illustrates another embodiment using color pattern.In this embodiment, multiple coloured pattern with different brightness 560,562,564 is combined to generate color pattern 566.In one embodiment, multiple coloured pattern brightness 560,562,564 is Essential colour, and to make pattern 560 change red brightness, pattern 562 changes green brightness and pattern 564 changes blue brightness.Because the ratio of color is known, so the image of the injection obtained has the known relation can carrying out decoding in imaging pattern.Identical with the embodiment of Figure 18, once set up corresponding relation, then can determine the three-dimensional coordinate of object 501.Be different from the pattern (the single circulation of the unique color that wherein projected) of Figure 18, the pattern of Figure 19 projects the complete circulation of three of almost identical color.For the pattern of Figure 18, because the unique corresponding particular color in each camera pixel identification and specific projection direction, thus find range unlikely there is ambiguity (at least for the situation of projection line perpendicular to core line) in Z.Because camera angle and projection angle are known, so the three dimensional object coordinate that only can use single camera image, use triangulation to determine in each pixel position.Therefore, the method for Figure 18 can be considered to encode, single shot method.Unlike this, in Figure 19, likely can there is ambiguity in the distance Z of object-point.Such as, if purple seen by camera, then projector may project with the unspecified angle in three different angles.Based on triangulation geometry, three different distance Z are possible.If the thickness of known object is in the relatively little scope of value in advance, so can eliminate two values in three different values, thus obtain three-dimensional coordinate with single shot.But, in the ordinary course of things, need to use extra projection pattern to carry out deblurring.Such as, the space periodic of coloured pattern can be changed, then use its irradiation object again.In this case, the method for projecting structural optical be considered to sequential method instead of coding, single shot method.
Referring now to Figure 20 to Figure 23, show the coding structure light pattern obtained for single image based on striped index technology.In the embodiment of Figure 20 and Figure 21, there is color fringe 568, the pattern of color striped 570 sends by projector 508.This technology utilizes the characteristic of imageing sensor, and this sensor has three independently Color Channels, such as red, green, blue, or cyan, yellow, magenta.The combination of the value generated by these sensor passages can produce a large amount of coloured patterns.As the embodiment of Figure 19, the ratio of color distribution is known, therefore can determine penetrate pattern and imaging pattern between relation and calculate three-dimensional coordinate.Coloured pattern of other types can also be used, such as, based on the pattern of Gerard Debreu because of (DeBruijn) sequence.Striped index technology and Gerard Debreu are known because of sequence for those of ordinary skills, therefore discuss no longer further.
In the embodiment of Figure 22 and Figure 23, use achromaticity striped index technology.In the embodiment of Figure 22, pattern 572 provides the striped group with multiple brightness (gray scale) grade and different in width.Therefore, special stripe group has unique greyscale pattern in whole image.Due to group uniqueness, can determine penetrate pattern and imaging pattern between one-to-one relationship with the coordinate of calculating object 501.In the embodiment of Figure 23, pattern 574 provides a series of stripeds with segmentation pattern.Because every bar line has unique step-by-step design, so can determine that corresponding relation between the pattern that penetrates and imaging pattern is with the coordinate of calculating object 501.In Figure 20 to Figure 23, additional advantage can be obtained to make it be in camera plane, this is because this simplifies the determination to the second dimension in the process obtaining the one-to-one relationship between camera pattern and projector pattern perpendicular to core line orientation by making projection line 572, projection line 574.
Referring now to Figure 24 to Figure 27, show the coding structure light pattern using two-dimensional space lattice technology.The pattern of these types is arranged such that subwindow (window 576 on such as pattern 578) is unique relative to other subwindows in pattern.In the embodiment of Figure 24, use pseudo-random binary array pattern 578.Pattern 578 uses the grid with the element (such as round 579) forming coding pattern.Should be understood that the element that can also use and there are other geometric configuratioies, such as but not limited to square, rectangle and triangle.In the embodiment of Figure 25, pattern 580 is depicted as has many-valued pseudorandom arrays, and each value wherein in numerical value has the shape 582 of specifying.These shapes 582 form unique subwindow 584, to make the corresponding relation between the pattern that penetrates and imaging pattern can the coordinate of calculating object 501.In the embodiment of Figure 26, grid 586 is the color that coding has the striped perpendicular to projector plane.The pattern of Figure 26 not necessarily can be provided in pattern that can be decoded in single shot, but colouring information can contribute to Simplified analysis.In the embodiment of Figure 27, the array 588 of colored shape (such as square or circular) is used to form pattern.
Referring now to Figure 28 A to Figure 28 B, show exemplary sinusoidal curve pattern 720.In embodiments, line 734 is perpendicular to the core line in projector plane.Sinusoid pattern 720 comprises 30 lines, 722,30 lines 722 and repeats once with the line 722 obtaining ading up to 60.Every bar line 722 has sinusoidal character 723, namely with its on and the phase about 180 degree of line of below.This be in order to make line 722 as much as possible near and make the larger depth of field, because line may be smudgy but still can be identified on the projection surface or on the image obtained.Each single line 722 can use the phase place of this line and be decoded uniquely, and the length of its center line must be at least one wavelength sinusoidal.
Because pattern 720 is repetitions, so the ambiguity of the identification of line generally can be caused.But this problem is solved by the visual field of camera and the geometry of the depth of field within the system.For line being carried out single view in the depth of field of optical resolution, camera, i.e. row's pixel, imaging can not go out have two lines of identical phase place.Such as, the first row pixel on camera only can receive the reflected light of line 1 to the line 30 from pattern.And downward further along camera sensor, the reflected light of line 2 to line 31 that another row's pixel will only receive from pattern, by that analogy.In Figure 28 B, show the enlarging section of the pattern 720 with three lines, the phase place wherein between line 722 is in succession about 180 degree.The phase place that Figure 28 B also show each single line how to be enough to decode uniquely to line.
Referring now to Figure 29 A to Figure 29 B, show another pattern 730 with square pattern element.In embodiments, line 732 is perpendicular to the core line in projector plane.Square pattern 730 comprised 27 lines 732 before pattern 730 is by repetition, and had the line ading up to 59.The code element 734 of pattern 730 is by the phase separation of square wave from left to right in Figure 29 B.Pattern 730 is encoded to make one group of sequence line 732 to be distinguished by the relative phase of its member.In image, obtain sequence line by vertically scanning for line.In embodiments, vertically scanning refers to along the core line in camera image plane and scans.Sequence line in camera vertical pixel row is paired together and its relative phase is determined.Need four pairs of sequence lines to decode with the line organized these, and in pattern 730, it is positioned.Owing to repeating to cause also there is ambiguity in this pattern 730, but this also can as above to solve for the identical mode described in sinusoid pattern 720.Figure 29 B shows the enlarged drawing of four lines 732 of square pattern.The phase place that this embodiment shows independent single line 732 can not be decoded uniquely to line, this is because Article 1 line and Article 3 line have identical absolute phase.
This mode of encoding to relative phase and absolute phase provides following advantage: there is larger tolerance the position for phase.Little error in the structure of projector can make the phase offset of the whole depth of field center line at camera, and the error caused due to projector and camera lens makes more to be difficult to determine absolute phase.This can overcome in the following manner in absolute phase method: by the increase cycle to make the cycle enough large thus the error that can overcome when determining phase place.
Should understand, for the situation of the two-dimensional pattern projected to the coding pattern of light, the pattern element of three not conllinear can be recognized due to its coding, and because the pattern element of three not conllinear is projected in two dimensions, so at least three pattern elements are not conllinear.For the situation of periodic patterns such as sinusoidal repeat patterns, each sinusoidal cycles represents multiple pattern element.Because have a large amount of periodic patterns in two dimensions, so pattern element is not conllinear.By contrast, for the situation of laser line scanning instrument sending light beam, all pattern elements are positioned on straight line.Although line has width and the luminous power of the afterbody in line cross section is less than the peak value of signal, these aspects of line are not assessed individually in the process of surface coordinate obtaining object, therefore do not represent independent pattern element.Although line can comprise multiple pattern element, these pattern elements are conllinear.
In addition, as shown in Figure 30 to Figure 31, can in conjunction with various patterning technique with the uncoded pattern 592 of checkerboard of the uncoded pattern of checkerboard 590 or coloured (Figure 31) that form scale-of-two (Figure 30).In the another embodiment shown in Figure 32, can use photometric stereo, wherein light source 596 is moved to multiple position and thinks that object 501 takes multiple image 594.
Referring now to Figure 33, show another embodiment of the system 700 of the three-dimensional coordinate had for obtaining object 702.In this embodiment, equipment 704 can independent operation when pulling down from AACMM 100.Equipment 704 comprises controller 706 and optional display 708.On the housing that display 708 can be integrated into equipment 704 or can be the separate part being coupled to equipment 704 when equipment 704 uses independent of AACMM 100.In the embodiment that display 708 can be separated with equipment 704, display 708 can comprise controller (not shown), and this controller provides extra function to be conducive to the independent operation of equipment 704.In one embodiment, controller 706 is arranged in separable display.
Controller 706 comprises telecommunication circuit, and this telecommunication circuit is configured to wirelessly data (such as image or coordinate data) be transferred to AACMM 100, independent computing equipment 710 or both combinations via communication link 712.Computing equipment 710 can be such as but not limited to computing machine, notebook, flat computer, personal digital assistant (PDA) or mobile phone.The point cloud of object 702 coordinate that display 708 enables operator see obtained image or obtain.In one embodiment, controller 706 decodes to the pattern in obtained image the three-dimensional coordinate determining object.In another embodiment, image is obtained by equipment 704 and is transferred to AACMM 100, computing equipment 710 or both combinations.
Equipment 704 can also comprise positioning equipment assembly 714.Positioning equipment assembly can comprise one or more inertial navigation sensors, such as GPS (GPS) sensor, gyro sensor, acceleration transducer.Such sensor can be electrically coupled to controller 706.Gyro sensor and acceleration transducer can be single shaft equipment or multi-axis machines.Positioning equipment assembly 714 is configured to controller 706 can be measured when equipment 704 is pulled down from AACMM or the orientation of maintenance equipment 704.Gyroscope in positioning equipment assembly 714 can be MEMS gyro instrument equipment, solid state ring laser equipment, fiber plant gyroscope or other types.
When equipment 704 is dismantled from joint arm CMM 100, use the method that the image obtained from multiple scanning is combined.In embodiments, image obtains respectively by use coding pattern, to make only to need single image to obtain the three-dimensional coordinate of ad-hoc location with equipment 704 and directional association.A kind of method combined the multiple images caught by equipment 704 is that to be provided to small part between adjacent images overlapping, to make it possible to match point cloud feature.This matching feature can be assisted by above-mentioned inertial navigation set.
May be used for auxiliary another kind of method of carrying out accuracy registration to the image gathered by equipment 704 is utilize reference marker.In embodiments, reference marker is the little mark with binder or adhesive patch, such as sphere shaped markup, and it is placed on one or more object to be measured.Even if such being marked at of relatively small amount is carried out in registration also very useful to multiple image, if particularly object to be measured has the feature of the relatively small amount for registration.In embodiments, reference marker can project into hot spot on one or more object checked.Such as, the small portable projectors that can send multiple point can be placed on before one or more object to be measured.The advantage that subpoint is better than viscosity point is that this point does not need to be attached and to be follow-uply removed.
In one embodiment, equipment by structured light projection to continuously and on the region 716 closed, and can to obtain image in the scope of the 100mm to 300mm of the precision of 35 microns on region 716.In embodiments, the vertical area 716 of projection is about 150mm 2to 200mm 2.One or more camera 510 can be have the CMOS of 1.2 to 5.0 million pixels or the digital camera of ccd sensor.
With reference to Figure 28 and Figure 29, the process of decoding to coding pattern will be described.The first step of decoding to the image of pattern is the center of gravity (cog) 724 (Figure 28 C) extracting projection pattern 720 feature along Y-direction.This is moving average by calculating grey scale pixel value and moves down along Y-direction, processes single row to realize at every turn.When the pixel value in image is higher than moving average, then obtain the starting point of feature.After obtaining starting point, continue to increase the width of feature until pixel value is lower than moving average.Then the pixel value between Origin And Destination and Y position thereof is used to calculate weighted mean, to provide the center of gravity 724 of the pattern characteristics 723 in image.Distance between Origin And Destination also goes on record for future use.
The following center of gravity 724 produced that uses obtains pattern lines 722.This be by from the first row of image along from left to right the direction of (when from direction viewing shown in figure) move and realize.For each center of gravity 724 in these row, in the adjacent column on next-door neighbour right side, search is positioned at the center of gravity 724 of specific range.If obtain the center of gravity 724 of two couplings, then potential line is determined.Process movably along with on image, how new line is determined, and is detected in tolerance along with additional center of gravity 724, and other lines previously determined extend in length.After whole image is processed, wave filter is applied on extracted line, to guarantee that the line of only desired length (being the wavelength of pattern) is used in remaining step.Figure 28 C also show detected line, and it is all longer than the single wavelength of pattern.In one embodiment, between the center of gravity of adjacent column, there is no increment (delta), or there is little increment.
The form at the next step Shi Yikuai center of decoding process is in the X direction along line drawing projected pattern features.Each pattern comprises both wide piece and narrow piece.In sinusoid pattern 720, this refers to crest and the trough of ripple, and in square pattern 730, this refers to wide square and narrow square.This process is carried out in the mode be similar to along Y-direction extraction feature, but the width calculation moving average also using the first stage to obtain, and moving direction is along line.As mentioned above, be greater than at width in the region of moving average and extract feature, but in this process, be also less than at width in the region of moving average and extract feature.Width and X position is used to calculate weighted mean to obtain block 726 center in X direction.The Y position of the center of gravity 724 between moving average intersection is also used to come the center of computing block 726 along Y-direction.This is on average realizing of Y-coordinate by asking for center of gravity.Starting point and the terminal of every bar line can also be revised, to guarantee that two points are the points of average intersection of being moved based on the feature extracted in this step.In one embodiment, complete block is only used in the treatment step below.
Then, line and block are further processed to guarantee that distance between every bar Xian Shangkuai center 726 is in predetermined tolerance.This is the increment between the X center between two adjacent blocks by asking on line, and checks that increment is less than that tolerance come.If increment is greater than tolerance, then line is broken into shorter line.If need the separated of most those latter two blocks on line, then last block is removed and can not generates any additional line.If to need between on line first piece and second piece or second piece of separated with the 3rd piece, then the block be positioned on the left of breakpoint is dropped and does not generate additional line.For disconnecting the situation of other positions any along the line of occurring in, line is broken into two lines, thus generates new line and suitable block is transferred to new block.After at this processing stage, two patterns need different steps to complete decoding.
Now, Xian Shangkuai center can be used to adopt another treatment step offset of sinusoidal curve pattern 720 to decode.Calculate the wavelength of the pattern 720 on the modulus (modulus) at each piece of X center and line 722, and the average phase place of giving outlet 722 of these values.Then, the phase place of line 722 can be used to decode to the line in pattern 720, this makes it possible to again X, Y, Z coordinate position of all centers of gravity 724 determined on line 722.
Before square pattern 730 is decoded, before carrying out any decoding, vertically connect First Line 732.This makes it possible to identification one group of line and and is not only the single line being similar to sinusoid pattern.Connection 736 is obtained between the center of gravity online 732 being included in block calculated in the processing stage of by using block 734 and first.Whether the first center of gravity in each piece on detection line 732, have another center of gravity to check in same row immediately below it.If do not have center of gravity in below, then this some place does not have, with the connection of another line, therefore to process continuation.If have center of gravity in below, then determine the Y distance between two centers of gravity, and compared with maximum spacing required between line.If distance is less than this value, then two lines are considered to be in the connection of that some place and connect 736 and are stored, and process proceeds to next block.In one embodiment, it is unique that line connects 736, to make the connection 736 between two lines more than neither one.
Next step for the process of square pattern 730 is the phase calculation between connecting line.First often pair of line 732 is processed to determine the overlapping length between line 732.In one embodiment, between this is to line, there is the overlapping of at least one wavelength, to make it possible to calculate relative phase.If line has required overlapping, then obtain the center of gravity of the center being positioned at overlapping region.Connect for this line, determine to comprise center of gravity and immediately below the block 738 of center of gravity, and the relative phase between computing block X center.This process is repeated for all connections between line.In one embodiment, only this process is being repeated in downward direction along Y-axis.This is because coding is based on around the connection under line instead of line or both other modes.Figure 29 C shows and can be used for calculating the block 738 that this organizes the relative phase of line.Relative phase in the embodiment of Figure 29 C is 3,1 and 2, and these phase places can be used to final stage to decode to top line.
The next step of decoding to square pattern 730 is used in the relative phase execution calculated in previous steps to search.736 are connected until reach four and connect the degree of depth and process every bar line 732 by downward tracker wire.This degree of depth is used to be because this is the number of phases of decoding to line.Each grade of junction, the relative phase between line 732 is used to determine Hash (hash).When reaching the required connection degree of depth, Hash is used to search line coding.If Hash returns an efficient coding, then this coding is recorded and is stored in ballot system.By this way every bar line 732 is processed, and use all connections (if they are effective phase combination) with desired depth to produce ballot.Then, last step is that who gets the most votes and by the coding assignment of line 732 to this value for which coding found out on every bar line 732.If there is no the coding that who gets the most votes is unique, then do not give line allocated code.When coding is assigned with, line 732 is identified, and can obtain now X, Y, Z coordinate position of all centers of gravity on line 732.
Should note, although the explanation provided above is based on three or more pattern elements, whether conllinear is distinguished between line sweep instrument and region (structured light) scanner, should notice that the object of this criterion is to distinguish the pattern being projected as region and the pattern being projected as line.Therefore, that project in a linear fashion, only have information along single path pattern is line pattern, even if this one-dimensional patterns may bend.
Although describe the present invention with reference to example embodiment, it will be understood by a person skilled in the art that, the element that can the present invention be carried out various change and can be replaced with equivalent in the present invention without departing from the scope of the present invention.In addition, under the prerequisite not departing from base region of the present invention, multiple amendment can be made and be suitable for instruction of the present invention to make particular case or material.Therefore, the invention is not restricted to disclosed as the particular implementation implementing best mode of the present invention, and the present invention will comprise all embodiments fallen in the scope of appended claims.In addition, the use of first, second grade of term does not represent any order or importance, and first, second grade of term is for distinguishing an element and another element.In addition, the use of term one (a), (an) etc. does not represent logarithm quantitative limitation, but represents to there is at least one project that is cited.

Claims (46)

1. one kind for the portable articulated arm coordinate measuring machine (AACMM) of measuring object three-dimensional coordinate in space, comprising:
Base portion;
Have the arm manuallyd locate of contrary first end and the second end, described arm is coupled to described base portion rotatably, and described arm comprises the arm section of multiple connection, and each arm section comprises at least one position detector for generation of position signalling;
Electronic circuit, described electronic circuit receives the described position signalling from least one position detector described in each arm section;
Be coupled to the sound end of described first end;
Be coupled to the non-contact 3-D measuring equipment of described sound end, described non-contact 3-D measuring equipment has projector and imageing sensor, described projector tool active planar, described projector arrangement becomes structured light on described object, described structured light sends from described source plane and comprises the pattern element of at least three not conllinear, and described imageing sensor is arranged to receive the described structured light from described object reflection; And
Be electrically coupled to the processor of described electronic circuit, described processor is configured in response to reception from the described position signalling of described position detector and in response to being received the three-dimensional coordinate that described structured light determines the point on described object by described imageing sensor.
2. AACMM according to claim 1, wherein, described non-contact 3-D measuring equipment is removably coupled to described sound end.
3. AACMM according to claim 1, wherein, described structured light is coding structure light pattern.
4. AACMM according to claim 3, wherein, described non-contact 3-D measuring equipment can independent operation when pulling down from described sound end.
5. AACMM according to claim 3, wherein, described coding structure light pattern comprises graphic element, and described graphic element comprises at least one in square, rectangle and triangle.
6. AACMM according to claim 5, wherein, described encoded light pattern is the sinusoid pattern comprising 30 lines, and the phase place of every bar line is relative to the phase 180 degree of adjacent lines.
7. AACMM according to claim 5, wherein, described coding structure light pattern comprises 27 lines and described graphic element is square.
8. AACMM according to claim 3, wherein, described coded structured light comprises the pattern with multi-wavelength, and wherein at least one wavelength has the space layout being different from other wavelength.
9. AACMM according to claim 3, wherein, described coded structured light comprises the pattern with multiple different colours.
10. AACMM according to claim 3, wherein, described coded structured light comprises the line pattern of segmentation.
11. AACMM according to claim 3, wherein, described coded structured light comprises two-dimensional space lattice.
12. AACMM according to claim 11, wherein, described two-dimensional space lattice comprises pseudo-random binary array.
13. AACMM according to claim 11, wherein, described two-dimensional space lattice comprises color-coded grid.
14. AACMM according to claim 11, wherein, described two-dimensional space lattice comprises many-valued pseudorandom arrays.
15. AACMM according to claim 11, wherein, described two-dimensional space lattice comprises the two-dimensional array of color-coded geometric configuration.
16. AACMM according to claim 1, wherein, described structured light patterns is uncoded structured light patterns.
17. AACMM according to claim 16, wherein, described uncoded structured light patterns comprises sequential projection image.
18. AACMM according to claim 17, wherein, described sequential projection image is the group of binary pattern.
19. AACMM according to claim 17, wherein, described sequential projection image is the pattern groups comprising the striped with at least two brightness degrees.
20. AACMM according to claim 17, wherein, described sequential projection image is the group of at least three sinusoid pattern.
21. AACMM according to claim 17, wherein, described sequential projection image is the pattern groups comprising the striped with at least two brightness degrees, and described pattern groups comprises at least three sinusoid pattern.
22. AACMM according to claim 16, wherein, the sequence of described uncoded structured light patterns to different exposure patterns projects, and each pattern in the sequence of described different exposure pattern is projected from the diverse location relative to described object.
23. AACMM according to claim 16, wherein, described uncoded structured light patterns comprises the greyscale pattern of repetition.
24. AACMM according to claim 1, also comprise the contact type measurement equipment being coupled to described sound end.
25. AACMM according to claim 1, wherein, described processor is positioned at described non-contact 3-D measuring equipment.
26. 1 kinds of methods operated the portable articulated arm coordinate measuring machine for measuring object coordinate in space, comprising:
Arrange the arm manuallyd locate with contrary first end and the second end, described arm comprises the arm section of multiple connection, and each arm section comprises at least one position detector for generation of position signalling;
Arrange the sound end for measuring described object, described sound end is coupled to described first end;
The described position signalling from described detecting device is received at electronic circuit place;
The 3 D non-contacting type measuring equipment with controller is set, described 3 D non-contacting type measuring equipment has sensor and projector, described projector arrangement becomes structured light on described object, described projector tool active planar, described structured light sends from described source plane and comprises the pattern element of at least three not conllinear; And
Structured light is projected to described object from described 3-D measuring apparatus.
27. methods according to claim 26, also comprise the described structured light using the reception of described 3-D measuring apparatus from described object reflection.
28. methods according to claim 27, also comprise the three-dimensional coordinate determining the point on described object according to reflected structured light.
29. methods according to claim 28, wherein, described structured light is coded structured light.
30. methods according to claim 29, wherein, the described coding structure light pattern projected on described object comprises graphic element, and described graphic element comprises at least one in square, rectangle and triangle.
31. methods according to claim 29, wherein, described encoded light pattern is the sinusoid pattern comprising 30 lines, and the phase place of every bar line is relative to the phase 180 degree of adjacent lines.
32. methods according to claim 29, wherein, described coding structure light pattern comprises 27 lines and described graphic element is square.
33. methods according to claim 29, wherein, described coded structured light comprises the pattern with multi-wavelength, and wherein at least one wavelength has the space layout being different from other wavelength.
34. methods according to claim 33, wherein, described in there is multiple wavelength pattern be arranged to be substantially perpendicular to the line of described source plane orientation.
35. methods according to claim 29, wherein, described coded structured light comprises the single pattern with multiple different colours.
36. methods according to claim 29, wherein, described coded structured light comprises the line pattern of segmentation.
37. methods according to claim 29, wherein, described coded structured light comprises two-dimensional space lattice.
38. according to method according to claim 37, and wherein, described two-dimensional space lattice comprises pseudo-random binary array.
39. according to method according to claim 37, and wherein, described two-dimensional space lattice comprises color-coded grid.
40. according to method according to claim 37, and wherein, described two-dimensional space lattice comprises many-valued pseudorandom arrays.
41. according to method according to claim 37, and wherein, described two-dimensional space lattice comprises the two-dimensional array of color-coded geometric configuration.
42. methods according to claim 29, also comprise and use described electronic circuit that described coded structured light is become the second pattern from the first pattern.
43. methods according to claim 42, wherein, described coded structured light changes in response to the input of operator.
44. methods according to claim 42, wherein, automatically change described coded structured light by described electronic circuit in response to the change of the situation of described object.
45. AACMM according to claim 28, also comprise:
Described 3-D measuring apparatus is separated from described sound end; And
Described 3-D measuring apparatus is operated independently with described sound end.
46. AACMM according to claim 45, also comprise and transmitting the data from the described 3-D measuring apparatus operated independently with described sound end.
CN201380029985.4A 2012-06-07 2013-05-20 Coordinate measurement machines with removable accessories Pending CN104380033A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/491,176 2012-06-07
US13/491,176 US8832954B2 (en) 2010-01-20 2012-06-07 Coordinate measurement machines with removable accessories
PCT/US2013/041826 WO2013184340A1 (en) 2012-06-07 2013-05-20 Coordinate measurement machines with removable accessories

Publications (1)

Publication Number Publication Date
CN104380033A true CN104380033A (en) 2015-02-25

Family

ID=48537024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380029985.4A Pending CN104380033A (en) 2012-06-07 2013-05-20 Coordinate measurement machines with removable accessories

Country Status (5)

Country Link
JP (1) JP5816773B2 (en)
CN (1) CN104380033A (en)
DE (1) DE112013002824T5 (en)
GB (1) GB2517621A (en)
WO (1) WO2013184340A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052591A (en) * 2015-04-10 2016-10-26 佳能株式会社 Measurement device and method for measuring the shape of an object to be measured, system, and article production method
CN106712160A (en) * 2015-07-30 2017-05-24 安徽啄木鸟无人机科技有限公司 Charging method of quick unmanned aerial vehicle (UAV) charging system
CN106767410A (en) * 2015-11-19 2017-05-31 手持产品公司 high-resolution dot pattern
CN107835931A (en) * 2015-12-04 2018-03-23 安德烈.弗拉基米罗维奇.克里莫夫 The method for monitoring the linear dimension of 3D solid
CN107957236A (en) * 2016-10-14 2018-04-24 卡尔蔡司工业测量技术有限公司 Method for operating coordinate measuring machine
CN108475147A (en) * 2016-01-14 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN109952488A (en) * 2016-09-09 2019-06-28 优质视觉技术国际公司 For measuring the articulated joint with multiple sensors of machine
CN113188450A (en) * 2021-04-23 2021-07-30 封泽希 Scene depth detection method and system based on structured light

Families Citing this family (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
JP6465682B2 (en) * 2014-03-20 2019-02-06 キヤノン株式会社 Information processing apparatus, information processing method, and program
US9769454B2 (en) 2014-06-20 2017-09-19 Stmicroelectronics S.R.L. Method for generating a depth map, related system and computer program product
US10656617B2 (en) 2014-07-16 2020-05-19 Faro Technologies, Inc. Measurement device for machining center
US20160016274A1 (en) * 2014-07-16 2016-01-21 Faro Technologies, Inc. Measurement device for machining center
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
EP3194883B1 (en) * 2014-08-07 2018-10-17 Ingenera SA Method and relevant device for measuring distance with auto-calibration and temperature compensation
US9693040B2 (en) 2014-09-10 2017-06-27 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US9602811B2 (en) 2014-09-10 2017-03-21 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
GB2545603B (en) * 2014-09-10 2020-04-15 Faro Tech Inc A portable device for optically measuring three-dimensional coordinates
US9671221B2 (en) 2014-09-10 2017-06-06 Faro Technologies, Inc. Portable device for optically measuring three-dimensional coordinates
DE102014013677B4 (en) 2014-09-10 2017-06-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment with a handheld scanner and subdivided display
DE102014013678B3 (en) 2014-09-10 2015-12-03 Faro Technologies, Inc. Method for optically sensing and measuring an environment with a handheld scanner and gesture control
CN107076551B (en) * 2014-09-19 2021-02-02 海克斯康测量技术有限公司 Multi-mode portable coordinate measuring machine
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
DE102015205187A1 (en) * 2015-03-23 2016-09-29 Siemens Aktiengesellschaft Method and device for the projection of line pattern sequences
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3396313B1 (en) 2015-07-15 2020-10-21 Hand Held Products, Inc. Mobile dimensioning method and device with dynamic accuracy compatible with nist standard
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
KR102482062B1 (en) * 2016-02-05 2022-12-28 주식회사바텍 Dental three-dimensional scanner using color pattern
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
CN106091930B (en) * 2016-08-16 2019-01-11 郑州辰维科技股份有限公司 A kind of real-time online measuring method based on double camera measuring system and structured light sensor
EP3315902B1 (en) * 2016-10-27 2023-09-06 Pepperl+Fuchs SE Measuring device and method for triangulation measurement
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US20200408512A1 (en) * 2018-03-16 2020-12-31 Nec Corporation Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, program, and storage medium
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
FI128523B (en) 2018-06-07 2020-07-15 Ladimo Oy Modeling the topography of a three-dimensional surface
US20200014909A1 (en) 2018-07-03 2020-01-09 Faro Technologies, Inc. Handheld three dimensional scanner with autofocus or autoaperture
FR3083602B1 (en) * 2018-07-06 2020-09-18 Hexagon Metrology Sas MEASURING ARM WITH MULTIFUNCTIONAL END
FR3083605B1 (en) * 2018-07-06 2020-09-18 Hexagon Metrology Sas MEASURING ARM WITH MULTIFUNCTIONAL END
WO2020136885A1 (en) * 2018-12-28 2020-07-02 ヤマハ発動機株式会社 Three-dimensional measurement device and workpiece processing device
WO2021003444A2 (en) * 2019-07-02 2021-01-07 Nikon Corporation Metrology for additive manufacturing
US11763473B2 (en) 2020-07-23 2023-09-19 Zhejiang Hanchine Ai Tech. Co., Ltd. Multi-line laser three-dimensional imaging method and system based on random lattice
CN111854642B (en) * 2020-07-23 2021-08-10 浙江汉振智能技术有限公司 Multi-line laser three-dimensional imaging method and system based on random dot matrix
WO2022207201A1 (en) * 2021-03-29 2022-10-06 Sony Semiconductor Solutions Corporation Depth sensor device and method for operating a depth sensor device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060017720A1 (en) * 2004-07-15 2006-01-26 Li You F System and method for 3D measurement and surface reconstruction
CN1812868A (en) * 2003-04-28 2006-08-02 斯蒂芬·詹姆斯·克兰普顿 CMM arm with exoskeleton
US20110164114A1 (en) * 2010-01-06 2011-07-07 Canon Kabushiki Kaisha Three-dimensional measurement apparatus and control method therefor
WO2011090892A2 (en) * 2010-01-20 2011-07-28 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
EP2400261A1 (en) * 2010-06-21 2011-12-28 Leica Geosystems AG Optical measurement method and system for determining 3D coordination in a measuring object surface

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5611147A (en) 1993-02-23 1997-03-18 Faro Technologies, Inc. Three dimensional coordinate measuring apparatus
US5402582A (en) 1993-02-23 1995-04-04 Faro Technologies Inc. Three dimensional coordinate measuring apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1812868A (en) * 2003-04-28 2006-08-02 斯蒂芬·詹姆斯·克兰普顿 CMM arm with exoskeleton
US20060017720A1 (en) * 2004-07-15 2006-01-26 Li You F System and method for 3D measurement and surface reconstruction
US20110164114A1 (en) * 2010-01-06 2011-07-07 Canon Kabushiki Kaisha Three-dimensional measurement apparatus and control method therefor
WO2011090892A2 (en) * 2010-01-20 2011-07-28 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
EP2400261A1 (en) * 2010-06-21 2011-12-28 Leica Geosystems AG Optical measurement method and system for determining 3D coordination in a measuring object surface

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106052591A (en) * 2015-04-10 2016-10-26 佳能株式会社 Measurement device and method for measuring the shape of an object to be measured, system, and article production method
CN106052591B (en) * 2015-04-10 2019-09-03 佳能株式会社 Measuring device, measurement method, system and article production method
CN106712160B (en) * 2015-07-30 2019-05-21 安徽啄木鸟无人机科技有限公司 A kind of charging method of unmanned plane quick charging system
CN106712160A (en) * 2015-07-30 2017-05-24 安徽啄木鸟无人机科技有限公司 Charging method of quick unmanned aerial vehicle (UAV) charging system
CN106767410A (en) * 2015-11-19 2017-05-31 手持产品公司 high-resolution dot pattern
CN106767410B (en) * 2015-11-19 2023-09-19 手持产品公司 High resolution dot pattern
CN107835931A (en) * 2015-12-04 2018-03-23 安德烈.弗拉基米罗维奇.克里莫夫 The method for monitoring the linear dimension of 3D solid
CN108475147A (en) * 2016-01-14 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN109952488A (en) * 2016-09-09 2019-06-28 优质视觉技术国际公司 For measuring the articulated joint with multiple sensors of machine
CN109952488B (en) * 2016-09-09 2021-02-02 优质视觉技术国际公司 Articulated joint with multiple sensors for a measuring machine
CN107957236B (en) * 2016-10-14 2020-10-27 卡尔蔡司工业测量技术有限公司 Method for operating a coordinate measuring machine
CN107957236A (en) * 2016-10-14 2018-04-24 卡尔蔡司工业测量技术有限公司 Method for operating coordinate measuring machine
CN113188450A (en) * 2021-04-23 2021-07-30 封泽希 Scene depth detection method and system based on structured light
CN113188450B (en) * 2021-04-23 2023-03-14 封泽希 Scene depth detection method and system based on structured light

Also Published As

Publication number Publication date
JP5816773B2 (en) 2015-11-18
GB2517621A (en) 2015-02-25
DE112013002824T5 (en) 2015-04-02
JP2015524916A (en) 2015-08-27
WO2013184340A1 (en) 2013-12-12

Similar Documents

Publication Publication Date Title
CN104380033A (en) Coordinate measurement machines with removable accessories
US11262194B2 (en) Triangulation scanner with blue-light projector
US10281259B2 (en) Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US10060722B2 (en) Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9628775B2 (en) Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US8832954B2 (en) Coordinate measurement machines with removable accessories
US9500469B2 (en) Laser line probe having improved high dynamic range
US10812694B2 (en) Real-time inspection guidance of triangulation scanner
US20140192187A1 (en) Non-contact measurement device
US20140081459A1 (en) Depth mapping vision system with 2d optical pattern for robotic applications
WO2015166915A1 (en) Measurement device
CN104350356A (en) Coordinate measurement machines with removable accessories
CN103712572A (en) Structural light source-and-camera-combined object contour three-dimensional coordinate measuring device
CN104040285B (en) There is the coordinate measuring machine of detachable accessory
EP3385661B1 (en) Articulated arm coordinate measurement machine that uses a 2d camera to determine 3d coordinates of smoothly continuous edge features
WO2016044014A1 (en) Articulated arm coordinate measurement machine having a 2d camera and method of obtaining 3d representations
CN105547191A (en) Colorful 3D measuring system
CN105571522A (en) Color 3D measurement system
CN105547194A (en) Colorful 3D measuring system
CN105547195A (en) Colorful 3D measuring system
CN105547193A (en) Colorful 3D measuring system
CN105547192A (en) Colorful 3D measuring system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150225