DE112013002824T5 - Coordinate measuring machines with removable accessories - Google Patents

Coordinate measuring machines with removable accessories

Info

Publication number
DE112013002824T5
DE112013002824T5 DE201311002824 DE112013002824T DE112013002824T5 DE 112013002824 T5 DE112013002824 T5 DE 112013002824T5 DE 201311002824 DE201311002824 DE 201311002824 DE 112013002824 T DE112013002824 T DE 112013002824T DE 112013002824 T5 DE112013002824 T5 DE 112013002824T5
Authority
DE
Germany
Prior art keywords
pattern
articulated arm
structured light
dimensional
arm cmm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
DE201311002824
Other languages
German (de)
Inventor
Paul C. Atwell
Burnham Stokes
Clark H. Briggs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faro Technologies Inc
Original Assignee
Faro Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/491,176 priority Critical patent/US8832954B2/en
Priority to US13/491,176 priority
Application filed by Faro Technologies Inc filed Critical Faro Technologies Inc
Priority to PCT/US2013/041826 priority patent/WO2013184340A1/en
Publication of DE112013002824T5 publication Critical patent/DE112013002824T5/en
Application status is Ceased legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B5/00Measuring arrangements characterised by the use of mechanical means
    • G01B5/004Measuring arrangements characterised by the use of mechanical means for measuring coordinates of points
    • G01B5/008Measuring arrangements characterised by the use of mechanical means for measuring coordinates of points using coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/002Measuring arrangements characterised by the use of optical means for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical means for measuring two or more coordinates coordinate measuring machines
    • G01B11/007Measuring arrangements characterised by the use of optical means for measuring two or more coordinates coordinate measuring machines feeler heads therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns

Abstract

A portable articulated arm coordinate measuring machine is provided. The coordinate measuring machine comprises a base with an arm portion. A probe end is coupled away from the socket to one end of the arm portion. A device is configured to emit a coded structured light on an object to determine the three-dimensional coordinates of a point on the object.

Description

  • background
  • The present disclosure relates to a coordinate measuring machine, and more particularly, to a portable articulated arm coordinate measuring machine having a connector at a probe end of the coordinate measuring machine that allows auxiliary devices which use structured light for non-contact three-dimensional measurement to be removably connected to the coordinate measuring machine.
  • Portable articulated arm CMMs have found widespread use in the manufacture of parts where there is a need to increase the dimensions of the part during various steps in the fabrication (eg, machining) of the part Partly fast and accurate Verify. Portable articulated arm CMM's represent a significant improvement over known fixed, costly, and relatively difficult to use gauges, especially in terms of the time required to perform dimensional measurements of relatively complex parts. Normally, an operator of a portable articulated arm CMM simply inserts a probe along the surface of the part or object to be measured. The measurement data is then recorded and provided to the operator. In some cases, the data is provided to the operator in an optical form, such as in three-dimensional (3-D) form on a computer screen. In other cases, the data is provided to the operator in numerical form, for example, when measuring the diameter of a hole, the text "diameter = 1.0034" is displayed on a computer screen.
  • An example of a prior art portable articulated arm CMM will be described in U.S. Patent No. 5,348,054 U.S. Patent No. 5,402,582 ('582) of the same assignee. The '582 patent discloses a 3-D measuring system comprising a manually operated articulated arm CMM with a support base at one end and a probe at the other end. The U.S. Patent No. 5,611,147 ('147) of the same assignee discloses a similar articulated arm CMM. In the '147 patent, the articulated arm CMM includes several features including an additional rotation axis at the probe end, providing one arm with a two-two-two or two-two-three axis configuration (the latter being a seven-axis arm). ,
  • Three-dimensional surfaces can also be measured by contactless methods. One type of non-contact device, sometimes referred to as a "laser line probe" or "laser line scanner," emits laser light either at a point or along a line. An imaging device such as a charge-coupled device (CCD) is positioned adjacent to the laser. The laser is arranged to emit a line of light that is reflected from the surface. The surface of the object to be measured causes diffuse reflection, which is picked up by the imaging device. The image of the reflected line on the sensor changes as the distance between the sensor and the surface changes. When the relationship between the imaging sensor and the laser and the position of the laser image on the sensor is known, triangulation techniques can be used to measure three-dimensional coordinates of points on the surface. A problem that arises with laser line probes is that the density of measured points may vary with the speed with which the laser line probe is moved across the surface of the object. The faster the laser line probe is moved, the greater the distance between the points and a lower point density. In a structured light scanner, the distance between dots is normally uniform in each of the two dimensions, generally providing a uniform measurement of dots on workpiece surfaces.
  • Although existing CMMs are suitable for their intended purposes, there is a need for a portable articulated arm CMM incorporating certain features of embodiments of the present invention.
  • Summary of the invention
  • According to one aspect of the invention, there is provided a portable articulated arm coordinate measuring machine (articulated arm CMM) for measuring three-dimensional coordinates of an object in space. The articulated arm CMM includes a pedestal. There is provided a manually positionable arm portion having opposite first and second ends, the arm portion rotatably coupled to the base, the arm portion including a plurality of connected arm segments, each arm segment including at least one position gauge for generating a position signal. An electronic circuit is provided which receives the position signal of the at least one position measuring device in each arm segment. A probe end is coupled to the first end. A contactless three-dimensional measuring device is coupled to the probe end, wherein the non-contact three-dimensional measuring device comprises a projector and an image sensor, wherein the projector has a source plane, wherein the projector is configured to provide a structured light emitting the object, wherein the structured light is disposed on the source plane and comprises at least three non-collinear pattern elements, the image sensor being arranged to receive the structured light reflected from the object. A processor is electrically coupled to the electronic circuit, the processor configured to determine the three-dimensional coordinates of a point on the object in response to receiving the position signals of the position encoders and in response to receiving the structured light by the image sensor.
  • According to one aspect of the invention, there is provided a method of operating a portable articulated arm coordinate measuring machine for measuring the coordinates of an object in space. The method includes providing a manually positionable arm portion having opposite first and second ends, the arm portion including a plurality of connected arm segments, each arm segment comprising at least one position gauge for generating a position signal. There is provided a probe end for measuring the object with the probe end coupled to the first end. An electronic circuit receives the position signals of the position measuring devices. There is provided a three-dimensional non-contact measuring device having a control device, wherein the three-dimensional non-contact measuring device comprises a sensor and a projector, wherein the projector is configured to emit a structured light to the object, wherein the projector has a source plane structured light is arranged on the source plane and comprises at least three non-collinear pattern elements. A structured light is projected onto the object by the three-dimensional measuring device.
  • Brief description of the drawings
  • Referring now to the drawings, exemplary embodiments are shown that are not to be construed as limiting the entire scope of the disclosure, and wherein the elements in several figures are numbered alike. Show it:
  • 1 : including 1A and 1B perspective views of a portable articulated arm coordinate measuring machine (articulated arm CMM) having embodiments of various aspects of the present invention therein;
  • 2 : including 2A - 2D taken together, a block diagram of the electronics used as part of the articulated arm CMM of 1 is used according to an embodiment;
  • 3 : including 3A and 3B taken together, a block diagram showing the detailed features of the electronic data processing system of 2 according to one embodiment;
  • 4 : an isometric view of the probe end of the articulated arm CMM of 1 ;
  • 5 : a side view of the probe end of 4 with the handle coupled thereto;
  • 6 : a side view of the probe end of 4 with the handle attached thereto;
  • 7 FIG. 4: an enlarged partial side view of the connection point section of the probe end of FIG 6 ;
  • 8th : another enlarged partial side view of the terminal section of the probe end of 5 ;
  • 9 an isometric view, partially in cross section, of the handle of 4 ;
  • 10 : an isometric view of the probe end of the articulated arm CMM of 1 a structured light device having a single camera mounted thereon;
  • 11 FIG. 4: an isometric view, partly in cross section, of the device of FIG 10 ;
  • 12 : an isometric view of the probe end of the articulated arm CMM of 1 with another structured light device with double cameras attached to it;
  • 13A and 13B : schematic representations showing the operation of the device of 10 illustrate when at the probe end of the articulated arm CMM of 1 is attached;
  • 14 - 17 of successive projections with an uncoded binary pattern provided by the structured light device of FIG 10 or 12 can be emitted according to an embodiment of the present invention;
  • 18 - 19 : spatially varying color - coded patterns produced by the structured light device of 10 or 12 emits can be, according to an embodiment of the invention;
  • 20 - 23 : pattern encoded with stripe indices, which is provided by the structured light device of 10 or 12 can be emitted, according to an embodiment of the invention;
  • 24 - 31 Two-dimensional lattice patterns provided by the structured light device of FIG 10 or 12 can be emitted, according to an embodiment of the invention;
  • 32 FIG. 3 is a schematic representation of a photometric method for detecting patterns of patterned light under a variety of lighting conditions; FIG. and
  • 33 FIG. 4 is an illustration of a structured light scanner apparatus operable independently of an articulated arm CMM according to another embodiment of the invention. FIG.
  • Detailed description
  • Portable articulated arm CMMs are used in a variety of applications to obtain measurements of objects. Embodiments of the present invention offer advantages in enabling an operator to easily and quickly couple auxiliary devices that use structured light to provide contactless measurement of a three-dimensional object to a probe end of the articulated arm CMM. Embodiments of the present invention provide further advantages by providing the communication of data representing a point cloud measured by the structured light device in the articulated arm CMM. Embodiments of the present invention provide the advantages of greater uniformity in the distribution of measured points, which can provide improved accuracy. Embodiments of the present invention provide still further benefits by providing power and data communication to a removable accessory without external connections or wiring.
  • As used herein, "structured light" refers to a two-dimensional light pattern that is projected onto a continuous and enclosed area of an object and conveys information that can be used to determine coordinates of points on the object. A structured light pattern includes at least three non-collinear pattern elements arranged in the continuous and enclosed area. Each of the three non-collinear pattern elements conveys information that can be used to determine the point coordinates.
  • In general, there are two types of structured light, namely a coded light pattern and an uncoded light pattern. A coded light pattern, as used herein, is a pattern in which the three-dimensional coordinates of an illuminated surface of the object can be determined by the detection of a single image. In some cases, the projection device may move relative to the object. In other words, in a coded light pattern, there is no significant temporal relationship between the projected pattern and the captured image. Normally, a coded light pattern includes a set of elements (eg, geometric shapes) arranged such that at least three of the elements are non-collinear. The set of elements may, in some cases, be arranged into collections of lines. The fact that at least three of the elements are not collinear ensures that the pattern is not a simple line pattern, such as would be projected by a laser line scanner. As a result, the pattern elements are recognizable because of the arrangement of the elements.
  • In contrast, an uncoded structured light pattern as used herein is a pattern that does not permit measurement by a single pattern as the projector moves relative to the object. An example of an uncoded light pattern is one that requires a sequence of successive patterns and, consequently, the capture of a sequence of successive images. Due to the temporal nature of the projection pattern and the capture of the image, there should be no relative movement between the projector and the object.
  • It should be understood that patterned light is different than the light projected from a laser line probe or laser line scanner-type device which produces a line of light. Inasmuch as laser line probes used with articulated arms currently have non-uniformities or other aspects which can be considered as features in the lines produced, these features are arranged in a collinear array. Thus, such features in a single generated line are not considered to make the projected light a structured light.
  • 1A and 1B illustrate in perspective an articulated arm CMM 100 according to various embodiments of the present invention, wherein an articulated arm is a type of coordinate measuring machine. 1A and 1B show that the exemplary articulated arm CMM 100 a joint measuring device with six or seven axes with a probe end 401 may include a probe housing 102 includes that at one End to an arm section 104 the articulated arm CMM 100 is coupled. The arm section 104 includes a first arm segment 106 that through a first grouping of bearing inserts 110 (eg two bearing inserts) to a second arm segment 108 is coupled. A second grouping of bearing inserts 112 (eg two bearing inserts) couples the second arm segment 108 to the probe housing 102 , A third group of bearing inserts 114 (eg three bearing inserts) couples the first arm segment 106 to a pedestal 116 at the other end of the arm section 104 the articulated arm CMM 100 is arranged. Each grouping of bearing inserts 110 . 112 . 114 provides multiple axes of articulation. The probe end 401 can also be a probe housing 102 include the shaft of the section of the seventh axis of the articulated arm CMM 100 includes (eg, an insert containing an encoder system that controls the movement of the measuring device, such as a probe 118 , in the seventh axis of the articulated arm CMM 100 determined). The probe end 401 can rotate in this embodiment about an axis extending through the center of the probe housing 102 extends. The base 116 is when using the articulated arm CMM 100 usually attached to a work surface.
  • Each bearing insert in each bearing insert grouping 110 . 112 . 114 typically includes an encoder system (eg, an optical angle encoder system). The encoder system (ie, a position gauge) provides an indication of the position of the respective arm segments 106 . 108 and the corresponding warehouse operations groupings 110 . 112 . 114 ready, all together giving an indication of the position of the probe 118 in terms of the socket 116 (and thus the position of the through the articulated arm CMM 100 measured object in a given frame of reference, for example, a local or global frame of reference). The arm segments 106 . 108 may be made of a suitably rigid material, such as, but not limited to, a carbon fiber composite material. A portable articulated arm CMM 100 having six or seven axes of articulation (ie degrees of freedom) provides the benefits of allowing the operator to probe 118 at a desired location in a 360 ° area around the pedestal 116 to position, with an arm section 104 is provided, which can be easily handled by the operator. It is understood, however, that the representation of an arm section 104 with two arm segments 106 . 108 as an example and that the claimed invention should not be limited thereby. An articulated arm CMM 100 may comprise any number of arm segments coupled together by bearing inserts (and thus more or fewer than six or seven axes of articulation or degrees of freedom).
  • The probe 118 is removable on the probe housing 102 attached, which with the bearing insert grouping 112 connected is. A handle 126 is in relation to the probe housing 102 for example, removable by means of a quick connector connection point. As will be discussed in more detail below, the handle 126 can be replaced with another device configured to emit structured light to provide contactless measurement of three-dimensional objects, thereby providing benefits by providing the operator with both contact measurements and non-contact measurements with the same articulated arm CMM 100 be enabled. The probe housing 102 takes a removable probe in exemplary embodiments 118 which is a contact measuring device and removable tips 118 which physically contact the object to be measured and include, but are not limited to: ball-type probes, touch-sensitive, bent, and elongate. In other embodiments, the measurement is performed, for example, by a non-contact device such as a coded structured light scanner device. The handle 126 in one embodiment, is replaced by the coded structured light scanner device using the quick connector interface. Other types of measuring devices may have the removable handle 126 replace to provide additional functionality. The examples of such measuring devices include, but are not limited to, e.g. B. one or more illumination lamps, a temperature sensor, a thermal scanner, a bar code scanner, a projector, a paint spray gun, a camera or the like.
  • In 1A and 1B it can be seen that the articulated arm CMM 100 the removable handle 126 which provides the advantages that additional parts or functionalities can be replaced without the probe housing 102 from the warehouse operations grouping 112 Will get removed. As based on 2D can be discussed in more detail, the removable handle 126 Also include an electrical connector that allows electrical energy and data to be handled 126 and the one in the probe end 401 arranged to be replaced corresponding electronics.
  • In various embodiments, each grouping of bearing inserts 110 . 112 . 114 in that the arm section 104 the articulated arm CMM 100 is moved around several axes of rotation. As already mentioned, each storage unit grouping comprises 110 . 112 . 114 corresponding encoder systems such as optical angle encoders, each coaxial with the corresponding axis of rotation z. B. the arm segments 106 . 108 are arranged. The optical encoder system detects a rotational movement (pivotal movement) or transverse movement (joint movement), for example, of each of the arm segments 106 . 108 about the corresponding axis and transmits a signal to an electronic data processing system in the articulated arm CMM 100 , as described in more detail below. Each single unprocessed encoder count is sent separately as a signal to the electronic data processing system where it is further processed into measurement data. It is not one of the articulated arm CMM 100 even separate position calculator (eg, a serial box) required in the U.S. Patent No. 5,402,582 ('582) of the same assignee.
  • The base 116 may be a fastening or mounting device 120 include. The mounting device 120 allows removable mounting of the articulated arm CMM 100 at a desired location such as an inspection table, a machining center, a wall or the floor. The base 116 in one embodiment comprises a handle portion 122 , which is a convenient location where the operator can access the pedestal 116 stops while the articulated arm CMM 100 is moved. In one embodiment, the socket comprises 116 Further, a movable cover portion 124 which is foldable to release a user interface such as a display screen.
  • According to one embodiment, the socket contains or accommodates 116 portable articulated arm CMM 100 an electronic circuit comprising an electronic data processing system comprising two main components: a basic processing system which stores the data of the various encoder systems in the articulated arm CMM 100 and processing data representing other arm parameters to support the three-dimensional (3-D) position calculations; and a user interface processing system that includes an integrated operating system, a touch-sensitive screen, and resident application software that facilitates the implementation of relatively complete metrology functions within the articulated arm CMM 100 without having to be connected to an external computer.
  • The electronic data processing system in the socket 116 Can with the encoder systems, sensors and other peripheral hardware, which are removed from the socket 116 is arranged (for example, a device with structured light, the handle on the removable 126 on the articulated arm CMM 100 can be mounted), communicate. The electronics that support these peripheral hardware devices or features can be found in any of the portable articulated arm CMMs 100 arranged camp use groupings 110 . 112 . 114 to be ordered.
  • 2 FIG. 12 is a block diagram of the electronics that are configured in an articulated arm CMM according to one embodiment 100 is used. In the 2A illustrated embodiment includes an electronic data processing system 210 that is a base processor card 204 for implementing the base processing system, a user interface card 202 , a basic energy card 206 to provide energy, a Bluetooth module 232 and a base pitch card 208 includes. The user interface card 202 includes a computer processor for executing the application software to perform the user interface, the screen, and other functions described herein.
  • In 2A and 2 B it can be seen that the electronic data processing system 210 over one or more Armbusse 218 communicates with the aforementioned plurality of encoder systems. Each encoder system generates at the in 2 B and 2C 1 and 2 illustrates an encoder-armature interface 214 , a digital encoder signal processor (DSP) 216 , an encoder readhead interface 234 and a temperature sensor 212 , Other devices, such as strain sensors, can attach to the arm 218 be connected.
  • In 2D is also the probe-end electronics 230 shown with the arm 218 communicated. The probe-end electronics 230 includes a probe-end DSP 228 , a temperature sensor 212 , a handle / device interface bus 240 in one embodiment, via the quick connector interface with the handle 126 or with the scanner device 242 with coded structured light, and a probe interface 226 , The quick connector interface allows access to the handle 126 on the data bus, the control lines, that of the scanner device 242 used with coded structured light energy bus and other accessories. The probe-end electronics 230 is in one embodiment in the probe housing 102 on the articulated arm CMM 100 arranged. The handle 126 In one embodiment, it may be removed from the quick connector interface and the measurement may be with the device 242 with structured light coming through the interface bus 240 with the probe-end electronics 230 the articulated arm CMM 100 be communicated. In one embodiment, the electronic data processing system 210 in the pedestal 116 the articulated arm CMM 100 , the probe-end electronics 230 in the probe housing 102 the articulated arm CMM 100 and the encoder systems in the bearing insert groupings 110 . 112 . 114 arranged. The probe interface 226 can by any suitable communications protocol, the commercially available products of Maxim Integrated Products, Inc., which are referred to as 1- Wire® communications protocol 236 are formed with the probe-end DSP 228 get connected.
  • 3 is a block diagram showing the detailed features of the electronic data processing system 210 the articulated arm CMM 100 describes according to an embodiment. The electronic data processing system 210 is in one embodiment in the socket 116 the articulated arm CMM 100 arranged and includes the base processor card 204 , the user interface card 202 , a basic energy card 206 , a Bluetooth module 232 and a base tilt module 208 ,
  • At an in 3A illustrated embodiment includes the base processor card 204 the various functional blocks presented herein. A basic processor function 302 For example, it is used to capture the measurement data of the articulated arm CMM 100 to support and receive over the arm 218 and a bus control module function 308 the raw coder data (eg coder system data). The memory function 304 stores programs and static arm configuration data. The base processor card 204 further includes a port function provided for an external hardware option 310 to connect to any external hardware devices or peripherals such as a scanner device 242 to communicate with coded structured light. A real-time clock (RTC) and a protocol 306 , a battery pack interface (IF; interface) 316 and a diagnostic port 318 are also in functionality in an embodiment of in 3A pictured base processor card 204 contain.
  • The base processor card 204 also conducts all the wired and wireless data communication to external (host computer) and internal (display processor 202 ) Devices. The base processor card 204 is able to have an ethernet function 320 with an Ethernet network [e.g. B. a clock synchronization standard such as IEEE (Institute of Electrical and Electronics Engineers) 1588 is used], via a LAN function 322 with a Wireless Local Area Network (WLAN) and a Parallel-to-Serial Communication (PSK) feature 314 with the Bluetooth module 232 to communicate. The base processor card 204 further includes a connection to a universal serial bus device (USB device) 312 ,
  • The base processor card 204 transmits and collects unprocessed measurement data (eg, encoder system counts, temperature measurements) for processing into measurement data, without the need for any pre-processing, such as disclosed in the serial box of the aforementioned '582 patent. The base processor 204 sends the processed data via an RS485 interface (IF) 326 to the display processor 328 on the user interface card 202 , In one embodiment, the base processor sends 204 also the unprocessed measurement data to an external computer.
  • Referring now to the user interface card 202 in 3B , the angular and position data received from the base processor are displayed on the display processor 328 used applications to provide an autonomous metrological system in the articulated arm CMM 100 to provide. The applications can work on the display processor 328 to support, for example, the following, but not limited to functions: measurement of features, instructional and training graphics, remote diagnostics, temperature corrections, control of various operating characteristics, connection to various networks, and display of measured objects. The user interface card 202 includes along with the display processor 328 and an interface for a liquid crystal display (LCD screen) 338 (such as a touch-sensitive LCD screen) have multiple interface options, including a Secure Digital Card (SD Card) Interface 330 , a store 332 , a USB host interface 334 , a diagnostic port 336 , a camera port 340 , an audio / video interface 342 , a dial-up / wireless modem 344 and a port 346 belong to the global positioning system (GPS).
  • This in 3A pictured electronic data processing system 210 also includes a basic energy card 206 with an environment recorder 362 for recording environmental data. The basic energy card 206 also provides energy for the electronic data processing system 210 ready, being an AC to DC converter 358 and a battery charger control 360 be used. The basic energy card 206 communicates via a serial single-ended bus 354 having an Inter-Integrated Circuit (I2C) and a serial peripheral interface including DMA (DSPI) 357 with the base processor card 204 , The basic energy card 206 is via an input / output extension function (I / O expansion function) 364 that are in the base energy card 206 is implemented, with a tilt sensor and a radio identification module (radio ID module) 208 connected.
  • Although they are shown as separate components, all or one subgroup In other embodiments, the components may be physically located at different locations and / or the functions may be arranged differently than those in FIG 3A and 3B be shown combined. For example, the base processor card 204 and the user interface card 202 combined in one embodiment in a physical map.
  • Now referring to 4 - 9 , There is an exemplary embodiment of a probe end 401 This illustrates a probe housing 102 comprising a mechanical and electrical quick connector interface which facilitates coupling a removable and replaceable device 400 with the articulated arm CMM 100 allows. The device 400 includes an enclosure in the exemplary embodiment 402 holding a handle section 404 includes, which is sized and shaped so that it is held in one hand of the operator, so for example as a pistol grip. The enclosure 402 is a thin-walled structure with a cavity 406 ( 9 ). The cavity 406 is sized and configured to be a control device 408 receives. The control device 408 may be a digital circuit, for example, having a microprocessor, or an analog circuit. The control device 408 is in one embodiment in asynchronous bidirectional communication with the electronic data processing system 210 ( 2 and 3 ). The communication connection between the control device 408 and the electronic data processing system 210 can be wired (eg via a control device 420 ), a direct or indirect wireless connection (eg Bluetooth or IEEE 802.11) or a combination of wired and wireless connections. In the exemplary embodiment, the enclosure is 402 in two halves 410 . 412 formed, for example, from an injection-molded plastic material. The halves 410 . 412 can with fasteners such as screws 414 be attached to each other. The enclosing halves 410 . 412 For example, adhesives or ultrasonic welding may be secured together in other embodiments.
  • The grip section 404 also includes buttons or actuators 416 . 418 which the operator can turn on manually. The actors 416 . 418 are to the control device 408 coupled, which is a signal to a control device 420 in the probe housing 102 transfers. The actors 416 . 418 lead in the exemplary embodiments, the functions of actuators 422 . 424 through, on the probe housing 102 opposite the device 400 are arranged. It is understood that the device 400 may have additional switches, buttons or other actuators, which also to control the device 400 , the articulated arm CMM 100 or vice versa can be used. The device 400 may also include display devices such as light emitting diodes (LEDs), sound generators, meters, displays or testers. The device 400 In one embodiment, it may include a digital voice recorder that permits synchronization of voice comments with a measured point. In yet another embodiment, the device comprises 400 a microphone that allows the operator to transmit voice activated commands to the electronic data processing system 210 allowed.
  • The grip section 404 may be configured in one embodiment for use with one of the two hands of the operator or for a particular hand (eg, the left hand or the right hand). The grip section 404 may also be configured to facilitate the use of operators with disabilities (eg, operators with missing fingers or operators with arm prostheses). Furthermore, the handle portion 404 be removed and the probe housing 102 be used alone when the free space is limited. As discussed above, the probe end 401 also the shaft of the seventh axis of the articulated arm CMM 100 include. In this embodiment, the device 400 be arranged so that it rotates about the seventh axis of the articulated arm CMM.
  • The probe end 401 includes a mechanical and electrical connection point 426 with a first connector 429 ( 8th ) on the device 400 that with a second connector 428 on the probe housing 102 interacts. The connectors 428 . 429 may include electrical and mechanical features that facilitate coupling of the device 400 to the probe housing 102 allow. The connection point 426 in one embodiment comprises a first surface 430 with a mechanical coupler 432 and an electrical connector 434 thereon. The enclosure 402 further comprises a second surface 436 that are adjacent to the first surface 430 positioned and offset therefrom. The second surface 436 In the exemplary embodiment, the surface area is approximately 12 mm from the first surface 430 is offset. This offset provides free space for the operator's fingers when a fastener such as a collar 438 tightened or loosened. The connection point 426 provides a relatively fast and secure electronic connection between the device 400 and the probe housing 102 available without the need to align connector pins and without the need for separate cables or connectors.
  • The electrical connector 434 extends from the first surface 430 and includes one or more connector pins 440 that, for example, via one or more Armbusse 218 , in asynchronous bidirectional communication electrically with the electronic data processing system 210 ( 2 and 3 ) are coupled. The bidirectional communication link may be wired (eg, via the arm 218 ), wireless (eg Bluetooth or IEEE 802.11) or a combination of wired and wireless connections. In one embodiment, the electrical connector 434 electrically to the control device 420 coupled. The control device 420 may be in asynchronous bidirectional communication with the electronic data processing system 210 stand, for example, over one or more Armbusse 218 , The electrical connector 434 is positioned to provide a relatively fast and secure electronic connection to an electrical connector 442 on the probe housing 102 provides. The electrical connectors 434 . 442 are connected together when the device 400 on the probe housing 102 is attached. The electrical connectors 434 . 442 For example, each may include a metal encapsulated connector housing that provides electromagnetic interference shielding that protects connector pins and alignment of the pins during the process of securing the device 400 on the probe housing 102 supported.
  • The mechanical coupler 432 provides a relatively rigid mechanical coupling between the device 400 and the probe housing 102 ready to hold relatively accurate applications where the position of the device 400 at the end of the arm section 104 the articulated arm CMM 100 preferably not moved or moved. Any such movement may typically result in undesirable degradation in the accuracy of the measurement result. These desired results are achieved with various structural features of the mechanical fastening configuration section of the mechanical and electronic quick connector interface of one embodiment of the present invention.
  • The mechanical coupler 432 in one embodiment comprises a first projection 444 that on one end 448 (The leading edge or the "front" of the device 400 ) is arranged. The first advantage 444 may include a grooved, notched, or bevelled junction that includes a lip 446 forms, extending from the first projection 444 extends out. The lip 446 is sized to fit in a slot 450 is absorbed by a projection 452 is defined, extending from the probe housing 102 out extends ( 8th ). It is understood that the first projection 444 and the slot 450 together with the federal government 438 form a coupler arrangement such that when the lip 446 in the slot 450 is positioned, the slot 450 can be used to both the longitudinal and the lateral movement of the device 400 restrict it when on the probe housing 102 is attached. As will be discussed in more detail below, one can see the rotation of the collar 438 for fixing the lip 446 in the slot 450 use.
  • Opposite the first lead 444 can the mechanical coupler 432 a second projection 454 include. The second projection 454 may have a grooved, notched lip, or a beveled pad surface 456 include ( 5 ). The second projection 454 is arranged such that it is a the probe housing 102 associated fastener, such as the federal government 438 , engages. As will be discussed in more detail below, the mechanical coupler includes 432 one above the surface 430 protruding raised surface that connects to the electrical connector 434 adjacent or around the electrical connector 434 is arranged, which is a pivot point for the connection point 426 forms ( 7 and 8th ). This serves as the third of three points of mechanical contact between the device 400 and the probe housing 102 when the device 400 attached to it.
  • The probe housing 102 includes a fret 438 which is mounted coaxially on one end. The Bund 438 includes a threaded portion that is movable between a first position (FIG. 5 ) and a second position ( 7 ) can be moved. The Bund 438 can by its rotation for attaching or removing the device 400 can be used without the need for external tools. The rotation of the waistband 438 move it along a cylinder 474 with a relatively coarse flat thread. The use of such a relatively large flat thread and such contoured surfaces allows for significant clamping force with minimal torque. The rough pitch of the threads of the cylinder 474 also allows tightening or loosening of the collar 438 with minimal rotation.
  • For coupling the device 400 to the probe housing 102 becomes the lip 446 in the slot 450 introduced and the device pivoted to the second projection 454 to a surface 458 turn around as indicated by the arrow 464 is displayed ( 5 ). The Bund 438 is turned, which causes him in the direction indicated by the arrow 462 indicated direction in engagement with the surface 456 is moved or moved. The movement of the covenant 438 against the angled surface 456 leads the mechanical coupler 432 against the raised surface 460 , This helps to overcome potential problems in warping the interface or, in the case of foreign objects on the surface of the interface, the rigid seat of the device 400 on the probe housing 102 could affect. The exercise of power by the federal government 438 on the second lead 454 causes the mechanical coupler 432 is moved forward, taking the lip 446 into a seat on the probe housing 102 suppressed. While the federal government 438 is further tightened, the second projection 454 applying pressure to a pivot point up to the probe housing 102 pressed down. This provides a rocking arrangement that puts pressure on the second projection 454 , the lip 446 and applying the center pivot point to move or rock the device 400 to reduce or eliminate. The pivot point presses directly against the underside of the probe housing 102 while the lip 446 a downward force on the end of the probe housing 102 exercises. 5 contains arrows 462 . 464 to the direction of movement of the device 400 and the federal government 438 display. 7 contains arrows 466 . 468 . 470 to the direction of the applied pressure in the junction 426 display when the fret 438 is tightened. It is understood that the dislocation distance of the surface 436 the device 400 a gap 472 between the covenant 438 and the surface 436 forms ( 6 ). The gap 472 allows the operator to tie the waistband 438 tighter, while reducing the risk that his fingers during the rotation of the collar 438 be trapped. The probe housing 102 in one embodiment has sufficient rigidity to warp when tightening the collar 438 to reduce or prevent.
  • Embodiments of the connection point 426 allow the correct alignment of the mechanical coupler 432 and the electrical connector 434 and also protect the electronic interface from applied stresses that would otherwise be due to the clamping action of the collar 438 , the lip 446 and the surface 456 can arise. This provides advantages in reducing or eliminating damage to a printed circuit board caused by stress 476 on which the electrical connectors 432 . 442 are attached, which may have soldered connections. Further, the embodiments offer advantages over known approaches in that a user does not need tools to manipulate the device 400 with the probe housing 102 to connect or disconnect from it. This gives the operator the ability to relatively easily use the device 400 manually with the probe housing 102 to connect or disconnect from it.
  • Because of the relatively large number of shielded electrical connections connected to the junction 426 are possible, one can have a relatively large number of functions between the articulated arm CMM 100 and the device 400 split. For example, switches, knobs, or other actuators that work on the articulated arm CMM 100 are arranged to control the device 400 be used or vice versa. Furthermore, commands and data from the electronic data processing system 210 to the device 400 be transmitted. The device 400 In one embodiment, a video camera transmits captured image data stored in memory on the base processor 204 to save or on the display device 328 is to be displayed. In another embodiment, the device 400 an image projector that retrieves data from the electronic data processing system 210 receives. In addition, temperature sensors can be used in either the articulated arm CMM 100 or in the device 400 are arranged to be shared among themselves. It should be understood that embodiments of the present invention provide advantages in providing a flexible connection that facilitates the quick, easy and reliable coupling of a wide variety of accessories 400 to the articulated arm CMM 100 allowed. Furthermore, the ability to share functions between the articulated arm CMM 100 and the device 400 a reduction in the size, energy consumption and complexity of the articulated arm CMM 100 by preventing a duplicate deployment.
  • The control device 408 In one embodiment, the operation or functionality of the probe end 401 the articulated arm CMM 100 to change. The control device 408 can, for example, indicator lights on the probe housing 102 to change that, when the device 400 is fixed, emit either a different colored light or a different light intensity or turn on / off at other times than when using the probe housing alone 102 , In one embodiment, the device comprises 400 a rangefinder sensor (not shown) that measures the distance to an object. The control device 408 In this embodiment, the indicator lights on the probe housing 102 to indicate to the operator how far the object is from the probe tip 118 is removed. In another embodiment, the control device 408 change the color of the indicator lights based on the quality of the image captured by the coded structured light scanner device. This offers advantages in simplifying the requirements of the control device 420 and allows for improved or higher functionality through the addition of additional devices.
  • Referring to 10 - 13 Embodiments of the present invention provide advantages for the projector, camera, signal processing, control and display interfaces in a non-contact, three-dimensional measurement device 500 , The device 500 includes a pair of optical devices such as a light projector 508 and a camera 510 projecting a structured light pattern and capturing a two-dimensional pattern of an object 501 was reflected. In the device 500 triangulation-based methods based on the known emitted pattern and the captured image are used to form a point cloud representing the x, y, and z coordinate data of the object 501 embodied to determine for each pixel of the received image. In one embodiment, the structured light pattern is coded such that a single image suffices for the determination of the three-dimensional coordinates of object points. Such a coded structured light pattern can also be referred to as a measurement of three-dimensional coordinates in a single image.
  • The projector 508 In the exemplary embodiment, uses a source of visible light that illuminates a pattern generator. The source of visible light may be a laser, a super-luminescent diode, an incandescent lamp, a light emitting diode (LED), or other lighting device. In the exemplary embodiment, the pattern generator is a chrome mask with a patterned pattern of light etched on it. The mask may have a single pattern or patterns that move in or out of position as needed. The mask can be installed manually or automatically in the operating position. In other embodiments, the source pattern may be a light that is reflected or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Incorporation, a liquid crystal display (LCD) display), a liquid crystal on silicon (LCOS) device or similar device operated in the pass-mode rather than the reflection mode. The projector 508 can also be a lens system 515 which changes the outgoing light to have the desired focal length characteristics.
  • The device 500 also includes an enclosure 502 with a handle section 504 , The device 500 In one embodiment, furthermore, a connection point 426 on one end, which cover the device 500 as described above mechanically and electrically to the probe housing 102 coupled. In other embodiments, the device 500 into the probe housing 102 be integrated. The connection point 426 offers advantages to the device 500 gives the possibility, quickly and easily to the articulated arm CMM 100 coupled and removed without the need for additional tools.
  • The camera 510 includes a photosensitive sensor that generates a digital image or a digital representation of the area within the field of view of the sensor. The sensor may be a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) sensor having, for example, a pixel array. The camera 510 may also include other components such as a lens 503 and other optical devices include, but are not limited to. The projector 508 and the camera 510 in the exemplary embodiment, are arranged at an angle such that the sensor detects that from the surface of the object 501 can absorb reflected light. In one embodiment, the projector 508 and the camera 510 positioned such that the device 500 with the probe tip 118 can be operated in their place. It is further understood that the device 500 essentially fixed relative to the probe tip 118 is and that on the handle section 504 forces not acting the device 500 relative to the probe tip 118 may influence. The device 500 In one embodiment, it may include an additional actuator (not shown) that allows the operator to capture data from the device 500 and collecting data from the probe tip 118 switch.
  • The projector 508 and the camera 510 are electrically connected to a control device 512 coupled in the enclosure 502 is arranged. The control device 512 may include one or more microprocessors, digital signal processors, memory, and signal conditioning circuits. The control device 512 can because of the digital signal processing and by the device 500 generated large volume of data in the handle portion 504 to be ordered. The control device 512 is over the electrical connector 434 electrically to the Armbusse 218 coupled. The device 500 can further actors 514 . 516 The operator can manually turn on the operation and data acquisition by the device 500 to start. In one embodiment, the image processing for determining the x, y, and z coordinate data becomes the object 501 embodying point cloud by the control device 512 performed and the coordinate data over the bus 240 to the electronic data processing system 210 transfer. In another embodiment, images become the electronic data processing system 210 transfer and take place Calculation of the coordinates by the electronic data processing system 210 ,
  • The control device 512 In one embodiment, it is configured with the electronic data processing system 210 to communicate images of structured light patterns from the electronic data processing system 210 to recieve. In yet another embodiment, the pattern emitted on the object may be from the electronic data processing system 210 be changed either automatically or in response to an input from the operator. This may offer advantages in higher accuracy measurements with less processing time, by allowing the use of patterns that are easier to decode, if conditions permit, and the use of the more complex patterns is possible if the desired degree of accuracy or desirability Resolution should be achieved.
  • In other embodiments of the present invention, the device comprises 520 ( 12 ) a couple of cameras ( 510 ). The cameras 510 are at an angle relative to the projector 508 arranged to that of the object 501 to catch reflected light. The use of multiple cameras 510 may be advantageous in some applications in providing redundant images to increase the accuracy of the measurement. In still other embodiments, the redundant images may be quickly removed from the device 500 allow detectable successive patterns by increasing the detection speed by the alternate operation of the cameras 510 is increased.
  • Now referring to 13A and 13B , the operation of the structured light device becomes 500 described. The device 500 emits first with the projector 508 a structured light pattern 522 on the surface 524 of the object 501 , The structured light pattern 522 may include the patterns disclosed in the journal article "DLP-Based Structured Light 3D Imaging Technologies and Applications" by Jason Geng, published in the SPIE minutes, Vol. 7932. The structured light pattern 522 can also be one of the in 14 - 32 illustrated patterns include, but are not limited to. The light 509 of the projector 508 gets off the surface 524 reflected and the reflected light 511 is from the camera 510 collected. It is understood that deviations in the surface 524 such as a lead 526 Create distortions in the structured pattern when the image of the pattern from the camera 510 is recorded. Since the pattern is formed by structured light, the control device is 512 or the electronic data processing system 210 in some cases capable of having a one-to-one correspondence between the pixels in the emitted pattern, e.g. B. a pixel 513 , and the pixels in the imaged pattern, e.g. B. a pixel 515 , to investigate. This provides the opportunity to use principles of triangulation to determine the coordinates of each pixel in the imaged pattern. The collection of three-dimensional coordinates of the surface 524 is sometimes referred to as a "point cloud". By moving the device 500 over the surface 524 can be a point cloud of the entire object 501 be generated. It is understood that the coupling of the device 500 provides benefits to the probe end in some embodiments in that the position and orientation of the device 500 the electronic data processing system are known so that the position of the object 501 relative to the articulated arm CMM 100 can also be determined.
  • For determining the coordinates of the pixel, it is known that the angle of each projected light beam 509 who is the object 522 in one point 527 corresponds to a projection angle Phi (Φ), so that the Φ information is coded into the emitted pattern. In one embodiment, the system is configured to enable the determination of the Φ value corresponding to each pixel in the imaged pattern. There is also an angle omega (Ω) for each pixel in the camera, as well as the baseline "D" distance between the projector 508 known to the camera. As a result, the distance "Z" from the camera 510 to the location that has captured the pixel, using the equation: Z / D = sin (Φ) / sin (Ω + Φ) (1) Thus, three-dimensional coordinates can be calculated at each pixel in the captured image.
  • In general, there are two categories of structured light, namely coded and uncoded structured light. A common form of uncoded light, such as the one in 14 - 17 such as 28 - 30 shown relies on a striped pattern that changes periodically along one dimension. These types of patterns are usually applied in a sequence to determine an approximate distance to the object. Some embodiments with uncoded patterns such as the sinusoidal patterns can give relatively high accuracy measurements. However, for these types of patterns to be usable, the scanner device and the object typically need to be held stationary relative to each other. In cases where the scanner device and the object are moving (relative to each other), then may be a coded pattern such as in 18 - 27 presented preferred. An encoded pattern allows the image to be analyzed with a single captured image. Some encoded patterns may be placed in a particular orientation on the projector pattern (eg, perpendicular to projector level epipolar lines), thereby simplifying the analysis of the three-dimensional surface coordinates based on a single image.
  • Epipolar lines are mathematical lines defined by the intersection of epipolar planes and the source plane 517 or the picture plane 521 (the level of the camera sensor) in 13B be formed. An epipolar plane may be any plane passing through the perspective center 519 of the projector and the perspective center of the camera. The epipolar lines at the source level 517 and the picture plane 521 may be parallel in some cases, but generally they are not parallel. One aspect of epipolar lines is that a particular epipolar line is at the projector level 517 a corresponding epipolar line on the image plane 521 Has. Thus, any particular pattern may be on an epipolar line at the projector level 517 is known, directly in the picture plane 521 be observed and evaluated. For example, if a coded pattern along an epipolar line in the projector plane 517 can be arranged, the distance between the coded elements in the image plane 521 with the pixels from the camera sensor 510 be read out. This information can be used to represent the three-dimensional coordinates of a point 527 on the object 501 to investigate. It is further possible to tilt coded patterns in an angle known with respect to an epipolar line and to efficiently extract the coordinates of the object surface. In the 20 - 29 Examples of coded patterns are shown.
  • In embodiments having a periodic pattern such as a sinusoidal repeating pattern, the sine period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two dimensions, the pattern elements are not collinear. In some cases, a striped pattern with different width stripes may embody a coded pattern.
  • Now referring to 14 - 17 , Embodiments of uncoded structured light patterns are shown there. Some of the patterns have a simple on / off (or 1, 0) pattern and are called "binary patterns." The binary pattern is known to have a particular sequence, referred to as a "gray code sequence," in some cases. The term "gray code" used in the field of three-dimensional metrology, which is based on structured light, differs somewhat from the term used in the field of electrical engineering, where the term "gray code" usually means the sequential change of a single bit at a time , The present application follows the use of the term "gray code," as is common in the field of three-dimensional metrology, where the Gray code normally represents a sequence of binary black and white values. 14A shows an example of a binary pattern comprising a plurality of successive images 530 . 532 . 534 includes, each having a different striped pattern on it. As a rule, the stripes alternate between bright (lighted) and dark (unlit) striped areas. Sometimes the terms "white" and "black" are used to mean illuminated or unlit. So if the pictures 530 . 532 . 534 one by one as in 14B on the surface 524 be projected, creates a composite image 536 , It should be noted that the bottom two patterns 535 . 537 from 14B for clarity in 14A are not shown. For every point on the object 501 (which is represented by a camera pixel in the image) has the composite pattern 536 a unique binary value created by the successive projection of the patterns 530 . 532 . 534 . 535 . 537 which corresponds to a relatively small range of possible projection angles Φ. By applying these projection angles together with the known pixel angle Ω for a given pixel and the known baseline distance D according to equation (1), the distance Z from the camera to the object point can be determined. Each camera pixel has a two-dimensional angle. The two-dimensional angle generally corresponds to the one-dimensional angle omega used in calculating the distance Z in accordance with equation (1). However, a line drawn from each camera pixel through the camera's perspective center and intersecting the object at a point defines a two-dimensional angle in space. When combined with the calculated value Z, the two pixel angles provide three-dimensional coordinates corresponding to a point on the object surface.
  • Similarly, instead of a binary pattern, a successive series of gray patterns having stripes of varying gray levels may be used. When used in this context, the term "gray scale" usually refers to a proportion of illumination at a point on the object from white (maximum light) through different shades of gray (less light) to black (minimum light). The same nomenclature is needed even if that projected light has a color such as red and the gray levels correspond to levels of red illumination. In one embodiment, the pattern has ( 15 ) a variety of pictures 538 . 540 . 542 with stripes having different light energy components, such as black, gray, and white, and producing an emitted pattern on the object 501 serve. The gray values can be used to determine the possible projection angles Φ to within a relatively small range of possible values. As discussed above, equation (1) may then be used to determine the distance Z.
  • In another embodiment, one may determine the distance Z to an object point by measuring a phase shift observed in a plurality of images. For example, the gray scale intensities vary 546 . 548 . 550 a projector pattern 552 at an in 16 illustrated embodiment in a sinusoidal manner, but the phase is shifted between projected patterns. In the first projector pattern, the sinusoidal gray value intensity 546 (which represents the optical energy per unit area) have a phase of 0 degrees at a certain point. The second projector pattern has the sinusoidal intensity 548 a phase of 120 degrees at the same point. In the third projector pattern, the sinusoidal intensity 550 have a phase of 240 degrees at the same point. It's the same when you say that the sinusoidal pattern in each step is shifted one-third of a period to the left (or right). A phase shift method is employed to determine a phase of the projected light at each camera pixel, thereby eliminating the need to consider the information from adjacent pixels as in the coded-pattern single shot case. Numerous methods can be used to determine the phase of a camera pixel. One method involves performing a multiply and accumulate method and then using an arctangent of a quotient. This method is well known to one of ordinary skill in the art and will not be discussed further. Furthermore, with the phase shift method, the background light is omitted in the calculation of the phase. For these reasons, the z-value calculated for a particular pixel is usually more accurate than the z-value calculated by the coded-pattern single-frame method. However, in a single collection of sinusoidal patterns like those in 16 represented all calculated phases from 0 to 360 degrees. For a given structured light triangulation system, these calculated phases may be appropriate if the "thickness" of the test object does not vary too much because the angle is known ahead of time for each projected strip. However, if the object is too thick, ambiguity may arise in the phase calculated for a particular pixel because that pixel may be made up of a first projected light beam impinging on a first position on the object or a second projected light beam impinging on one second position on the object was received. In other words, if there is a possibility that the phase at any pixel in the camera arrangement may vary by more than 2π radians, then the phases may not be properly decoded and the desired one-to-one correspondence is not achieved.
  • 17A shows a sequence 1-4 of projected Gray code intensities 554 according to a method by which the ambiguity in the distance Z based on a calculated phase can be eliminated. A collection of Gray code patterns is projected one at a time onto the object. There are four sequence patterns in the illustrated example, which are on the left side of FIG 554 in 17A are characterized by 1, 2, 3, 4. The sequence pattern 1 is dark (black) in its left half (elements 0-15) and bright (white) in its right half (elements 16-31). Sequence pattern 2 has a dark band towards the middle (elements 8-23) and bright bands towards the edges (elements 2-7, 24-31). Sequence pattern 3 has two distinct bright bands near the center (elements 4-11, 20-27) and three bright bands (elements 0-3, 12-19, 28-31). The sequence pattern 4 has four distinct dark bands (elements 2-5, 10-13, 18-21, 26-29) and five distinct bright bands (elements 0-1, 6-9, 14-17, 22-25 , 30-31). For any particular pixel in the camera, this sequence of patterns allows the object's "object thickness range" to be improved by a factor of 16 compared to an initial thickness range corresponding to all elements 0 through 31.
  • At an in 17C illustrated other methods 556 For example, a phase shift method similar to the method of FIG 16 is similar. At the in 17C illustrated embodiment are four sine periods of a pattern 556A projected onto a project. For the reasons discussed above, there may be ambiguity in the distance Z to an object when the pattern of 17C is used. One way to reduce or eliminate this ambiguity is to have one or more additional sinusoidal patterns 556B . 556C to project, each pattern having a different stripe period (pitch). For example, in 17B a second sinusoidal pattern 555 projected onto an object with three instead of four fringe periods. In one embodiment, the difference in phases may be for the two patterns 555 . 556 be used to help eliminate ambiguity in the distance Z to the target helpful.
  • Another method of eliminating ambiguity is to use another type of method, such as the Gray code method of 17A to eliminate the ambiguity in the distances Z calculated by the method of shifting sinusoidal phases.
  • In applications where the object and the device 500 In relative motion, it may be desirable to use a single pattern, that of the camera 510 offers the possibility to take an image that has sufficient information for measuring the three-dimensional properties of the object 501 provides without having to project consecutive images. Now referring to 18 and 19 , have the patterns 558 . 566 a distribution of colors, which in some cases allows the object to be measured based on a single (coded) image. In the embodiment of 18 be with the pattern 558 Lines with a continuously varying wavelength of light wavelength used to create a pattern in which, for example, the color changes continuously from blue to green to yellow to red to magenta. Thus, for any particular spectral wavelength, there may be a one-to-one correspondence between the emitted image and the imaged pattern. With the determined correspondence one can see the three-dimensional coordinates of the object 501 from a single pictorial pattern. The stripes of the pattern 558 are oriented in a configuration perpendicular to the epipolar lines on the projector level. Since the epipolar lines are imaged on the projector plane as epipolar lines on the image plane of the camera, an association between projector points and camera points can be achieved by moving along the direction of the epipolar lines in the image plane of the camera and the color of the line is registered in each case , It is understood that each pixel in the image plane of the camera corresponds to a two-dimensional angle. The color allows the determination of the 1: 1 correspondence between certain projection angles and specific camera angles. This correspondence information is sufficient in combination with the distance between the camera and the projector (the baseline distance D) and the angles of the camera and projector relative to the baseline to allow the determination of the distance Z from the camera to the object.
  • In 19 another embodiment is shown with color samples. In this embodiment, a plurality of colored patterns with different intensities 560 . 562 . 564 combined to a color pattern 566 to create. The variety of intensities 560 . 562 . 564 colored pattern consists in one embodiment of such primary colors that the pattern 560 the intensity of the color red, the pattern 562 the intensity of the color green and the pattern 564 change the intensity of the color blue. Since the ratios of the colors are known, the resulting emitted image has a known ratio that can be decoded in the imaged pattern. As in the embodiment of 18 can be the three-dimensional coordinates of the object 501 determined as soon as the correspondence is established. Unlike the pattern of 18 in which a single cycle of unique colors is projected, the pattern of 19 three complete cycles of almost identical colors. In the pattern of 18 There is hardly any possibility of ambiguity in the measured distance Z (at least in the case where the projected lines are perpendicular to the epipolar lines), because each camera pixel recognizes a certain color that corresponds only to a particular projection direction. Since the camera angle and projection angles are known, one can use triangulation to determine the three-dimensional coordinates of the object at each pixel position using a single camera image. That's why you can use the procedure of 18 consider as coded single-shot method. In contrast, there is in 19 a possibility of ambiguity in the distance Z to an object point. For example, if the camera sees a purple color, the projector may have projected one of three different angles. Based on the triangulation geometry, three different distances Z are possible. If it is known prematurely that the thickness of the object is within a relatively small range of values, then two of the values can be eliminated, thereby obtaining three-dimensional coordinates in a single shot. In the general case, however, the use of additional projected patterns would be required to eliminate the ambiguity. For example, the spatial period of the colored pattern may be changed and subsequently used to illuminate the object a second time. In this case, this method of projected structured light is considered to be a sequential method and not a coded single-frame method.
  • Now referring to 20 - 23 , there are shown coded structured light patterns for a single image acquisition based on a strip indexing method. In the embodiments of 20 and 21 become patterns with color stripes 568 . 570 from the projector 505 emitted. In this method, a feature of image sensors is used, wherein the sensor has three independent color channels such as red, green and blue or cyan, yellow and magenta. The combination of values generated by these sensor channels can produce a large number of colored patterns bring forth. As in the embodiment of 19 For example, the ratio of color distribution is known so that the relationship between the emitted pattern and the image pattern can be determined and the three-dimensional coordinates calculated. Other types of colored patterns may be used, such as a pattern based on the De Bruijn sequence. The strip indexing method and De Bruijn sequence are well known to one of ordinary skill in the art and therefore will not be discussed further.
  • In the embodiments of 22 and 23 a strip indexing method without colors is used. The pattern 572 provides in the embodiment of 22 Groups of strips with multiple intensity levels (gray scale levels) and different widths ready. As a result, a particular stripe group has a unique gray value pattern throughout the image. Due to the uniqueness of the groups, a one-to-one correspondence between the emitted pattern and the imaged pattern for the calculation of the coordinates of the object 501 be determined. In the embodiment of 23 represents the pattern 574 a series of strips with a segmented pattern available. Since each line has a unique segment design, the correspondence between the emitted pattern and the imaged pattern can be used to compute the coordinates of the object 501 determine. In 20 - 23 Additional advantages can be achieved by looking at the projected lines 572 . 574 oriented perpendicular to the epipolar lines in the camera plane, because this simplifies the determination of a second dimension in finding the 1: 1 correspondence between the camera and projector patterns.
  • Now referring to 24 - 27 , there are shown coded structured light patterns in which a method with a two-dimensional spatial grid pattern is used. These types of patterns are arranged such that a subwindow such as a window 576 in the pattern 578 relative to other subwindows within the pattern. In the embodiment of 24 becomes a pattern 578 used with pseudo-random binary arrangement. In the pattern 578 becomes a grid with elements such as circles 579 used, which form the coded pattern. It is understood that elements with other geometric shapes such as squares, rectangles, and triangles may be used, but are not limited thereto. In the embodiment of 25 is a pattern 580 represented by a multi-valued pseudo-random arrangement, each of the numerical values being an associated form 582 having. These forms 582 form a unique subwindow 584 representing the correspondence between the emitted pattern and the imaged pattern for the calculation of the coordinates of the object 501 allows. In the embodiment of 26 is the grid 586 color-coded with stripes arranged perpendicular to the projector plane. The pattern of 26 does not necessarily provide a pattern that can be decoded in a single shot, but the color information can help simplify the analysis. In the embodiment of 27 becomes an arrangement 558 used by colored shapes such as squares or circles to form the pattern.
  • Now referring to the 28A - 28B , is there an exemplary sinusoidal pattern 720 shown. The lines 734 are arranged in a configuration perpendicular to the epipolar lines on the projector level. The sinusoidal pattern 720 consists of up to 30 lines 722 which are repeated once, giving a total number of lines 722 of 60 results. Every line 722 has a sine feature 723 which is about 180 degrees out of phase with the line above and below the line. This provides the lines 722 the ability to be as close together as possible, and also allows for greater depth of field, because the lines may be out of focus on the projected surface or captured image and still be detected. Every single line 722 can be uniquely decoded using only the phase of this line, where the line length must be at least one wavelength of the sinusoid.
  • Because the pattern 720 it would normally cause ambiguity in the identification of the lines. However, this problem is solved in this system by the geometry of the field of view of the camera and the depth of field. In a single shot of the camera - ie a series of pixels - within the depth of field in which the lines can be optically resolved, no two lines can be imaged with the same phase. For example, the first row of pixels in the camera can only pick up reflected light from lines 1-30 of the pattern, whereas further down in the camera sensor another row only picks up reflected light from lines 2-31 of the pattern, etc. 28B is an enlarged section of the pattern 720 shown with three lines, with the phase between successive lines 722 is about 180 degrees. It is also shown how the phase of each individual line is sufficient to clearly decode the lines.
  • Now referring to 29A - 29B , is there another pattern 730 illustrated having square pattern elements. The lines 732 are arranged in a configuration perpendicular to the epipolar lines on the projector level. The square pattern 730 contains 27 lines 732 before it is repeated, and has a total number of 59. The code elements 734 of the pattern 730 differ by the phase of the square wave from left to right in 29B , The pattern 730 is coded such that a group of successive lines 732 is distinguished by the relative phases of its elements. The successive lines are found in the image by vertically scanning the lines. In one embodiment, "vertical scanning" means scanning along the epipolar lines in the image plane of the camera. Successive lines in a vertical pixel column of the camera are assembled into a pair and their relative phases are determined. Four consecutive paired lines are required to decode the line group and place them in the pattern 730 to locate. Because of the repetition there is also an ambiguity in this pattern 730 but this is solved in the same way as above with respect to the sinusoidal pattern 720 was discussed. 29B shows an enlarged view of four lines 732 of the square pattern. This embodiment shows that the phase of a single line 732 alone is unable to uniquely decode a line because the first and third lines have the same absolute phase.
  • This approach of encoding the relative phases to the absolute phases provides advantages in having a higher tolerance for the positions of the phases. Minor errors in the design of the projector, which can cause the phases of the lines to be shifted over the full depth of field of the camera, as well as errors caused by projector and camera lenses, make the determination of an absolute phase considerably difficult. This can be remedied in the absolute phase method by increasing the period so that it is large enough to correct for the error in determining the phase.
  • It is understood that in the case of a two-dimensional pattern projecting a coded light pattern, the three non-collinear pattern elements are recognizable because of their codes, and since they are projected in two dimensions, the at least three pattern elements are not collinear. In the case of the periodic pattern such as the sinusoidally repeated pattern, each sine period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two dimensions, the pattern elements are not collinear. In contrast, in the case of the laser line scanner emitting a light line, all the pattern elements are on a straight line. Although the line has a width and the end of the line cross-section may have less optical energy than the peak of the signal, these aspects of the line are not evaluated separately in determining the surface coordinates of an object so that they do not represent separate pattern elements. Although the line may contain several pattern elements, these pattern elements are collinear.
  • Furthermore, the various methods with patterns as shown in FIG 30 - 31 combined to be either a binary ( 30 ) uncoded checkerboard pattern 590 or a colored ( 31 ) uncoded checkerboard pattern 592 to build. In yet another embodiment, in 32 can be used, a photometric stereo method can be used, wherein a plurality of images 594 on the object 501 is recorded and where the light source 596 is moved to a variety of positions.
  • Now referring to 33 , is there another embodiment of a system 700 for acquiring three-dimensional coordinates of an object 702 shown. In this embodiment, the device 704 operated independently when coming from the articulated arm CMM 100 is removed. The device 708 includes a control device 706 and an optional display 708 , The ad 708 can in the case of the device 704 integrated or a separate component attached to the device 704 is coupled when independent of the articulated arm CMM 100 is used. In embodiments, where the display 708 from the device 704 is separable, the ad can 708 a control device (not shown) that provides additional functionality to the independent operation of the device 704 to facilitate. The control device 706 is arranged in one embodiment in the separable display.
  • The control device 706 includes a communication circuit configured to wirelessly transmit data such as images or coordinate data over a communication link 712 to the articulated arm KG 100 , to a separate computing device 710 or a combination of both. The computing device 710 may be, but is not limited to, a computer, a laptop, a tablet computer, a personal digital assistant (PDA) or a mobile phone. The ad 708 may allow the operator to capture the captured images or the point cloud of the detected coordinates of the object 702 to watch. In one embodiment, the control device decodes 706 the patterns in the captured image to determine the three-dimensional coordinates of the object. In another embodiment, the images are from the device 704 and either to the articulated arm CMM 100 , the computing device 710 or a combination of both.
  • The device 704 may also include a positioner assembly 714 include. The positioner assembly may include one or more inertial navigation sensors, such as a Global Positioning System (GPS) sensor, a gyro sensor, an acceleration sensor. Such sensors can be electrically connected to the control device 706 be coupled. Gyro and acceleration sensors can be uniaxial or multi-axis devices. The positioner assembly 714 is configured to the control device 706 the measurement or maintenance of the orientation of the device 704 to allow, if the latter of the articulated arm CMM 100 is removed. A spinning top in the positioning device assembly 714 may be a microelectromechanical system (MEMS) gyroscope, a semiconductor ring laser device, a fiber optic gyroscope or other type.
  • When the device 704 from the articulated arm CMM 100 For example, a method of combining images obtained by multiple scans is employed. In one embodiment, the images are each obtained through the use of coded patterns such that only a single image is necessary for obtaining three-dimensional coordinates that correspond to a specific position and orientation of the device 704 assigned. A method for combining several of the device 704 Captured images is to provide at least some overlap between adjacent images such that point cloud features can be matched. This matching function can be supported by the navigation inertial devices described above.
  • Another procedure that helps to ensure the accurate registration of the device 704 captured images is the use of reference marks. In one embodiment, the reference marks are small markings with an adhesive or adhesive backing layer, so for example circular markings, which are arranged on one or more objects to be measured. Even a small number of such tags may be useful in registering multiple images, especially if the measured object has a relatively small number of registry usable features. In one embodiment, the reference markers can be projected as points of light onto the test object (s). For example, you can arrange a small portable projector that can emit a variety of small dots in front of the or objects to be measured. An advantage of the projected points over glued points is that the dots do not have to be attached and later removed.
  • In one embodiment, the device projects the structured light across a continuous and enclosed area 716 and can be a picture of the area 716 at a distance of 100 mm to 300 mm with an accuracy of 35 μm. In one embodiment, the rectangular area makes 716 the projection about 150 to 200 mm 2 from. The camera or the cameras 510 can be a digital camera with a CMOS or CCD sensor with 1.2-5.0 megapixels.
  • Referring to 28 and 29 , the method of decoding a coded pattern will be described. The first step in decoding an image of the pattern is to locate the centers of gravity (cog; centers of gravity). 724 ( 28C ) of the features of the projected pattern 720 in the y direction to extract. This is done by calculating a moving average of the pixel gray values and moving down in the y direction, processing one column at a time. If a pixel value in an image is above the moving average, then a starting point for a feature is found. After the starting point is found, the width of the feature becomes larger and larger until a pixel value is below the moving average. Then, using the pixel values and their y-positions between start and end points, a weighted average is calculated, with the center of gravity 724 of the pattern feature 723 in the picture. The distances between the start and end points are also recorded for later use.
  • The resulting emphases 724 will be used next to the pattern lines 722 to investigate. This is done by moving in the left-to-right direction (in the direction shown in the figures) starting with the first column of the image. For every focus 724 in this column, the immediately adjacent right column becomes a center of gravity 724 searched, which lies at a certain distance. If two matching priorities 724 found, then a potential line was determined. As the process moves across the image, further new lines are detected and other previously determined lines are lengthened, while additional centroids 724 be determined within the tolerance. Once the entire image has been processed, a filter is applied to the extracted lines to ensure that only lines of a desired length, which is the wavelength of the pattern, are used in the remaining steps. 28C also shows the detected lines, all being longer than a single wavelength of the pattern. At a Embodiment is no or a small delta between centers of gravity of adjacent columns.
  • The next step in the decoding process is to extract the projected pattern features along the lines in the x direction in the form of block centers. Each pattern contains both wide and narrow blocks. In the sinusoidal pattern 720 this refers to the peaks and valleys of the wave and the square pattern 730 on the broad and narrow squares or rectangles. This method continues in a similar manner as extracting the features in the y-direction; however, the moving average is also calculated using the widths determined in the first step and the direction of movement along the line. As described above, the features are extracted in the area where the widths are above the moving average, but in this method also features are extracted in those areas where the widths are below the moving average. The widths and the x positions serve to give a weighted average for finding the center of the block 726 to calculate in the x-direction. The y-positions of the focal points 724 Intersections between moving averages are also used around a center of the block 726 in the y-direction. This is done by taking the average of the y-coordinates of the centers of gravity. The start and end points of each line are also modified based on the features extracted in this step to ensure that both points are where the moving average crossover occurs. In one embodiment, only complete blocks are used in the later processing steps.
  • The lines and blocks are then further processed to ensure that the distance between the block centers 726 on every line within a given tolerance. This is done by making the delta between the x-center positions between two adjacent blocks on a line and checking that the delta is below the tolerance. If the delta is above the tolerance, then the line is split into two smaller lines. If the split between the last two blocks on a line is required then the last block is removed and no further line is created. If the separation between the first and second or the second and third blocks on a line is necessary, then the blocks to the left of the separation are also discarded and no further line is generated. In situations where separation occurs at any other location along the line, the line is split into two lines, a new line is created and the appropriate blocks are transferred to it. After this processing step, the two patterns require different steps to complete the decoding.
  • The sinusoidal pattern 720 can now be decoded with a further processing step by means of the block centers on the lines. The module of each x block center and the wavelength of the pattern 720 on a line 722 are calculated and the average of these values gives the phase of the line 722 , The phase of the line 722 can then be used for decoding the line in the pattern 720 which, in turn, determines an x, y, z coordinate position of all centroids 724 on this line 722 allows.
  • Before the decoding of the square pattern 730 become the first lines 732 connected vertically before any decoding can occur. This makes it possible for a group of lines to be identified, not just a single line, as in the sinusoidal pattern. The connections 736 be between the lines 732 determined by looking at the blocks 734 and using the centroids contained in the block calculated in the first processing step. It becomes the first focus in every block on a line 732 checked to see if another center of gravity directly below it exists in the same column. If there is no centroid below, then there is no connection with another line at that point, so processing continues. If there is a centroid underneath, then the y-distance between the two centroids is determined and compared to the nominal maximum distance between lines. If the distance is less than this value, the two lines are considered connected at this point, the connection becomes 736 stored and processing continues on the next block. In one embodiment is a line connection 736 so unique that no two lines more than one connection 736 between them.
  • The next processing step in the square pattern 730 is the phase calculation between connected lines. Every pair of lines 732 is processed first to determine the length of the overlap between them. In one embodiment, at least one wavelength of the overlap between the line pair is present to allow the calculation of the relative phase. If the lines have the desired overlap, then the centroid in the middle of the overlap area is determined. The blocks 738 , which contain the center of gravity and the center of gravity directly below it, are determined and the relative phase between the x-block centers is calculated for this line connection. This procedure is repeated for all connections between lines. In one embodiment, the method is repeated only in the downward direction in the y-axis. This is because the code is based on connections under lines and not the other way around or not both. 29C shows the blocks 738 which could be used for calculating the relative phase in this set of lines. The relative phases in the embodiment of 29C are 3, 1 and 2 and these phases would be used in the last step to decode the top line.
  • The next step in the decoding of the square pattern 730 is the execution of a search with the relative phases calculated in the previous step. Every line 732 is processed by the line connections 736 be tracked until a connection depth of four is reached. This depth is used because this is the number of phases for decoding the line. At each level of the connection is determined by the relative phase between the lines 732 a double cross determined. Once the required connection depth has been reached, the double-cross is used to search the line code. If the double-cross confirms a valid code, it will be recorded and stored in a voting system. Every line 732 is processed in this way, and all links that have the desired depth are used to generate a vote if they are a valid phase combination. The final step is to figure out which code has the most votes on each line 732 received, and the code of the line 732 to assign this value. If there is no unique code that received the most votes, then no code is assigned to the line. The lines 732 are identified once a code has been assigned, and the x, y, z coordinate position of all centroids on that line 732 can now be determined.
  • It should be noted that although in the descriptions given above, whether three or more pattern elements are collinear, a distinction is made between line scanners and surface scanners (structured light scanners), the intent of this criterion is to distinguish patterns that are surface areas and projected as lines. As a result, patterns that are projected linearly with information only along a single path are still line patterns, although the one-dimensional pattern may be curved.
  • Although the invention has been described by way of example embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for features thereof without departing from the scope of the invention. Furthermore, numerous modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Accordingly, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode of practicing this invention, but that the invention will include all aspects within the scope of the appended claims. Further, the use of the terms "first," "second," etc. does not mean any order or significance, but the terms "first," "second," and so forth are used to distinguish one feature from another. In addition, the use of the terms "a," "an," etc. does not mean a limitation on the amount, but rather the presence of at least one of the object referred to.

Claims (46)

  1. Portable articulated arm coordinate measuring machine (articulated arm CMM) for measuring three-dimensional coordinates of an object in space, comprising: a socket; a manually positionable arm portion having opposite first and second ends, the arm portion being rotatably coupled to the base, the arm portion including a plurality of connected arm segments, each arm segment including at least one position gauge for generating a position signal; an electronic circuit which receives the position signal of the at least one position measuring device in each arm segment; a probe end coupled to the first end; a contactless three-dimensional measuring device coupled to the probe end, the non-contact, three-dimensional measuring device comprising a projector and an image sensor, the projector having a source plane, the projector being configured to emit a patterned light onto the object, the structured light being from the Source layer and comprises at least three non-collinear pattern elements, the image sensor being arranged to receive the structured light reflected from the object; and a processor electrically coupled to the electronic circuit, wherein the processor is configured to determine the three-dimensional coordinates of a point on the object in response to receiving the position signals of the position gauges and in response to receiving the patterned light by the image sensor.
  2. Articulated arm CMM according to claim 1, wherein the non-contact three-dimensional measuring device is removably coupled to the probe end.
  3. Articulated arm CMM according to claim 1, wherein the structured light is a coded patterned light pattern.
  4. Articulated arm CMM according to claim 3, wherein the non-contact three-dimensional measuring device is independently operable when it is detached from the probe end.
  5. Articulated arm CMM according to claim 3, wherein the coded structured light pattern comprises graphic elements comprising at least one of a square, rectangle and triangle.
  6. Articulated arm CMM according to claim 5, wherein the coded light pattern is a sinusoidal pattern comprising 30 lines and each line is 180 degrees out of phase with respect to adjacent lines.
  7. Articulated arm CMM according to claim 5, wherein the coded structured light pattern comprises 27 lines and the graphic elements are squares.
  8. Articulated arm CMM according to claim 3, wherein the coded structured light comprises a pattern having a plurality of wavelengths, wherein at least one wavelength has a different spatial arrangement than the other wavelengths.
  9. Articulated arm CMM according to claim 3, wherein the coded structured light comprises a pattern with a plurality of different colors.
  10. Articulated arm CMM according to claim 3, wherein the coded structured light comprises a segmented line pattern.
  11. Articulated arm CMM according to claim 3, wherein the coded structured light comprises a two-dimensional spatial grid pattern.
  12. Articulated arm CMM according to claim 11, wherein the two-dimensional spatial grid pattern comprises a pseudo-random binary arrangement.
  13. The articulated arm CMM of claim 11, wherein the two-dimensional spatial grid pattern comprises a color-coded grid.
  14. The articulated arm CMM of claim 11, wherein the two-dimensional spatial grid pattern comprises a multi-valued pseudorandom array.
  15. The articulated arm CMM of claim 11, wherein the two-dimensional spatial grid pattern comprises a two-dimensional array of color-coded geometric shapes.
  16. The articulated arm CMM of claim 1, wherein the patterned light pattern is an uncoded patterned light pattern.
  17. The articulated arm CMM of claim 16, wherein the uncoded patterned light pattern comprises successive projection images.
  18. Articulated arm CMM according to claim 17, wherein the successive projection images are a set of binary patterns.
  19. The articulated arm CMM of claim 17, wherein the successive projection images are a set of patterns comprising stripes having at least two intensity levels.
  20. Articulated arm CMM according to claim 17, wherein the successive projection images are a set of at least three sinusoidal patterns.
  21. The articulated arm CMM of claim 17, wherein the successive projection images are a set of patterns comprising stripes having at least two intensity levels comprising at least three sinusoidal patterns.
  22. The articulated arm CMM of claim 16, wherein the uncoded structured light pattern projects a sequence of different illuminated patterns, each of the successions of different illuminated patterns being projected from different locations relative to the object.
  23. The articulated arm CMM of claim 16, wherein the uncoded structured light pattern comprises a repeated gray level pattern.
  24. Articulated arm CMM according to claim 1, further comprising a contact measuring device coupled to the probe end.
  25. Articulated arm CMM according to claim 1, wherein the processor is arranged in the contactless three-dimensional measuring device.
  26. A method of operating a portable articulated arm coordinate measuring machine for measuring the coordinates of an object in space, comprising: providing a manually positionable arm portion having opposite first and second ends, the arm portion including a plurality of connected arm segments, each arm segment comprising at least one position measuring device for generating a Position signal comprises; Providing a probe end for measuring the object, wherein the probe end is coupled to the first end; Receiving the position signals of the position measuring devices on an electronic circuit; Providing a three-dimensional non-contact measuring device with a control device, wherein the three-dimensional non-contact measuring device comprises a sensor and a projector, wherein the projector is configured to emit a structured light on the object, wherein the projector has a source plane, wherein the structured light from the Source level is emitted from and comprises at least three non-collinear pattern elements; and projecting a structured light from the three-dimensional measuring device onto the object.
  27. The method of claim 26, further comprising receiving a reflection of the structured light from the object with the three-dimensional measuring device.
  28. The method of claim 27, further comprising determining three-dimensional coordinates of a point on the object from the reflected structured light.
  29. The method of claim 28, wherein the patterned light is a coded structured light.
  30. The method of claim 29, wherein the coded structured light pattern projected on the object comprises graphic elements comprising at least one of a square, rectangle, and triangle.
  31. The method of claim 29, wherein the coded light pattern is a sinusoidal pattern comprising 30 lines and each line is 180 degrees out of phase with respect to adjacent lines.
  32. The method of claim 29, wherein the coded structured light pattern comprises 27 lines and the graphical elements are squares.
  33. The method of claim 29, wherein the coded structured light comprises a pattern having a plurality of wavelengths, wherein at least one wavelength has a different spatial arrangement than the other wavelengths.
  34. The method of claim 33, wherein the pattern having a plurality of wavelengths is arranged in lines oriented substantially perpendicular to the source plane.
  35. The method of claim 29, wherein the coded structured light comprises a single pattern having a plurality of different colors.
  36. The method of claim 29, wherein the coded structured light comprises a segmented line pattern.
  37. The method of claim 29, wherein the coded structured light comprises a two-dimensional spatial grid pattern.
  38. The method of claim 37, wherein the two-dimensional spatial grid pattern comprises a pseudorandom binary array.
  39. The method of claim 37, wherein the two-dimensional spatial grid pattern comprises a color-coded grid.
  40. The method of claim 37, wherein the two-dimensional spatial grid pattern comprises a multi-valued pseudorandom array.
  41. The method of claim 37, wherein the two-dimensional spatial grid pattern comprises a two-dimensional array of color-coded geometric shapes.
  42. The method of claim 29, further comprising changing the coded structured light from a first pattern to a second pattern with the electronic circuit.
  43. The method of claim 42, wherein the coded structured light is changed in response to an operator input.
  44. The method of claim 42, wherein the coded structured light is automatically changed by the electronic circuit in response to a change in conditions on the object.
  45. Articulated arm CMM according to claim 28, further comprising: Separating the three-dimensional measuring device from the probe end; and Operating the three-dimensional measuring device separate from the end of the probe.
  46. The articulated arm CMM of claim 45, further comprising transmitting data from the three-dimensional measurement device operated separately from the probe end.
DE201311002824 2010-01-20 2013-05-20 Coordinate measuring machines with removable accessories Ceased DE112013002824T5 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/491,176 US8832954B2 (en) 2010-01-20 2012-06-07 Coordinate measurement machines with removable accessories
US13/491,176 2012-06-07
PCT/US2013/041826 WO2013184340A1 (en) 2012-06-07 2013-05-20 Coordinate measurement machines with removable accessories

Publications (1)

Publication Number Publication Date
DE112013002824T5 true DE112013002824T5 (en) 2015-04-02

Family

ID=48537024

Family Applications (1)

Application Number Title Priority Date Filing Date
DE201311002824 Ceased DE112013002824T5 (en) 2010-01-20 2013-05-20 Coordinate measuring machines with removable accessories

Country Status (5)

Country Link
JP (1) JP5816773B2 (en)
CN (1) CN104380033A (en)
DE (1) DE112013002824T5 (en)
GB (1) GB2517621A (en)
WO (1) WO2013184340A1 (en)

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
JP6465682B2 (en) * 2014-03-20 2019-02-06 キヤノン株式会社 Information processing apparatus, information processing method, and program
US9769454B2 (en) 2014-06-20 2017-09-19 Stmicroelectronics S.R.L. Method for generating a depth map, related system and computer program product
US20160016274A1 (en) * 2014-07-16 2016-01-21 Faro Technologies, Inc. Measurement device for machining center
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
EP3194883B1 (en) * 2014-08-07 2018-10-17 Ingenera SA Method and relevant device for measuring distance with auto-calibration and temperature compensation
DE102014013678B3 (en) 2014-09-10 2015-12-03 Faro Technologies, Inc. Method for optically sensing and measuring an environment with a handheld scanner and gesture control
US9693040B2 (en) 2014-09-10 2017-06-27 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
DE102014013677B4 (en) 2014-09-10 2017-06-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment with a handheld scanner and subdivided display
US9671221B2 (en) 2014-09-10 2017-06-06 Faro Technologies, Inc. Portable device for optically measuring three-dimensional coordinates
GB2545603A (en) * 2014-09-10 2017-06-21 Faro Tech Inc A portable device for optically measuring three-dimensional coordinates
US9602811B2 (en) 2014-09-10 2017-03-21 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
CN107076551A (en) * 2014-09-19 2017-08-18 海克斯康测量技术有限公司 Multi-mode portable coordinate measuring machine
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US20160101936A1 (en) 2014-10-10 2016-04-14 Hand Held Products, Inc. System and method for picking validation
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
DE102015205187A1 (en) * 2015-03-23 2016-09-29 Siemens Aktiengesellschaft Method and device for the projection of line pattern sequences
JP6512912B2 (en) * 2015-04-10 2019-05-15 キヤノン株式会社 Measuring device for measuring the shape of the object to be measured
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3118576B1 (en) 2015-07-15 2018-09-12 Hand Held Products, Inc. Mobile dimensioning device with dynamic accuracy compatible with nist standard
CN104979882B (en) * 2015-07-30 2017-03-01 安徽啄木鸟无人机科技有限公司 A kind of unmanned plane quick charging system and its charging method
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
WO2017095259A1 (en) * 2015-12-04 2017-06-08 Андрей Владимирович КЛИМОВ Method for monitoring linear dimensions of three-dimensional entities
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
JP2017126192A (en) * 2016-01-14 2017-07-20 セイコーエプソン株式会社 Image recognition device, image recognition method and image recognition unit
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
CN106091930B (en) * 2016-08-16 2019-01-11 郑州辰维科技股份有限公司 A kind of real-time online measuring method based on double camera measuring system and structured light sensor

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5611147A (en) 1993-02-23 1997-03-18 Faro Technologies, Inc. Three dimensional coordinate measuring apparatus
US5402582A (en) 1993-02-23 1995-04-04 Faro Technologies Inc. Three dimensional coordinate measuring apparatus
GB0309662D0 (en) * 2003-04-28 2003-06-04 Crampton Stephen Robot CMM arm
US20060017720A1 (en) * 2004-07-15 2006-01-26 Li You F System and method for 3D measurement and surface reconstruction
JP5460341B2 (en) * 2010-01-06 2014-04-02 キヤノン株式会社 Three-dimensional measuring apparatus and control method thereof
US8533967B2 (en) * 2010-01-20 2013-09-17 Faro Technologies, Inc. Coordinate measurement machines with removable accessories
EP2400261A1 (en) * 2010-06-21 2011-12-28 Leica Geosystems AG Optical measurement method and system for determining 3D coordination in a measuring object surface

Also Published As

Publication number Publication date
WO2013184340A1 (en) 2013-12-12
JP2015524916A (en) 2015-08-27
CN104380033A (en) 2015-02-25
JP5816773B2 (en) 2015-11-18
GB2517621A (en) 2015-02-25

Similar Documents

Publication Publication Date Title
US7336375B1 (en) Wireless methods and systems for three-dimensional non-contact shape sensing
JP5127820B2 (en) Camera-based target coordinate measurement method
JP5689681B2 (en) Non-contact probe
JP5925244B2 (en) Joint arm coordinate measuring machine
US7180607B2 (en) Method and device for calibrating a measuring system
US8537374B2 (en) Coordinate measuring machine having an illuminated probe end and method of operation
US8970823B2 (en) Device for optically scanning and measuring an environment
EP0866308B1 (en) Optical profile sensor
JP2009534657A (en) Camera-equipped 6-degree-of-freedom target measurement device and target tracking device with a rotating mirror
JP6355710B2 (en) Non-contact optical three-dimensional measuring device
US20150068054A1 (en) Portable coordinate measurement machine having a handle that includes electronics
US9686532B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
JP4871352B2 (en) Automatic reference system and apparatus for 3D scanning
US20090059241A1 (en) System and Method for Three-Dimensional Measurement of The Shape of Material Objects
US20060103853A1 (en) Optical projection system
US8502991B2 (en) Method for the determination of the 3D coordinates of an object
US7400414B2 (en) Hand-size structured-light three-dimensional metrology imaging system and method
US20100134598A1 (en) Hand-held self-referenced apparatus for three-dimensional scanning
JP6099675B2 (en) Inspection method by barcode identification
US9858682B2 (en) Device for optically scanning and measuring an environment
DE112011100296T5 (en) Multifunctional coordinate measuring machines
JP2009078133A (en) Device for determining 3d coordinates of object, in particular of tooth
US9683837B2 (en) Optical measurement method and measurement system for determining 3D coordinates on a measurement object surface
US6549288B1 (en) Structured-light, triangulation-based three-dimensional digitizer
Gühring Dense 3D surface acquisition by structured light using off-the-shelf components

Legal Events

Date Code Title Description
R012 Request for examination validly filed
R002 Refusal decision in examination/registration proceedings
R003 Refusal decision now final