JP6355710B2 - Non-contact optical three-dimensional measuring device - Google Patents

Non-contact optical three-dimensional measuring device Download PDF

Info

Publication number
JP6355710B2
JP6355710B2 JP2016500623A JP2016500623A JP6355710B2 JP 6355710 B2 JP6355710 B2 JP 6355710B2 JP 2016500623 A JP2016500623 A JP 2016500623A JP 2016500623 A JP2016500623 A JP 2016500623A JP 6355710 B2 JP6355710 B2 JP 6355710B2
Authority
JP
Japan
Prior art keywords
light
camera
projector
scanner
surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2016500623A
Other languages
Japanese (ja)
Other versions
JP2016514271A (en
Inventor
ベルント−ディートマー ベッカー
ベルント−ディートマー ベッカー
ロバート イー ブリッジイズ
ロバート イー ブリッジイズ
Original Assignee
ファロ テクノロジーズ インコーポレーテッド
ファロ テクノロジーズ インコーポレーテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201361791797P priority Critical
Priority to US61/791,797 priority
Priority to US13/932,267 priority patent/US9482529B2/en
Priority to US13/932,267 priority
Application filed by ファロ テクノロジーズ インコーポレーテッド, ファロ テクノロジーズ インコーポレーテッド filed Critical ファロ テクノロジーズ インコーポレーテッド
Priority to PCT/US2014/020481 priority patent/WO2014149702A1/en
Publication of JP2016514271A publication Critical patent/JP2016514271A/en
Application granted granted Critical
Publication of JP6355710B2 publication Critical patent/JP6355710B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical means for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof in so far as they are not adapted to particular types of measuring means of the preceding groups
    • G01B21/02Measuring arrangements or details thereof in so far as they are not adapted to particular types of measuring means of the preceding groups for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof in so far as they are not adapted to particular types of measuring means of the preceding groups for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/045Correction of measurements

Description

  The subject matter disclosed herein relates to a three-dimensional coordinate scanner, and more particularly to a triangulation scanner having multiple data acquisition schemes.

  The acquisition of three-dimensional coordinates of an object or environment is known. Various techniques can be used, such as time-of-flight or triangulation methods. A time-of-flight system such as a laser tracker, total station, or time-of-flight scanner directs a light beam, such as a laser beam, to a retroreflector target or spot on the object surface. An absolute rangefinder is used to determine the distance to the target or spot based on the length of time it takes for the light to reach the target or spot and return. By moving the laser beam or target over the surface of the object, the coordinates of the object are confirmed. Time-of-flight systems have the advantage of having relatively high accuracy, but in some cases, time-of-flight systems typically have to measure each point on the surface individually, so some other systems Slower than.

  In contrast, scanners that use triangulation to measure three-dimensional coordinates use light patterns on a straight line (such as laser lines from a laser probe) and light patterns that cover an area (such as structured light). ) Is projected onto the surface. By mounting the camera and projector on a common frame, for example, the camera is coupled to the projector in a fixed relationship. The light emitted from the projector is reflected by the surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object can be determined using the principle of trigonometry. Compared to coordinate measuring devices that use tactile probes, the triangulation system offers the advantage that coordinate data can be acquired quickly over a large area. As used herein, the collection of 3D coordinate values provided by the triangulation system is referred to as point cloud data or simply point cloud.

  Several problems prevent the acquisition of highly accurate point cloud data when using a laser scanner. These are, for example, changes in the reflectance of the object surface or changes in the incident angle of the surface relative to the projection light, resulting in a low resolution near the edge, such as a change in the level of light received by the camera image plane, the edge of the hole Including, but not limited to, multipath interference. In some cases, the operator does not notice the problem or cannot resolve it. In these cases, missing or defective point cloud data results.

  Thus, while existing scanners are suitable for their intended purpose, there remains a need for improvement, particularly in providing a scanner that can adapt to inappropriate conditions and improve data point acquisition.

The present invention is a non-contact optical three-dimensional measuring apparatus, an assembly including a first projector, a first camera, a second projector, and a second camera, wherein the first projector, the first camera, and the The second projector and the second camera are in a fixed relationship with each other, the first projector has a first light source, and emits first light having at least one pattern to the surface of the object. A first projector is configured, and the first camera has a first lens and a first photosensitive array, receives a first portion of the first light reflected from the surface, and generates a corresponding first signal. The first camera has a first field of view, the first field of view is a first viewing angle region of the first camera, and the second projector has a second light source. Of the object The second projector is configured to emit a second light onto a surface, the second camera has a second lens and a second photosensitive array, and a second portion of the second light reflected by the surface The second camera is configured to receive and generate a corresponding second signal, the second camera having a second field of view, wherein the second field of view is a second viewing angle region of the second camera. The second field of view is different from the first field of view and is electrically coupled to the assembly, the first projector, the second projector, the first camera, and the second camera, and the first signal at a first time point Collecting the second signal at a second time point different from the first time point, and determining a three-dimensional coordinate of the first point on the surface based at least in part on the first signal. And at least part of the second signal And a processor for executing computer-executable program code for performing operations including determining a three-dimensional coordinate of the second point on the surface based on a multipath when executed by the processor Determining the presence of interference and moving the assembly from a first position to a second position at a third time point between the first time point and the second time point if the multi-path interference is present; The processor is further configured to execute computer-executable program code for performing operations including: According to one aspect of the present invention, a non-contact optical three-dimensional measurement apparatus is provided. Non-contact optical three-dimensional is an assembly including a projector, a first camera, and a second camera, wherein the projector, the first camera, and the second camera are fixed to each other, and the projector and the first camera are connected to each other. A first distance in between, a second distance between the projector and the second camera, wherein the projector has a light source, and the projector emits a first light having one of a plurality of spatial change patterns to the object A first camera having a first lens and a first photosensitive array, the first camera receiving a first portion of the first light reflected by the surface and receiving a corresponding first light; The first camera has a first field of view, the first field of view is a first viewing angle region of the first camera, and the second camera has a second lens and a second photosensitive array. And the second camera has a surface The second portion of the reflected first light is adapted to generate a corresponding second digital signal, wherein the second camera has a second field of view and the second field of view is a second viewing angle of the second camera. An assembly having a second field of view different from the first field of view, a processor electrically coupled to the projector, the first camera, and the second camera, and a first digital at a first time point when executed by the processor. Collecting a signal and collecting a second digital signal at a second time different from the first time and determining a three-dimensional coordinate of the first point on the surface based at least in part on the first digital signal and the first distance And a computer readable medium for determining a three-dimensional coordinate of the second point on the surface based at least in part on the second digital signal and the second distance.

In the related art of the present invention , a method for determining three-dimensional coordinates on the surface of an object is provided. The method provides an assembly including a projector, a first camera, and a second camera, wherein the projector, the first camera, and the second camera are fixed relative to each other, and the projector and the first camera are connected to each other. A first distance is provided between the projector and the second camera, the projector has a light source, and the projector has a first light having any one of a plurality of spatial change patterns on the surface. A first camera having a first lens and a first photosensitive array, wherein the first camera is configured to receive a first portion of the first light reflected by the surface; One camera has a first field of view, the first field of view is the first viewing angle region of the first camera, the second camera has a second lens and a second photosensitive array, and the second camera is on the surface Receiving a second portion of the reflected first light The second camera has a second field of view, the second field of view is a second viewing angle region of the second camera, and the second field of view is different from the first field of view. A processor electrically coupled to one camera and a second camera is provided, and a first light having a first pattern selected from among a plurality of spatial change patterns is taken as a first example from the projector to the surface. Radiating and acquiring a first image of the surface by a first camera in a first example and sending a corresponding first digital signal to the processor to determine a first set of three-dimensional coordinates of the first point on the surface. And determining a first set based at least in part on the first pattern, the first digital signal, and the first distance, and performing a diagnostic procedure for evaluating the quality of the first set, wherein a plurality of spatial Selected from change patterns Determining a second pattern of the first light, determining the second pattern based at least in part on a result of the diagnostic procedure, and using the first light having the second pattern as a second example, Radiating from the projector to the surface, in the second example, a second image of the surface is acquired by the second camera and a corresponding second signal is sent to the processor to obtain a second set of three-dimensional coordinates of the second point on the surface. Determining, including determining the first set based at least in part on the second pattern, the second digital signal, and the second distance.

  These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.

  The subject matter considered as the invention is pointed out in detail and explicitly claimed in the claims at the end of the description. These and other features and advantages of the present invention will become apparent from the following detailed description when taken in conjunction with the accompanying drawings.

1 is a schematic top view of a scanner according to an embodiment of the present invention. 2 is a flowchart illustrating a method of operating the scanner of FIG. FIG. 6 is a schematic top view of a scanner according to another embodiment of the invention. 4 is a flowchart illustrating a method for operating the scanner of FIG. 3. 2 is a schematic diagram of elements of a laser scanner according to one embodiment. FIG. 3 is a flow diagram illustrating a method of operating a scanner according to one embodiment. FIG. 6 is a schematic top view of a scanner according to another embodiment of the invention. 3 is a flow diagram illustrating a method of operating a scanner according to one embodiment. 1 is a perspective view of a scanner used with a remote probe device according to an embodiment of the present invention. FIG. 1 is a perspective view of a scanner used with a remote probe device according to an embodiment of the present invention. FIG. 6 is a flowchart illustrating a method of operating the scanner of FIG. 1 is a schematic top view of a scanner according to one embodiment. FIG. FIG. 11 is a flowchart illustrating a method for operating the scanner of FIG. 10. 3 is a flowchart illustrating a diagnosis method according to an embodiment.

  The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.

  Embodiments of the present invention provide the advantage of increasing the reliability and accuracy of the three-dimensional coordinates of the data point cloud acquired by the scanner. Embodiments of the present invention provide the advantage of detecting acquired data anomalies and automatically adjusting the operation of the scanner to obtain the desired results. Embodiments of the present invention provide the advantage of detecting anomalies in acquired data and instructing the operator about areas where additional data acquisition is required. Still further embodiments of the present invention provide the advantage of detecting acquisition data anomalies and providing instructions to the operator when additional data acquisition can be acquired with a remote probe.

  The scanner device acquires three-dimensional coordinate data of the object. In one embodiment, the scanner 20 shown in FIG. 1 has a housing 22 that includes a first camera, a second camera, and a projector 28. The projector 28 emits light 30 to the surface 32 of the object 34. In the exemplary embodiment, projector 28 uses a visible light source that illuminates the pattern generator. The visible light source can be, for example, a laser, a superluminescent diode, incandescent light, a xenon lamp, a light emitting diode (LED), or other light emitting device. In one embodiment, the pattern generator is a chrome-on-glass slide with a structured light pattern etched into it. The slide may have a single pattern or multiple patterns that move to and from a fixed position as needed. The slide can be placed in the operating position manually or automatically. In other embodiments, the source pattern is a digital micromirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or in reflective mode. Can be light that is reflected or conducted by similar devices used in the conduction mode. The projector 28 may further include a lens system 36 that changes the emitted light to cover a desired area.

  In one embodiment, projector 28 can be configured to emit structured light to area 37. As used herein, “structured light” refers to a two-dimensional pattern of light projected onto an area of an object that conveys information that can be used to determine the coordinates of a point on the object. In one embodiment, the structured light pattern includes at least three non-collinear pattern elements that are placed in the area. Each of the three non-collinear pattern elements conveys information that can be used to determine point coordinates. In another embodiment, a projector is provided that can be configured to project a linear pattern along with an area pattern. In one embodiment, the projector is a digital micromirror device (DMD) configured to switch back and forth between the two. In one embodiment, the DMD projector can sweep straight lines or sweep points in a raster pattern.

  In general, two types of structured light patterns are provided: an encoded light pattern and an uncoded light pattern. As used herein, an encoded light pattern is one in which the three-dimensional coordinates of the illuminated surface of an object are obtained by acquiring a single image. With an encoded light pattern, point cloud data can be acquired and recorded while the projection device is moving relative to the object. One type of encoded light pattern includes a collection of elements (such as a geometric shape) arranged on a straight line, at least three of which are non-collinear. Such pattern elements are recognizable because of their arrangement.

  In contrast, the uncoded structured light pattern used here is a pattern that does not allow measurement by a single pattern. A series of uncoded light patterns are projected and imaged sequentially. In this case, it is usually necessary to hold the projector while being fixed to the object.

  It should be appreciated that the scanner 20 can use either encoded or uncoded structured light patterns. The structured light pattern may include the pattern disclosed in the journal article “DLP-based structured light 3D imaging technology and applications” by Jason Geng published in “SPIE Bulletin” No. 7932. In addition, in some embodiments described below, the projector 28 conducts patterned swept line light or swept spot light. Swept linear and point light is advantageous over area light when identifying some type of anomaly such as multipath interference. It is also advantageous in that a straight line is automatically swept while the scanner is stationary in order to obtain a more uniform sampling of surface points.

  The first camera 24 includes a photosensitive sensor 44 that produces a digital image / display of area 48 within the field of view of the sensor. The sensor may be a charge coupled device (CCD) type sensor or a complementary metal oxide semiconductor (CMOS) type sensor having an array of pixels, for example. The first camera 24 may further include other components such as, but not limited to, a lens 46 and other optical elements, for example. The lens 46 has an associated first focal length. Sensor 44 and lens 46 cooperate to define a first field of view “X”. In the exemplary embodiment, the first field of view “X” is 16 degrees (0.28 inches per inch).

  Similarly, the second camera 26 includes a photosensitive sensor 38 that produces a digital image / display in an area 40 within the field of view of the sensor. The sensor may be a charge coupled device (CCD) type sensor or a complementary metal oxide semiconductor (CMOS) type sensor having an array of pixels, for example. The second camera 26 may further include other components such as, but not limited to, a lens 42 and other optical elements, for example. The lens 42 has an associated second focal length that is different from the first focal length. Sensor 38 and lens 42 cooperate to define a second field of view “Y”. In the exemplary embodiment, the second field of view “Y” is 50 degrees (0.85 inches per inch). The second visual field Y is larger than the first visual field X. Similarly, area 40 is larger than area 48. It should be appreciated that a large field of view allows fast acquisition of a given region of the object surface 32 being measured, but if the photosensitive arrays 44, 38 have the same number of pixels, the field of view is The smaller the value, the higher the resolution.

  In the exemplary embodiment, projector 28 and first camera 24 are arranged in a fixed relationship at an angle such that sensor 44 can receive light reflected from the surface of object 34. Similarly, the projector 28 and the second camera 26 are arranged in a fixed relationship at an angle such that the sensor 38 can receive light reflected from the surface 32 of the object 34. Since the projector 28, the first camera 24, and the second camera 26 have a fixed geometric relationship, the distance and coordinates of points on the surface can be determined by their trigonometric relationship. Although the fields of view (FOV) of the cameras 24 and 26 are shown in FIG. 1 as not overlapping, the FOVs may overlap partially or completely.

  The projector 28 and the cameras 24 and 26 are electrically coupled to a control device 50 mounted in the housing 22. The controller 50 can include one or more microprocessors, digital signal processors, memories, and signal conditioning circuits. The scanner 20 may further include an actuator (not shown) that is manually activated by an operator to initiate operation and data acquisition by the scanner 20. In one embodiment, image processing for determining X, Y, Z coordinate data of a point cloud representing the surface 32 of the object 34 is performed by the control device 50. The coordinate data can be stored locally in the volatile or non-volatile memory 54, for example. The memory can be removable, for example a flash drive or a memory card. In other embodiments, the scanner 20 includes a communication circuit 52 that causes the scanner 20 to transmit coordinate data to the remote processing system 56. The communication medium 58 between the scanner 20 and the remote processing system 56 may be wired (such as Ethernet (registered trademark)) or wireless (such as Bluetooth or IEEE 802.11). In one embodiment, coordinate data is determined by remote processing system 56 based on acquired images transmitted by scanner 20 through communication medium 58.

  Relative motion is possible between the object surface 32 and the scanner 20 as indicated by the double arrow 47. There are several ways in which such relative motion is performed. In one embodiment, the scanner is a handheld scanner and the object 34 is fixed. The relative movement is performed by moving the scanner on the object surface. In another embodiment, the scanner is attached to a robotic end effector. When the robot moves the scanner on the object surface, relative movement is performed by the robot. In another embodiment, the scanner 20 or object 34 is mounted on a movable mechanical mechanism, such as a gantry coordinate measuring machine or an articulated arm CMM. When the scanner 20 is moved on the object surface, the relative movement is performed by moving the mechanical mechanism. In some embodiments, the movement is performed by operator action, and in other embodiments, the movement is performed by a mechanism under computer control.

  Referring now to FIG. 2, the operation of scanner 20 according to method 1260 is illustrated. As shown in block 1262, projector 28 first emits a structured light pattern onto area 37 of surface 32 of object 34. Light 30 from the projector 28 is reflected from the surface 32 as reflected light 62 received by the second camera 26. The three-dimensional profile of the surface 32 affects the image of the pattern captured by the photosensitive array 38 in the second camera 26. Using information collected from one or more images of one or more patterns, the controller 50 or remote processing system 56 can detect between the pixels of the photosensitive array 38 and the pattern of light emitted by the projector 28. Determine one-to-one correspondence. Using this one-to-one correspondence, the principle of trigonometry is used to determine the three-dimensional coordinates of points on the surface 32. This acquisition of three-dimensional coordinate data (point cloud data) is indicated by block 1264. By moving the scanner 20 across the surface 32, a point cloud is created for the entire object 34.

  During the scanning process, the controller 50 or remote processing system 56 may detect an undesirable condition or problem in the point cloud data as indicated at block 1266. A method for detecting such a problem is described below with respect to FIG. The detected problem may be, for example, an error of point cloud data in a specific area or the absence thereof. This error or absence of data can be caused by too little or too much light reflecting from this area. An under- or over-reflection of light is, for example, at the object surface 32 as a result of a high or variable angle of incidence of light 30 on the object surface 32 or as a result of a low reflectivity (black or transparent) material or glossy surface. This can result from the difference in reflectance. A point on the object can have an angle that produces a very bright specular reflectance known as glint.

  Another possible reason for the error or absence of point cloud data is the lack of resolution in areas with minute features, sharp edges, or abrupt changes in depth. This lack of resolution can be the result of, for example, holes.

  Another possible reason for error or absence of point cloud data is multipath interference. Normally, light rays from the projector 28 impinge on points on the surface 32 and are scattered over a range of angles. The scattered light is imaged into a small spot on the photosensitive array 38 by the lens 42 of the camera 26. Similarly, the scattered light can be imaged into a small spot on the photosensitive array 44 by the lens 46 of the camera 24. Not only is the light that reaches a point on the surface 32 a light beam from the projector 28, but multipath interference occurs when the secondary light is reflected by another part of the surface 32. Such secondary light can interfere with accurate determination of the three-dimensional coordinates of this point by disturbing the pattern of light received by the photosensitive arrays 38,44. A method for identifying the presence of multipath interference is described in this application with respect to FIG.

  If the controller determines at block 1266 that there is no problem with the point cloud, the procedure ends. Otherwise, a determination is made at block 1268 as to whether the scanner is used in manual or automatic mode. If the mode is manual, the operator is guided at block 1270 to move the scanner to the desired position.

  There are many methods in which a desired movement is instructed by an operator. In one embodiment, the indicator light on the scanner body indicates the desired direction of movement. In another embodiment, light is projected onto the surface indicating the direction in which the operator should move. In addition, the color of the projected light can indicate whether the scanner is too close or too far from the object. In another embodiment, an indication is given on the display about the area in which the operator projects light. Such a display may be a point cloud data, a CAD model, or a graphic display of a combination of the two. The display can be presented on a computer monitor, or a display incorporated into the scanning device.

  In any of these embodiments, a method for determining the approximate position of the scanner is desired. In one case, the scanner may be mounted on an articulated arm CMM that uses a joint angle encoder to determine the position and orientation of the scanner mounted at the end. In another case, the scanner includes an inertial sensor provided in the apparatus. Inertial sensors can include, for example, gyroscopes, accelerometers, and magnetometers. Another way to determine the approximate position of the scanner is to illuminate a photometric dot that is provided as a marker point on or around the object. In this way, the wide-angle FOV camera of the scanner can determine the approximate position of the scanner with respect to the object.

  In another embodiment, the CAD model on the computer screen indicates the area where additional measurements are desired, and the operator moves the scanner accordingly by matching the features on the object with the features on the scanner. By updating the CAD model on the screen as it is scanned, the operator can get quick feedback on whether the desired area of the part has been measured.

  After the operator has moved the scanner to the home position, measurements are taken by the small FOV camera 24 at block 1272. Viewing a relatively narrow area at block 1272 improves the resolution of the resulting three-dimensional coordinates and provides a good ability to reveal features such as holes and edges.

  Since the narrow-angle FOV camera visually recognizes a relatively narrow area than the wide-angle FOV camera, the projector 28 can irradiate a relatively narrow area. This is advantageous for eliminating multipath interference because the object is provided with relatively few illumination points that can reflect light back to the object. Narrowing the illuminated area can facilitate exposure control so as to obtain an optimal amount of light for a given reflectivity and angle of incidence of the object to be inspected. At block 1274, if all points have been collected, the procedure ends at block 1276. Otherwise continue.

  In embodiments where the mode following block 1268 is automated, at block 1278, the automation mechanism moves the scanner to the desired position. In some embodiments, the automated mechanism includes a sensor that provides information about the relative position of the scanner and the object to be inspected. For embodiments where the automation mechanism is a robot, an angular transducer in the robot joint provides information about the position and orientation of the robot end effector used to hold the scanner. For embodiments in which the object is moved by another type of automated mechanism, a linear encoder or other various sensors may provide information regarding the relative position of the object and the scanner.

  After the automated mechanism moves the scanner or object to a fixed position, block 1280 performs a three-dimensional measurement with a small FOV camera. Such measurements are repeated by block 1282 until all measurements are complete and the procedure ends at block 1284.

  In one embodiment, projector 28 changes the structured light pattern when the scanner switches from data acquisition by second camera 26 to first camera 24. In another embodiment, the same structured light pattern is used in both cameras 24,26. In yet another embodiment, when data is acquired by the first camera 24, the projector 28 emits a pattern formed by sweep lines or dots. After acquiring data with the first camera 24, the process continues scanning using the second camera 26. This process continues until the operator has scanned the desired area of the part.

  While the process of FIG. 2 is illustrated as a linear or sequential process, it should be appreciated that in other embodiments, one or more steps of the illustrated diagram can be performed in parallel. In the method shown in FIG. 2, this method requires first measuring the entire object and then performing more detailed measurements according to the evaluation of the acquired point cloud data. An alternative use of the scanner 20 begins by measuring details or critical areas using a camera 24 with a small FOV.

  It should also be recognized that providing a technique for changing a camera lens or projector lens as a technique for changing a camera or projector FOV in a scanning system is a common practice in existing scanning systems. However, such changes are time consuming and generally require an additional correction step in which an artifact such as a dot plate is placed in front of the camera or projector to determine the aberration correction parameters of the camera or projector system. And Thus, a scanning system in which two cameras having different FOVs, such as cameras 24 and 26 of FIG. 1, provide significant advantages in scanner measurement speed and feasibility in a fully automated mode.

  Another embodiment for a scanner 20 having a housing 22 that includes a first coordinate acquisition system 76 and a second coordinate acquisition system 78 is shown in FIG. The first coordinate acquisition system 76 includes a first projector 80 and a first camera 82. Similar to the embodiment of FIG. 1, the projector 80 emits light 84 to the surface 32 of the object 34. In the exemplary embodiment, projector 80 uses a visible light source that illuminates the pattern generator. The visible light source can be a laser, superluminescent diode, incandescent light, light emitting diode (LED), or other light emitting element. In one embodiment, the pattern generator is a chrome-on-glass slide with a structured light pattern etched into it. The slide may have a single pattern or multiple patterns that move to or from a fixed position as needed. The slide can be placed in the operating position manually or automatically. In other embodiments, the source pattern is not a digital micromirror device (DMD) such as a digital light projector (DLP) from Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a reflective mode It can be light reflected or conducted by similar devices used in conduction mode. The projector 80 may further include a lens system 86 that changes the emitted light to have a desired focus characteristic.

  The first camera 82 includes a photosensitive array sensor 88 that produces a digital image / display in an area 90 within the field of view of the sensor. The sensor may be a charge coupled device (CCD) type sensor or a complementary metal oxide semiconductor (CMOS) type sensor having an array of pixels, for example. The first camera 82 may further include other components such as, but not limited to, a lens 92 and other optical elements. The first projector 80 and the first camera 82 are arranged at a fixed angle so that the first camera 82 can detect the light 85 from the first projector 80 reflected by the surface 32 of the object 34. Since the first camera 92 and the first projector 80 are arranged in a fixed relationship, the principle of trigonometry described above can be used to determine the coordinates of points on the surface 32 in the area 90. For clarity, FIG. 3 is depicted as having a first camera 82 near the first projector 80, but it should be appreciated that the camera may be provided near the opposite side of the housing 22. It is. It is expected that the accuracy of 3D measurement is improved by further separating the first camera 82 and the first projector 80.

  The second coordinate acquisition system 78 includes a second projector 94 and a second camera 96. Projector 94 has a light source that may include a laser, light emitting diode (LED), superluminescent diode (SLED), xenon bulb, or other suitable type of light source. In one embodiment, lens 98 is used to focus light received from a laser light source into linear light 100, and may include one or more cylindrical lenses, or other variously shaped lenses. A lens is also referred to as a “lens system” because it may include one or more individual lenses or a collection of lenses. Linear light is substantially linear, that is, the maximum deviation from the straight line is less than about 1% of the length. One lens type that may be utilized according to one embodiment is a rod lens. The rod lens is generally in the form of a glass or plastic all-cylindrical body whose peripheral surface is polished and both ends are ground. Such a lens converts collimated light that passes through the diameter of the rod into a straight line. Another type of lens that may be used may be a cylindrical lens. A cylindrical lens is a lens having a partial cylindrical shape. For example, one surface of a cylindrical lens is flat while the opposite surface has a cylindrical shape.

  In another embodiment, the projector 94 generates a two-dimensional pattern of light that covers the area of the surface 32. At this time, the resulting coordinate acquisition system 78 is called a structured light scanner.

  The second camera 96 includes a sensor 102 such as a charge coupled device (CCD) type sensor or a complementary metal oxide semiconductor (CMOS) type sensor. The second camera 96 may further include other components such as, but not limited to, the lens 104 and other optical elements. The second projector 94 and the second camera 96 are arranged at such an angle that the second camera 96 detects the light 106 from the second projector 94 reflected by the object 34. Since the second projector 94 and the second camera 96 are arranged in a fixed relationship, the principle of trigonometry described above is used to determine the coordinates of a point on the surface 32 that is on a straight line formed by the light 100. It should be recognized. It should also be appreciated that a camera 96 and a projector 94 are provided on both sides of the housing 22 to increase 3D measurement accuracy.

  In another embodiment, the second coordinate acquisition system is not only a fixed linear light but also a swept linear light, a swept point light, an encoded light pattern (covering the area), or a continuous (covering the area). It is configured to project various patterns that may include light patterns. Each type of projection pattern has different advantages such as speed, accuracy, and exemption from multipath interference. By evaluating the performance requirements for each specific measurement and / or examining the characteristics of the collected data and the expected object shape (from CAD models or from 3D reconstruction based on collected scan data) Thus, it is possible to select the type of projection pattern that optimizes the performance.

  In another embodiment, the distance from the second coordinate acquisition system 78 to the object surface 32 is different from the distance from the first coordinate acquisition system 76 to the object surface 32. For example, the camera 96 may be located closer to the object 32 than the camera 88. In this way, the resolution and accuracy of the second coordinate acquisition system 78 can be improved over that of the first coordinate acquisition system 76. In many cases, it is beneficial to scan relatively large and smooth objects quickly with the low resolution system 76 and then scan details with the high resolution system 78 including edges and holes.

  The scanner 20 can be used in manual mode or automated mode. In manual mode, the operator is prompted to move the scanner closer or further away from the object surface according to the acquisition system being used. Furthermore, the scanner 20 can project a beam or pattern of light that instructs the operator in the direction in which the scanner 20 is moved. Alternatively, the indicator light on the device can indicate the direction in which the scanner should be moved. In automatic mode, the scanner 20 or object 34 can be automatically moved relative to each other according to the measurement requirements.

  Similar to the embodiment of FIG. 1, the first coordinate acquisition system 76 and the second coordinate acquisition system 78 are electrically coupled to a control device 50 mounted on the housing 22. The controller 50 can include one or more microprocessors, digital signal processors, memories, and signal conditioning circuits. The scanner 20 may further include an actuator (not shown) that is manually activated by an operator to initiate operation and data acquisition by the scanner 20. In one embodiment, image processing for determining X, Y, Z coordinate data of a point cloud representing the surface 32 of the object 34 is performed by the controller 50. The coordinate data can be stored locally in the volatile or non-volatile memory 54, for example. The memory can be removable, for example a flash drive or a memory card. In other embodiments, the scanner 20 includes a communication circuit 52 that causes the scanner 20 to transmit coordinate data to the remote processing system 56. The communication medium 58 between the scanner 20 and the remote processing system 56 may be wired (such as Ethernet (registered trademark)) or wireless (such as Bluetooth or IEEE 802.11). In one embodiment, the coordinate data is determined by the remote processing system 56 and the acquired image is transmitted by the scanner 20 over the communication medium 58.

  Now referring to FIG. 4, a method 1400 for operating the scanner 20 of FIG. 3 will be described. At block 1402, the first projector 80 of the first coordinate acquisition system 76 of the scanner 20 emits a structured light pattern to the area 90 of the surface 32 of the object 34. Light 84 from the projector 80 is reflected from the surface 32, and reflected light 85 is received by the first camera 82. As described above, changes in the surface profile of the surface 32 cause distortions in the imaging pattern of light received by the first photosensitive array 88. Because the pattern is formed by structured light, straight lines or light, or point light, in some instances, the controller 50 or remote processing system 56 may be between points on the surface 32 and pixels of the photosensitive array 88. It is possible to determine the one-to-one correspondence. Thus, the principle of trigonometry described above can be used in block 1404 to obtain point cloud data, so to speak, the X, Y, Z coordinates of points on the surface 32 can be determined. By moving the scanner 20 relative to the surface 32, a point cloud can be created from the entire object 34.

  At block 1406, the controller 50 or remote processing system 56 determines whether the point cloud data has the desired data quality attributes or has a potential problem. The types of problems that can occur are described above with reference to FIG. 2, and this description will not be repeated here. If the controller determines at block 1406 that the point cloud has the desired data quality attribute, the procedure ends. Otherwise, a determination is made at block 1408 as to whether the scanner is used in manual or automatic mode. If the mode is manual, the operator is guided at block 1410 to move the scanner to the desired position.

  As described above with reference to FIG. 2, there are several methods for instructing a desired movement by an operator. The description will not be repeated here.

  A method for determining the approximate position of the scanner is required to guide the operator in making the desired movement. As described with reference to FIG. 2, the method includes mounting the scanner 20 on the articulated arm CMM, using an inertial sensor in the scanner 20, illuminating photographic measurement dots, or matching features to the display image. Can be included.

  After the operator has moved the scanner to the home position, measurements are made at block 1412 by the second coordinate acquisition system 78. By using the second coordinate acquisition system, the resolution or accuracy is improved or the problem is solved. At block 1414, the procedure ends at block 1416 when all points have been collected. Otherwise continue.

  If the operating mode from block 1408 is automated, the automation mechanism moves the scanner to the desired position at block 1418. In most cases, an automated mechanism causes the sensor to provide information about the relative position of the scanner and the object to be inspected. In the case where the automation mechanism is a robot, an angle transducer in the robot joint provides information about the position and orientation of the robotic end effector used to hold the scanner. For other types of automation mechanisms, linear encoders or various other sensors may provide information about the relative position of the object and the scanner.

  After the automatic mechanism moves the scanner or object to a fixed position, a three-dimensional measurement is made by the second coordinate acquisition system 78 at block 1420. Such measurements are repeated by block 1422 until all measurements are complete. The procedure ends at block 1424.

  Although the process of FIG. 4 is illustrated as a linear or sequential process, it should be appreciated that in other embodiments, one or more of the illustrated steps may be performed in parallel. In the method shown in FIG. 4, the method includes first measuring the entire object and then performing a more detailed measurement according to an evaluation of the acquired point cloud data. Use of the alternative scanner 20 begins by measuring details or critical areas using the second coordinate acquisition system 78.

  It should also be recognized that providing a technique for changing a camera lens or projector lens as a technique for changing a camera or projector FOV in an existing scanning system is a common practice in existing scanning systems. However, such changes are time consuming and typically involve an additional compensation step in which an artifact such as a dot plate is placed in front of the camera or projector to determine aberration correction parameters for the camera or projector system. I need. Thus, a system that provides two different coordinate acquisition systems, such as the scanning system 20 of FIG. 3, provides significant advantages in scanner measurement speed and feasibility for the fully automatic mode.

  Errors may occur when performing scanner measurements as a result of multipath interference. The source of multipath interference will now be discussed and a first method for avoiding or reducing multipath interference will be described.

  The case of multipath interference occurs when a portion of the light impinging on the object surface is first scattered on another surface of the object before returning to the camera. For points on the object that accept this scattered light, the light sent to the photosensitive array corresponds not only to the light directly projected from the projector, but also to the light sent to different points on the projector and scattered by the object. . Especially in the case of scanners that project two-dimensional (structured) light, the result of multipath interference may be an inaccurate calculated distance from the projector to the object surface at this point.

  An example of multipath interference is illustrated with reference to FIG. 5A. In this embodiment, scanner 4570 projects linear light 4525 onto object surface 4510A. The linear light 4525 is perpendicular to the plane of the paper. In one embodiment, the rows of the photosensitive array are parallel to the paper plane and the columns are perpendicular to the paper plane. Each row represents a point on the projection line 4525 in a direction perpendicular to the plane of the paper. The distance from the projector to the object at this point on the straight line is found by first calculating the centroid of each row. For surface point 4526, the center of gravity on photosensitive array 4541 is represented by point 4546. The center of gravity position 4546 on the photosensitive array can be used to calculate the distance from the camera projection center 4544 to the object point 4526. This calculation is based on a trigonometric relationship based on the principle of triangulation. To perform these calculations, a baseline distance D from the camera projection center 4544 to the projector projection center 4523 is required. In addition, knowledge of the relative orientation from projector system 4520 to camera system 4540 is required.

  To understand the error caused by multipath interference, consider point 4527. Light reflected or scattered from this point is imaged at a point 4548 on the photosensitive array 4541 by the lens 4542. However, in addition to light received directly from the projector and scattered at point 4527, additional light is reflected from point 4526 to point 4527 before being imaged onto the photosensitive array. It is very likely that the light is scattered to an unexpected location, forming two centroids in a given row. Consequently, the observation of two centroids in a given row is a good indication of the presence of multipath interference.

  In the case of structured light projected onto an area of the object surface, the secondary reflection from a point such as 4527 is usually not as obvious as the light projected onto a straight line, thus causing errors in the measured 3D surface coordinates. Cheap.

  By using a projector with an adjustable illumination pattern for display element 4521, it is possible to change the illumination pattern. Display element 4521 can be a digital micromechanical mirror (DMM), such as a digital light projector (DLP). Such devices include a large number of small mirrors that can be quickly adjusted by electrical signals and that quickly adjust the illumination pattern. Other devices that can generate an electro-regulated display pattern include LCD (Liquid Crystal Display) and LCOS (Liquid Crystal on Silicon) displays.

  An approach to inspecting multipath interference in a system that projects structured light over an area is to change the display to project linear light. The presence of multiple centroids in a row will indicate that multipath interference exists. By sweeping the linear light, the area can be covered without requiring the operator to move the probe.

  With an electrically adjustable display, the linear light can be set at a desired angle. By changing the direction of the projected linear light, multipath interference can be eliminated in many cases.

  For many creases and surfaces with steep angles where reflections are difficult to avoid, an electro-regulated display is used to sweep point light. In some cases, secondary reflections can be generated from a single point of light, but it is usually relatively easy to determine which of the reflected spot light is effective.

  Electrically controlled displays can also be used to quickly switch between encoded and uncoded patterns. In most cases, coding patterns are used to make 3D measurements based on single camera frame information. On the other hand, a large number of patterns (sequential or uncoded patterns) can be used to obtain high accuracy in the measured 3D coordinate values.

  In the past, electrically controlled displays have been used to project each of a series of patterns into a sequential pattern, for example, a series of grayscale linear patterns followed by a series of sinusoidal patterns each having a different phase. It is used.

  This advanced method identifies or eliminates problems such as multi-path interference, and is suitable for single shot patterns (eg coding patterns) or multiple shot patterns to get the required accuracy as quickly as possible In choosing these methods of indicating whether or not, it offers advantages over previous methods.

  In the case of line scanners, there are often methods for determining the presence of multipath interference. In the absence of multipath interference, the light reflected by points on the object surface is imaged as a single row into a region of consecutive pixels. If two or more regions in a row receive a large amount of light, multipath interference is indicated. An example of such a multipath interference condition and the resulting extra illumination area of the photosensitive array is shown in FIG. 5A. Surface 4510A now has a large curvature near intersection 4526. The vertical surface at the intersection is a straight line 4528 and the incident angle is 4531. The direction of the reflected linear light 4529 is determined from a reflection angle 4532 equal to the incident angle. As described above, the linear light 4529 actually represents the entire direction of light scattered over an angular range. The center of the scattered light collides with surface 4510A at point 4527, which is imaged at point 4548 on the photosensitive array by lens 4544. The unexpectedly large amount of light received in the vicinity of point 4548 indicates that multipath interference is probably present. In line scanners, the main concern for multipath interference is not the case shown in FIG. 5A where the two spots 4546 and 4527 are separated by a significant distance and analyzed separately, rather the two spots overlap. This is an example of whether or not to go together. In this case, it is not possible to determine the center of gravity corresponding to the desired point corresponding to point 4546 in FIG. 15E. The problem is exacerbated in the case of a scanner that projects light over a two-dimensional area, as can also be seen with reference to FIG. 5A. If all of the light imaged on the photosensitive array 4541 is needed to determine the two-dimensional coordinates, the light at point 4527 will be pointed from the object surface along with the desired pattern of light projected directly from the projector. Obviously, it also accommodates unwanted light reflected to 4527. As a result, in this case, the wrong 3D coordinates may be calculated at point 4527 for the light projected onto the entire area.

  With respect to the projected linear light, in many cases, it is possible to eliminate multipath interference by changing the direction of the straight line. One possibility is to make a line scanner using a projector with inherent two-dimensional performance so that the lines are swept or automatically rotated in different directions. An example of such a projector uses a digital micromirror (DMD) as described above. For example, if multipath interference is suspected in a specific scan acquired by structured light, the measurement system can be automatically set to switch to a measurement method that uses linear light by sweeping.

  Another way to reduce, minimize or eliminate multipath interference is to sweep point light rather than linear light or light areas over the area where multipath interference is indicated. By irradiating a single point of light, the light scattered by secondary reflection can usually be easily identified.

  The determination of the desired pattern projected by the electrically adjustable display is made using diagnostic analysis, as described below with reference to FIG.

  In addition to its use in diagnosing and correcting multipath interference, changing the pattern of projected light provides the advantage of obtaining the required accuracy and resolution in a minimum amount of time. In one embodiment, the measurement is first performed by projecting the light encoding pattern onto the object in a single shot. The three-dimensional coordinates of the surface are determined using the collected data, and the results are analyzed to determine whether some regions have holes, edges, or features that require more detailed analysis. Such a detailed analysis can be performed, for example, by using the narrow angle FOV camera 24 of FIG. 1 and the high resolution scanner system 78 of FIG.

  The coordinates are analyzed to determine the approximate distance to the target and thus for more accurate measurement methods, such as a method of continuously projecting a sinusoidal phase-shifted light pattern onto the surface, as described below. Provides starting distance. Using the encoded light pattern to determine the starting distance for each point on the surface eliminates the need to obtain this information by changing the pitch with multiple sinusoidal phase shift scans, thus saving considerable time.

  Referring now to FIG. 5B, an embodiment for overcoming or improving the accuracy of coordinate data acquired by the scanner 20 is shown. By scanning an object, such as object 34, with scanner 20, process 211 begins at block 212. The scanner 20 can be, for example, a scanner as described in the embodiments of FIGS. 1, 3, 5 and 7 having at least one projector and camera. In this embodiment, at block 212, the scanner 20 projects the first light pattern onto the object. In one embodiment, the first light pattern is a coded and structured light pattern. Process 211 obtains and determines 3D coordinate data at block 214. The coordinate data is analyzed at query block 216 to determine if there is any anomaly such as the multipath interference described above, low resolution around the element, or the absence of data due to changes in surface angle or surface reflectance. If an anomaly is detected, the process 211 proceeds to block 218 where the light pattern emitted by the projector is changed to the second light pattern. In one embodiment, the second light pattern is linear light by sweeping.

  After projecting the second light pattern, the process 211 proceeds to block 220 where three-dimensional coordinate data is acquired and a determination is made for the area where the anomaly is detected. Process 211 loops back to query block 216, where it is determined whether the anomaly has been resolved. If query block 216 still detects anomalies or omissions, accuracy or resolution, the process loops back to block 218 and switches to the third light pattern. In one embodiment, the third light pattern is a continuous sine wave phase shift pattern. In another embodiment, the third light pattern is a point light by sweeping. This iterative procedure continues until the anomaly is resolved. If coordinate data from the anomalous area is determined, the process 211 proceeds to block 222 where the radiation pattern switches back to the first structured light pattern and the scanning process continues. Process 211 continues until the operator has scanned the desired area of the object. In the event that scanning of information obtained using the method of FIG. 11 is not sufficient, the measurement problem with the tactile probe described herein may be used.

  Referring now to FIG. 6, another embodiment of the scanner 20 is shown attached to the movable device 120. At least one projector 122 and at least one camera 124 arranged in a certain geometric relationship are connected to the scanner 20 so that the principle of trigonometry can be used to determine the three-dimensional coordinates of the points on the surface 32. Have. The scanner 20 may be the same scanner as described with reference to FIG. 1 or FIG. 3, for example. In one embodiment, the scanner is the same as the scanner of FIG. 10 with a tactile probe. However, the scanner used in the embodiment of FIG. 6 is a scanner such as a structured light or line scanner, such as a portable coordinate measuring machine with an integrated line laser scanner, dated 18 January 2006. It may be a scanner as disclosed in US Pat. No. 7,246,030 filed by the same owner. In another embodiment, the scanner used in the embodiment of FIG. 6 is a structured light scanner that projects light over an area of an object.

  In the exemplary embodiment, mobile device 120 is a robotic device that provides automatic movement by arm segments 126, 128 connected by pivot swivel joint 130 to move arm segments 126, 128, resulting in scanner 20. Moves from the first position to the second position (as indicated by the dotted line in FIG. 6). The movable device 120 may include an actuator such as a motor (not shown) coupled to the arm segments 126, 128 to move the arm segments 126, 128 from a first position to a second position, for example. It should be appreciated that the mobile device 120 with articulated arms is for illustrative purposes and the claimed invention should not be so limited. In other embodiments, the scanner 20 can be attached to a movable device that moves the scanner 20 via, for example, rails, wheels, tracks, belts, cables, or combinations thereof. In other embodiments, the robot has a different number of arm segments.

  In one embodiment, the mobile device is an articulated arm coordinate measuring machine (as described in US patent application Ser. No. 13 / 491,176, filed Jan. 20, 2010, by the same holder). AACMM). In this embodiment, movement of the scanner 20 from the first position to the second position requires the operator to move the arm segments 126, 128 by hand.

  For embodiments having an automated device, the moveable device 120 further includes a controller 132 configured to energize the actuator to move the arm segments 126, 128. In one embodiment, controller 132 communicates with controller 134. As will be described in detail below, this arrangement allows the scanner 20 to be moved by the controller 132 in response to an abnormality in the acquired data. It should be appreciated that the controllers 132, 134 can be integrated into a single processing unit, or the functions can be distributed among several processing units.

  By performing analysis with reference to FIG. 12, it is possible to determine the position and orientation of the scanner 20 and obtain a desired measurement result. In some embodiments, the measured feature may utilize a scanner in a desired direction. For example, measurement of the hole diameter can be improved by orienting the scanner camera 124 to be generally perpendicular to the hole. In other embodiments, the scanner can be positioned to reduce or minimize the possibility of multipath interference. Such analysis may be based on CAD models available as part of the diagnostic procedure or based on data collected by the initial position scanner prior to secondary movement of the scanner 20 by the device 120.

  Now, the operation of the scanner 20 and the movable device 120 will be described with reference to FIG. The process begins at block 134 where the object 34 is scanned with the scanner 20 in the first position. At block 138, the scanner 20 obtains and determines coordinate data for points on the surface 32 of the object 34. The movable device 120 can move the scanner 20 to acquire data about the surface points of the desired area. In question block 140, it is determined whether there is an anomaly in the coordinate data at point 142, such as multipath interference, or whether there is a need to change direction to obtain improved resolution and measurement accuracy. Point 142 in FIG. 6 may represent a single point, a line by the point, or an area on surface 32. If an anomaly or a need for improved accuracy is detected, the process continues to block 144 where the mobile device 120 moves the position of the scanner 20, such as the first position to the second position, and blocks 146 the area of interest. Rescan to obtain 3D coordinate data. The process loops back to the query block 140 where it is determined whether there is still an anomaly in the coordinate data or whether an improvement in measurement accuracy is desired. In these cases, the scanner 20 is moved again and the process continues until the measurement results reach the desired level. Once the coordinate data is obtained, the process proceeds from query block 140 to block 148 where the scanning process continues until the desired area is scanned.

  In embodiments where the scanner 20 includes a haptic probe (FIG. 10), the movement of the scanner from the first position to the second position may be configured to contact the area of interest with the haptic probe. Since the position of the scanner, and hence the haptic probe, can be determined from the position and orientation of the arm segments 126, 128, the three-dimensional coordinates of the points on the surface 32 can be determined.

  In some embodiments, the measurement results obtained by the scanner 20 of FIGS. 8A and 8B can be disturbed by multipath interference. In other cases, the measurement results may not provide the desired resolution or accuracy to correctly measure the characteristics of the surface 32, particularly edges, holes, or complex features. In these cases, it is desirable for the operator to interrogate points or areas on the surface 32 using the remote probe 152. In one embodiment shown in FIGS. 8A and 8B, the scanner 20 includes a projector 156 such that light emitted by the projector 156 is reflected off the surface 32 and received by one or both of the cameras 154, 155; Cameras 154 and 155 arranged at an angle with respect to projector 156 are included. Projector 156 and cameras 154, 156 are arranged in a fixed geometric relationship so that the principle of trigonometry can be used to determine the three-dimensional coordinates of a point on surface 32.

  In one embodiment, projector 156 is configured to emit visible light 157 to area of interest 159 on surface 32 of object 34 as shown in FIG. 8A. The three-dimensional coordinates of the illuminated area of interest 159 can be confirmed by using an image of the illuminated area 159 with one or both of the cameras 154, 155.

  The scanner 20 is configured to cooperate with the remote probe 152 so that the operator can bring the probe tip 166 into contact with the object surface 132 in the illuminated region of interest 159. In one embodiment, remote probe 152 includes at least three non-collinear point lights 168. The point light 168 is a spot of light generated by a light emitting diode (LED), or a reflective bright spot of light irradiated by an infrared or visible light source from a projector 156 or another light source not depicted in FIG. 8B. It can be. The infrared or visible light source in this case can be mounted on the scanner 20 or provided outside the scanner 20. By determining the three-dimensional coordinates of the spot-like light 168 with a scanner, and using information about the geometry of the probe 152, the position of the probe tip 166 is determined, thus determining the coordinates of the object surface 32. Can be done. The tactile probe used in this way eliminates potential problems due to multipath interference and also allows for relatively accurate measurements of holes, edges, and detailed features. In one embodiment, the probe 166 can be a tactile probe that can be activated by pressing a probe actuator button (not shown), or the probe 166 can be a touch trigger probe that is activated by contact with the surface 32. A communication circuit (not shown) transmits a signal to the scanner 20 in response to a signal generated by the actuator button or the touch trigger probe. In one embodiment, the point light 168 is replaced with a geometric light pattern, which may include straight lines or curves.

  Referring now to FIG. 9, a process for obtaining coordinate data for points on the surface 32 of the object 34 using the stationary scanner 20 of FIGS. 8A and 8B with a remote probe 152 is shown. The process begins at block 170 where the surface 32 of the object 34 is scanned. The process obtains and determines the three-dimensional coordinate data of the surface 32 at block 172. The process then determines at query block 174 whether there is an abnormality in the coordinate data of area 159 or whether there is a problem with the accuracy or resolution of area 159. An anomaly may be invalid data that is discarded due to, for example, multipath interference. Anomalies can be missing data due to lack of surface reflectance or resolution around features such as openings and holes, for example. A diagnostic procedure for detecting (identifying) multipath interference and related problems may be cited with reference to FIG.

  Once the area 159 has been identified, the scanner 20 indicates to the operator at block 176 that the coordinate data for the area 159 can be obtained via the remote probe 152. This area 159 can be indicated by emitting visible light 157 to irradiate area 159. In one embodiment, light 157 is emitted by projector 156. The color of light 157 can be changed to communicate the type of anomaly or problem to the operator. For example, where multipath interference occurs, the light 157 may be red while the low resolution may be green. The area can also be indicated on a display having a graphical display (such as a CAD model) of the object.

  The process then proceeds to block 178 where an image of the remote probe 152 is acquired when the sensor 166 contacts the surface 32. Point light 168, which can be an LED or a reflective target, can be received by one of cameras 154, 155, for example. Using best fit techniques well known to mathematicians, the scanner 20 determines the three-dimensional coordinates of the probe center at block 180 where the three-dimensional coordinates of the object surface 32 are determined at block 180. Once a point in area 159 where an anomaly has been detected is acquired, the process proceeds to continue scanning object 34 at block 182 until the desired area is scanned.

  Referring now to FIG. 10, another embodiment of a scanner 20 that is gripped by an operator during operation is shown. In this embodiment, the housing 22 can include a handle 186 that allows the operator to hold the scanner 20 during operation. The housing 22 includes a projector 188 and a camera 190 that are arranged at an angle relative to each other such that light 192 emitted by the projector is reflected by the surface 32 and received by the camera 190. The scanner 20 of FIG. 10 operates in a manner substantially similar to the embodiment of FIGS. 1 and 3 and obtains three-dimensional coordinate data of points on the surface 32 using trigonometric principles.

  The scanner 20 further includes an integrated probe member 184. Probe member 184 includes a sensor 194 at one end. The sensor 194 can be a tactile probe that can respond to a press on an actuator button (not shown) by an operator, or a touch trigger probe that can react to contact with the surface 32, for example. As will be described in more detail later, probe member 184 allows an operator to obtain the coordinates of a point on surface 32 by contacting sensor 194 with surface 32.

  Projector 188, camera 190, and actuator circuit for sensor 194 are electrically coupled to control device 50 mounted in housing 22. The controller 50 can include one or more microprocessors, digital signal processors, memories, and signal conditioning circuits. The scanner 20 may further include an actuator (not shown) such as on a handle 186 that may be manually activated by an operator to initiate operation and data capture by the scanner 20, for example. In one embodiment, image processing for determining X, Y, Z coordinate data of a point cloud representing the surface 32 of the object 34 is performed by the controller 50. The coordinate data can be stored locally in the volatile or non-volatile memory 54, for example. For example, the memory can be removable, such as a flash drive or a memory card. In other embodiments, the scanner 20 includes a communication circuit 52 for the scanner 20 to transmit coordinate data to the remote processing system 56. A communication medium 58 between the scanner 20 and the remote processing system 56 is wired (such as Ethernet (registered trademark)) or wireless (such as Bluetooth or IEEE 802.11). In one embodiment, the coordinate data is determined by the remote processing system 56 and the scanner 20 transmits the acquired image over the communication medium 58.

  Now, the operation of the scanner 20 of FIG. 10 will be described with reference to FIG. The process begins at block 196 where the operator scans the surface 32 of the object 34 by moving the scanner 20 by hand. At block 198, three-dimensional coordinates are determined and obtained. In the question block 200, it is determined whether there is an abnormality in the coordinate data or whether an improvement in accuracy is required. As noted above, anomalies can occur for several reasons, such as multipath interference, surface reflectance changes, or low feature resolution. If an anomaly exists, the process proceeds to block 202 where area 204 is indicated to the operator. Area 204 can be indicated by projecting visible light 192 onto surface 32 with projector 188. In one embodiment, light 192 is colored to notify the operator of the type of anomaly detected.

  The operator then moves the scanner from the first position to the second position (indicated by the dotted line) at block 206. In the second position, the sensor 194 contacts the surface 32. The position and orientation (6 degrees of freedom) of the scanner 20 at the second position can be determined using known best fit methods based on the image acquired by the camera 190. Since the dimensions and arrangement of the sensor 194 are well known with respect to the mechanical structure of the scanner 20, the three-dimensional coordinate data of the points in the area 204 can be determined at block 208. The process then proceeds to block 210 where the object scan continues. The scanning process continues until the desired area has been scanned.

  A general approach can be used to evaluate not only multipath interference, but also overall quality including resolution and material type, surface properties, and geometric effects. Referring also to FIG. 12, in one embodiment, the method 4600 may be performed automatically under computer control. Step 4602 determines whether information about the three-dimensional coordinates of the object to be inspected is available. The first type of three-dimensional information is CAD data. CAD data usually refers to the nominal dimensions of the object to be inspected. The second type of 3D information is measured 3D data, such as data previously measured by a scanner or other device. In some instances, step 4602 may include a further step of aligning a reference frame of a coordinate measuring device, such as a laser tracker or 6DOF scanner accessory, with an object reference frame. In the position embodiment, this is done by measuring at least three points on the surface of the object with a laser tracker.

  If the answer to the question presented in step 4602 is that 3D information is available, then in step 4604 the computer or processor calculates the object measurement impact on multipath interference. used. In one embodiment, this is done by projecting each ray emitted by the scanner projector and calculating the angle or reflectance for each case. The computer or software identifies each region of the object surface that is sensitive to errors as a result of multipath interference. Step 4604 may also perform an impact analysis on multipath errors for various positions of the 6DOF probe relative to the object to be inspected. In some cases, multipath interference can be avoided or minimized by selecting an appropriate position and orientation of the 6DOF probe relative to the object to be inspected, as described above. If the answer to the question presented in step 4602 is that three-dimensional information is not available, step 4606 measures the three-dimensional coordinates of the object surface using a desired or preferred measurement method. is there. Following the calculation of multipath interference, step 4608 may be performed to evaluate other aspects of the expected scan quality. One such quality factor is whether the scan resolution is sufficient for the characteristics of the object to be inspected. For example, if the resolution of the device is 3 mm and there are sub-millimeter features for which valid scan data is desirable, the problem areas of these objects should be noted for later corrective action. Another quality factor that is partly related to resolution is the ability to measure the edge of an object and the edge of a hole. Knowledge of scanner performance allows a determination of whether the scanner resolution is good enough for a given edge. Another quality factor is the amount of light expected to return from a given feature. A small amount of light is expected to return to the scanner from the inside of the small hole, for example, from the oblique angle of view. A small amount of light is also expected from certain types and colors of materials. Some materials have a large penetration depth for the light from the scanner, and good measurement results are not expected in this case. In some cases, an automated program asks for additional user information. For example, if the computer program is performing steps 4604 and 4608 based on CAD data, no known types of materials are used for the surface properties of the object to be inspected. In these cases, step 4608 may include another step of obtaining material properties for the object to be inspected.

  Following the analysis of steps 4604 and 4608, step 4610 is to determine whether further diagnostic procedures are to be performed. A first example of a possible diagnostic procedure is step 4612 of projecting the stripe at a suitable angle to focus on whether multipath interference is observed. The general indication of multipath interference for the projected straight stripe is described above with reference to FIG. Another example of a diagnostic step is a step 4614 of projecting a collection of straight lines aligned in the direction of an epipolar straight line with a source pattern of light, eg, a source pattern of light 30 from projector 36 of FIG. For the case where linear light in the light source pattern is aligned with epipolar lines, these lines appear as straight lines in the image plane on the photosensitive array. The use of epipolar lines is described in further detail in US patent application Ser. No. 13 / 443,946 filed Apr. 11, 2012 by the same owner. If these patterns of the photosensitive array are not straight lines, or if the straight lines are smeared or noisy, problems are probably pointed out as a result of multipath interference.

  Step 4616 is to select a suitable combination of actions based on the analysis and diagnostic procedure being performed. Step 4618 of measuring using a 2D (structured) pattern of encoded light is preferred when measurement speed is particularly important. If high accuracy is more important, step 4620 of measuring a 2D (structured) pattern of encoded light using a continuous pattern, for example, a continuous sine wave pattern with different phases and pitches, is preferred. If method 4618 or 4620 is selected, reposition the scanner, in other words, adjust the position and orientation of the scanner to a position that minimizes the multipath interference and specular reflection <glint> obtained by the analysis of step 4604. It is also desirable to select step 4628. Such an instruction can be provided to the user by illuminating the problem area with light from the scanner projector or by displaying such area on a monitor display. Alternatively, the next step of the measurement procedure can be automatically selected by a computer or processor. If the preferred scanner location does not eliminate multipath interference and glint, several options are available. In some cases, the scanner can be repositioned and valid measurement results combined to repeat the measurement. In other cases, alternative measurement steps are added to the procedure or performed instead of using structured light. As described above, step 4622 of scanning light stripes provides a convenient way to obtain information for the entire area, reducing the risk of having problems due to multipath interference. Step 4624, which sweeps a small spot of light over the region of interest, reduces the risk of problems due to multipath interference. Measuring the area of the object surface with a tactile probe eliminates the possibility of multipath interference. The tactile probe provides a well-known resolution based on the size of the probe tip to eliminate the low reflectance light or large light transmission depth problems found in the object being examined.

  In most cases, the quality of the data collected in the combination of steps 4618-4628 can be evaluated in step 4630 based on the data obtained from the measurements and combined with the previously performed analysis results. If step 4632 finds that the quality is acceptable, then step 4634 completes the measurement. Otherwise, analysis resumes at step 4604. In some cases, the 3D information was not as accurate as desired. In this case, it is beneficial to repeat some of the previous steps.

  Although the invention has been described in detail with respect to only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention may be modified to include a number of variations, alterations, alternatives or equivalent arrangements not heretofore described, but which are compatible with the spirit and scope of the invention. In addition, while various embodiments of the invention have been described, it should be understood that aspects of the invention include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

Claims (8)

  1. A non-contact optical three-dimensional measuring device,
    An assembly including a first projector, a first camera, a second projector, and a second camera, wherein the first projector, the first camera, the second projector, and the second camera are fixed to each other The first projector includes a first light source, and the first projector is configured to emit first light having at least one pattern to the surface of the object, and the first camera is connected to the first lens. And a first photosensitive array, wherein the first camera is configured to receive a first portion of the first light reflected from the surface and generate a corresponding first signal. The first field of view is a first viewing angle region of the first camera, and the second projector has a second light source to emit second light to the surface of the object. The second project And the second camera has a second lens and a second photosensitive array, and receives the second portion of the second light reflected from the surface and generates a corresponding second signal. An assembly in which a second camera is configured, the second camera has a second field of view, the second field of view is a second viewing angle region of the second camera, and the second field of view is different from the first field of view. When,
    It is electrically coupled to the first projector and the second projector and the first camera and the second camera, the second time point different from the first time point with thereby collecting the first signal at a first time point a Collecting two signals; determining a three-dimensional coordinate of a first point on the surface based at least in part on the first signal; and on the surface based at least in part on the second signal. A processor executing computer-executable program code for performing an operation including determining a three-dimensional coordinate of the second point of
    It encompasses,
    When executed by the processor, determining the presence of multipath interference, and if the multipath interference is present, at a third time between the first time and the second time, A non-contact optical three-dimensional measurement apparatus , wherein the processor is further configured to execute computer-executable program code for performing operations including moving from a first position to a second position .
  2.   The non-contact optical three-dimensional measuring apparatus according to claim 1, wherein the second light is linear light in a direction perpendicular to a propagation direction of the second light.
  3.   The non-contact optical three-dimensional measurement apparatus according to claim 1, wherein the at least one pattern includes at least three non-collinear pattern elements.
  4.   The non-contact optical three-dimensional measurement apparatus according to claim 3, wherein the second light includes a second pattern, and the second pattern has at least three non-collinear pattern elements.
  5.   The non-contact optical three-dimensional measurement apparatus according to claim 2, wherein the linear light is a linear pattern that is swept in time.
  6.   The non-contact optical three-dimensional measuring apparatus according to claim 2, wherein the linear light is spot light that is time-swept.
  7.   The non-contact optical three-dimensional measuring apparatus according to claim 1, wherein the first visual field is at least twice as large as the second visual field.
  8.   The first photosensitive array includes a first pixel, the first pixel is configured to capture light reflected from a first area of the surface, the second photosensitive array includes a second pixel, and the first pixel 2. The non-contact optical three-dimensional measurement apparatus according to claim 1, wherein two pixels are configured to capture light reflected by a second area of the surface, and the second area is smaller than the first area.
JP2016500623A 2011-04-15 2014-03-05 Non-contact optical three-dimensional measuring device Active JP6355710B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US201361791797P true 2013-03-15 2013-03-15
US61/791,797 2013-03-15
US13/932,267 US9482529B2 (en) 2011-04-15 2013-07-01 Three-dimensional coordinate scanner and method of operation
US13/932,267 2013-07-01
PCT/US2014/020481 WO2014149702A1 (en) 2013-03-15 2014-03-05 Three-dimensional coordinate scanner and method of operation

Publications (2)

Publication Number Publication Date
JP2016514271A JP2016514271A (en) 2016-05-19
JP6355710B2 true JP6355710B2 (en) 2018-07-11

Family

ID=50382644

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016500623A Active JP6355710B2 (en) 2011-04-15 2014-03-05 Non-contact optical three-dimensional measuring device

Country Status (5)

Country Link
JP (1) JP6355710B2 (en)
CN (1) CN105190232A (en)
DE (1) DE112014001483T5 (en)
GB (1) GB2527993B (en)
WO (1) WO2014149702A1 (en)

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US8467072B2 (en) 2011-02-14 2013-06-18 Faro Technologies, Inc. Target apparatus and method of making a measurement with the target apparatus
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
WO2012141868A1 (en) 2011-04-15 2012-10-18 Faro Technologies, Inc. Enhanced position detector in laser tracker
DE112013000727T5 (en) 2012-01-27 2014-11-06 Faro Technologies, Inc. Test method with bar code marking
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US20160101936A1 (en) 2014-10-10 2016-04-14 Hand Held Products, Inc. System and method for picking validation
GB2531928B (en) * 2014-10-10 2018-12-12 Hand Held Prod Inc Image-stitching for dimensioning
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9964402B2 (en) * 2015-04-24 2018-05-08 Faro Technologies, Inc. Two-camera triangulation scanner with detachable coupling mechanism
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3396313A1 (en) 2015-07-15 2018-10-31 Hand Held Products, Inc. Mobile dimensioning method and device with dynamic accuracy compatible with nist standard
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
DE102016002398B4 (en) * 2016-02-26 2019-04-25 Gerd Häusler Optical 3D sensor for fast and dense shape detection
US20170299379A1 (en) * 2016-04-15 2017-10-19 Lockheed Martin Corporation Precision Hand-Held Scanner
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
CN106500628B (en) * 2016-10-19 2019-02-19 杭州思看科技有限公司 A kind of 3-D scanning method and scanner containing multiple and different long wavelength lasers
CN106500627B (en) * 2016-10-19 2019-02-01 杭州思看科技有限公司 3-D scanning method and scanner containing multiple and different long wavelength lasers
US10430958B2 (en) 2017-07-11 2019-10-01 Microsoft Technology Licensing, Llc Active illumination 3D zonal imaging system
US20190301858A1 (en) * 2018-03-30 2019-10-03 Koninklijke Philips N.V. System and method for 3d scanning

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7246030B2 (en) 2002-02-14 2007-07-17 Faro Technologies, Inc. Portable coordinate measurement machine with integrated line laser scanner
DE10344922B4 (en) * 2003-09-25 2008-06-26 Siemens Audiologische Technik Gmbh All-scanner
US8786682B2 (en) * 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US9204129B2 (en) * 2010-09-15 2015-12-01 Perceptron, Inc. Non-contact sensing system having MEMS-based light source
JP5782786B2 (en) * 2011-04-01 2015-09-24 株式会社ニコン shape measuring device
WO2012141868A1 (en) * 2011-04-15 2012-10-18 Faro Technologies, Inc. Enhanced position detector in laser tracker

Also Published As

Publication number Publication date
WO2014149702A1 (en) 2014-09-25
GB2527993A (en) 2016-01-06
GB201518275D0 (en) 2015-12-02
DE112014001483T5 (en) 2015-12-10
GB2527993B (en) 2018-06-27
CN105190232A (en) 2015-12-23
JP2016514271A (en) 2016-05-19

Similar Documents

Publication Publication Date Title
EP2183545B1 (en) Phase analysis measurement apparatus and method
JP4228132B2 (en) Position measuring device
US6175415B1 (en) Optical profile sensor
US10302413B2 (en) Six degree-of-freedom laser tracker that cooperates with a remote sensor
US8970823B2 (en) Device for optically scanning and measuring an environment
US7490576B2 (en) Time of flight teat location system
DE112011102995T5 (en) Laser scanner or laser tracking device with a projector
EP2163847B1 (en) Instrument for examining/measuring an object to be measured
US20060132803A1 (en) Method and apparatus for combining a targetless optical measurement function and optical projection of information
US8280152B2 (en) Method for optical measurement of the three dimensional geometry of objects
US5661667A (en) 3D imaging using a laser projector
EP0553266B1 (en) Method and apparatus for three-dimensional non-contact shape sensing
JP2015017992A (en) Method and device for optically scanning and measuring environment
US9599455B2 (en) Device for optically scanning and measuring an environment
US7724379B2 (en) 3-Dimensional shape measuring method and device thereof
EP1672313A1 (en) Automatic inspection of workpieces
US8233041B2 (en) Image processing device and image processing method for performing three dimensional measurements
JP2005025415A (en) Position detector
KR100753885B1 (en) Image obtaining apparatus
JP5816773B2 (en) Coordinate measuring machine with removable accessories
ES2340462T3 (en) Robot with sensor system to determine three-dimensional measurement data.
JP2006276012A (en) Measuring system for obtaining six degrees of freedom of object
US9217637B2 (en) Device for optically scanning and measuring an environment
JP2012529031A (en) 3D surface detection method and apparatus using dynamic reference frame
DE112013004489T5 (en) Laser scanner with dynamic setting of the angular scanning speed

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20161206

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170919

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171003

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20171222

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180522

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180612

R150 Certificate of patent or registration of utility model

Ref document number: 6355710

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150