US20190186905A1 - Aerial device having a three-dimensional measurement device - Google Patents
Aerial device having a three-dimensional measurement device Download PDFInfo
- Publication number
- US20190186905A1 US20190186905A1 US16/272,291 US201916272291A US2019186905A1 US 20190186905 A1 US20190186905 A1 US 20190186905A1 US 201916272291 A US201916272291 A US 201916272291A US 2019186905 A1 US2019186905 A1 US 2019186905A1
- Authority
- US
- United States
- Prior art keywords
- projector
- camera
- light
- aerial
- pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 52
- 230000003287 optical effect Effects 0.000 claims description 23
- 230000004044 response Effects 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 description 28
- 238000000034 method Methods 0.000 description 24
- 230000008901 benefit Effects 0.000 description 12
- 238000003384 imaging method Methods 0.000 description 10
- 230000004075 alteration Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 239000000523 sample Substances 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000010348 incorporation Methods 0.000 description 2
- 230000010363 phase shift Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 241000282320 Panthera leo Species 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000012883 sequential measurement Methods 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/254—Projection of a pattern, viewing through a pattern, e.g. moiré
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/002—Active optical surveying means
-
- B64C2201/027—
-
- B64C2201/123—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
Definitions
- the subject matter disclosed herein relates in general to system for measuring three-dimensional (3D) coordinates using a scanner device operated from a mover apparatus such as a drone.
- a 3D imager uses a triangulation method to measure the 3D coordinates of points on an object.
- the 3D imager usually includes a projector that projects onto a surface of the object either a pattern of light in a line or a pattern of light covering an area.
- a camera is coupled to the projector in a fixed relationship, for example, by attaching a camera and the projector to a common frame. The light emitted from the projector is reflected off of the object surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object may be determined using trigonometric principles.
- triangulation systems provide advantages in quickly acquiring coordinate data over a large area.
- the resulting collection of 3D coordinate values or data points of the object being measured by the triangulation system is referred to as point cloud data or simply a point cloud.
- a 3D imager may be attached to a variety of mover devices such as robotic devices and aerial drones.
- a method known as videogrammetry provides a way to register multiple 3D data sets when there is relative motion between the 3D imager and an object being measured.
- Videogrammetry can further be used to provide data needed to directly determine 3D coordinates when multiple two-dimensional (2D) images with the camera at different positions relative to the object being measured.
- videogrammetry is further making use of triangulation principles.
- videogrammetry as used herein is understood to further encompass triangulation.
- triangulation which may further include videogrammetry
- videogrammetry A particular issue that may be encountered when using triangulation (which may further include videogrammetry) is lack of accuracy and detail when distances from a 3D imager to an object are large or variable. Accordingly, while existing triangulation-based 3D imager devices are suitable for their intended purpose, the need for improvement remains.
- a three-dimensional (3D) coordinate measuring system includes an aerial measuring device that has an aerial drone and a 3D measurement device.
- the 3D measurement device being rotatably attached to the aerial drone, the aerial drone is movable from a first position to a stationary second position.
- the 3D measurement device being configured to optically measure points on the surface of an object.
- the system further includes one or more processors configured to execute nontransitory computer readable instructions.
- the computer readable instructions comprise: moving the aerial measuring device from the first position; landing the aerial measuring device at the second position; rotating the 3D measurement device to optically measure a first object point; and determining a first 3D coordinates of the first object point with the 3D measuring device.
- FIGS. 1A and 1B are block diagrams of a 3D imager and a stereo camera pair, respectively, according to an embodiment
- FIG. 2 is a perspective view of a 3D imager having two cameras and a projector according to an embodiment
- FIG. 3 is a perspective view of a 3D imager having two cameras and a projector according to an embodiment
- FIGS. 4A and 4B show epipolar geometry for two reference planes and three reference planes, respectively, according to an embodiment
- FIG. 5 is a perspective view of an aerial drone carrying a 2D camera
- FIGS. 6 and 6B illustrates a measurement scenario in which an aerial drone carrying a 2D camera is used in combination with an external projector to obtain 3D coordinates of an object according to an embodiment
- FIG. 7 is a perspective view of an aerial drone that includes a 3D imager and a registration camera according to an embodiment
- FIGS. 8 and 9 illustrate measurement scenarios in which an aerial drone having a 3D imager and a registration camera are used with an external projector moved to two different positions to determine 3D coordinates of an object according to an embodiment
- FIG. 10 is a perspective view of a rotatable external projector according to an embodiment
- FIG. 11 illustrates a measurement scenario in which a rotation mechanism of the external projector directs a projected pattern of light onto an object at a desired location according to an embodiment
- FIG. 12 illustrates a measurement scenario in which the 3D imager on an aerial drone is further redirected by a rotation mechanism according to an embodiment
- FIG. 13 is a schematic representation illustrating the principle of operation of a line scanner
- FIG. 14 is a perspective view of an aerial drone carrying a line scanner and a registration camera according to an embodiment
- FIG. 15 illustrates a measurement scenario in which an aerial drone projects a line of light on an object, while an external projector projects a pattern of light onto the object according to an embodiment
- FIG. 16 illustrates a measurement scenario in which an aerial drone is used together with an external projector to measure an object indoors
- FIG. 17 illustrates a measurement scenario in which a 3D imager having two cameras and an internal projector in a triangulation pattern are used in combination with a registration camera to obtain 3D coordinates of an object without use of an external projector according to an embodiment
- FIG. 18 is a schematic representation of a computing system according to an embodiment.
- Embodiments of the present invention provide advantages in measuring large objects with 3D imagers, obtaining relatively high accuracy while providing color (texture) information.
- FIG. 1A shows a triangulation scanner (3D imager) 100 A that projects a pattern of light over an area on a surface 130 A.
- a structured light triangulation scanner is a 3D imager.
- the scanner 100 A which has a frame of reference 160 A, includes a projector 110 A and a camera 120 A.
- the projector 110 A includes an illuminated projector pattern generator 112 A, a projector lens 114 A, and a perspective center 118 A through which a ray of light 111 A emerges.
- the ray of light 111 A emerges from a corrected point 116 A having a corrected position on the pattern generator 112 A.
- the point 116 A has been corrected to account for aberrations of the projector, including aberrations of the lens 114 A, in order to cause the ray to pass through the perspective center 118 A, thereby simplifying triangulation calculations.
- the ray of light 111 A intersects the surface 130 A in a point 132 A, which is reflected (scattered) off the surface and sent through the camera lens 124 A to create a clear image of the pattern on the surface 130 A on the surface of a photosensitive array 122 A.
- the light from the point 132 A passes in a ray 121 A through the camera perspective center 128 A to form an image spot at the corrected point 126 A.
- the image spot is corrected in position to correct for aberrations in the camera lens.
- a correspondence is obtained between the point 126 A on the photosensitive array 122 A and the point 116 A on the illuminated projector pattern generator 112 A.
- the correspondence may be obtained by using a coded or an uncoded pattern, which may in some cases be projected sequentially.
- the angles a and b in FIG. 1A may be determined.
- the baseline 140 A which is a line segment drawn between the perspective centers 118 A and 128 A, has a length C. Knowing the angles a, b and the length C, all the angles and side lengths of the triangle 128 A- 132 A- 118 A may be determined.
- Digital image information is transmitted to a processor 150 A, which determines 3D coordinates of the surface 130 A.
- the processor 150 A may also instruct the illuminated pattern generator 112 A to generate an appropriate pattern.
- the processor 150 A may be located within the scanner assembly, or it may be in an external computer, or a remote server, as discussed further herein below in reference to FIG. 33 .
- FIG. 1B shows a stereo camera 100 B that receives a pattern of light from an area on a surface 130 B.
- the stereo camera 100 B which has a frame of reference 160 B, includes a first camera 120 B and a second camera 170 B.
- the first camera 120 B includes a first camera lens 124 B and a first photosensitive array 122 B.
- the first camera 120 B has a first camera perspective center 128 B through which a ray of light 121 B passes from a point 132 B on the surface 130 B onto the first photosensitive array 122 B as a corrected image spot 126 B.
- the image spot is corrected in position to correct for aberrations in the camera lens.
- the second camera 170 B includes a second camera lens 174 B and a second photosensitive array 172 B.
- the second camera 170 B has a second camera perspective center 178 B through which a ray of light 171 B passes from the point 132 B onto the second photosensitive array 172 B as a corrected image spot 176 B.
- the image spot is corrected in position to correct for aberrations in the camera lens.
- a correspondence is obtained between the point 126 B on the first photosensitive array 122 B and the point 176 B on the second photosensitive array 172 B. As explained herein below, the correspondence may be obtained using videogrammetry or other methods.
- the baseline 140 B which is a line segment drawn between the perspective centers 128 B and 178 B, has a length C. Knowing the angles a, b and the length C, all the angles and side lengths of the triangle 128 B- 132 B- 178 B may be determined.
- Digital image information is transmitted to a processor 150 B, which determines 3D coordinates of the surface 130 B.
- the processor 150 B may be located within the stereo camera assembly, or it may be in an external computer, or a remote server, as discussed further herein below in reference to FIG. 33 .
- FIG. 2 shows a structured light triangulation scanner 200 having a projector 250 , a first camera 210 , and a second camera 230 .
- the projector 250 creates a pattern of light on a pattern generator plane 252 , which it projects from a corrected point 253 on the pattern through a perspective center 258 (point D) of the lens 254 onto an object surface 270 at a point 272 (point F).
- the point 272 is imaged by the first camera 210 by receiving a ray of light from the point 272 through a perspective center 218 (point E) of a lens 214 onto the surface of a photosensitive array 212 of the camera as a corrected point 220 .
- the point 220 is corrected in the read-out data by applying a correction factor to remove the effects of lens aberrations.
- the point 272 is likewise imaged by the second camera 230 by receiving a ray of light from the point 272 through a perspective center 238 (point C) of the lens 234 onto the surface of a photosensitive array 232 of the second camera as a corrected point 235 .
- point C perspective center 238
- each of the two cameras has a different view of the point 272 (point F). Because of this difference in viewpoints, it is possible in some cases to see features that would otherwise be obscured—for example, seeing into a hole or behind a blockage.
- point F point 272
- a first triangulation calculation can be made between corresponding points in the two cameras using the triangle CEF with the baseline B 3 .
- a second triangulation calculation can be made based on corresponding points of the first camera and the projector using the triangle DEF with the baseline B 2 .
- a third triangulation calculation can be made based on corresponding points of the second camera and the projector using the triangle CDF with the baseline B 1 .
- the optical axis of the first camera 220 is 216
- the optical axis of the second camera 230 is 236 .
- FIG. 3 shows 3D imager 300 having two cameras 310 , 330 and a projector 350 arranged in a triangle A 1 -A 2 -A 3 .
- the 3D imager 300 of FIG. 3 further includes a camera 390 that may be used to provide color (texture) information for incorporation into the 3D image.
- the camera 390 may be used to register multiple 3D images through the use of videogrammetry.
- a 3D triangulation instrument 440 includes a device 1 and a device 2 on the left and right sides, respectively, of FIG. 4A .
- Device 1 and device 2 may be two cameras or device 1 and device 2 may be one camera and one projector.
- Each of the two devices has a perspective center, O 1 and O 2 , and a reference plane, 430 or 410 .
- the perspective centers are separated by a baseline distance B, which is the length of the line 402 between O 1 and O 2 .
- the concept of perspective center is discussed in more detail in reference to FIGS. 13C, 13D, and 13E .
- the perspective centers O 1 , O 2 are points through which rays of light may be considered to travel, either to or from a point on an object. These rays of light either emerge from an illuminated projector pattern, such as the pattern on illuminated projector pattern generator 112 A of FIG.
- FIG. 1A or impinge on a photosensitive array, such as the photosensitive array 122 A of FIG. 1A .
- the lens 114 A lies between the illuminated object point 932 and plane of the illuminated object projector pattern generator 112 A.
- the lens 924 A lies between the illuminated object point 132 A and the plane of the photosensitive array 122 A, respectively.
- the pattern of the front surface planes of devices 112 A and 122 A would be the same if they were moved to appropriate positions opposite the lenses 114 A and 124 A, respectively.
- This placement of the reference planes 430 , 410 is applied in FIG. 4A , which shows the reference planes 430 , 410 between the object point and the perspective centers O 1 , O 2 .
- a point U D on the plane 430 If device 1 is a camera, it is known that an object point that produces the point U D on the image must lie on the line 438 .
- the object point might be, for example, one of the points V A , V B , V C , or V D .
- These four object points correspond to the points W A , W B , W C , W D , respectively, on the reference plane 410 of device 2 .
- any epipolar line on the reference plane 410 passes through the epipole E 2 .
- FIG. 4B illustrates the epipolar relationships for a 3D imager 490 corresponding to 3D imager 300 of FIG. 3 in which two cameras and one projector are arranged in a triangular pattern.
- the device 1 , device 2 , and device 3 may be any combination of cameras and projectors as long as at least one of the devices is a camera.
- Each of the three devices 491 , 492 , 493 has a perspective center O 1 , O 2 , O 3 , respectively, and a reference plane 460 , 470 , and 480 , respectively.
- Each pair of devices has a pair of epipoles.
- Device 1 and device 2 have epipoles E 12 , E 21 on the planes 460 , 470 , respectively.
- Device 1 and device 3 have epipoles E 13 , E 31 , respectively on the planes 460 , 480 , respectively.
- Device 2 and device 3 have epipoles E 23 , E 32 on the planes 470 , 480 , respectively.
- each reference plane includes two epipoles.
- the reference plane for device 1 includes epipoles E 12 and E 13 .
- the reference plane for device 2 includes epipoles E 21 and E 23 .
- the reference plane for device 3 includes epipoles E 31 and E 32 .
- the redundancy of information provided by using a 3D imager 300 having a triangular arrangement of projector and cameras may be used to reduce measurement time, to identify errors, and to automatically update compensation/calibration parameters.
- the triangular arrangement of the 3D imager 300 may also be used to automatically update compensation/calibration parameters.
- Compensation parameters are numerical values stored in memory, for example, in an internal electrical system of an 3D measurement device or in another external computing unit. Such parameters may include the relative positions and orientations of the cameras and projector in the 3D imager.
- the compensation parameters may relate to lens characteristics such as lens focal length and lens aberrations. They may also relate to changes in environmental conditions such as temperature. Sometimes the term calibration is used in place of the term compensation. Often compensation procedures are performed by the manufacturer to obtain compensation parameters for a 3D imager. In addition, compensation procedures are often performed by a user. User compensation procedures may be performed when there are changes in environmental conditions such as temperature. User compensation procedures may also be performed when projector or camera lenses are changed or after then instrument is subjected to a mechanical shock.
- Inconsistencies in results based on epipolar calculations for a 3D imager 1290 may indicate a problem in compensation parameters, which are numerical values stored in memory. Compensation parameters are used to correct imperfections or nonlinearities in the mechanical, optical, or electrical system to improve measurement accuracy. In some cases, a pattern of inconsistencies may suggest an automatic correction that can be applied to the compensation parameters. In other cases, the inconsistencies may indicate a need to perform user compensation procedures.
- color information is sometimes referred to as “texture” information since it may suggest the materials being imaged or reveal additional aspects of the scene such as shadows.
- color (texture) information is provided by a color camera separated from the camera in the triangulation scanner (i.e., the triangulation camera).
- An example of a separate color camera is the camera 390 in the 3D imager 300 of FIG. 3 .
- the wide-FOV camera may assist in registering together multiple images obtained with the triangulation camera by identifying natural features or artificial targets outside the FOV of the triangulation camera.
- the camera 390 in the 3D imager 300 may serve as both a wide-FOV camera and a color camera.
- Position of each of the cameras may be characterized by three translational degrees-of-freedom (DOF), which might be for example x-y-z coordinates of the camera perspective center.
- DOF degrees-of-freedom
- Orientation of each of the cameras may be characterized by three orientational DOF, which might be for example roll-pitch-yaw angles.
- Position and orientation together yield the pose of an object.
- the three translational DOF and the three orientational DOF together yield the six DOF of the pose for each camera.
- a compensation procedure may be carried out by a manufacturer or by a user to determine the pose of a triangulation scanner and a color camera mounted on a common base, the pose of each referenced to a common frame of reference.
- 3D coordinates are determined based at least in part on triangulation.
- a triangulation calculation requires knowledge of the relative position and orientation of at least one projector such as 1310 A and one camera such as 1320 A.
- 3D coordinates are obtained by identifying features or targets on an object and noting changes in the features or target as the object 1330 moves.
- the process of identifying natural features of an object 1330 in a plurality of images is sometimes referred to as videogrammetry.
- videogrammetry There is a well-developed collection of techniques that may be used to determine points associated with features of objects as seen from multiple perspectives. Such techniques are generally referred to as image processing or feature detection.
- Such techniques when applied to determination of 3D coordinates based on relative movement between the measuring device and the measured object, are sometimes referred to as videogrammetry techniques.
- the common points identified by the well-developed collection of techniques described above may be referred to as cardinal points.
- a commonly used but general category for finding the cardinal points is referred to as interest point detection, with the detected points referred to as interest points.
- an interest point has a mathematically well-founded definition, a well-defined position in space, an image structure around the interest point that is rich in local information content, and a variation in illumination level that is relatively stable over time.
- a particular example of an interest point is a corner point, which might be a point corresponding to an intersection of three planes, for example.
- SIFT scale invariant feature transform
- Other common feature detection methods for finding cardinal points include edge detection, blob detection, and ridge detection.
- FIG. 5 is a perspective view of an aerial quadcopter drone 500 having a body 510 , four legs, and four rotors 520 .
- the quadcopter drone is capable of flying in any direction or hovering through the use of four rotors 520 .
- a two-dimensional (2D) camera is mounted to the drone.
- Other drones may use more or fewer rotors.
- the camera 530 may be mounted to a different type of aerial drone, for example one having fixed wings that do not provide hovering capability.
- FIG. 6 is a perspective view of a measurement scenario 600 involving a drone 610 and a projector system.
- the drone 610 includes a 2D camera 530 and a computing device 626 .
- the computing device 626 has the capability of a personal computer.
- the computing device 626 provides some preprocessing but sends the preprocessed signal to a remote computer or computer network for further processing.
- the preprocessed signal may be sent to the remote computer wirelessly or through a wire attached to a tether.
- the projector system 630 includes a projector 632 and a stand 636 .
- the stand 636 includes a tripod and motorized wheels.
- the motorized wheels respond to computer control, for example, to wireless signals from a remote computer or network.
- the projector system 640 in FIG. 6 is replaced with a projector system 636 B in FIG. 6B having a drone 636 B in place of the stand 636 .
- the drone 636 B is configured to fly the projector 632 to a desired location before landing and projecting light rays 634 .
- the projector 632 is configured to project laser light through a diffraction grating to produce rays 634 that end at an object 640 in a collection of spots 635 .
- the spots are approximately circular and are projected in a rectangular grid pattern.
- the diffraction grating is configured to give some of the projected spots more power than the others, thereby enabling some of the spots to be distinguished from the others.
- spots 635 on the surface of the object 640 are viewed by the 2D camera 530 .
- the camera 530 collects multiple 2D images, the images including some of the projected spots 635 .
- the projected spots provide a way to register multiple 2D data sets. If the camera 530 collects the multiple 2D images from different positions and if a combination of natural features and projected spots provide registration features in each of the 2D data sets, it may be possible to determine 3D coordinates using triangulation based on the multiple 2D camera images.
- the device frame of reference 650 may be represented by three orthogonal axes x D , y D , and z D , as shown in FIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 .
- the multiple 2D images and obtained in the device frame of reference 650 are moved into a common frame of reference 660 that is fixed (stationary) with respect to the object 640 .
- the common frame of reference 660 may be represented by three orthogonal axes x C , y C , and z C , as shown in FIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 .
- an aerial quadcopter drone 700 includes a body 510 , four legs, and four rotors 520 .
- a 3D imager 300 is attached to the drone 700 .
- the 3D imager 300 includes two cameras 310 , 330 and a projector 350 arranged in a triangle. The projector projects a pattern of light onto a region 725 of the object 640 , and the two cameras 310 , 330 receive reflected light from the region 725 .
- the drone 700 also includes a camera 390 that may be used to provide color (texture) information for incorporation into the 3D image and to assist in registration.
- the 3D imager enables measurement of 3D coordinates through the combined use of triangulation and videogrammetry.
- a personal computer or similar computing device 626 cooperates with the 3D imager 300 to process collected data to obtain 3D coordinates.
- Spots of light 635 are projected onto the object 640 to provide registration markers for the 3D imager 300 .
- the spots 634 are at a different wavelength than the pattern of light projected by the projector 350 .
- the spots of light form the projector 350 are in the infrared range, while the spots of light from the projector 634 are in the visible range, which appear in images of the camera 390 .
- the projected spots of light 350 and 634 are at the same wavelength.
- the determination of 3D coordinates of points on the object surface are greatly assisted in the geometry of the 3D imager 300 that includes the two cameras 310 , 330 , and the projector 350 arranged in a triangle.
- the multiple epipolar relationships provided by the triangular relationship of the devices 310 , 330 , and 350 was described herein above in reference to FIG. 4B .
- the epipolar constraints imposed by the geometry of FIG. 4B are used.
- the term “epipolar constraints” as used herein means all of the geometrical constraints of the cameras 310 , 330 and the projector 350 .
- These geometrical constraints include the six degree-of-freedom pose of each of the three devices 310 , 330 , and 350 relative to the other devices.
- the epipolar constraints provide the baseline distances between each of the devices 310 , 330 , and 350 .
- Epipolar constraints also provide constraints on the position of the reference plane of each of the devices relative to the perspective center of each of the devices. Implicitly included in such constraints are the focal length of each of the lens systems, the positions of the projection plane of the projector 350 and the image planes of the cameras 310 , 330 , and the size and number of pixels in each of the devices 310 , 330 and 350 .
- the device frame of reference 650 may be represented by three orthogonal axes x D , y D , and z D , as shown in FIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 .
- the multiple 2D images resulting 3D coordinates obtained in the device frame of reference 650 are moved into a common frame of reference 660 that is fixed (stationary) with respect to the object 640 .
- the common frame of reference 660 may be represented by three orthogonal axes x C , y C , and z C , as shown in FIGS. 6, 6B, 8, 9, 11, 12, 15, 16 , and 17 .
- the stand 636 includes motorized wheels that cause it to move near to or farther from the object 640 according to the needs of the measurement.
- the projector system 630 will be moved closer to the object 640 when the drone 700 moves closer to the object 640 .
- the objective is to maintain as a relatively large number of projected spots in the region 725 . As the drone 700 moves closer to the object 640 , the area 725 decreases. It is desirable that the number of projected spots 635 decrease correspondingly.
- the drone 700 is configured to adjust it position from far to near according to measurement needs.
- the drone may first measure the object 640 over a relatively large FOV and then move closer to the object 640 to measure features of interest on the object 640 in greater detail and with higher accuracy.
- the projector system 630 may likewise move closer to the object in response.
- the stand 636 is replaced by the drone 636 B, which can move the 3D imager 300 to any needed location.
- the motorized mobile platform moves the external projector to put the projected pattern of light from the external projector into the FOV of the registration camera, which is the camera 530 , 390 , 1416 in FIGS. 5, 7, 14 , respectively.
- a control system monitors the number of projected spots received by the registration camera and adjusts the position of the motorized mobile platform to ensure that external projector is properly positioned to project spots within the FOV of the registration camera.
- the motorized mobile platform moves the external projector in such a way that the registration camera sees at least a portion of the pattern of light projected by the external projector before and after the movement of the motorized platform.
- the motorized mobile platform moves the external projector so as to keep the number of dots seen by the registration camera above a specified minimum number and below a specified maximum number.
- the specified minimum number and the specified maximum number are user-adjustable values.
- the density of dots projected over a given region increases as the external projector moves closer to the object. Consequently, as the aerial drone moves closer to the object, it may be advantageous to move the external projector closer to the object.
- a rotation mechanism provided on the external projector may be used synergistically with the movement mechanism on the external projector to obtain a desired number of projected spots in the FOV of the registration camera, as explained further herein below.
- the optical power received by the photosensitive array of the camera in 3D imager of the aerial drone likewise changes. Consequently, in an embodiment, the optical power from the external projector is changed in response to the amount of light (e.g., the optical power) received by pixels in the photosensitive array in the registration camera. Likewise, it may be desirable to change the exposure time of the registration camera in response to the amount of light received by the registration camera.
- the projector 632 in FIG. 6 is replaced by the rotatable projector 1010 in FIG. 10 on a rotatable projector system 1000 .
- the rotatable projector 1010 is configured emit a beam of pattern of light 1012 along a direction determined by a rotation about a horizontal axis 1022 and a rotation about a vertical axis 1020 .
- the stand 636 includes a tripod 1032 , wheels 1034 , and a motor drive system 1036 .
- the rotation of the projector about the axes 1020 and 1022 is under computer control, which may be a computer on the rotatable projector system 1000 or a computer located elsewhere in the overall measurement environment.
- the projected spots serve only as landmarks.
- the exact direction of the projected pattern 1012 may not be too important. In other cases, the direction may be important.
- the device may include accurate angle transducers such as angular encoders that may, for example, be accurate to 5 microradians.
- the beam steering mechanisms comprise a horizontal shaft and a vertical shaft, each shaft mounted on a pair of bearings and each driven by a frameless motor.
- the projector may be directly mounted to the horizontal shaft, but many other arrangements are possible.
- a mirror may be mounted to the horizontal shaft to reflect projected light onto the object or reflect scattered light from the object onto a camera.
- a mirror angled at 45 degrees rotates around a horizontal axis and receives or returns light along the horizontal axis.
- galvanometer mirrors may be used to send or receive light along a desired direction.
- a MEMS steering mirror is used to direct the light into a desired direction.
- Many other beam steering mechanisms are possible and may be used.
- an angular encoder is used to measure the angle of rotation of the projector or camera along each of the two axes. Many other angle transducers are available and may be used.
- FIG. 11 illustrates a measurement situation 1100 in which a rotatable projector system 1000 rotates to place the projected spots from a projected beam 1010 into desired locations on the object 640 according to the position of the drone 700 .
- the rotation mechanism of the external projector is configured to rotate the projected pattern of light to place it in the FOV of the registration camera of the 3D imager in the aerial drone.
- the registration is the camera 530 , 390 , 1416 in FIGS. 5, 7, 14 , respectively.
- the rotation mechanism of the external projector is configured to rotate the projected pattern of light in such a way that the registration camera sees at least a portion of the pattern of light projected by the external projector before and after the rotation by the rotation mechanism.
- the rotation mechanism is configured to rotate the pattern of light from the external projector so as to keep the number of dots seen by the registration camera above a specified minimum number and below a specified maximum number. For example, if the aerial drone moves closer to the object and at the same time rotates to point the 3D imager upward, it may be desirable to respond by moving the external projector closer to the object while at the same time rotating the projected light from the external projector upward. This keeps a relatively large number of projected spots from the external projector visible to the registration camera, which is usually a desirable result.
- FIG. 12 illustrates a measurement situation 1200 in which a device such as the 3D imager 300 or the camera 530 is a part of an aerial drone 3D imager 1220 similar to the drone 500 or the drone 700 except that the 3D imager 1220 includes a rotatable steering mechanism.
- the rotatable steering mechanism is substantially like the rotatable steering mechanism of the projector system 1000 except in being sized to hold the 3D imager elements.
- the drone flies to a desired location relative to an object 640 to be measured and then lands before making the measurement.
- a rotatable steering mechanism of a 3D imager 1220 steers the projected pattern of light to the desired location on the object 640 before making a measurement.
- the projected spots 1010 are used to assist in registration.
- accurately known rotation angles of the rotatable mechanism of the 3D imager 1220 also contribute to the determination of the 3D coordinates of the surface of the object 640 .
- a way to minimize, but not eliminate, the effect of vibration from rotating propellers is to add a servo-controlled gimbal camera mount, which is a device configured keep the camera pointed in the same direction as the direction of the drone changes slightly.
- a servo system makes use of signals from an inertial measurement unit to send signals to three brushless motors that keep the camera leveled.
- the 3D imager device mounted to the mover further includes a laser line probe, also known as a line scanner.
- the operation of the laser line scanner (also known as a laser line probe or simply line scanner) is now described with reference to FIG. 13 .
- the line scanner system 1300 includes a projector 1320 and a camera 1340 .
- the projector 1320 includes a source pattern of light 1321 and a projector lens 1322 .
- the source pattern of light includes an illuminated pattern in the form of a line.
- the projector lens includes a projector perspective center and a projector optical axis that passes through the projector perspective center. In the example of FIG. 13 , a central ray of the beam of light 1324 is aligned with the perspective optical axis.
- the camera 1340 includes a camera lens 1342 and a photosensitive array 1341 .
- the lens has a camera optical axis 1343 that passes through a camera lens perspective center 1344 .
- the projector optical axis, which is aligned to the beam of light 1324 , and the camera lens optical axis 1343 are perpendicular to the line of light 1325 projected by the source pattern of light 1321 .
- the line 1325 is in the direction perpendicular to the paper in FIG. 13 .
- the line of light 1325 strikes an object surface, which at a first distance from the projector is object surface 1310 A and at a second distance from the projector is object surface 1320 A.
- the object surface may be at a different distance from the projector than the distance to either object surface 1310 A or 1310 B.
- the line of light intersects surface 1310 A in a point 1326 and it intersects the surface 1310 B in a point 1327 .
- a ray of light travels from the point 1326 through the camera lens perspective center 1344 to intersect the photosensitive array 1341 in an image point 1346 .
- intersection point 1327 For the case of the intersection point 1327 , a ray of light travels from the point 1327 through the camera lens perspective center to intersect the photosensitive array 1341 in an image point 1347 .
- the distance from the projector (and camera) to the object surface can be determined.
- the distance from the projector to other points on the intersection of the line of light 1325 with the object surface, that is points on the line of light that do not lie in the plane of the paper of FIG. 13 may similarly be found.
- the pattern on the photosensitive array will be a line of light (in general, not a straight line), where each point in the line corresponds to a different position perpendicular to the plane of the paper, and the position perpendicular to the plane of the paper contains the information about the distance from the projector to the camera. Therefore, by evaluating the pattern of the line in the image of the photosensitive array, the three-dimensional coordinates of the object surface along the projected line can be found. Note that the information contained in the image on the photosensitive array for the case of a line scanner is contained in a (not generally straight) line.
- FIG. 14 illustrates a drone 1400 configured to carry a 3D imager 1410 that includes a line projector 1412 , a first camera 1414 , and a second camera 1416 .
- the first camera is configured to cooperate with the line projector to determine 3D coordinates of object points intersected by the projected line.
- the projector 1412 corresponds to the projector 1320 of FIG. 13
- the camera 1414 corresponds to the camera 1340 of FIG. 13 .
- the second camera 1416 is a color camera configured to respond to visible light.
- the camera 1416 is configured to respond to light from a stationary external projector, thereby enabling registration of the multiple 3D data sets collected by the laser scanner over different positions and orientations of the drone 1400 .
- the 3D imager 1410 is configured to work with a computer 626 , which might be a personal computer.
- FIG. 15 illustrates a measurement scenario 1500 in which a drone 1400 that includes a 3D imager 1410 as shown in FIG. 14 projects a beam of light 1510 that produces a line of light 1512 on a surface of an object 640 .
- This image of the line of light 1512 is captured by the camera 1414 and is used to determine 3D coordinates of the line of light on the surface of the object 640 in the frame of reference of the 3D imager 1410 .
- the color camera 1416 has a relatively larger angular view 1520 that covers an area 1522 on the object surface. The color camera captures several spots from the stationary external projector 1000 , which may be a rotatable projector.
- the color camera captures the line of light projected by the scanner as well as the spots projected by the external projector 1000 .
- the color camera includes an optical filter to block out the wavelength of the projected line of light.
- the drone will fly relatively close to the object 640 when projecting a line of laser light, as this permits maximum accuracy to be obtained.
- the laser line probe flies along a smooth path, covering the surface with the line of light, which produces a collection of 3D coordinates over a surface area.
- the drone lands before projecting the laser light 1512 .
- a high accuracy rotatable mount of provided to steer the beam of light 1510 from the 3D imager 1410 .
- the angular accuracy of the beam steering mechanism is high, for example, 5 microradians, thereby enabling high registration accuracy of measured 3D coordinates, even if registration spots from the beams 1010 are not provided.
- the line projector 1412 may be replaced by an area projector.
- the area projector projects single shot coded patterns of light that are captured by a camera to determine 3D coordinate values of the object 640 .
- the drone lands before measurements begin and the drone projects a sequence of patterns that are evaluated to determine 3D coordinates of the object 640 to relatively high accuracy.
- One such sequential measurement method known in the art is the phase-shift method in which optical power of projected light is modulated sinusoidally along one direction and the phase of the sinusoidal pattern shifted side-to-side at least three times.
- the resulting optical powers collected at each point for each of the three or more phase shifts is sufficient to enable determination of 3D coordinates.
- An advantage of this method is high rejection of background light.
- a digital micromirror device (DMD) is used to produce desired patterns such as coded patterns or phase-shifted sinusoidal patterns.
- all cameras and projectors include optical filters to pass desired wavelengths of light and block unwanted wavelengths.
- optical filters may be thin-film dielectric coatings applied to windows or lenses.
- Another method to minimize the effect of background light is to pulse the projected light and to reduce the camera exposure time in correspondence.
- Another possibility is to use a fast lock-in amplifier, in which the optical power of light is modulated and then filtered to extract the modulation frequency.
- Image chips with such a lock-in amplifying functionality are made, for example, by the company Heliotis.
- the projected optical power is varied according to the distance from the object.
- laser light may be sent through a diffraction grating to obtain spots that remain relatively small and sharply defined over relatively large distances.
- a diffraction grating is configured to produce some spots that are more powerful than others.
- a drone may be restricted to a safety region in which humans are not allowed.
- a drone may have a parachute configured to open in the event of a rapid fall, which might be detected, for example, by an inertial measurement unit (IMU).
- IMU inertial measurement unit
- a drone may be attached to a tether.
- FIG. 16 illustrates a measurement scenario 1600 in which a drone 1610 includes a 3D imager 1620 configured to measure 3D coordinates of an object 1630 in cooperation with a projector system 1010 .
- the 3D imager 1620 is configured to cooperate with a computing device 626 and may include, for example, any of the 3D measuring devices 530 , 300 , 1220 , or 1410 of FIGS. 5, 7, 12, and 14 , respectively.
- the projector system 1010 is configured to sit in a fixed position, for example, on a pillar 1030 .
- the projector system 1010 is configured to rotate about two axes, for example about the axes 1020 and 1022 .
- the projector system is configured to project beams of light 1012 to form a collection of spots 635 in a grid pattern.
- the projector system 1010 may project any other pattern of light.
- the drones are temporarily mounted on magnetic mounts or other structures.
- FIG. 17 illustrates a measurement scenario 1700 in which a drone 700 that includes a 3D imager 300 and computing device 626 operates without imaging a pattern of light projected onto the object by an external projector 630 .
- a 3D imager 300 described herein above in reference to FIGS. 3 and 7 has the capability of imaging an object at near and far distances, especially if provision is made to vary the optical power of the projected light. It may be possible, for example, for a measurement device to measure at distances of 6 meters at relatively high power and relatively low resolution while measuring at a distance of 0.5 meters at relatively lower power with improved resolution and accuracy.
- the camera 390 is capable of measuring over a FOV of around 60 degrees, thereby enabling many target features to be captured as cardinal points for use in registering multiple images using videogrammetry techniques.
- a FOV field of view
- the advantages provided by the multiple epipolar constraints that must be satisfied simultaneously as explained herein above with reference to FIG. 4B , it is possible to relatively good registration and relatively good 3D accuracy even without the projecting of a pattern of light by an external projector such as the projector 630 .
- a drone 1700 may be able to provide its own guidance control, perhaps in conjunction with maps or CAD models provided to it through a computer or network interface.
- an additional 3D measuring device such as a laser tracker, total station, or a time-of-flight (TOF) laser scanner is used to determine 3D coordinates of reference markers having a fixed position relative to an object being measured.
- a TOF laser scanner is a device that measures a distance and two angles to a diffusely scattering target.
- the term TOF in this instance means that the scanner measures the time required for an emitted light signal to reach the target and return to the scanner. The speed of light in air is used in combination with the determined time to calculate the distance to the target.
- This TOF method is distinguished from the method used by a triangulation scanner (3D imager) which is based on a triangulation calculation and does not depend directly on the speed of light in air.
- a laser tracker typically has an accuracy of 100 micrometers or better, while total stations and TOF laser scanners typically have accuracies of a few millimeters.
- a laser tracker may be used to determine that the distance between two parallel walls is 10 meters. This distance value is then probably known to an accuracy of about 100 micrometers.
- a 3D imager 300 attached to a drone 700 may know this distance to within about a centimeter or perhaps a few centimeters over this 10 meter distance. The distance between the walls as measured by the tracker may then be transferred to the measurements made by the 3D imager 300 between the walls, and the resulting coordinate values rescaled according to the more accurate distance measured by the laser tracker.
- a total station or a TOF laser scanner may be used to determine a distance between two parallel walls to an accuracy of a few millimeters, which is still usually expected to be several times better than the accuracy over the same distance as measured by a drone 700 .
- coordinate measurements of a relatively accurate reference coordinate measuring device such as a laser tracker, total station, or TOF scanner, may make measurements enabling improvement in measurements made by a drone that includes a 3D imager. In most cases this improvement is possible even if an external projector is used in combination with the drone-carried 3D imager.
- a computer 626 establishes a Remote Desktop link to a personal computer in possession of an operator. The operator then sees whatever data is collected and processed by the computer 626 .
- data is sent to a remote computer or computer network over a high speed wireless data link for further processing.
- data is sent to a remote computer over a wired link such as an Ethernet link.
- a wired link is practical, for example, if the drone is attached to a tether.
- detailed data are stored in memory on the drone for post-processing, while 2D and 3D data are transmitted to a remote computing device wirelessly but at a lower frame rate, for example, at 3 or 4 frames per second.
- FIG. 18 further describes the computing system associated with the drone systems described herein above.
- the drone system which may be any of the drone systems described herein above, includes computing elements 626 .
- the computing elements 626 will include at least a processor, a memory, and a communication interface.
- the computing elements 626 is equivalent to a personal computer so that a Remote Desktop link may be established with a remote computer.
- the computing elements 626 are part of an overall computing system 1800 that may optionally include a one or more additional computing elements 1810 and 1820 and a networking computing component (the Cloud) 1830 .
- the network interface provided by the computing elements 626 may support wired or wireless communication between the computing elements 626 and the optional computing elements 1810 , 1820 , and 1830 .
- a three-dimensional (3D) coordinate measuring system includes an external projector configured to operate while stationary and to project a first projected pattern of light onto an object.
- the system further includes an aerial measuring device that includes an aerial drone and a 3D imaging device, the 3D imaging device being attached to the aerial drone, the aerial measuring device being separate from the external projector, the aerial measuring device being configured to cooperate with the external projector to determine 3D coordinates of the object.
- the 3D imaging device further comprises a first two-dimensional (2D) camera configured to capture a first 2D image from a first position of the aerial drone and a second 2D image from a second position of aerial drone, the second position being different than the first position.
- the 3D imaging device is configured to identify three cardinal points common to the first image and the second image, each of the three cardinal points derived from the first projected pattern of light on the object or from natural features of the object, at least one of the three cardinal points being derived from the first projected pattern of light.
- the 3D imaging device is further configured to determine first 3D coordinates of a first object point, second 3D coordinates of a second object point, and third 3D coordinates of a third object, the determined first 3D coordinates, determined second 3D coordinates, and determined third 3D coordinates based at least in part on the three cardinal points in the first image and the three cardinal points in the second image.
- the 3D coordinate measuring system of claim the external projector is configured to project a first collection of spots of light onto the object.
- the external projector comprises a source of laser light and a diffraction grating, the external projector being configured to project laser light through the diffraction grating and onto the object.
- the external projector is configured to adjust power of the laser light according to the amount of light received by pixels of the first 2D camera.
- the system further includes an internal projector configured to project a second pattern of projected light through a projector perspective center onto the object.
- a second 2D camera is configured to form a second camera image of the second pattern of projected light on the object, the second 2D camera having a camera perspective center, there being a baseline between the projector perspective center and the camera perspective center, the length of the baseline being a baseline distance.
- the 3D imaging device is further configured to determine the 3D coordinates of the object based at least in part on the second pattern of projected light, the second camera image, and the baseline distance.
- the first 2D camera is a color camera.
- the internal projector is configured to project light a pulsed pattern of light.
- the external projector is further configured to be moved by a motorized device.
- the motorized device is a mobile platform having motorized wheels.
- the motorized device is an aerial drone.
- the aerial drone is a helicopter. In an embodiment the aerial drone is a quadcopter. In still another embodiment, the aerial drone is a fixed wing aircraft. In an embodiment, the second pattern of projected light is a line of light.
- the external projector includes a first mechanism configured to steer light from the external projector into a plurality of directions.
- the first mechanism is configured to steer the light from the external projector about two orthogonal axes.
- a projector angular transducer is provided to measure an angle of rotation of the first mechanism.
- the aerial drone device further includes a second mechanism configured to steer the 3D imaging device into a plurality of directions.
- the second mechanism is further configured to steer the 3D imaging device about two orthogonal axes.
- an angular transducer is provided to measure an angle of rotation of the second mechanism.
- the system is further configured to detect motion of the external projector relative to the object, the determination based at least in part on movement of the first projected pattern of light relative and on cardinal points that are based on natural features of the object. In an embodiment, the system is further configured to determine a pose of the external projector based at least in part on the first projected pattern of light and on those cardinal points that are obtained from natural features of the object. In an embodiment, the system is further configured to detect motion of the external projector relative to the object, the relative motion determined based at least in part on the first projected pattern of light, an observed movement of the first projected pattern of light, the second pattern of projected light, the second camera image, and the baseline distance. In an embodiment, the system is configured to determine a pose of the external projector based at least in part on the first projected pattern of light, the observed movement of the first projected pattern of light, the second pattern of projected light, the second camera image, and the baseline distance.
- a three-dimensional (3D) coordinate measuring system includes an external projector configured to project a first projected pattern of light onto an object.
- the system further includes an aerial measuring device in a device frame of reference, the aerial measuring device including a triangulation scanner and a registration camera, the triangulation scanner including an internal projector and a first camera, the triangulation scanner configured to determine 3D coordinates of the object in the device frame of reference in a first instance and a second instance, the registration camera configured to image the first projected pattern of light in the first instance and the second instance, the aerial measuring device configured to register in a common frame of reference the determined 3D coordinates in the first instance and in the second instance based at least in part on the registration camera image of the first projected pattern of light in the first instance and the second instance.
- the first projected pattern of light is projected from the external projector while the external projector is stationary.
- the external projector is attached to a motorized mobile platform configured to move the external projector to multiple positions.
- the motorized mobile platform includes motorized wheels. In an embodiment, the motorized mobile platform is a drone. In an embodiment, the motorized mobile platform is a helicopter. In an embodiment, the motorized mobile platform is a quadcopter. In an embodiment, the motorized mobile platform is configured to move the external projector in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- a rotation mechanism is configured to rotate the first projected pattern of light about a first axis.
- the rotation mechanism is configured to rotate the first projected pattern of light about a second axis.
- an angle transducer is configured to measure an angle of rotation of the first projected pattern of light about the first axis.
- the rotation mechanism is configured to rotate the first projected pattern of light about the first axis to put the first projected pattern of light in a field-of-view of the registration camera.
- the motorized mobile platform is configured to move the external projector to put the first projected pattern of light in a field-of-view of the registration camera.
- the motorized mobile platform is configured to move the external projector in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- the rotation mechanism is configured to rotate the first projected pattern of light in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- the first projected pattern of light includes a collection of illuminated dots.
- the motorized mobile platform is configured to move the external projector so as to keep the number of dots seen by registration camera above a specified minimum number and below a specified maximum number.
- the rotation mechanism is configured to rotate the first projected pattern of light so as to keep the number of dots seen by registration camera above a specified minimum number and below a specified maximum number.
- the optical power of the projected dots is adjusted to keep the light level received by pixels in the registration camera within a specified range.
- the registration camera is a color camera.
- a three-dimensional (3D) coordinate measuring system includes an aerial drone attached to a triangulation scanner and a registration camera.
- the triangulation scanner including a first projector, a first camera, and a second camera, the first camera, the second camera, and the third camera arranged in a triangle.
- the first projector, the first camera, and the second camera having first epipolar constraints.
- the first projector configured to project a projected first pattern of light on the object.
- the first camera configured to form a first image of the projected first pattern of light on the object.
- the second camera configured to form a second image of the projected first pattern of light on the object.
- the system further configured to determine 3D coordinates of an object based at least in part on the projected first pattern of light, the first image, the second image, and the first epipolar constraints, the registration camera configured to obtain an image of the object and to extract from the image cardinal points based on natural features of the object.
- the system further being configured to register multiple sets of the 3D coordinates obtained from the triangulation scanner, the registration based at least in part on matching of common cardinal points present in successive images of the registration camera.
- the aerial drone is configured to fly according to navigation signals, the navigation signals based at least in part on data from the registration camera.
- the aerial drone is further configured to fly nearer to an object or farther from an object according to a level of feature detail present in the images of the registration camera.
- the registration camera is a color camera and the 3D coordinate measuring system is further configured to place the observed colors on the registered 3D coordinates.
- system further includes a drone computer configured to communicate with a remote computer by Remote Desktop mode, the drone computer configured to enable a remote user to view images obtained from the aerial drone.
- system further includes a drone computer configured to communicate with a remote computer through wired or wireless signals.
- system further includes constraints to keep the drone located in safety zones, the constraints being selected from the group consisting of a tether and a barrier.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Mechanical Engineering (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A three-dimensional (3D) coordinate measuring system is provided. The system includes an aerial measuring device that has an aerial drone and a 3D measurement device. The 3D measurement device being rotatably attached to the aerial drone, the aerial drone is movable from a first position to a stationary second position. The 3D measurement device being configured to optically measure points on the surface of an object. The system further includes one or more processors configured to execute nontransitory computer readable instructions. The computer readable instructions comprise: moving the aerial measuring device from the first position; landing the aerial measuring device at the second position; rotating the 3D measurement device to optically measure a first object point; and determining a first 3D coordinates of the first object point with the 3D measuring device.
Description
- The present Application is a continuation application of U.S. application Ser. No. 15/991,433 filed on May 29, 2018, which is a continuation of U.S. application Ser. No. 15/250,324 filed on Aug. 29, 2016, now U.S. Pat. No. 9,989,357, which is claims the benefit of U.S. Provisional Application Ser. No. 62/215,978 filed on Sep. 9, 2015, and U.S. Provisional Application Ser. No. 62/216,021 filed on Sep. 9, 2015, and also U.S. Provisional Application Ser. No. 62/216,027 filed on Sep. 9, 2015, the contents of all of which are incorporated herein by reference in their entirety.
- The subject matter disclosed herein relates in general to system for measuring three-dimensional (3D) coordinates using a scanner device operated from a mover apparatus such as a drone.
- A 3D imager uses a triangulation method to measure the 3D coordinates of points on an object. The 3D imager usually includes a projector that projects onto a surface of the object either a pattern of light in a line or a pattern of light covering an area. A camera is coupled to the projector in a fixed relationship, for example, by attaching a camera and the projector to a common frame. The light emitted from the projector is reflected off of the object surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object may be determined using trigonometric principles. Compared to coordinate measurement devices that use tactile probes, triangulation systems provide advantages in quickly acquiring coordinate data over a large area. As used herein, the resulting collection of 3D coordinate values or data points of the object being measured by the triangulation system is referred to as point cloud data or simply a point cloud.
- A 3D imager may be attached to a variety of mover devices such as robotic devices and aerial drones. A method known as videogrammetry provides a way to register multiple 3D data sets when there is relative motion between the 3D imager and an object being measured. Videogrammetry can further be used to provide data needed to directly determine 3D coordinates when multiple two-dimensional (2D) images with the camera at different positions relative to the object being measured. In this case, videogrammetry is further making use of triangulation principles. Hence the term videogrammetry as used herein is understood to further encompass triangulation.
- A particular issue that may be encountered when using triangulation (which may further include videogrammetry) is lack of accuracy and detail when distances from a 3D imager to an object are large or variable. Accordingly, while existing triangulation-based 3D imager devices are suitable for their intended purpose, the need for improvement remains.
- According to an embodiment of the present invention, a three-dimensional (3D) coordinate measuring system is provided. The system includes an aerial measuring device that has an aerial drone and a 3D measurement device. The 3D measurement device being rotatably attached to the aerial drone, the aerial drone is movable from a first position to a stationary second position. The 3D measurement device being configured to optically measure points on the surface of an object. The system further includes one or more processors configured to execute nontransitory computer readable instructions. The computer readable instructions comprise: moving the aerial measuring device from the first position; landing the aerial measuring device at the second position; rotating the 3D measurement device to optically measure a first object point; and determining a first 3D coordinates of the first object point with the 3D measuring device.
- The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
-
FIGS. 1A and 1B are block diagrams of a 3D imager and a stereo camera pair, respectively, according to an embodiment; -
FIG. 2 is a perspective view of a 3D imager having two cameras and a projector according to an embodiment; -
FIG. 3 is a perspective view of a 3D imager having two cameras and a projector according to an embodiment; -
FIGS. 4A and 4B show epipolar geometry for two reference planes and three reference planes, respectively, according to an embodiment; -
FIG. 5 is a perspective view of an aerial drone carrying a 2D camera; -
FIGS. 6 and 6B illustrates a measurement scenario in which an aerial drone carrying a 2D camera is used in combination with an external projector to obtain 3D coordinates of an object according to an embodiment; -
FIG. 7 is a perspective view of an aerial drone that includes a 3D imager and a registration camera according to an embodiment; -
FIGS. 8 and 9 illustrate measurement scenarios in which an aerial drone having a 3D imager and a registration camera are used with an external projector moved to two different positions to determine 3D coordinates of an object according to an embodiment; -
FIG. 10 is a perspective view of a rotatable external projector according to an embodiment; -
FIG. 11 illustrates a measurement scenario in which a rotation mechanism of the external projector directs a projected pattern of light onto an object at a desired location according to an embodiment; -
FIG. 12 illustrates a measurement scenario in which the 3D imager on an aerial drone is further redirected by a rotation mechanism according to an embodiment; -
FIG. 13 is a schematic representation illustrating the principle of operation of a line scanner; -
FIG. 14 is a perspective view of an aerial drone carrying a line scanner and a registration camera according to an embodiment; -
FIG. 15 illustrates a measurement scenario in which an aerial drone projects a line of light on an object, while an external projector projects a pattern of light onto the object according to an embodiment; -
FIG. 16 illustrates a measurement scenario in which an aerial drone is used together with an external projector to measure an object indoors; -
FIG. 17 illustrates a measurement scenario in which a 3D imager having two cameras and an internal projector in a triangulation pattern are used in combination with a registration camera to obtain 3D coordinates of an object without use of an external projector according to an embodiment; and -
FIG. 18 is a schematic representation of a computing system according to an embodiment. - The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
- Embodiments of the present invention provide advantages in measuring large objects with 3D imagers, obtaining relatively high accuracy while providing color (texture) information.
-
FIG. 1A shows a triangulation scanner (3D imager) 100A that projects a pattern of light over an area on asurface 130A. Another name for a structured light triangulation scanner is a 3D imager. Thescanner 100A, which has a frame ofreference 160A, includes aprojector 110A and acamera 120A. Theprojector 110A includes an illuminatedprojector pattern generator 112A, aprojector lens 114A, and aperspective center 118A through which a ray oflight 111A emerges. The ray oflight 111A emerges from a correctedpoint 116A having a corrected position on thepattern generator 112A. In an embodiment, thepoint 116A has been corrected to account for aberrations of the projector, including aberrations of thelens 114A, in order to cause the ray to pass through theperspective center 118A, thereby simplifying triangulation calculations. - The ray of
light 111A intersects thesurface 130A in apoint 132A, which is reflected (scattered) off the surface and sent through thecamera lens 124A to create a clear image of the pattern on thesurface 130A on the surface of aphotosensitive array 122A. The light from thepoint 132A passes in aray 121A through the camera perspective center 128A to form an image spot at the correctedpoint 126A. The image spot is corrected in position to correct for aberrations in the camera lens. A correspondence is obtained between thepoint 126A on thephotosensitive array 122A and thepoint 116A on the illuminatedprojector pattern generator 112A. As explained herein below, the correspondence may be obtained by using a coded or an uncoded pattern, which may in some cases be projected sequentially. Once the correspondence is known, the angles a and b inFIG. 1A may be determined. Thebaseline 140A, which is a line segment drawn between the perspective centers 118A and 128A, has a length C. Knowing the angles a, b and the length C, all the angles and side lengths of the triangle 128A-132A-118A may be determined. Digital image information is transmitted to aprocessor 150A, which determines 3D coordinates of thesurface 130A. Theprocessor 150A may also instruct theilluminated pattern generator 112A to generate an appropriate pattern. Theprocessor 150A may be located within the scanner assembly, or it may be in an external computer, or a remote server, as discussed further herein below in reference toFIG. 33 . -
FIG. 1B shows astereo camera 100B that receives a pattern of light from an area on asurface 130B. Thestereo camera 100B, which has a frame ofreference 160B, includes afirst camera 120B and asecond camera 170B. Thefirst camera 120B includes afirst camera lens 124B and a firstphotosensitive array 122B. Thefirst camera 120B has a firstcamera perspective center 128B through which a ray oflight 121B passes from apoint 132B on thesurface 130B onto the firstphotosensitive array 122B as a correctedimage spot 126B. The image spot is corrected in position to correct for aberrations in the camera lens. - The
second camera 170B includes asecond camera lens 174B and a secondphotosensitive array 172B. Thesecond camera 170B has a secondcamera perspective center 178B through which a ray oflight 171B passes from thepoint 132B onto the secondphotosensitive array 172B as a correctedimage spot 176B. The image spot is corrected in position to correct for aberrations in the camera lens. - A correspondence is obtained between the
point 126B on the firstphotosensitive array 122B and thepoint 176B on the secondphotosensitive array 172B. As explained herein below, the correspondence may be obtained using videogrammetry or other methods. Once the correspondence is known, the angles a and b inFIG. 1B may be determined. Thebaseline 140B, which is a line segment drawn between the perspective centers 128B and 178B, has a length C. Knowing the angles a, b and the length C, all the angles and side lengths of thetriangle 128B-132B-178B may be determined. Digital image information is transmitted to aprocessor 150B, which determines 3D coordinates of thesurface 130B. Theprocessor 150B may be located within the stereo camera assembly, or it may be in an external computer, or a remote server, as discussed further herein below in reference toFIG. 33 . -
FIG. 2 shows a structuredlight triangulation scanner 200 having aprojector 250, afirst camera 210, and asecond camera 230. Theprojector 250 creates a pattern of light on apattern generator plane 252, which it projects from a correctedpoint 253 on the pattern through a perspective center 258 (point D) of thelens 254 onto anobject surface 270 at a point 272 (point F). Thepoint 272 is imaged by thefirst camera 210 by receiving a ray of light from thepoint 272 through a perspective center 218 (point E) of alens 214 onto the surface of aphotosensitive array 212 of the camera as a correctedpoint 220. Thepoint 220 is corrected in the read-out data by applying a correction factor to remove the effects of lens aberrations. Thepoint 272 is likewise imaged by thesecond camera 230 by receiving a ray of light from thepoint 272 through a perspective center 238 (point C) of thelens 234 onto the surface of aphotosensitive array 232 of the second camera as a correctedpoint 235. It should be understood that any reference to a lens in this document refers not only to an individual lens but to a lens system, including an aperture within the lens system. - The inclusion of two
cameras system 200 provides advantages over the device ofFIG. 1A that includes a single camera. One advantage is that each of the two cameras has a different view of the point 272 (point F). Because of this difference in viewpoints, it is possible in some cases to see features that would otherwise be obscured—for example, seeing into a hole or behind a blockage. In addition, it is possible in thesystem 200 ofFIG. 2 to perform three triangulation calculations rather than a single triangulation calculation, thereby improving measurement accuracy. A first triangulation calculation can be made between corresponding points in the two cameras using the triangle CEF with the baseline B3. A second triangulation calculation can be made based on corresponding points of the first camera and the projector using the triangle DEF with the baseline B2. A third triangulation calculation can be made based on corresponding points of the second camera and the projector using the triangle CDF with the baseline B1. The optical axis of thefirst camera 220 is 216, and the optical axis of thesecond camera 230 is 236. -
FIG. 3 shows3D imager 300 having twocameras projector 350 arranged in a triangle A1-A2-A3. In an embodiment, the3D imager 300 ofFIG. 3 further includes acamera 390 that may be used to provide color (texture) information for incorporation into the 3D image. In addition, thecamera 390 may be used to register multiple 3D images through the use of videogrammetry. - This triangular arrangement provides additional information beyond that available for two cameras and a projector arranged in a straight line as illustrated in
FIG. 2 . The additional information may be understood in reference toFIG. 4A , which explain the concept of epipolar constraints, andFIG. 4B , which explains how epipolar constraints are advantageously applied to the triangular arrangement of the3D imager 300. InFIG. 4A , a3D triangulation instrument 440 includes adevice 1 and adevice 2 on the left and right sides, respectively, ofFIG. 4A .Device 1 anddevice 2 may be two cameras ordevice 1 anddevice 2 may be one camera and one projector. Each of the two devices, whether a camera or a projector, has a perspective center, O1 and O2, and a reference plane, 430 or 410. The perspective centers are separated by a baseline distance B, which is the length of theline 402 between O1 and O2. The concept of perspective center is discussed in more detail in reference toFIGS. 13C, 13D, and 13E . The perspective centers O1, O2 are points through which rays of light may be considered to travel, either to or from a point on an object. These rays of light either emerge from an illuminated projector pattern, such as the pattern on illuminatedprojector pattern generator 112A ofFIG. 1A , or impinge on a photosensitive array, such as thephotosensitive array 122A ofFIG. 1A . As can be seen inFIG. 1A , thelens 114A lies between the illuminated object point 932 and plane of the illuminated objectprojector pattern generator 112A. Likewise, the lens 924A lies between theilluminated object point 132A and the plane of thephotosensitive array 122A, respectively. However, the pattern of the front surface planes ofdevices lenses FIG. 4A , which shows the reference planes 430, 410 between the object point and the perspective centers O1, O2. - In
FIG. 4A , for thereference plane 430 angled toward the perspective center O2 and thereference plane 410 angled toward the perspective center O1, aline 402 drawn between the perspective centers O1 and O2 crosses theplanes plane 430. Ifdevice 1 is a camera, it is known that an object point that produces the point UD on the image must lie on theline 438. The object point might be, for example, one of the points VA, VB, VC, or VD. These four object points correspond to the points WA, WB, WC, WD, respectively, on thereference plane 410 ofdevice 2. This is true whetherdevice 2 is a camera or a projector. It is also true that the four points lie on astraight line 412 in theplane 410. This line, which is the line of intersection of thereference plane 410 with the plane of O1-O2-UD, is referred to as theepipolar line 412. It follows that any epipolar line on thereference plane 410 passes through the epipole E2. Just as there is an epipolar line on the reference plane ofdevice 2 for any point on the reference plane ofdevice 1, there is also anepipolar line 434 on the reference plane ofdevice 1 for any point on the reference plane ofdevice 2. -
FIG. 4B illustrates the epipolar relationships for a3D imager 490 corresponding to3D imager 300 ofFIG. 3 in which two cameras and one projector are arranged in a triangular pattern. In general, thedevice 1,device 2, anddevice 3 may be any combination of cameras and projectors as long as at least one of the devices is a camera. Each of the threedevices reference plane Device 1 anddevice 2 have epipoles E12, E21 on theplanes Device 1 anddevice 3 have epipoles E13, E31, respectively on theplanes Device 2 anddevice 3 have epipoles E23, E32 on theplanes device 1 includes epipoles E12 and E13. The reference plane fordevice 2 includes epipoles E21 and E23. The reference plane fordevice 3 includes epipoles E31 and E32. - Consider the situation of
FIG. 4B in whichdevice 3 is a projector,device 1 is a first camera, anddevice 2 is a second camera. Suppose that a projection point P3, a first image point P1, and a second image point P2 are obtained in a measurement. These results can be checked for consistency in the following way. - To check the consistency of the image point P1, intersect the plane P3-E31-E13 with the
reference plane 460 to obtain theepipolar line 464. Intersect the plane P2-E21-E12 to obtain theepipolar line 462. If the image point P1 has been determined consistently, the observed image point P1 will lie on the intersection of the calculated epipolar lines 1262 and 1264. - To check the consistency of the image point P2, intersect the plane P3-E32-E23 with the
reference plane 470 to obtain theepipolar line 474. Intersect the plane P1-E12-E21 to obtain theepipolar line 472. If the image point P2 has been determined consistently, the observed image point P2 will lie on the intersection of thecalculated epipolar lines - To check the consistency of the projection point P3, intersect the plane P2-E23-E32 with the
reference plane 480 to obtain theepipolar line 484. Intersect the plane P1-E13-E31 to obtain theepipolar line 482. If the projection point P3 has been determined consistently, the projection point P3 will lie on the intersection of thecalculated epipolar lines - The redundancy of information provided by using a
3D imager 300 having a triangular arrangement of projector and cameras may be used to reduce measurement time, to identify errors, and to automatically update compensation/calibration parameters. - The triangular arrangement of the
3D imager 300 may also be used to automatically update compensation/calibration parameters. Compensation parameters are numerical values stored in memory, for example, in an internal electrical system of an 3D measurement device or in another external computing unit. Such parameters may include the relative positions and orientations of the cameras and projector in the 3D imager. The compensation parameters may relate to lens characteristics such as lens focal length and lens aberrations. They may also relate to changes in environmental conditions such as temperature. Sometimes the term calibration is used in place of the term compensation. Often compensation procedures are performed by the manufacturer to obtain compensation parameters for a 3D imager. In addition, compensation procedures are often performed by a user. User compensation procedures may be performed when there are changes in environmental conditions such as temperature. User compensation procedures may also be performed when projector or camera lenses are changed or after then instrument is subjected to a mechanical shock. - Inconsistencies in results based on epipolar calculations for a 3D imager 1290 may indicate a problem in compensation parameters, which are numerical values stored in memory. Compensation parameters are used to correct imperfections or nonlinearities in the mechanical, optical, or electrical system to improve measurement accuracy. In some cases, a pattern of inconsistencies may suggest an automatic correction that can be applied to the compensation parameters. In other cases, the inconsistencies may indicate a need to perform user compensation procedures.
- It is often desirable to integrate color information into 3D coordinates obtained from a triangulations scanner (3D imager). Such color information is sometimes referred to as “texture” information since it may suggest the materials being imaged or reveal additional aspects of the scene such as shadows. Usually such color (texture) information is provided by a color camera separated from the camera in the triangulation scanner (i.e., the triangulation camera). An example of a separate color camera is the
camera 390 in the3D imager 300 ofFIG. 3 . - In some cases, it is desirable to supplement 3D coordinates obtained from a triangulation scanner with information from a two-dimensional (2D) camera covering a wider field-of-view (FOV) than the 3D imager. Such wide-FOV information may be used for example to assist in registration. For example, the wide-FOV camera may assist in registering together multiple images obtained with the triangulation camera by identifying natural features or artificial targets outside the FOV of the triangulation camera. For example, the
camera 390 in the3D imager 300 may serve as both a wide-FOV camera and a color camera. - If a triangulation camera and a color camera are connected together in a fixed relationship, for example, by being mounted onto a common base, then the position and orientation of the two cameras may be found in a common frame of reference. Position of each of the cameras may be characterized by three translational degrees-of-freedom (DOF), which might be for example x-y-z coordinates of the camera perspective center. Orientation of each of the cameras may be characterized by three orientational DOF, which might be for example roll-pitch-yaw angles. Position and orientation together yield the pose of an object. In this case, the three translational DOF and the three orientational DOF together yield the six DOF of the pose for each camera. A compensation procedure may be carried out by a manufacturer or by a user to determine the pose of a triangulation scanner and a color camera mounted on a common base, the pose of each referenced to a common frame of reference.
- In an embodiment, 3D coordinates are determined based at least in part on triangulation. A triangulation calculation requires knowledge of the relative position and orientation of at least one projector such as 1310A and one camera such as 1320A.
- In another embodiment, 3D coordinates are obtained by identifying features or targets on an object and noting changes in the features or target as the object 1330 moves. The process of identifying natural features of an object 1330 in a plurality of images is sometimes referred to as videogrammetry. There is a well-developed collection of techniques that may be used to determine points associated with features of objects as seen from multiple perspectives. Such techniques are generally referred to as image processing or feature detection. Such techniques, when applied to determination of 3D coordinates based on relative movement between the measuring device and the measured object, are sometimes referred to as videogrammetry techniques.
- The common points identified by the well-developed collection of techniques described above may be referred to as cardinal points. A commonly used but general category for finding the cardinal points is referred to as interest point detection, with the detected points referred to as interest points. According to the usual definition, an interest point has a mathematically well-founded definition, a well-defined position in space, an image structure around the interest point that is rich in local information content, and a variation in illumination level that is relatively stable over time. A particular example of an interest point is a corner point, which might be a point corresponding to an intersection of three planes, for example. Another example of signal processing that may be used is scale invariant feature transform (SIFT), which is a method well known in the art and described in U.S. Pat. No. 6,711,293 to Lowe. Other common feature detection methods for finding cardinal points include edge detection, blob detection, and ridge detection.
-
FIG. 5 is a perspective view of anaerial quadcopter drone 500 having abody 510, four legs, and fourrotors 520. The quadcopter drone is capable of flying in any direction or hovering through the use of fourrotors 520. A two-dimensional (2D) camera is mounted to the drone. Other drones may use more or fewer rotors. In other embodiments, thecamera 530 may be mounted to a different type of aerial drone, for example one having fixed wings that do not provide hovering capability. -
FIG. 6 is a perspective view of ameasurement scenario 600 involving adrone 610 and a projector system. Thedrone 610 includes a2D camera 530 and acomputing device 626. In an embodiment, thecomputing device 626 has the capability of a personal computer. In another embodiment, thecomputing device 626 provides some preprocessing but sends the preprocessed signal to a remote computer or computer network for further processing. The preprocessed signal may be sent to the remote computer wirelessly or through a wire attached to a tether. - The
projector system 630 includes aprojector 632 and astand 636. In an embodiment, thestand 636 includes a tripod and motorized wheels. In an embodiment, the motorized wheels respond to computer control, for example, to wireless signals from a remote computer or network. In an alternative embodiment illustrated in ameasurement scenario 600B, theprojector system 640 inFIG. 6 is replaced with a projector system 636B inFIG. 6B having a drone 636B in place of thestand 636. In an embodiment, the drone 636B is configured to fly theprojector 632 to a desired location before landing and projectinglight rays 634. In an embodiment, theprojector 632 is configured to project laser light through a diffraction grating to producerays 634 that end at anobject 640 in a collection ofspots 635. In an embodiment, the spots are approximately circular and are projected in a rectangular grid pattern. In another embodiment, the diffraction grating is configured to give some of the projected spots more power than the others, thereby enabling some of the spots to be distinguished from the others. In an embodiment, spots 635 on the surface of theobject 640 are viewed by the2D camera 530. - In an embodiment, the
camera 530 collects multiple 2D images, the images including some of the projectedspots 635. The projected spots provide a way to register multiple 2D data sets. If thecamera 530 collects the multiple 2D images from different positions and if a combination of natural features and projected spots provide registration features in each of the 2D data sets, it may be possible to determine 3D coordinates using triangulation based on the multiple 2D camera images. - Each of the images collected by the
2D camera 530 are said to be in a device frame of reference that changes with the position and orientation of the aerial drone. The device frame ofreference 650 may be represented by three orthogonal axes xD, yD, and zD, as shown inFIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 . The multiple 2D images and obtained in the device frame ofreference 650 are moved into a common frame ofreference 660 that is fixed (stationary) with respect to theobject 640. The common frame ofreference 660 may be represented by three orthogonal axes xC, yC, and zC, as shown inFIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 . - In a further embodiment, an
aerial quadcopter drone 700 includes abody 510, four legs, and fourrotors 520. A3D imager 300, discussed herein above in reference toFIG. 3 , is attached to thedrone 700. In an embodiment, the3D imager 300 includes twocameras projector 350 arranged in a triangle. The projector projects a pattern of light onto aregion 725 of theobject 640, and the twocameras region 725. Thedrone 700 also includes acamera 390 that may be used to provide color (texture) information for incorporation into the 3D image and to assist in registration. The 3D imager enables measurement of 3D coordinates through the combined use of triangulation and videogrammetry. A personal computer orsimilar computing device 626 cooperates with the3D imager 300 to process collected data to obtain 3D coordinates. Spots of light 635 are projected onto theobject 640 to provide registration markers for the3D imager 300. In an embodiment, thespots 634 are at a different wavelength than the pattern of light projected by theprojector 350. For example, in an embodiment, the spots of light form theprojector 350 are in the infrared range, while the spots of light from theprojector 634 are in the visible range, which appear in images of thecamera 390. In another embodiment, the projected spots oflight - The determination of 3D coordinates of points on the object surface are greatly assisted in the geometry of the
3D imager 300 that includes the twocameras projector 350 arranged in a triangle. The multiple epipolar relationships provided by the triangular relationship of thedevices FIG. 4B . When solving for 3D coordinates on the object surface based on the projected pattern of light from theinternal projector 350 and the images obtained by the first camera and the second camera, the epipolar constraints imposed by the geometry ofFIG. 4B are used. The term “epipolar constraints” as used herein means all of the geometrical constraints of thecameras projector 350. These geometrical constraints include the six degree-of-freedom pose of each of the threedevices devices projector 350 and the image planes of thecameras devices - Each of the images collected by the
cameras reference 650 may be represented by three orthogonal axes xD, yD, and zD, as shown inFIGS. 6, 6B, 8, 9, 11, 12, 15, 16, and 17 . The multiple 2D images resulting 3D coordinates obtained in the device frame ofreference 650 are moved into a common frame ofreference 660 that is fixed (stationary) with respect to theobject 640. The common frame ofreference 660 may be represented by three orthogonal axes xC, yC, and zC, as shown inFIGS. 6, 6B, 8, 9, 11, 12, 15, 16 , and 17. - In an embodiment, the
stand 636 includes motorized wheels that cause it to move near to or farther from theobject 640 according to the needs of the measurement. Usually theprojector system 630 will be moved closer to theobject 640 when thedrone 700 moves closer to theobject 640. The objective is to maintain as a relatively large number of projected spots in theregion 725. As thedrone 700 moves closer to theobject 640, thearea 725 decreases. It is desirable that the number of projectedspots 635 decrease correspondingly. - In an embodiment, the
drone 700 is configured to adjust it position from far to near according to measurement needs. The drone may first measure theobject 640 over a relatively large FOV and then move closer to theobject 640 to measure features of interest on theobject 640 in greater detail and with higher accuracy. Theprojector system 630 may likewise move closer to the object in response. In an embodiment, thestand 636 is replaced by the drone 636B, which can move the3D imager 300 to any needed location. - In an embodiment, the motorized mobile platform moves the external projector to put the projected pattern of light from the external projector into the FOV of the registration camera, which is the
camera FIGS. 5, 7, 14 , respectively. In other words, a control system monitors the number of projected spots received by the registration camera and adjusts the position of the motorized mobile platform to ensure that external projector is properly positioned to project spots within the FOV of the registration camera. - In an embodiment, the motorized mobile platform moves the external projector in such a way that the registration camera sees at least a portion of the pattern of light projected by the external projector before and after the movement of the motorized platform. An advantage of this approach is that these observations by the registration camera may assist in registering the object 3D coordinates determined before and after the movement.
- In an embodiment, the motorized mobile platform moves the external projector so as to keep the number of dots seen by the registration camera above a specified minimum number and below a specified maximum number. In an embodiment, the specified minimum number and the specified maximum number are user-adjustable values. The density of dots projected over a given region increases as the external projector moves closer to the object. Consequently, as the aerial drone moves closer to the object, it may be advantageous to move the external projector closer to the object. A rotation mechanism provided on the external projector may be used synergistically with the movement mechanism on the external projector to obtain a desired number of projected spots in the FOV of the registration camera, as explained further herein below.
- As the distance from the external projector to the object is changed and as the distance from the aerial drone to the object is changed, the optical power received by the photosensitive array of the camera in 3D imager of the aerial drone likewise changes. Consequently, in an embodiment, the optical power from the external projector is changed in response to the amount of light (e.g., the optical power) received by pixels in the photosensitive array in the registration camera. Likewise, it may be desirable to change the exposure time of the registration camera in response to the amount of light received by the registration camera.
- In an embodiment, the
projector 632 inFIG. 6 is replaced by therotatable projector 1010 inFIG. 10 on arotatable projector system 1000. In an embodiment, therotatable projector 1010 is configured emit a beam of pattern of light 1012 along a direction determined by a rotation about ahorizontal axis 1022 and a rotation about avertical axis 1020. In an embodiment, thestand 636 includes atripod 1032,wheels 1034, and amotor drive system 1036. In an embodiment, the rotation of the projector about theaxes rotatable projector system 1000 or a computer located elsewhere in the overall measurement environment. - In some embodiments, the projected spots serve only as landmarks. In this case, the exact direction of the projected
pattern 1012 may not be too important. In other cases, the direction may be important. In this case, the device may include accurate angle transducers such as angular encoders that may, for example, be accurate to 5 microradians. - A number of different beam steering mechanisms and angle transducers may be used in the
rotatable projector system 1000. In an embodiment, the beam steering mechanisms comprise a horizontal shaft and a vertical shaft, each shaft mounted on a pair of bearings and each driven by a frameless motor. The projector may be directly mounted to the horizontal shaft, but many other arrangements are possible. For example, a mirror may be mounted to the horizontal shaft to reflect projected light onto the object or reflect scattered light from the object onto a camera. In another embodiment, a mirror angled at 45 degrees rotates around a horizontal axis and receives or returns light along the horizontal axis. In other embodiments, galvanometer mirrors may be used to send or receive light along a desired direction. In another embodiment, a MEMS steering mirror is used to direct the light into a desired direction. Many other beam steering mechanisms are possible and may be used. In an embodiment, an angular encoder is used to measure the angle of rotation of the projector or camera along each of the two axes. Many other angle transducers are available and may be used. -
FIG. 11 illustrates ameasurement situation 1100 in which arotatable projector system 1000 rotates to place the projected spots from a projectedbeam 1010 into desired locations on theobject 640 according to the position of thedrone 700. In an embodiment, the rotation mechanism of the external projector is configured to rotate the projected pattern of light to place it in the FOV of the registration camera of the 3D imager in the aerial drone. The registration is thecamera FIGS. 5, 7, 14 , respectively. - In an embodiment, the rotation mechanism of the external projector is configured to rotate the projected pattern of light in such a way that the registration camera sees at least a portion of the pattern of light projected by the external projector before and after the rotation by the rotation mechanism. An advantage of this approach is that these observations by the registration camera may assist in registering the object 3D coordinates obtained before and after the rotation.
- In an embodiment, the rotation mechanism is configured to rotate the pattern of light from the external projector so as to keep the number of dots seen by the registration camera above a specified minimum number and below a specified maximum number. For example, if the aerial drone moves closer to the object and at the same time rotates to point the 3D imager upward, it may be desirable to respond by moving the external projector closer to the object while at the same time rotating the projected light from the external projector upward. This keeps a relatively large number of projected spots from the external projector visible to the registration camera, which is usually a desirable result.
-
FIG. 12 illustrates ameasurement situation 1200 in which a device such as the3D imager 300 or thecamera 530 is a part of an aerialdrone 3D imager 1220 similar to thedrone 500 or thedrone 700 except that the3D imager 1220 includes a rotatable steering mechanism. In an embodiment, the rotatable steering mechanism is substantially like the rotatable steering mechanism of theprojector system 1000 except in being sized to hold the 3D imager elements. - In an embodiment, the drone flies to a desired location relative to an
object 640 to be measured and then lands before making the measurement. A rotatable steering mechanism of a3D imager 1220 steers the projected pattern of light to the desired location on theobject 640 before making a measurement. In an embodiment, the projectedspots 1010 are used to assist in registration. In a further embodiment, accurately known rotation angles of the rotatable mechanism of the3D imager 1220 also contribute to the determination of the 3D coordinates of the surface of theobject 640. An advantage of this approach is that a robot that has landed does not experience vibration from the rotating propellers, thus enabling more accurate measurement of 3D coordinates. - A way to minimize, but not eliminate, the effect of vibration from rotating propellers is to add a servo-controlled gimbal camera mount, which is a device configured keep the camera pointed in the same direction as the direction of the drone changes slightly. In an embodiment, a servo system makes use of signals from an inertial measurement unit to send signals to three brushless motors that keep the camera leveled.
- In an embodiment, the 3D imager device mounted to the mover further includes a laser line probe, also known as a line scanner. The operation of the laser line scanner (also known as a laser line probe or simply line scanner) is now described with reference to
FIG. 13 . Theline scanner system 1300 includes aprojector 1320 and acamera 1340. Theprojector 1320 includes a source pattern of light 1321 and aprojector lens 1322. The source pattern of light includes an illuminated pattern in the form of a line. The projector lens includes a projector perspective center and a projector optical axis that passes through the projector perspective center. In the example ofFIG. 13 , a central ray of the beam of light 1324 is aligned with the perspective optical axis. Thecamera 1340 includes acamera lens 1342 and aphotosensitive array 1341. The lens has a cameraoptical axis 1343 that passes through a cameralens perspective center 1344. In theexemplary system 1300, the projector optical axis, which is aligned to the beam of light 1324, and the camera lensoptical axis 1343 are perpendicular to the line of light 1325 projected by the source pattern of light 1321. In other words, theline 1325 is in the direction perpendicular to the paper inFIG. 13 . The line of light 1325 strikes an object surface, which at a first distance from the projector isobject surface 1310A and at a second distance from the projector is object surface 1320A. It is understood that at different heights above or below the paper ofFIG. 13 , the object surface may be at a different distance from the projector than the distance to either objectsurface FIG. 13 , the line of light intersectssurface 1310A in apoint 1326 and it intersects thesurface 1310B in apoint 1327. For the case of theintersection point 1326, a ray of light travels from thepoint 1326 through the cameralens perspective center 1344 to intersect thephotosensitive array 1341 in animage point 1346. For the case of theintersection point 1327, a ray of light travels from thepoint 1327 through the camera lens perspective center to intersect thephotosensitive array 1341 in animage point 1347. By noting the position of the intersection point relative to the position of the camera lensoptical axis 1343, the distance from the projector (and camera) to the object surface can be determined. The distance from the projector to other points on the intersection of the line of light 1325 with the object surface, that is points on the line of light that do not lie in the plane of the paper ofFIG. 13 , may similarly be found. In the usual case, the pattern on the photosensitive array will be a line of light (in general, not a straight line), where each point in the line corresponds to a different position perpendicular to the plane of the paper, and the position perpendicular to the plane of the paper contains the information about the distance from the projector to the camera. Therefore, by evaluating the pattern of the line in the image of the photosensitive array, the three-dimensional coordinates of the object surface along the projected line can be found. Note that the information contained in the image on the photosensitive array for the case of a line scanner is contained in a (not generally straight) line. -
FIG. 14 illustrates adrone 1400 configured to carry a3D imager 1410 that includes aline projector 1412, afirst camera 1414, and asecond camera 1416. The first camera is configured to cooperate with the line projector to determine 3D coordinates of object points intersected by the projected line. In an embodiment, theprojector 1412 corresponds to theprojector 1320 ofFIG. 13 , and thecamera 1414 corresponds to thecamera 1340 ofFIG. 13 . In an embodiment, thesecond camera 1416 is a color camera configured to respond to visible light. In an embodiment, thecamera 1416 is configured to respond to light from a stationary external projector, thereby enabling registration of the multiple 3D data sets collected by the laser scanner over different positions and orientations of thedrone 1400. In an embodiment, the3D imager 1410 is configured to work with acomputer 626, which might be a personal computer. -
FIG. 15 illustrates ameasurement scenario 1500 in which adrone 1400 that includes a3D imager 1410 as shown inFIG. 14 projects a beam of light 1510 that produces a line of light 1512 on a surface of anobject 640. This image of the line of light 1512 is captured by thecamera 1414 and is used to determine 3D coordinates of the line of light on the surface of theobject 640 in the frame of reference of the3D imager 1410. In an embodiment, thecolor camera 1416 has a relatively largerangular view 1520 that covers anarea 1522 on the object surface. The color camera captures several spots from the stationaryexternal projector 1000, which may be a rotatable projector. In an embodiment, the color camera captures the line of light projected by the scanner as well as the spots projected by theexternal projector 1000. In another embodiment, the color camera includes an optical filter to block out the wavelength of the projected line of light. In most cases, the drone will fly relatively close to theobject 640 when projecting a line of laser light, as this permits maximum accuracy to be obtained. In an embodiment, the laser line probe flies along a smooth path, covering the surface with the line of light, which produces a collection of 3D coordinates over a surface area. - In another embodiment, the drone lands before projecting the
laser light 1512. In an embodiment, a high accuracy rotatable mount of provided to steer the beam of light 1510 from the3D imager 1410. In an embodiment, the angular accuracy of the beam steering mechanism is high, for example, 5 microradians, thereby enabling high registration accuracy of measured 3D coordinates, even if registration spots from thebeams 1010 are not provided. - In other embodiments, the
line projector 1412 may be replaced by an area projector. In an embodiment, the area projector projects single shot coded patterns of light that are captured by a camera to determine 3D coordinate values of theobject 640. In another embodiment, the drone lands before measurements begin and the drone projects a sequence of patterns that are evaluated to determine 3D coordinates of theobject 640 to relatively high accuracy. One such sequential measurement method known in the art is the phase-shift method in which optical power of projected light is modulated sinusoidally along one direction and the phase of the sinusoidal pattern shifted side-to-side at least three times. As is well known in the art, the resulting optical powers collected at each point for each of the three or more phase shifts is sufficient to enable determination of 3D coordinates. An advantage of this method is high rejection of background light. In an embodiment, a digital micromirror device (DMD) is used to produce desired patterns such as coded patterns or phase-shifted sinusoidal patterns. - Other methods are known to minimize the effects of background light. In an embodiment, all cameras and projectors include optical filters to pass desired wavelengths of light and block unwanted wavelengths. In some cases, such optical filters may be thin-film dielectric coatings applied to windows or lenses. Another method to minimize the effect of background light is to pulse the projected light and to reduce the camera exposure time in correspondence. Another possibility is to use a fast lock-in amplifier, in which the optical power of light is modulated and then filtered to extract the modulation frequency. Image chips with such a lock-in amplifying functionality are made, for example, by the company Heliotis.
- In an embodiment, the projected optical power is varied according to the distance from the object. For example, for the
projector 350 of the3D imager 300, laser light may be sent through a diffraction grating to obtain spots that remain relatively small and sharply defined over relatively large distances. For this case, it is possible in some cases to project light within the eyesafe class 1 laser safety limit while operating by up to 6 meters from theobject 640. In an embodiment, a diffraction grating is configured to produce some spots that are more powerful than others. - Although the figures described herein above have used outdoor measurements as examples, it is equally possible to make measurements indoors, for example, in factory environments. In some cases, a drone may be restricted to a safety region in which humans are not allowed. In some cases, a drone may have a parachute configured to open in the event of a rapid fall, which might be detected, for example, by an inertial measurement unit (IMU). In other embodiments, a drone may be attached to a tether.
-
FIG. 16 illustrates ameasurement scenario 1600 in which adrone 1610 includes a3D imager 1620 configured to measure 3D coordinates of anobject 1630 in cooperation with aprojector system 1010. In an embodiment, the3D imager 1620 is configured to cooperate with acomputing device 626 and may include, for example, any of the3D measuring devices FIGS. 5, 7, 12, and 14 , respectively. In an embodiment, theprojector system 1010 is configured to sit in a fixed position, for example, on apillar 1030. In an embodiment, theprojector system 1010 is configured to rotate about two axes, for example about theaxes spots 635 in a grid pattern. Alternatively, theprojector system 1010 may project any other pattern of light. In embodiments, the drones are temporarily mounted on magnetic mounts or other structures. -
FIG. 17 illustrates ameasurement scenario 1700 in which adrone 700 that includes a3D imager 300 andcomputing device 626 operates without imaging a pattern of light projected onto the object by anexternal projector 630. A3D imager 300 described herein above in reference toFIGS. 3 and 7 has the capability of imaging an object at near and far distances, especially if provision is made to vary the optical power of the projected light. It may be possible, for example, for a measurement device to measure at distances of 6 meters at relatively high power and relatively low resolution while measuring at a distance of 0.5 meters at relatively lower power with improved resolution and accuracy. In an embodiment, thecamera 390 is capable of measuring over a FOV of around 60 degrees, thereby enabling many target features to be captured as cardinal points for use in registering multiple images using videogrammetry techniques. In many cases, because of the advantages provided by the multiple epipolar constraints that must be satisfied simultaneously, as explained herein above with reference toFIG. 4B , it is possible to relatively good registration and relatively good 3D accuracy even without the projecting of a pattern of light by an external projector such as theprojector 630. Because the registration is relatively good and because the scene may be imaged in 3D both near to the object and far from the object, adrone 1700 may be able to provide its own guidance control, perhaps in conjunction with maps or CAD models provided to it through a computer or network interface. - In an embodiment, an additional 3D measuring device such as a laser tracker, total station, or a time-of-flight (TOF) laser scanner is used to determine 3D coordinates of reference markers having a fixed position relative to an object being measured. In this context, a TOF laser scanner is a device that measures a distance and two angles to a diffusely scattering target. The term TOF in this instance means that the scanner measures the time required for an emitted light signal to reach the target and return to the scanner. The speed of light in air is used in combination with the determined time to calculate the distance to the target. This TOF method is distinguished from the method used by a triangulation scanner (3D imager) which is based on a triangulation calculation and does not depend directly on the speed of light in air.
- A laser tracker typically has an accuracy of 100 micrometers or better, while total stations and TOF laser scanners typically have accuracies of a few millimeters. For example, a laser tracker may be used to determine that the distance between two parallel walls is 10 meters. This distance value is then probably known to an accuracy of about 100 micrometers. In contrast, a
3D imager 300 attached to adrone 700 may know this distance to within about a centimeter or perhaps a few centimeters over this 10 meter distance. The distance between the walls as measured by the tracker may then be transferred to the measurements made by the3D imager 300 between the walls, and the resulting coordinate values rescaled according to the more accurate distance measured by the laser tracker. Likewise, a total station or a TOF laser scanner may be used to determine a distance between two parallel walls to an accuracy of a few millimeters, which is still usually expected to be several times better than the accuracy over the same distance as measured by adrone 700. In broad terms, coordinate measurements of a relatively accurate reference coordinate measuring device such as a laser tracker, total station, or TOF scanner, may make measurements enabling improvement in measurements made by a drone that includes a 3D imager. In most cases this improvement is possible even if an external projector is used in combination with the drone-carried 3D imager. - In an embodiment, a
computer 626 establishes a Remote Desktop link to a personal computer in possession of an operator. The operator then sees whatever data is collected and processed by thecomputer 626. In another embodiment, data is sent to a remote computer or computer network over a high speed wireless data link for further processing. In a further embodiment, data is sent to a remote computer over a wired link such as an Ethernet link. A wired link is practical, for example, if the drone is attached to a tether. In some embodiments, detailed data are stored in memory on the drone for post-processing, while 2D and 3D data are transmitted to a remote computing device wirelessly but at a lower frame rate, for example, at 3 or 4 frames per second. -
FIG. 18 further describes the computing system associated with the drone systems described herein above. The drone system, which may be any of the drone systems described herein above, includes computingelements 626. The most cases, thecomputing elements 626 will include at least a processor, a memory, and a communication interface. In some cases, thecomputing elements 626 is equivalent to a personal computer so that a Remote Desktop link may be established with a remote computer. Thecomputing elements 626 are part of anoverall computing system 1800 that may optionally include a one or moreadditional computing elements computing elements 626 may support wired or wireless communication between thecomputing elements 626 and theoptional computing elements - In accordance with an embodiment, a three-dimensional (3D) coordinate measuring system is provided. The system includes an external projector configured to operate while stationary and to project a first projected pattern of light onto an object. The system further includes an aerial measuring device that includes an aerial drone and a 3D imaging device, the 3D imaging device being attached to the aerial drone, the aerial measuring device being separate from the external projector, the aerial measuring device being configured to cooperate with the external projector to determine 3D coordinates of the object.
- In an embodiment, the 3D imaging device further comprises a first two-dimensional (2D) camera configured to capture a first 2D image from a first position of the aerial drone and a second 2D image from a second position of aerial drone, the second position being different than the first position. The 3D imaging device is configured to identify three cardinal points common to the first image and the second image, each of the three cardinal points derived from the first projected pattern of light on the object or from natural features of the object, at least one of the three cardinal points being derived from the first projected pattern of light. In this embodiment, the 3D imaging device is further configured to determine first 3D coordinates of a first object point, second 3D coordinates of a second object point, and third 3D coordinates of a third object, the determined first 3D coordinates, determined second 3D coordinates, and determined third 3D coordinates based at least in part on the three cardinal points in the first image and the three cardinal points in the second image.
- In another embodiment, the 3D coordinate measuring system of claim the external projector is configured to project a first collection of spots of light onto the object. In another embodiment, the external projector comprises a source of laser light and a diffraction grating, the external projector being configured to project laser light through the diffraction grating and onto the object. In another embodiment, the external projector is configured to adjust power of the laser light according to the amount of light received by pixels of the first 2D camera.
- In still another embodiment, the system further includes an internal projector configured to project a second pattern of projected light through a projector perspective center onto the object. A second 2D camera is configured to form a second camera image of the second pattern of projected light on the object, the second 2D camera having a camera perspective center, there being a baseline between the projector perspective center and the camera perspective center, the length of the baseline being a baseline distance. The 3D imaging device is further configured to determine the 3D coordinates of the object based at least in part on the second pattern of projected light, the second camera image, and the baseline distance.
- In an embodiment, the first 2D camera is a color camera. In an embodiment, the internal projector is configured to project light a pulsed pattern of light.
- In still another embodiment, the external projector is further configured to be moved by a motorized device. In an embodiment, the motorized device is a mobile platform having motorized wheels. In an embodiment the motorized device is an aerial drone.
- In still another embodiment the aerial drone is a helicopter. In an embodiment the aerial drone is a quadcopter. In still another embodiment, the aerial drone is a fixed wing aircraft. In an embodiment, the second pattern of projected light is a line of light.
- In still another embodiment, the external projector includes a first mechanism configured to steer light from the external projector into a plurality of directions. In an embodiment, the first mechanism is configured to steer the light from the external projector about two orthogonal axes. In an embodiment, a projector angular transducer is provided to measure an angle of rotation of the first mechanism. In an embodiment, the aerial drone device further includes a second mechanism configured to steer the 3D imaging device into a plurality of directions. In an embodiment, the second mechanism is further configured to steer the 3D imaging device about two orthogonal axes. In an embodiment, an angular transducer is provided to measure an angle of rotation of the second mechanism.
- In still another embodiment, the system is further configured to detect motion of the external projector relative to the object, the determination based at least in part on movement of the first projected pattern of light relative and on cardinal points that are based on natural features of the object. In an embodiment, the system is further configured to determine a pose of the external projector based at least in part on the first projected pattern of light and on those cardinal points that are obtained from natural features of the object. In an embodiment, the system is further configured to detect motion of the external projector relative to the object, the relative motion determined based at least in part on the first projected pattern of light, an observed movement of the first projected pattern of light, the second pattern of projected light, the second camera image, and the baseline distance. In an embodiment, the system is configured to determine a pose of the external projector based at least in part on the first projected pattern of light, the observed movement of the first projected pattern of light, the second pattern of projected light, the second camera image, and the baseline distance.
- In accordance with another embodiment, a three-dimensional (3D) coordinate measuring system is provided. The system includes an external projector configured to project a first projected pattern of light onto an object. The system further includes an aerial measuring device in a device frame of reference, the aerial measuring device including a triangulation scanner and a registration camera, the triangulation scanner including an internal projector and a first camera, the triangulation scanner configured to determine 3D coordinates of the object in the device frame of reference in a first instance and a second instance, the registration camera configured to image the first projected pattern of light in the first instance and the second instance, the aerial measuring device configured to register in a common frame of reference the determined 3D coordinates in the first instance and in the second instance based at least in part on the registration camera image of the first projected pattern of light in the first instance and the second instance.
- In another embodiment, the first projected pattern of light is projected from the external projector while the external projector is stationary. In another embodiment, the external projector is attached to a motorized mobile platform configured to move the external projector to multiple positions.
- In an embodiment, the motorized mobile platform includes motorized wheels. In an embodiment, the motorized mobile platform is a drone. In an embodiment, the motorized mobile platform is a helicopter. In an embodiment, the motorized mobile platform is a quadcopter. In an embodiment, the motorized mobile platform is configured to move the external projector in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- In still another embodiment a rotation mechanism is configured to rotate the first projected pattern of light about a first axis. In an embodiment, the rotation mechanism is configured to rotate the first projected pattern of light about a second axis. In an embodiment, an angle transducer is configured to measure an angle of rotation of the first projected pattern of light about the first axis. In an embodiment, the rotation mechanism is configured to rotate the first projected pattern of light about the first axis to put the first projected pattern of light in a field-of-view of the registration camera.
- In still another embodiment, the motorized mobile platform is configured to move the external projector to put the first projected pattern of light in a field-of-view of the registration camera.
- In an embodiment, the motorized mobile platform is configured to move the external projector in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- In an embodiment, the rotation mechanism is configured to rotate the first projected pattern of light in such a way that a portion of the first projected pattern is common to images obtained by the registration camera before and after movement of the external projector.
- In still another embodiment, the first projected pattern of light includes a collection of illuminated dots. In an embodiment, the motorized mobile platform is configured to move the external projector so as to keep the number of dots seen by registration camera above a specified minimum number and below a specified maximum number. In an embodiment, the rotation mechanism is configured to rotate the first projected pattern of light so as to keep the number of dots seen by registration camera above a specified minimum number and below a specified maximum number. In an embodiment, the optical power of the projected dots is adjusted to keep the light level received by pixels in the registration camera within a specified range. In still another embodiment, the registration camera is a color camera.
- In accordance with another embodiment, a three-dimensional (3D) coordinate measuring system is provided. The system includes an aerial drone attached to a triangulation scanner and a registration camera. The triangulation scanner including a first projector, a first camera, and a second camera, the first camera, the second camera, and the third camera arranged in a triangle. The first projector, the first camera, and the second camera having first epipolar constraints. The first projector configured to project a projected first pattern of light on the object. The first camera configured to form a first image of the projected first pattern of light on the object. The second camera configured to form a second image of the projected first pattern of light on the object. The system further configured to determine 3D coordinates of an object based at least in part on the projected first pattern of light, the first image, the second image, and the first epipolar constraints, the registration camera configured to obtain an image of the object and to extract from the image cardinal points based on natural features of the object. The system further being configured to register multiple sets of the 3D coordinates obtained from the triangulation scanner, the registration based at least in part on matching of common cardinal points present in successive images of the registration camera.
- In still another embodiment, the aerial drone is configured to fly according to navigation signals, the navigation signals based at least in part on data from the registration camera. In another embodiment, the aerial drone is further configured to fly nearer to an object or farther from an object according to a level of feature detail present in the images of the registration camera. In another embodiment, the registration camera is a color camera and the 3D coordinate measuring system is further configured to place the observed colors on the registered 3D coordinates.
- In still another embodiment, the system further includes a drone computer configured to communicate with a remote computer by Remote Desktop mode, the drone computer configured to enable a remote user to view images obtained from the aerial drone.
- In still another embodiment, the system further includes a drone computer configured to communicate with a remote computer through wired or wireless signals.
- In still another embodiment, the system further includes constraints to keep the drone located in safety zones, the constraints being selected from the group consisting of a tether and a barrier.
- While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
Claims (19)
1. A three-dimensional (3D) coordinate measuring system, comprising
an aerial measuring device that includes an aerial drone and a 3D measurement device, the aerial drone being configured to move from a first position to a stationary second position, and wherein the 3D measurement device is configured to optically measure points on the surface of an object;
one or more processors configured to execute nontransitory computer readable instructions, the computer readable instructions comprising:
moving the aerial measuring device from the first position;
landing the aerial measuring device at the second position;
optically measuring a first object point with the 3D measurement device; and
determining a first 3D coordinates of the first object point.
2. The system of claim 1 , further comprising:
an external projector operable to project a projected pattern of light onto an object; and
wherein the 3D measurement device includes a camera configured to acquire an image of the pattern of light.
3. The system of claim 2 , wherein the external projector is mounted on a stand, the stand having a tripod and motorized wheels.
4. The system of claim 3 , further comprising a controller operably coupled to the motorized wheels, the motorized wheels being configured to move the stand.
5. The system of claim 2 , wherein the external projector is coupled to a second aerial drone.
6. The system of claim 1 , wherein the 3D measurement device is a 3D imager having a first camera, a second camera, and a projector.
7. The system of claim 6 , wherein the first camera, the second camera and the projector are arranged relative to each other in a triangle.
8. The system of claim 6 , wherein the 3D imager further includes a color camera.
9. The system of claim 1 , wherein the measuring of the first point is through a combination of triangulation and videogrammetry.
10. The system of claim 3 , wherein the computer readable instructions further comprise:
measuring the object over a first field of view (FOV) with the 3D measurement device;
moving the aerial drone to a third location, the third location being closer to the object than the second location;
moving the projector in response to the movement of the aerial drone;
measuring the object over a second FOV with the aerial drone at the third position, the second FOV being smaller than the first FOV.
11. The system of claim 10 , wherein the movement of the projector is to a projector position that places a pattern of light emitted by the projector into the second FOV.
12. The system of claim 11 , wherein the movement of the projector is selected to keep the number of dots in the pattern of light that are imaged by the color camera above a predetermined first number of dots and below a predetermined second number of dots.
13. The system of claim 12 , wherein the computer readable instructions further include increasing a density of dots as the projector moves closer to the object.
14. The system of claim 10 , wherein the computer readable instructions further include changing an optical power of the projector in response to an amount of light received by the color camera.
15. The system of claim 2 , wherein the projector is rotatable about a first axis and a second axis, the first axis and second axis being perpendicular to each other.
16. The system of claim 8 , wherein the 3D imager includes a laser line scanner configured to project a line of light.
17. The system of claim 16 , wherein the color camera includes an optical filter configured to block the wavelength of light of the line of light.
18. The system of claim 17 , wherein the computer readable instructions further include landing the aerial drone between emitting the line of light.
19. The system of claim 1 , wherein the 3D measuring device is selected from a group comprising: a laser tracker, a total station, an area scanner, and a time of flight scanner.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/272,291 US20190186905A1 (en) | 2015-09-09 | 2019-02-11 | Aerial device having a three-dimensional measurement device |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562216021P | 2015-09-09 | 2015-09-09 | |
US201562216027P | 2015-09-09 | 2015-09-09 | |
US201562215978P | 2015-09-09 | 2015-09-09 | |
US15/250,324 US9989357B2 (en) | 2015-09-09 | 2016-08-29 | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
US15/991,433 US10234278B2 (en) | 2015-09-09 | 2018-05-29 | Aerial device having a three-dimensional measurement device |
US16/272,291 US20190186905A1 (en) | 2015-09-09 | 2019-02-11 | Aerial device having a three-dimensional measurement device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/991,433 Continuation US10234278B2 (en) | 2015-09-09 | 2018-05-29 | Aerial device having a three-dimensional measurement device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190186905A1 true US20190186905A1 (en) | 2019-06-20 |
Family
ID=58190420
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/250,324 Active 2036-09-03 US9989357B2 (en) | 2015-09-09 | 2016-08-29 | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
US15/991,433 Active US10234278B2 (en) | 2015-09-09 | 2018-05-29 | Aerial device having a three-dimensional measurement device |
US16/272,291 Abandoned US20190186905A1 (en) | 2015-09-09 | 2019-02-11 | Aerial device having a three-dimensional measurement device |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/250,324 Active 2036-09-03 US9989357B2 (en) | 2015-09-09 | 2016-08-29 | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
US15/991,433 Active US10234278B2 (en) | 2015-09-09 | 2018-05-29 | Aerial device having a three-dimensional measurement device |
Country Status (4)
Country | Link |
---|---|
US (3) | US9989357B2 (en) |
DE (1) | DE112016004085T5 (en) |
GB (1) | GB2556802A (en) |
WO (1) | WO2017044344A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110926366A (en) * | 2019-12-13 | 2020-03-27 | 浙江省计量科学研究院 | Curved surface contour measuring method based on multi-station layout of laser tracker |
CN111307041A (en) * | 2020-03-20 | 2020-06-19 | 嘉兴方石科技有限公司 | Building measuring method |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10021379B2 (en) | 2014-06-12 | 2018-07-10 | Faro Technologies, Inc. | Six degree-of-freedom triangulation scanner and camera for augmented reality |
US9402070B2 (en) | 2014-06-12 | 2016-07-26 | Faro Technologies, Inc. | Coordinate measuring device with a six degree-of-freedom handheld probe and integrated camera for augmented reality |
US10068506B2 (en) * | 2014-08-01 | 2018-09-04 | Philips Lighting Holding B.V. | System, device for creating an aerial image |
US10176625B2 (en) * | 2014-09-25 | 2019-01-08 | Faro Technologies, Inc. | Augmented reality camera for use with 3D metrology equipment in forming 3D images from 2D camera images |
US9506744B2 (en) | 2014-12-16 | 2016-11-29 | Faro Technologies, Inc. | Triangulation scanner and camera for augmented reality |
US10620300B2 (en) | 2015-08-20 | 2020-04-14 | Apple Inc. | SPAD array with gated histogram construction |
US9989357B2 (en) | 2015-09-09 | 2018-06-05 | Faro Technologies, Inc. | Aerial device that cooperates with an external projector to measure three-dimensional coordinates |
US10621433B1 (en) * | 2015-12-18 | 2020-04-14 | EControls Holdings, KKC | Multiscopic whitetail scoring game camera systems and methods |
CN105627926B (en) * | 2016-01-22 | 2017-02-08 | 尹兴 | Four-camera group planar array feature point three-dimensional measurement system and measurement method |
DE102016206982B4 (en) | 2016-04-25 | 2022-02-10 | Siemens Aktiengesellschaft | Airmobile for scanning an object and system for damage analysis of the object |
US10823826B2 (en) * | 2016-06-14 | 2020-11-03 | Stmicroelectronics, Inc. | Adaptive laser power and ranging limit for time of flight sensor |
US20180341009A1 (en) * | 2016-06-23 | 2018-11-29 | Apple Inc. | Multi-range time of flight sensing |
US11060853B2 (en) * | 2016-09-14 | 2021-07-13 | Scantech (Hangzhou) Co., Ltd. | Three-dimensional sensor system and three-dimensional data acquisition method |
US11303859B2 (en) * | 2016-09-29 | 2022-04-12 | Stmicroelectronics (Research & Development) Limited | Time of flight sensing for brightness and autofocus control in image projection devices |
US10275610B2 (en) | 2016-11-28 | 2019-04-30 | Stmicroelectronics, Inc. | Time of flight sensing for providing security and power savings in electronic devices |
US11430148B2 (en) * | 2016-12-28 | 2022-08-30 | Datalogic Ip Tech S.R.L. | Apparatus and method for pallet volume dimensioning through 3D vision capable unmanned aerial vehicles (UAV) |
US10559213B2 (en) * | 2017-03-06 | 2020-02-11 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
US10720069B2 (en) * | 2017-04-17 | 2020-07-21 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting |
US11055867B2 (en) * | 2017-05-02 | 2021-07-06 | Jdrf Electromag Engineering Inc. | Automatic light position detection system |
US11268804B2 (en) | 2017-05-02 | 2022-03-08 | Jdrf Electromag Engineering, Inc. | Automatic light position detection system |
US10189580B2 (en) * | 2017-06-16 | 2019-01-29 | Aerobo | Image stabilization and pointing control mechanization for aircraft imaging systems |
WO2019005260A1 (en) | 2017-06-29 | 2019-01-03 | Apple Inc. | Time-of-flight depth mapping with parallax compensation |
US10527711B2 (en) | 2017-07-10 | 2020-01-07 | Aurora Flight Sciences Corporation | Laser speckle system and method for an aircraft |
WO2019026169A1 (en) | 2017-08-01 | 2019-02-07 | J Think株式会社 | Operation system for working machine |
US10217371B1 (en) * | 2017-08-22 | 2019-02-26 | Rosemount Aerospace Inc. | Method and system for aircraft taxi strike alerting using adaptive field of view |
US10600194B2 (en) * | 2017-08-24 | 2020-03-24 | General Electric Company | Image and video capture architecture for three-dimensional reconstruction |
US11064184B2 (en) * | 2017-08-25 | 2021-07-13 | Aurora Flight Sciences Corporation | Aerial vehicle imaging and targeting system |
US10495421B2 (en) | 2017-08-25 | 2019-12-03 | Aurora Flight Sciences Corporation | Aerial vehicle interception system |
US10955552B2 (en) | 2017-09-27 | 2021-03-23 | Apple Inc. | Waveform design for a LiDAR system with closely-spaced pulses |
JP6775748B2 (en) * | 2017-09-28 | 2020-10-28 | 株式会社オプティム | Computer system, location estimation method and program |
EP4268757A3 (en) * | 2017-10-06 | 2023-12-06 | Advanced Scanners, Inc. | Generation of one or more edges of luminosity to form three-dimensional models of objects |
CN111492262B (en) * | 2017-10-08 | 2024-06-28 | 魔眼公司 | Distance measurement using warp grid patterns |
JP7179844B2 (en) | 2017-11-13 | 2022-11-29 | ヘキサゴン メトロロジー,インコーポレイテッド | Thermal management of optical scanning equipment |
CN111465870B (en) | 2017-12-18 | 2023-08-29 | 苹果公司 | Time-of-flight sensing using an array of addressable emitters |
CN110006392A (en) * | 2018-01-05 | 2019-07-12 | 中国移动通信有限公司研究院 | A kind of antenna for base station work ginseng measurement method, device and measuring device |
EP3803266A4 (en) | 2018-06-06 | 2022-03-09 | Magik Eye Inc. | Distance measurement using high density projection patterns |
FI128523B (en) * | 2018-06-07 | 2020-07-15 | Ladimo Oy | Modeling the topography of a three-dimensional surface |
USD875573S1 (en) | 2018-09-26 | 2020-02-18 | Hexagon Metrology, Inc. | Scanning device |
CN109658497B (en) * | 2018-11-08 | 2023-04-14 | 北方工业大学 | Three-dimensional model reconstruction method and device |
EP3911920B1 (en) | 2019-01-20 | 2024-05-29 | Magik Eye Inc. | Three-dimensional sensor including bandpass filter having multiple passbands |
EP3918424A4 (en) * | 2019-01-28 | 2022-09-14 | Stanley Electric Co., Ltd. | Ballistic light modulations for image enhancement through fog |
US11604261B2 (en) * | 2019-02-06 | 2023-03-14 | Lockeed Martin Corporation | Extended laser active ranging system, method and computer readable program product |
EP3887852A1 (en) | 2019-02-11 | 2021-10-06 | Apple Inc. | Depth sensing using a sparse array of pulsed beams |
CN109767490B (en) * | 2019-03-05 | 2023-07-18 | 盎锐(上海)信息科技有限公司 | Image analysis system and method for projection grating modeling |
US11474209B2 (en) | 2019-03-25 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11500094B2 (en) | 2019-06-10 | 2022-11-15 | Apple Inc. | Selection of pulse repetition intervals for sensing time of flight |
US11555900B1 (en) | 2019-07-17 | 2023-01-17 | Apple Inc. | LiDAR system with enhanced area coverage |
EP4025868A1 (en) * | 2019-09-05 | 2022-07-13 | Plasser & Theurer Export Von Bahnbaumaschinen Gesellschaft m.b.H. | Method and measuring apparatus for measuring an object of a track |
JP2023504157A (en) | 2019-12-01 | 2023-02-01 | マジック アイ インコーポレイテッド | Improving triangulation-based 3D range finding using time-of-flight information |
US11733359B2 (en) | 2019-12-03 | 2023-08-22 | Apple Inc. | Configurable array of single-photon detectors |
US11644303B2 (en) * | 2019-12-16 | 2023-05-09 | Faro Technologies, Inc. | Three-dimensional coordinate measuring instrument coupled to a camera having a diffractive optical element |
TWI726536B (en) * | 2019-12-16 | 2021-05-01 | 財團法人工業技術研究院 | Image capturing method and image capturing apparatus |
US20210223038A1 (en) * | 2020-01-18 | 2021-07-22 | Magik Eye Inc. | Distance measurements including supplemental accuracy data |
CN111324145B (en) * | 2020-02-28 | 2022-08-16 | 厦门理工学院 | Unmanned aerial vehicle autonomous landing method, device, equipment and storage medium |
CN111442721B (en) * | 2020-03-16 | 2021-07-27 | 天目爱视(北京)科技有限公司 | Calibration equipment and method based on multi-laser ranging and angle measurement |
CN111781585B (en) * | 2020-06-09 | 2023-07-18 | 浙江大华技术股份有限公司 | Method for determining firework setting-off position and image acquisition equipment |
CN112033297A (en) * | 2020-08-10 | 2020-12-04 | 山东科技大学 | Derrick deformation monitoring method based on close-range photogrammetry technology |
CN112378336B (en) * | 2020-11-13 | 2023-02-17 | 南通中远海运川崎船舶工程有限公司 | Cabin capacity measuring system based on unmanned aerial vehicle and measuring method thereof |
US11681028B2 (en) | 2021-07-18 | 2023-06-20 | Apple Inc. | Close-range measurement of time of flight using parallax shift |
US20240168176A1 (en) * | 2021-08-31 | 2024-05-23 | James DOTAN | Precision positioning and pointing instrument |
CN114111639B (en) * | 2021-11-26 | 2024-04-30 | 凌云光技术股份有限公司 | Correction method and device of surface structured light three-dimensional measurement system |
US11869206B2 (en) * | 2021-12-28 | 2024-01-09 | Datalogic Ip Tech S.R.L. | Controllable laser pattern for eye safety and reduced power consumption for image capture devices |
CN114577180B (en) * | 2022-05-06 | 2022-07-15 | 成都纵横通达信息工程有限公司 | Geographic information mapping device, system and method based on unmanned aerial vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140046589A1 (en) * | 2011-04-14 | 2014-02-13 | Hexagon Technology Center Gmbh | Measuring system for determining 3d coordinates of an object surface |
US20150116693A1 (en) * | 2013-10-31 | 2015-04-30 | Kabushiki Kaisha Topcon | Three-Dimensional Measuring Method And Surveying System |
US20170122736A1 (en) * | 2015-11-03 | 2017-05-04 | Leica Geosystems Ag | Surface surveying device for determining 3d coordinates of a surface |
US20180035606A1 (en) * | 2016-08-05 | 2018-02-08 | Romello Burdoucci | Smart Interactive and Autonomous Robotic Property Maintenance Apparatus, System, and Method |
US10056001B1 (en) * | 2016-05-20 | 2018-08-21 | Amazon Technologies, Inc. | Three-dimensional representations of objects detected by an unmanned aerial vehicle |
US10234278B2 (en) * | 2015-09-09 | 2019-03-19 | Faro Technologies, Inc. | Aerial device having a three-dimensional measurement device |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711293B1 (en) | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
US20130215433A1 (en) | 2010-10-12 | 2013-08-22 | Stephen James Crampton | Hover cmm |
WO2013132386A1 (en) * | 2012-03-08 | 2013-09-12 | Koninklijke Philips N.V. | Controllable high luminance illumination with moving light-sources |
JP6326237B2 (en) | 2014-01-31 | 2018-05-16 | 株式会社トプコン | Measuring system |
-
2016
- 2016-08-29 US US15/250,324 patent/US9989357B2/en active Active
- 2016-08-30 GB GB1803608.7A patent/GB2556802A/en not_active Withdrawn
- 2016-08-30 DE DE112016004085.7T patent/DE112016004085T5/en not_active Withdrawn
- 2016-08-30 WO PCT/US2016/049372 patent/WO2017044344A1/en active Application Filing
-
2018
- 2018-05-29 US US15/991,433 patent/US10234278B2/en active Active
-
2019
- 2019-02-11 US US16/272,291 patent/US20190186905A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140046589A1 (en) * | 2011-04-14 | 2014-02-13 | Hexagon Technology Center Gmbh | Measuring system for determining 3d coordinates of an object surface |
US20150116693A1 (en) * | 2013-10-31 | 2015-04-30 | Kabushiki Kaisha Topcon | Three-Dimensional Measuring Method And Surveying System |
US10234278B2 (en) * | 2015-09-09 | 2019-03-19 | Faro Technologies, Inc. | Aerial device having a three-dimensional measurement device |
US20170122736A1 (en) * | 2015-11-03 | 2017-05-04 | Leica Geosystems Ag | Surface surveying device for determining 3d coordinates of a surface |
US10056001B1 (en) * | 2016-05-20 | 2018-08-21 | Amazon Technologies, Inc. | Three-dimensional representations of objects detected by an unmanned aerial vehicle |
US20180035606A1 (en) * | 2016-08-05 | 2018-02-08 | Romello Burdoucci | Smart Interactive and Autonomous Robotic Property Maintenance Apparatus, System, and Method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110926366A (en) * | 2019-12-13 | 2020-03-27 | 浙江省计量科学研究院 | Curved surface contour measuring method based on multi-station layout of laser tracker |
CN111307041A (en) * | 2020-03-20 | 2020-06-19 | 嘉兴方石科技有限公司 | Building measuring method |
Also Published As
Publication number | Publication date |
---|---|
US10234278B2 (en) | 2019-03-19 |
US9989357B2 (en) | 2018-06-05 |
US20180274910A1 (en) | 2018-09-27 |
US20170067734A1 (en) | 2017-03-09 |
GB2556802A (en) | 2018-06-06 |
DE112016004085T5 (en) | 2018-06-14 |
WO2017044344A1 (en) | 2017-03-16 |
GB201803608D0 (en) | 2018-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10234278B2 (en) | Aerial device having a three-dimensional measurement device | |
US20190079522A1 (en) | Unmanned aerial vehicle having a projector and being tracked by a laser tracker | |
US10574963B2 (en) | Triangulation scanner and camera for augmented reality | |
US20200150217A1 (en) | Laser Speckle System and Method for an Aircraft | |
US20170094251A1 (en) | Three-dimensional imager that includes a dichroic camera | |
US10021379B2 (en) | Six degree-of-freedom triangulation scanner and camera for augmented reality | |
US9618620B2 (en) | Using depth-camera images to speed registration of three-dimensional scans | |
US9513107B2 (en) | Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner | |
US8638446B2 (en) | Laser scanner or laser tracker having a projector | |
JP7025156B2 (en) | Data processing equipment, data processing method and data processing program | |
JP2019117188A (en) | System for surface analysis and associated method | |
CN105424006A (en) | Unmanned aerial vehicle hovering precision measurement method based on binocular vision | |
JP7011908B2 (en) | Optical information processing equipment, optical information processing method and optical information processing program | |
CN104503339A (en) | Multi-resolution indoor three-dimensional scene reconstitution device and method based on laser radar and quadrotor | |
WO2016025358A1 (en) | A six degree-of-freedom triangulation scanner and camera for augmented reality | |
WO2016089430A1 (en) | Using two-dimensional camera images to speed registration of three-dimensional scans | |
US10830889B2 (en) | System measuring 3D coordinates and method thereof | |
WO2016089431A1 (en) | Using depth-camera images to speed registration of three-dimensional scans | |
EP4134707A1 (en) | Construction site digital field book for three-dimensional scanners | |
JP7203935B2 (en) | Optical information processing device, optical information processing method, and optical information processing program | |
EP3943979A1 (en) | Indoor device localization | |
JP2017111118A (en) | Registration calculation between three-dimensional(3d)scans based on two-dimensional (2d) scan data from 3d scanner | |
JP2017111117A (en) | Registration calculation of three-dimensional scanner data performed between scans on the basis of measurements by two-dimensional scanner | |
WO2016089429A1 (en) | Intermediate two-dimensional scanning with a three-dimensional scanner to speed registration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FARO TECHNOLOGIES, INC., FLORIDA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEIDEMANN, ROLF;WOHLFELD, DENIS;BRIDGES, ROBERT E.;AND OTHERS;SIGNING DATES FROM 20160823 TO 20160915;REEL/FRAME:048294/0545 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |