US20150085079A1 - Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images - Google Patents
Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images Download PDFInfo
- Publication number
- US20150085079A1 US20150085079A1 US14/494,639 US201414494639A US2015085079A1 US 20150085079 A1 US20150085079 A1 US 20150085079A1 US 201414494639 A US201414494639 A US 201414494639A US 2015085079 A1 US2015085079 A1 US 2015085079A1
- Authority
- US
- United States
- Prior art keywords
- color
- image
- enhanced
- images
- sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000011156 evaluation Methods 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 14
- 230000004044 response Effects 0.000 claims description 5
- 230000006835 compression Effects 0.000 claims description 3
- 238000007906 compression Methods 0.000 claims description 3
- 239000003086 colorant Substances 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 230000008901 benefit Effects 0.000 description 6
- 238000013507 mapping Methods 0.000 description 6
- 238000012935 Averaging Methods 0.000 description 4
- 230000004075 alteration Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003534 oscillatory effect Effects 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H04N13/0257—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/16—Spatio-temporal transformations, e.g. video cubism
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
- G01C15/002—Active optical surveying means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/51—Display arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/38—Registration of image sequences
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- H04N13/0422—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/257—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/147—Scene change detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
Definitions
- U.S. Pat. No. 8,705,016 ('016) describes a laser scanner that colors collected scan data by assigning colors obtained from colored images, pixel by pixel, to the scan image.
- the hardware of the color camera determines the quality, brightness levels, and contrast of the colored three-dimensional (3D) images.
- a laser scanner that superimposes colors obtained from a color camera onto 3D gray-scale images obtained from a time-of-flight (TOF) laser scanner.
- TOF time-of-flight
- a TOF scanner is any type of scanner in which the distance to a target point is based on the speed of light in air between the scanner and the target point.
- Laser scanners are typically used for scanning closed or open spaces such as interior areas of buildings, industrial installations and tunnels. They are used for many purposes, including industrial applications and accident reconstruction applications.
- a laser scanner can be used to optically scan and measure objects in a volume around the scanner through the acquisition of data points representing objects within the volume. Such data points are obtained by transmitting a beam of light onto the objects and collecting the reflected or scattered light to determine the distance, two-angles (i.e., an azimuth and a zenith angle), and optionally a gray-scale value.
- This raw scan data is collected, stored and sent to a processor or processors to generate a 3D image representing the scanned area or object.
- a processor or processors to generate a 3D image representing the scanned area or object.
- at least three values are collected for each data point. These three values may include the distance and two angles, or may be transformed values, such as the x, y, z coordinates.
- a fourth value collected by the 3D laser scanner is a gray-scale value for each point measured. Such a gray-scale value is related to the irradiance of scattered light returning to the scanner.
- Angle measuring devices such as angular encoders are used to measure the two angles of rotation about the two axes of rotation.
- One type of angular encoder includes a disk and one or more readheads.
- the disk is affixed to a rotating shaft, and the one or more read heads are affixed to a portion that is stationary with respect to the rotating shaft.
- Many contemporary laser scanners also include a camera mounted on the laser scanner for gathering camera digital images of the environment and for presenting the camera digital images to an operator of the laser scanner. By viewing the camera images, the operator of the scanner can determine the field of view (FOV) of the measured volume and adjust settings on the laser scanner to measure over a larger or smaller region of space if the FOV needs adjusting.
- the camera digital images may be transmitted to a processor to add color to the scanner image.
- To generate a color scanner image at least six values (three positional coordinates such as x, y, z; and red value, green value, blue value or “RGB”) are collected for each data point.
- point cloud data The data collected by a laser scanner is often referred to as point cloud data because the data, which is typically relatively dense, may resemble a cloud.
- point cloud is taken herein to mean a collection of 3D values associated with scanned objects.
- the point cloud data may be used to produce 3D representations of the scene being scanned.
- a single color camera image provides red, green, and blue pixel values each displayed on the final image with a varying degree of color, from zero red, blue, or green to 100 percent red, blue, or green.
- the degree of level of color displayed for a given pixel is generally limited by the need to avoid saturation of pixels throughout the entire camera photosensitive array.
- the maximum light received by any pixel in the array determines the maximum exposure time for the entire array.
- a colorized scanned image may have bright colors in a portion of the color image but dim colors at other parts of the image.
- Such an image is said to have relatively low dynamic range because those parts of the color image receiving relatively low light may not show details that would be desirable to see in a final color 3D image.
- a method for optically scanning and measuring an object with a laser scanner including providing the laser scanner having integral components that include a light emitter, a light receiver, a first angle measuring device, a second angle measuring device, a control and evaluation unit, and a color camera; providing a color display; measuring a first angle with the first angle measuring device; measuring a second angle with the second angle measuring device; emitting with the light emitter an emission light beam; reflecting the emission light beam from the object to produce a reception light beam; receiving with the light receiver the reception light beam and obtaining a first electrical signal in response; determining with the control and evaluation unit distances to the plurality of measuring points on the object based at least in part on the first electrical signals for each of the plurality of measuring points and on a speed of light in air; determining with the control and evaluation unit gray-scale values for the plurality of measuring points; capturing with the color camera a sequence of color images while the color camera is fixed in space, each image of the sequence captured with a
- FIG. 1 is a schematic illustration of the optical, mechanical, and electrical components of the laser scanner
- FIG. 2 is a schematic illustration of the laser scanner in operation
- FIG. 3 is a perspective drawing of the laser scanner
- FIG. 4 is a flowchart of a method according to an embodiment.
- a laser scanner 10 is described in reference to FIGS. 1-3 .
- the laser scanner 10 is provided as a device for optically scanning and measuring an environment of the laser scanner 10 .
- the laser scanner 10 has a measuring head 12 and a base 14 .
- the measuring head 12 is mounted on the base 14 as a unit that can be rotated about a vertical axis.
- the measuring head 12 has a mirror 16 , which can be rotated about a horizontal axis.
- the intersection point of the two axes of rotation is designated center C 10 of the laser scanner 10 .
- the measuring head 12 is further provided with a light emitter 17 for emitting an emission light beam 18 .
- the emission light beam 18 is preferably a laser beam in the range of approx. 300 to 1600 nm wave length, for example 1550 nm, 905 nm, 790 nm or less than 400 nm, on principle, also other electro-magnetic waves having, for example, a greater wave length can be used, however.
- the emission light beam 18 is amplitude-modulated with a modulation signal.
- the emission light beam 18 is emitted by the light emitter 17 onto the rotary mirror 16 , where it is deflected and emitted to the environment.
- the direction of the emission light beam 18 and of the reception light beam 20 results from the angular positions of the rotary mirror 16 and the measuring head 12 , which depend on the positions of their corresponding rotary drives which, in turn, are registered by one encoder each.
- a control and evaluation unit 22 has a data connection to the light emitter 17 and to the light receiver 21 in measuring head 12 , whereby parts of it can be arranged also outside the measuring head 12 , for example as a computer connected to the base 14 .
- the control and evaluation unit 22 is configured to determine, for a multitude of measuring points X, the distance d between the laser scanner 10 and the (illuminated point at) object O, from the propagation time of emission light beam 18 and reception light beam 20 . For this purpose, the phase shift between the two light beams 18 , 20 can be determined and evaluated, for example.
- a display unit 24 is connected to the control and evaluation unit 22 .
- the display unit 24 in the present case is a display at the laser scanner 10 ; alternatively it can, however, also be the display of a computer which is connected to the base 14 .
- Scanning takes place along a circle by means of the relatively quick rotation of the mirror 16 .
- the whole space is scanned step by step, by way of the circles.
- the entirety of measuring points X of such a measurement defines a scan.
- the center C 10 of the laser scanner 10 defines the origin of the local stationary reference system.
- the base 14 rests in this local stationary reference system.
- each measuring point X comprises brightness information which is determined by the control and evaluation unit 22 as well.
- the brightness value is a gray-scale value which is determined, for example, by integration of the bandpass-filtered and amplified signal of the light receiver 21 over a measuring period which is assigned to the measuring point X.
- images can be generated optionally, by which colors (R,G,B) can be assigned to the measuring points as values.
- the laser scanner 10 is provided with a color camera 25 , which is connected to the control and evaluation unit 22 as well.
- the color camera 25 is configured, for example, as a CCD camera or a CMOS camera and provides a signal which is three-dimensional in color space, preferably an RGB signal, for an image which is two-dimensional in position space.
- the control and evaluation unit 22 concatenates the scan (which is three-dimensional in position space) of the laser scanner with the images (which are two-dimensional in position space) of the color camera 25 , such concatenating being denoted “mapping.” Concatenating takes place image by image for each of the captured color images so as to assign, as a final result, a color (in RGB share) to each measuring point X of the scan; that is, to color the scan.
- the light receiver 21 usually is configured such that it doesn't receive the reception light beam 20 coming from the mirror 16 directly, but that the mirror 16 deflects the reception light beam 20 to receiver optics 30 .
- the receiver optics 30 forms on the light receiver 21 an image of the reception light beam 20 , coming from the mirror 16 .
- the mirror 16 has a small semiaxis which defines the diameter of the reception light beam 20 .
- the receiver optics 30 is provided with a reception lens 32 , the diameter of which is at least as big as the small semiaxis of the mirror 16 , so that it can completely receive the reception light beam 30 and project it onto the next optical element.
- the optical axis of the reception lens 32 is aligned to the mirror 16 .
- the receiver optics 30 reduces the diameter of the reception light beam 20 to the dimension of the light receiver.
- the direction of the reception light beam 20 passed to the light receiver is also the direction into the color camera 25 , which may be arranged behind the receiver optics 30 or within the receiver optics 30 .
- a preferred arrangement of the color camera 25 is, however, disclosed in the aforementioned U.S. Pat. No. 8,705,016.
- the color camera 25 is arranged in front of the receiver optics 30 .
- the light receiver 21 and the color camera 25 jointly use the mirror 16 , but the receiver optics 30 is used only by the light receiver 21 .
- An arrangement of the color camera 25 on the optical axis of the receiver lens 32 has the advantage of keeping aberrations at a low level; i.e., the receiver optics 30 and the color camera 25 view the same section of the environment.
- the color camera 25 can—with regard to the direction of the reception light beam 20 —be directly on the receiver lens 32 .
- the emission light beam 18 of the light emitter 17 can then be deflected, for example by a semitransparent mirror, to the optical axis of the receiver lens 32 , to further hit the mirror 16 .
- the color camera 25 can receive the reception light beam 20 at least partially, by a semitransparent mirror.
- the space directly on the receiver lens 32 can then be taken by the light emitter 17 .
- the light emitter 17 and the color camera 25 are in operation consecutively.
- the laser scanner 10 having its color camera 25 switched off, first scans the environment by the emission light beam 18 and receives the reception light beam 20 , wherefrom a gray-scale scan is generated. It then captures the color images of the environment, with the light emitter 17 switched off, by the color camera 15 .
- the control and evaluation unit 22 assigns colors to the measuring points X to color the gray-scale scan.
- the color camera 25 can increase the contrast of its images.
- the color camera 25 captures a sequence of images with a low dynamic range (LDR).
- LDR low dynamic range
- dynamic range refers to a ratio of the maximum voltage level produced by any pixel to the minimum voltage level produced by any pixel.
- the dynamic range may be described by other equivalent measures such as the ratio of the maximum number of electrons within any one pixel well to the minimum number of electronics in any pixel well.
- the dynamic range may be considered a level of lightness or brightness of light at a point captured by a pixel as seen by the human eye for each of the colors R, G, B.
- multiple images are obtained with the scanner color camera 25 receiving light from a fixed part of the environment.
- Each of these images is said to be a LDR image because there the maximum dynamic range cannot exceed a certain value for any particular camera array.
- the maximum level of light is limited by saturation level of the array, and the minimum level of light is limited by camera noise, especially camera electrical noise.
- camera exposure time is increased, some areas of the array begin to saturate but the levels of other areas of the array that previously had very low levels are now somewhat higher.
- an image with a high dynamic range is generated, preferably in the control and evaluation unit 22 or in a suitable processing unit of the color camera 25 .
- the HDR image is then processed further. In this way, regions on the array that are completely dark or completely bright are avoided.
- the HDR image Compared to LDR images that may be encoded with only 8 bits per color channel, the HDR image has many more bits, for example, 32 bits per color channel. Often, the values of the color channels for each pixel of the HDR image are represented by floating point numbers instead of integer numbers. Then, the values may be in similar ranges as the values of LDR images, but having finer gradations. If enough storage capacity is available, the complete sequence of LDR images, in addition to the resulting HDR image, may be retained to preserve full information.
- the HDR image can be visualized even if the hardware can display only LDR images, for example on the display unit 24 .
- the process of mapping one set of colors to another to approximate the appearance of high dynamic range images in a medium that has a more limited dynamic range (for example, the display unit 24 ) is referred to as tone mapping.
- a slide or a dynamic compression (tone mapping) can be used for this purpose.
- dynamic compression the dynamic range of the HDR image is reduced to an LDR image by use of operators, particularly global operators, local operators, frequency-based operators or gradient-based operators. Bright surfaces appear darker, and dark surfaces appear brighter. With the local operators, a maximum visibility of details is obtained, independently of the illumination situation.
- a fluent linking of local operators is constructed to generate continuous transitions without edges.
- the required storage capacity may be reduced by applying tone mapping to enable keeping, not the entire sequence of LDR images, but instead a single, already processed image obtained using the methods described hereinabove.
- the resulting LDR image can then be used to color the gray-scale scan.
- the HDR image may be used to color the gray-scale scan, taking advantage of the finely graduated brightness levels to provide more object detail to enable more precise localization of objects.
- a HDR image may be used to enable display of an image in a preferred manner.
- a dynamically determined average brightness level can be taken into account, so that the required number of images to be shot can be limited.
- it can be determined whether there are bright areas or dark areas that for which image details are not being extracted (because of overexposure or underexposure). Such a determination may be made, for example, based on brightness statistics.
- a threshold value (which may be based at least in part on brightness statistics) is exceeded, capturing of further images can be stopped without a loss of quality, minimizing the required time.
- quality can be traded off against speed.
- HDR image and LDR image may be saved.
- the number of images can be reduced without sacrificing quality by observing averaged brightness values at pixels of the photosensitive array.
- the use of the term brightness in this context is understood to be related to a number of electrons created in R, B, G pixel wells in relation to the maximum number of electrons that the well will hold. This number of electrons is proportional to an optical power level passing from an object point through the camera lens before reaching an R, G, or B pixel in the camera photosensitive array.
- the pixel has a certain responsivity by which the integrated optical power is converted into a number of electrons in the pixel well.
- the electrons are extracted as an electrical current and converted into a voltage that is sampled with an analog-to-digital converter to provide a voltage.
- the voltage level for any particular pixel depends at least in part on (1) the optical power at the R, G, or B wavelength reflected from the object point into a corresponding pixel in the array, (2) the camera exposure time, and (3) responsivity of the pixel for the particular wavelength of light (R, G, or B).
- Averaging of the brightness values can take place over a rotation of the mirror 16 .
- the averaged brightness values are then used to determine the different exposure times for the sequence of LDR images.
- the number of LDR images in the sequence may depend on the camera FOV and camera aperture.
- the rotation of the mirror 16 for averaging of the brightness values may be a rotation about the horizontal axis of the mirror 16 .
- this rotation may also be a rotation of the entire measuring head 12 about its vertical axis, thus also resulting in a rotation of the mirror 16 .
- the mirror 16 may be still with respect to the measuring head 12 (for example, by having no rotation of the mirror 16 around the horizontal axis).
- the mirror 16 may for example be aimed to the horizon (which defines the horizontal position of the mirror 16 ).
- the mirror 16 may be rotated about its horizontal axis in any of a number of different patterns, which might be full or partial rotations of the mirror.
- the mirror 16 performs an oscillatory rotation around the horizontal position, for example, between the angles of ⁇ 30° and +30° (a “waggling mirror”).
- Vertical and horizontal rotations may be used separately or combined to obtain an average brightness value to use in determining the number of LDR images to collect and the exposure time for each.
- the averaging of the brightness values may also be determined from any previous image taken by the color camera 25 .
- the averaging of the brightness values may result in a single averaged brightness value used as median for defining the exposure times.
- FIG. 4 shows a flowchart of the method of an embodiment of the present invention.
- the laser scanner 10 generates a gray-scale scan over points X, each of the scan points obtained from several (e.g., 2000) samples of the propagation time of emission light beam 18 and reception light beam 20 .
- the color camera 25 captures a sequence of LDR color images with different exposure times. The order of step 101 and step 102 may be changed.
- step 103 first, an HDR image is generated from the sequence of LDR images with different exposure times, and afterwards, a single LDR image is generated from the HDR image.
- Tone mapping is used, either to convert the HDR image still comprising the whole sequence of LDR images into the single LDR image, or to convert the sequence of LDR images into a processed HDR image.
- the single LDR image (or the processed HDR image) is used to color the gray-scale scan into a color scan.
- Connection between the laser scanner and, where appropriate, parts of the control and evaluation unit which are arranged outside the measuring head, and, where appropriate, a display unit on a computer which is connected to the laser scanner, and further computers which are incorporated in the system, can be carried out by wire or wireless, for example by means of WLAN.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A laser scanner scans an object by measuring first and second angles with angle measuring devices, sending light onto an object and capturing the reflected light to determine a distances and gray-scale values to points on the object, capturing a sequence of color images with a color camera at different exposure times, determining 3D coordinates and gray-scale values to points on the object, determining from the sequence of color images an enhanced color image having a higher dynamic range than available from any single color image, and superimposing the enhanced color image on the 3D gray-scale image to obtain an enhanced 3D color image.
Description
- The present application claims the benefit of German Patent Application No. DE102013110580.7, filed on Sep. 24, 2013, and of U.S. Provisional Patent Application No. 61/926,461, filed on Jan. 13, 2014, the contents of both of which are hereby incorporated by reference in their entirety.
- U.S. Pat. No. 8,705,016 ('016) describes a laser scanner that colors collected scan data by assigning colors obtained from colored images, pixel by pixel, to the scan image. The hardware of the color camera determines the quality, brightness levels, and contrast of the colored three-dimensional (3D) images.
- Further described herein is a laser scanner that superimposes colors obtained from a color camera onto 3D gray-scale images obtained from a time-of-flight (TOF) laser scanner.
- A TOF scanner is any type of scanner in which the distance to a target point is based on the speed of light in air between the scanner and the target point. Laser scanners are typically used for scanning closed or open spaces such as interior areas of buildings, industrial installations and tunnels. They are used for many purposes, including industrial applications and accident reconstruction applications. A laser scanner can be used to optically scan and measure objects in a volume around the scanner through the acquisition of data points representing objects within the volume. Such data points are obtained by transmitting a beam of light onto the objects and collecting the reflected or scattered light to determine the distance, two-angles (i.e., an azimuth and a zenith angle), and optionally a gray-scale value. This raw scan data is collected, stored and sent to a processor or processors to generate a 3D image representing the scanned area or object. To generate the image, at least three values are collected for each data point. These three values may include the distance and two angles, or may be transformed values, such as the x, y, z coordinates. In an embodiment, a fourth value collected by the 3D laser scanner is a gray-scale value for each point measured. Such a gray-scale value is related to the irradiance of scattered light returning to the scanner.
- Angle measuring devices such as angular encoders are used to measure the two angles of rotation about the two axes of rotation. One type of angular encoder includes a disk and one or more readheads. In an embodiment, the disk is affixed to a rotating shaft, and the one or more read heads are affixed to a portion that is stationary with respect to the rotating shaft.
- Many contemporary laser scanners also include a camera mounted on the laser scanner for gathering camera digital images of the environment and for presenting the camera digital images to an operator of the laser scanner. By viewing the camera images, the operator of the scanner can determine the field of view (FOV) of the measured volume and adjust settings on the laser scanner to measure over a larger or smaller region of space if the FOV needs adjusting. In addition, the camera digital images may be transmitted to a processor to add color to the scanner image. To generate a color scanner image, at least six values (three positional coordinates such as x, y, z; and red value, green value, blue value or “RGB”) are collected for each data point.
- The data collected by a laser scanner is often referred to as point cloud data because the data, which is typically relatively dense, may resemble a cloud. The term point cloud is taken herein to mean a collection of 3D values associated with scanned objects. The point cloud data may be used to produce 3D representations of the scene being scanned.
- A single color camera image provides red, green, and blue pixel values each displayed on the final image with a varying degree of color, from zero red, blue, or green to 100 percent red, blue, or green. However, the degree of level of color displayed for a given pixel is generally limited by the need to avoid saturation of pixels throughout the entire camera photosensitive array. In other words, the maximum light received by any pixel in the array determines the maximum exposure time for the entire array. As a result, a colorized scanned image may have bright colors in a portion of the color image but dim colors at other parts of the image. Such an image is said to have relatively low dynamic range because those parts of the color image receiving relatively low light may not show details that would be desirable to see in a final color 3D image.
- Accordingly, while existing 3D scanners are suitable for their intended purposes, what is needed is a 3D scanner having certain features of embodiments of the present invention.
- According to one aspect of the invention, a method is provided for optically scanning and measuring an object with a laser scanner, the method including providing the laser scanner having integral components that include a light emitter, a light receiver, a first angle measuring device, a second angle measuring device, a control and evaluation unit, and a color camera; providing a color display; measuring a first angle with the first angle measuring device; measuring a second angle with the second angle measuring device; emitting with the light emitter an emission light beam; reflecting the emission light beam from the object to produce a reception light beam; receiving with the light receiver the reception light beam and obtaining a first electrical signal in response; determining with the control and evaluation unit distances to the plurality of measuring points on the object based at least in part on the first electrical signals for each of the plurality of measuring points and on a speed of light in air; determining with the control and evaluation unit gray-scale values for the plurality of measuring points; capturing with the color camera a sequence of color images while the color camera is fixed in space, each image of the sequence captured with a different exposure time and having an associated first dynamic range, the color images providing second electrical signals in response; determining with the control and evaluation unit a 3D gray-scale image based at least in part on the first angle, the second angle, the distances to and gray-scale values for the plurality of measuring points on the object; determining with the control and evaluation unit an enhanced color image having an enhanced dynamic range, the enhanced dynamic range being higher than the any of the associated first dynamic ranges, the enhanced color image based at least in part on the second electrical signals; determining with the control and evaluation unit an enhanced 3D color image by superimposing the enhanced color image on the 3D gray-scale image; and displaying the enhanced 3D color image on the color display.
- The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
-
FIG. 1 is a schematic illustration of the optical, mechanical, and electrical components of the laser scanner; -
FIG. 2 is a schematic illustration of the laser scanner in operation; -
FIG. 3 is a perspective drawing of the laser scanner; and -
FIG. 4 is a flowchart of a method according to an embodiment. - The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
- A
laser scanner 10 is described in reference toFIGS. 1-3 . Thelaser scanner 10 is provided as a device for optically scanning and measuring an environment of thelaser scanner 10. Thelaser scanner 10 has ameasuring head 12 and abase 14. Themeasuring head 12 is mounted on thebase 14 as a unit that can be rotated about a vertical axis. Themeasuring head 12 has amirror 16, which can be rotated about a horizontal axis. The intersection point of the two axes of rotation is designated center C10 of thelaser scanner 10. - The
measuring head 12 is further provided with alight emitter 17 for emitting anemission light beam 18. Theemission light beam 18 is preferably a laser beam in the range of approx. 300 to 1600 nm wave length, for example 1550 nm, 905 nm, 790 nm or less than 400 nm, on principle, also other electro-magnetic waves having, for example, a greater wave length can be used, however. Theemission light beam 18 is amplitude-modulated with a modulation signal. Theemission light beam 18 is emitted by thelight emitter 17 onto therotary mirror 16, where it is deflected and emitted to the environment. Areception light beam 20 which is reflected in the environment by an object O or scattered otherwise, is captured again by therotary mirror 16, deflected and directed onto alight receiver 21. The direction of theemission light beam 18 and of thereception light beam 20 results from the angular positions of therotary mirror 16 and themeasuring head 12, which depend on the positions of their corresponding rotary drives which, in turn, are registered by one encoder each. - A control and
evaluation unit 22 has a data connection to thelight emitter 17 and to thelight receiver 21 in measuringhead 12, whereby parts of it can be arranged also outside themeasuring head 12, for example as a computer connected to thebase 14. The control andevaluation unit 22 is configured to determine, for a multitude of measuring points X, the distance d between thelaser scanner 10 and the (illuminated point at) object O, from the propagation time ofemission light beam 18 andreception light beam 20. For this purpose, the phase shift between the twolight beams - A
display unit 24 is connected to the control andevaluation unit 22. Thedisplay unit 24 in the present case is a display at thelaser scanner 10; alternatively it can, however, also be the display of a computer which is connected to thebase 14. - Scanning takes place along a circle by means of the relatively quick rotation of the
mirror 16. By virtue of the relatively slow rotation of themeasuring head 12 relative to thebase 14, the whole space is scanned step by step, by way of the circles. The entirety of measuring points X of such a measurement defines a scan. For such a scan, the center C10 of thelaser scanner 10 defines the origin of the local stationary reference system. Thebase 14 rests in this local stationary reference system. - In addition to the distance d to the center C10 of the
laser scanner 10, each measuring point X comprises brightness information which is determined by the control andevaluation unit 22 as well. The brightness value is a gray-scale value which is determined, for example, by integration of the bandpass-filtered and amplified signal of thelight receiver 21 over a measuring period which is assigned to the measuring point X. through use of a color camera, images can be generated optionally, by which colors (R,G,B) can be assigned to the measuring points as values. - The
laser scanner 10 is provided with acolor camera 25, which is connected to the control andevaluation unit 22 as well. Thecolor camera 25 is configured, for example, as a CCD camera or a CMOS camera and provides a signal which is three-dimensional in color space, preferably an RGB signal, for an image which is two-dimensional in position space. The control andevaluation unit 22 concatenates the scan (which is three-dimensional in position space) of the laser scanner with the images (which are two-dimensional in position space) of thecolor camera 25, such concatenating being denoted “mapping.” Concatenating takes place image by image for each of the captured color images so as to assign, as a final result, a color (in RGB share) to each measuring point X of the scan; that is, to color the scan. - The
light receiver 21 usually is configured such that it doesn't receive thereception light beam 20 coming from themirror 16 directly, but that themirror 16 deflects thereception light beam 20 toreceiver optics 30. Through use of the optical components, particularly the lenses and/or mirrors, thereceiver optics 30 forms on thelight receiver 21 an image of thereception light beam 20, coming from themirror 16. As a 45° sectional area of a cylinder, themirror 16 has a small semiaxis which defines the diameter of thereception light beam 20. Thereceiver optics 30 is provided with areception lens 32, the diameter of which is at least as big as the small semiaxis of themirror 16, so that it can completely receive thereception light beam 30 and project it onto the next optical element. The optical axis of thereception lens 32 is aligned to themirror 16. Thereceiver optics 30 reduces the diameter of thereception light beam 20 to the dimension of the light receiver. - The direction of the
reception light beam 20 passed to the light receiver is also the direction into thecolor camera 25, which may be arranged behind thereceiver optics 30 or within thereceiver optics 30. A preferred arrangement of thecolor camera 25 is, however, disclosed in the aforementioned U.S. Pat. No. 8,705,016. With regard to the direction of thereception light beam 20, thecolor camera 25 is arranged in front of thereceiver optics 30. In other words, thelight receiver 21 and thecolor camera 25 jointly use themirror 16, but thereceiver optics 30 is used only by thelight receiver 21. - An arrangement of the
color camera 25 on the optical axis of thereceiver lens 32 has the advantage of keeping aberrations at a low level; i.e., thereceiver optics 30 and thecolor camera 25 view the same section of the environment. Thecolor camera 25 can—with regard to the direction of thereception light beam 20—be directly on thereceiver lens 32. Theemission light beam 18 of thelight emitter 17 can then be deflected, for example by a semitransparent mirror, to the optical axis of thereceiver lens 32, to further hit themirror 16. Alternatively, thecolor camera 25 can receive thereception light beam 20 at least partially, by a semitransparent mirror. The space directly on thereceiver lens 32 can then be taken by thelight emitter 17. - Preferably, the
light emitter 17 and thecolor camera 25 are in operation consecutively. In an embodiment, thelaser scanner 10, having itscolor camera 25 switched off, first scans the environment by theemission light beam 18 and receives thereception light beam 20, wherefrom a gray-scale scan is generated. It then captures the color images of the environment, with thelight emitter 17 switched off, by the color camera 15. The control andevaluation unit 22 assigns colors to the measuring points X to color the gray-scale scan. - The
color camera 25 can increase the contrast of its images. For this purpose, thecolor camera 25 captures a sequence of images with a low dynamic range (LDR). In this context, the term dynamic range refers to a ratio of the maximum voltage level produced by any pixel to the minimum voltage level produced by any pixel. The dynamic range may be described by other equivalent measures such as the ratio of the maximum number of electrons within any one pixel well to the minimum number of electronics in any pixel well. Somewhat less precisely, the dynamic range may be considered a level of lightness or brightness of light at a point captured by a pixel as seen by the human eye for each of the colors R, G, B. - In an embodiment, multiple images are obtained with the
scanner color camera 25 receiving light from a fixed part of the environment. Each of these images is said to be a LDR image because there the maximum dynamic range cannot exceed a certain value for any particular camera array. On one extreme, the maximum level of light is limited by saturation level of the array, and the minimum level of light is limited by camera noise, especially camera electrical noise. As camera exposure time is increased, some areas of the array begin to saturate but the levels of other areas of the array that previously had very low levels are now somewhat higher. By collecting multiple LDR images, each having a different exposure time, each small area of the environment may be captured with an appropriate level of illumination and exposure. From the sequence of differently exposed LDR images, an image with a high dynamic range (HDR) is generated, preferably in the control andevaluation unit 22 or in a suitable processing unit of thecolor camera 25. The HDR image is then processed further. In this way, regions on the array that are completely dark or completely bright are avoided. - Compared to LDR images that may be encoded with only 8 bits per color channel, the HDR image has many more bits, for example, 32 bits per color channel. Often, the values of the color channels for each pixel of the HDR image are represented by floating point numbers instead of integer numbers. Then, the values may be in similar ranges as the values of LDR images, but having finer gradations. If enough storage capacity is available, the complete sequence of LDR images, in addition to the resulting HDR image, may be retained to preserve full information.
- The HDR image can be visualized even if the hardware can display only LDR images, for example on the
display unit 24. The process of mapping one set of colors to another to approximate the appearance of high dynamic range images in a medium that has a more limited dynamic range (for example, the display unit 24) is referred to as tone mapping. A slide or a dynamic compression (tone mapping) can be used for this purpose. With dynamic compression, the dynamic range of the HDR image is reduced to an LDR image by use of operators, particularly global operators, local operators, frequency-based operators or gradient-based operators. Bright surfaces appear darker, and dark surfaces appear brighter. With the local operators, a maximum visibility of details is obtained, independently of the illumination situation. In an embodiment, a fluent linking of local operators is constructed to generate continuous transitions without edges. The required storage capacity may be reduced by applying tone mapping to enable keeping, not the entire sequence of LDR images, but instead a single, already processed image obtained using the methods described hereinabove. - The resulting LDR image can then be used to color the gray-scale scan. Alternatively, the HDR image may be used to color the gray-scale scan, taking advantage of the finely graduated brightness levels to provide more object detail to enable more precise localization of objects. Furthermore, a HDR image may be used to enable display of an image in a preferred manner.
- When capturing images, a dynamically determined average brightness level can be taken into account, so that the required number of images to be shot can be limited. When capturing the images, it can be determined whether there are bright areas or dark areas that for which image details are not being extracted (because of overexposure or underexposure). Such a determination may be made, for example, based on brightness statistics. When a threshold value (which may be based at least in part on brightness statistics) is exceeded, capturing of further images can be stopped without a loss of quality, minimizing the required time. Depending on the user settings, quality can be traded off against speed. To take full advantage of the collected information, HDR image and LDR image may be saved.
- As explained hereinabove, the number of images can be reduced without sacrificing quality by observing averaged brightness values at pixels of the photosensitive array. The use of the term brightness in this context is understood to be related to a number of electrons created in R, B, G pixel wells in relation to the maximum number of electrons that the well will hold. This number of electrons is proportional to an optical power level passing from an object point through the camera lens before reaching an R, G, or B pixel in the camera photosensitive array. The pixel has a certain responsivity by which the integrated optical power is converted into a number of electrons in the pixel well. The electrons are extracted as an electrical current and converted into a voltage that is sampled with an analog-to-digital converter to provide a voltage. The voltage level for any particular pixel depends at least in part on (1) the optical power at the R, G, or B wavelength reflected from the object point into a corresponding pixel in the array, (2) the camera exposure time, and (3) responsivity of the pixel for the particular wavelength of light (R, G, or B).
- Averaging of the brightness values can take place over a rotation of the
mirror 16. The averaged brightness values are then used to determine the different exposure times for the sequence of LDR images. The number of LDR images in the sequence may depend on the camera FOV and camera aperture. - The rotation of the
mirror 16 for averaging of the brightness values may be a rotation about the horizontal axis of themirror 16. However, this rotation may also be a rotation of the entire measuringhead 12 about its vertical axis, thus also resulting in a rotation of themirror 16. During the rotation of the measuringhead 12, themirror 16 may be still with respect to the measuring head 12 (for example, by having no rotation of themirror 16 around the horizontal axis). In this case, themirror 16 may for example be aimed to the horizon (which defines the horizontal position of the mirror 16). Alternatively, themirror 16 may be rotated about its horizontal axis in any of a number of different patterns, which might be full or partial rotations of the mirror. In an embodiment, themirror 16 performs an oscillatory rotation around the horizontal position, for example, between the angles of −30° and +30° (a “waggling mirror”). Vertical and horizontal rotations may be used separately or combined to obtain an average brightness value to use in determining the number of LDR images to collect and the exposure time for each. - Alternatively, from an extra rotation (or other movement) of the
mirror 16, the averaging of the brightness values may also be determined from any previous image taken by thecolor camera 25. The averaging of the brightness values may result in a single averaged brightness value used as median for defining the exposure times. -
FIG. 4 shows a flowchart of the method of an embodiment of the present invention. Instep 101, thelaser scanner 10 generates a gray-scale scan over points X, each of the scan points obtained from several (e.g., 2000) samples of the propagation time ofemission light beam 18 andreception light beam 20. Instep 102, thecolor camera 25 captures a sequence of LDR color images with different exposure times. The order ofstep 101 and step 102 may be changed. Instep 103, first, an HDR image is generated from the sequence of LDR images with different exposure times, and afterwards, a single LDR image is generated from the HDR image. Tone mapping is used, either to convert the HDR image still comprising the whole sequence of LDR images into the single LDR image, or to convert the sequence of LDR images into a processed HDR image. Instep 104, the single LDR image (or the processed HDR image) is used to color the gray-scale scan into a color scan. - Connection between the laser scanner and, where appropriate, parts of the control and evaluation unit which are arranged outside the measuring head, and, where appropriate, a display unit on a computer which is connected to the laser scanner, and further computers which are incorporated in the system, can be carried out by wire or wireless, for example by means of WLAN.
- While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
Claims (7)
1. A method for optically scanning and measuring an object with a laser scanner, the method comprising:
providing the laser scanner having integral components that include a light emitter, a light receiver, a first angle measuring device, a second angle measuring device, a control and evaluation unit, and a color camera;
providing a color display;
measuring a first angle with the first angle measuring device;
measuring a second angle with the second angle measuring device;
emitting with the light emitter an emission light beam;
reflecting the emission light beam from the object to produce a reception light beam;
receiving with the light receiver the reception light beam and obtaining a first electrical signal in response;
determining with the control and evaluation unit distances to a plurality of measuring points on the object based at least in part on the first electrical signals for each of the plurality of measuring points and on a speed of light in air;
determining with the control and evaluation unit gray-scale values for the plurality of measuring points;
capturing with the color camera a sequence of color images while the color camera is fixed in space, each image of the sequence captured with a different exposure time and having an associated first dynamic range, the color images providing second electrical signals in response;
determining with the control and evaluation unit a three-dimensional (3D) gray-scale image based at least in part on the first angle, the second angle, the distances to and gray-scale values for the plurality of measuring points on the object;
determining with the control and evaluation unit an enhanced color image having an enhanced dynamic range, the enhanced dynamic range being higher than the any of the associated first dynamic ranges, the enhanced color image based at least in part on the second electrical signals;
determining with the control and evaluation unit an enhanced 3D color image by superimposing the enhanced color image on the 3D gray-scale image; and
displaying the enhanced 3D color image on the color display.
2. The method of claim 1 , further including a step of applying dynamic compression to the enhanced color image to obtain a reduced dynamic range selected to match properties of the color display.
3. The method of claim 1 , wherein in the step of providing the laser scanner, the laser scanner further includes a rotatable mirror, the rotatable mirror configured to rotate about the first angle.
4. The method of claim 3 , wherein in the step of providing the laser scanner, the rotatable mirror reflects light from the object into the color camera.
5. The method of claim 4 , further including a step of determining an average value of second electrical signals over a plurality of the first angles obtained in response to rotation of the rotatable mirror.
6. The method of claim 5 , wherein, in the step of capturing with the color camera a sequence of color images, the sequence of color images has a number of color images in the sequence, the number of color images based at least in part on the average value of the second electrical signals.
7. The method of claim 6 , wherein, in the step of capturing with the color camera a sequence of color images, the different exposure time of each image of the sequence is based at least in part on the average value of the second electrical signals.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1606874.4A GB2533753A (en) | 2013-09-24 | 2014-09-24 | Improved dynamic range of color camera images superimposed on scanned three dimensional |
JP2016516903A JP2016537613A (en) | 2013-09-24 | 2014-09-24 | Improved dynamic range of color camera images superimposed on 3D grayscale scanned images |
PCT/US2014/057093 WO2015048053A1 (en) | 2013-09-24 | 2014-09-24 | Improved dynamic range of color camera images superimposed on scanned three dimensional gray scale images |
US14/494,639 US20150085079A1 (en) | 2013-09-24 | 2014-09-24 | Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102013110580.7 | 2013-09-24 | ||
DE102013110580.7A DE102013110580B4 (en) | 2013-09-24 | 2013-09-24 | Method for optically scanning and measuring a scene and laser scanner designed to carry out the method |
US201461926461P | 2014-01-13 | 2014-01-13 | |
US14/494,639 US20150085079A1 (en) | 2013-09-24 | 2014-09-24 | Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150085079A1 true US20150085079A1 (en) | 2015-03-26 |
Family
ID=52623448
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/493,426 Active 2035-06-16 US9741093B2 (en) | 2013-09-24 | 2014-09-23 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US14/494,639 Abandoned US20150085079A1 (en) | 2013-09-24 | 2014-09-24 | Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images |
US15/434,220 Active US9747662B2 (en) | 2013-09-24 | 2017-02-16 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US15/685,527 Active US9965829B2 (en) | 2013-09-24 | 2017-08-24 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US15/935,490 Active US10109033B2 (en) | 2013-09-24 | 2018-03-26 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US16/146,002 Active US10475155B2 (en) | 2013-09-24 | 2018-09-28 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US16/596,003 Active US10896481B2 (en) | 2013-09-24 | 2019-10-08 | Collecting and viewing three-dimensional scanner data with user defined restrictions |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/493,426 Active 2035-06-16 US9741093B2 (en) | 2013-09-24 | 2014-09-23 | Collecting and viewing three-dimensional scanner data in a flexible video format |
Family Applications After (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/434,220 Active US9747662B2 (en) | 2013-09-24 | 2017-02-16 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US15/685,527 Active US9965829B2 (en) | 2013-09-24 | 2017-08-24 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US15/935,490 Active US10109033B2 (en) | 2013-09-24 | 2018-03-26 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US16/146,002 Active US10475155B2 (en) | 2013-09-24 | 2018-09-28 | Collecting and viewing three-dimensional scanner data in a flexible video format |
US16/596,003 Active US10896481B2 (en) | 2013-09-24 | 2019-10-08 | Collecting and viewing three-dimensional scanner data with user defined restrictions |
Country Status (3)
Country | Link |
---|---|
US (7) | US9741093B2 (en) |
DE (1) | DE102013110580B4 (en) |
WO (1) | WO2015048048A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150096181A1 (en) * | 2013-10-09 | 2015-04-09 | Hexagon Technology Center Gmbh | Surveying device having a rotation mirror for optically scanning an environment |
USD756813S1 (en) * | 2014-05-21 | 2016-05-24 | Kongsberg Seatex As | Laser based positioning sensor |
US9531967B2 (en) | 2013-12-31 | 2016-12-27 | Faro Technologies, Inc. | Dynamic range of a line scanner having a photosensitive array that provides variable exposure |
US9658061B2 (en) | 2013-12-31 | 2017-05-23 | Faro Technologies, Inc. | Line scanner that uses a color image sensor to improve dynamic range |
WO2018031266A1 (en) * | 2016-08-12 | 2018-02-15 | Microvision, Inc. | Devices and methods for adjustable resolution depth mapping |
CN109013204A (en) * | 2018-10-26 | 2018-12-18 | 江苏科瑞恩自动化科技有限公司 | A kind of gluing process and device based on the movement of laser traces track |
US11513343B2 (en) | 2019-09-27 | 2022-11-29 | Faro Technologies, Inc. | Environmental scanning and image reconstruction thereof |
CN116418967A (en) * | 2023-04-13 | 2023-07-11 | 青岛图海纬度科技有限公司 | Color restoration method and device for laser scanning of underwater dynamic environment |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9210288B2 (en) | 2009-11-20 | 2015-12-08 | Faro Technologies, Inc. | Three-dimensional scanner with dichroic beam splitters to capture a variety of signals |
DE102009057101A1 (en) | 2009-11-20 | 2011-05-26 | Faro Technologies, Inc., Lake Mary | Device for optically scanning and measuring an environment |
US9529083B2 (en) | 2009-11-20 | 2016-12-27 | Faro Technologies, Inc. | Three-dimensional scanner with enhanced spectroscopic energy detector |
DE102010020925B4 (en) | 2010-05-10 | 2014-02-27 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment |
DE102012100609A1 (en) | 2012-01-25 | 2013-07-25 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
US10139985B2 (en) * | 2012-06-22 | 2018-11-27 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US10163261B2 (en) * | 2014-03-19 | 2018-12-25 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US9786097B2 (en) | 2012-06-22 | 2017-10-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
DE102012109481A1 (en) * | 2012-10-05 | 2014-04-10 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
US10067231B2 (en) | 2012-10-05 | 2018-09-04 | Faro Technologies, Inc. | Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner |
US9652852B2 (en) | 2013-09-24 | 2017-05-16 | Faro Technologies, Inc. | Automated generation of a three-dimensional scanner video |
DE102014116904B4 (en) * | 2014-11-19 | 2016-11-24 | Faro Technologies, Inc. | Method for optically scanning and measuring a scene and automatically generating a video |
DE102013110580B4 (en) | 2013-09-24 | 2024-05-23 | Faro Technologies, Inc. | Method for optically scanning and measuring a scene and laser scanner designed to carry out the method |
US9934222B2 (en) | 2014-04-22 | 2018-04-03 | Google Llc | Providing a thumbnail image that follows a main image |
US9972121B2 (en) * | 2014-04-22 | 2018-05-15 | Google Llc | Selecting time-distributed panoramic images for display |
USD780777S1 (en) | 2014-04-22 | 2017-03-07 | Google Inc. | Display screen with graphical user interface or portion thereof |
USD781317S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
JP6387782B2 (en) * | 2014-10-17 | 2018-09-12 | ソニー株式会社 | Control device, control method, and computer program |
US10175360B2 (en) | 2015-03-31 | 2019-01-08 | Faro Technologies, Inc. | Mobile three-dimensional measuring instrument |
CN105241406B (en) * | 2015-09-29 | 2018-09-25 | 苏州金螳螂建筑装饰股份有限公司 | Building decoration three-dimensional modeling inspection method of accuracy |
EP3377917A1 (en) * | 2015-11-18 | 2018-09-26 | Faro Technologies, Inc. | Automated generation of a three-dimensional scanner video |
DE102015122845A1 (en) * | 2015-12-27 | 2017-06-29 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment by means of a 3D measuring device and evaluation in the network |
DE102015122846A1 (en) | 2015-12-27 | 2017-06-29 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment by means of a 3D measuring device and near-field communication |
CA2962334C (en) * | 2016-03-29 | 2023-01-24 | Robert Lindsay Vanderbeck | Tunnel convergence detection apparatus and method |
WO2017199785A1 (en) * | 2016-05-17 | 2017-11-23 | コニカミノルタ株式会社 | Monitoring system setting method, and monitoring system |
US10482621B2 (en) * | 2016-08-01 | 2019-11-19 | Cognex Corporation | System and method for improved scoring of 3D poses and spurious point removal in 3D image data |
US10120075B2 (en) | 2016-08-19 | 2018-11-06 | Faro Technologies, Inc. | Using a two-dimensional scanner to speed registration of three-dimensional scan data |
US10380749B2 (en) | 2016-09-26 | 2019-08-13 | Faro Technologies, Inc. | Device and method for indoor mobile mapping of an environment |
US10282854B2 (en) | 2016-10-12 | 2019-05-07 | Faro Technologies, Inc. | Two-dimensional mapping system and method of operation |
EP3315907A1 (en) * | 2016-10-27 | 2018-05-02 | Leica Geosystems AG | Verfahren zur visuellen darstellung von scandaten |
CA172005S (en) * | 2016-12-01 | 2017-08-11 | Riegl Laser Measurement Systems Gmbh | Laser scanner for surveying, for topographical and distance measurement |
GB2563307A (en) * | 2017-01-30 | 2018-12-12 | Faro Tech Inc | Method for optically scanning and measuring an environment |
US10824773B2 (en) | 2017-03-28 | 2020-11-03 | Faro Technologies, Inc. | System and method of scanning an environment and generating two dimensional images of the environment |
US11686934B2 (en) * | 2017-08-31 | 2023-06-27 | Faro Technologies, Inc. | Remote control of a scanner using movement of a mobile computing device |
GB201717011D0 (en) | 2017-10-17 | 2017-11-29 | Nokia Technologies Oy | An apparatus a method and a computer program for volumetric video |
US11022434B2 (en) | 2017-11-13 | 2021-06-01 | Hexagon Metrology, Inc. | Thermal management of an optical scanning device |
US10712810B2 (en) | 2017-12-08 | 2020-07-14 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for interactive 360 video playback based on user location |
US11055532B2 (en) | 2018-05-02 | 2021-07-06 | Faro Technologies, Inc. | System and method of representing and tracking time-based information in two-dimensional building documentation |
USD875573S1 (en) | 2018-09-26 | 2020-02-18 | Hexagon Metrology, Inc. | Scanning device |
US11024050B2 (en) | 2018-11-05 | 2021-06-01 | Faro Technologies, Inc. | System and method of scanning an environment |
CN109619918A (en) | 2019-01-02 | 2019-04-16 | 京东方科技集团股份有限公司 | Showcase and its control method |
US11486701B2 (en) | 2019-02-06 | 2022-11-01 | Faro Technologies, Inc. | System and method for performing a real-time wall detection |
US11293748B2 (en) | 2019-03-07 | 2022-04-05 | Faro Technologies, Inc. | System and method for measuring three-dimensional coordinates |
US11501478B2 (en) | 2020-08-17 | 2022-11-15 | Faro Technologies, Inc. | System and method of automatic room segmentation for two-dimensional laser floorplans |
EP3982162A1 (en) * | 2020-10-09 | 2022-04-13 | Nabla Vision Sl | Lidar device |
CN112819877A (en) * | 2021-01-12 | 2021-05-18 | 深圳辰视智能科技有限公司 | Laser line point cloud generating method and device and computer readable storage medium |
US11790557B2 (en) | 2021-04-27 | 2023-10-17 | Faro Technologies, Inc. | Calibrating system for colorizing point-clouds |
WO2024072733A1 (en) * | 2022-09-26 | 2024-04-04 | Faro Technologies, Inc. | Generating graphical representations for viewing 3d data and/or image data |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030090646A1 (en) * | 2001-11-09 | 2003-05-15 | Johannes Riegl | Apparatus for taking up an object space |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6369812B1 (en) | 1997-11-26 | 2002-04-09 | Philips Medical Systems, (Cleveland), Inc. | Inter-active viewing system for generating virtual endoscopy studies of medical diagnostic data with a continuous sequence of spherical panoramic views and viewing the studies over networks |
US20060114251A1 (en) | 2004-02-11 | 2006-06-01 | Miller Jacob J | Methods for simulating movement of a computer user through a remote environment |
CA2649916A1 (en) | 2008-01-09 | 2009-07-09 | Tiltan Systems Engineering Ltd. | Apparatus and method for automatic airborne lidar data processing and mapping using data obtained thereby |
DE102009015922B4 (en) | 2009-03-25 | 2016-12-15 | Faro Technologies, Inc. | Method for optically scanning and measuring a scene |
GB0908200D0 (en) | 2009-05-13 | 2009-06-24 | Red Cloud Media Ltd | Method of simulation of a real physical environment |
DE102010020925B4 (en) | 2010-05-10 | 2014-02-27 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment |
DE102010032726B3 (en) | 2010-07-26 | 2011-11-24 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
DE102010033561B3 (en) | 2010-07-29 | 2011-12-15 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
US9599715B2 (en) | 2010-08-03 | 2017-03-21 | Faro Technologies, Inc. | Scanner display |
WO2012037157A2 (en) | 2010-09-13 | 2012-03-22 | Alt Software (Us) Llc | System and method for displaying data having spatial coordinates |
US9026359B2 (en) | 2010-11-01 | 2015-05-05 | Nokia Corporation | Visually representing a three-dimensional environment |
JP5753409B2 (en) * | 2011-03-07 | 2015-07-22 | 株式会社トプコン | Panorama image creation method and three-dimensional laser scanner |
US9121724B2 (en) | 2011-09-30 | 2015-09-01 | Apple Inc. | 3D position tracking for panoramic imagery navigation |
DE102014116904B4 (en) | 2014-11-19 | 2016-11-24 | Faro Technologies, Inc. | Method for optically scanning and measuring a scene and automatically generating a video |
DE102013110580B4 (en) | 2013-09-24 | 2024-05-23 | Faro Technologies, Inc. | Method for optically scanning and measuring a scene and laser scanner designed to carry out the method |
-
2013
- 2013-09-24 DE DE102013110580.7A patent/DE102013110580B4/en active Active
-
2014
- 2014-09-23 US US14/493,426 patent/US9741093B2/en active Active
- 2014-09-24 WO PCT/US2014/057086 patent/WO2015048048A1/en active Application Filing
- 2014-09-24 US US14/494,639 patent/US20150085079A1/en not_active Abandoned
-
2017
- 2017-02-16 US US15/434,220 patent/US9747662B2/en active Active
- 2017-08-24 US US15/685,527 patent/US9965829B2/en active Active
-
2018
- 2018-03-26 US US15/935,490 patent/US10109033B2/en active Active
- 2018-09-28 US US16/146,002 patent/US10475155B2/en active Active
-
2019
- 2019-10-08 US US16/596,003 patent/US10896481B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030090646A1 (en) * | 2001-11-09 | 2003-05-15 | Johannes Riegl | Apparatus for taking up an object space |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150096181A1 (en) * | 2013-10-09 | 2015-04-09 | Hexagon Technology Center Gmbh | Surveying device having a rotation mirror for optically scanning an environment |
US9341474B2 (en) * | 2013-10-09 | 2016-05-17 | Hexagon Technology Center Gmbh | Surveying device having a rotation mirror for optically scanning an environment |
US9531967B2 (en) | 2013-12-31 | 2016-12-27 | Faro Technologies, Inc. | Dynamic range of a line scanner having a photosensitive array that provides variable exposure |
US9658061B2 (en) | 2013-12-31 | 2017-05-23 | Faro Technologies, Inc. | Line scanner that uses a color image sensor to improve dynamic range |
US9909856B2 (en) | 2013-12-31 | 2018-03-06 | Faro Technologies, Inc. | Dynamic range of a line scanner having a photosensitive array that provides variable exposure |
USD756813S1 (en) * | 2014-05-21 | 2016-05-24 | Kongsberg Seatex As | Laser based positioning sensor |
WO2018031266A1 (en) * | 2016-08-12 | 2018-02-15 | Microvision, Inc. | Devices and methods for adjustable resolution depth mapping |
US9921056B2 (en) | 2016-08-12 | 2018-03-20 | Microvision, Inc. | Devices and methods for adjustable resolution depth mapping |
CN109013204A (en) * | 2018-10-26 | 2018-12-18 | 江苏科瑞恩自动化科技有限公司 | A kind of gluing process and device based on the movement of laser traces track |
US11513343B2 (en) | 2019-09-27 | 2022-11-29 | Faro Technologies, Inc. | Environmental scanning and image reconstruction thereof |
CN116418967A (en) * | 2023-04-13 | 2023-07-11 | 青岛图海纬度科技有限公司 | Color restoration method and device for laser scanning of underwater dynamic environment |
Also Published As
Publication number | Publication date |
---|---|
US10475155B2 (en) | 2019-11-12 |
US20170352127A1 (en) | 2017-12-07 |
US10109033B2 (en) | 2018-10-23 |
US9965829B2 (en) | 2018-05-08 |
US20200051205A1 (en) | 2020-02-13 |
US10896481B2 (en) | 2021-01-19 |
DE102013110580B4 (en) | 2024-05-23 |
US20150085068A1 (en) | 2015-03-26 |
US20190035053A1 (en) | 2019-01-31 |
US20170161867A1 (en) | 2017-06-08 |
US9741093B2 (en) | 2017-08-22 |
DE102013110580A1 (en) | 2015-03-26 |
US9747662B2 (en) | 2017-08-29 |
US20180211361A1 (en) | 2018-07-26 |
WO2015048048A1 (en) | 2015-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150085079A1 (en) | Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images | |
US10116920B2 (en) | Balancing colors in a scanned three-dimensional image | |
US9689972B2 (en) | Scanner display | |
US9835727B2 (en) | Method for optically scanning and measuring an environment | |
JP5816778B2 (en) | Laser scanner with additional detector | |
JP5753409B2 (en) | Panorama image creation method and three-dimensional laser scanner | |
US6989890B2 (en) | Apparatus for taking up an object space | |
EP3598174B1 (en) | Laser scanner with enhanced dymanic range imaging | |
US20100134596A1 (en) | Apparatus and method for capturing an area in 3d | |
JP7270702B2 (en) | Depth sensing system and method | |
WO2012073414A1 (en) | Image processing device | |
CN108346134B (en) | Method and measuring instrument for coloring three-dimensional point cloud | |
CN106471336A (en) | For target detection and/or knowledge method for distinguishing and measuring instrument | |
JP2021076603A (en) | Photoelectric sensor and object detection method | |
CN110031830B (en) | Distance measurement method based on laser line scanning imaging | |
WO2015048053A1 (en) | Improved dynamic range of color camera images superimposed on scanned three dimensional gray scale images | |
Zollhöfer | Commodity RGB-D sensors: Data acquisition | |
CN113962875A (en) | Method and system for enhancing images using machine learning | |
US20210321016A1 (en) | Artificial intelligence scan colorization | |
CN115150545B (en) | Measurement system for acquiring three-dimensional measurement points | |
US20230153967A1 (en) | Removing reflection from scanned data | |
JP2024047931A (en) | Imaging device, image processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FARO TECHNOLOGIES, INC., FLORIDA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GITTINGER, JURGEN;OSSIG, MARTIN;SIGNING DATES FROM 20140924 TO 20140925;REEL/FRAME:033850/0663 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |