US11568520B2 - Method and device for inpainting of colourised three-dimensional point clouds - Google Patents

Method and device for inpainting of colourised three-dimensional point clouds Download PDF

Info

Publication number
US11568520B2
US11568520B2 US15/879,303 US201815879303A US11568520B2 US 11568520 B2 US11568520 B2 US 11568520B2 US 201815879303 A US201815879303 A US 201815879303A US 11568520 B2 US11568520 B2 US 11568520B2
Authority
US
United States
Prior art keywords
point cloud
camera
instrument
coordinate system
pixel value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/879,303
Other languages
English (en)
Other versions
US20180211367A1 (en
Inventor
Daniel Blaser
Richard FETZEL
Bianca Gordon
Marco SCHRÖDER
Bernd Walser
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leica Geosystems AG
Original Assignee
Leica Geosystems AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Geosystems AG filed Critical Leica Geosystems AG
Assigned to LEICA GEOSYSTEMS AG reassignment LEICA GEOSYSTEMS AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLASER, DANIEL, Gordon, Bianca, Schröder, Marco, WALSER, BERND, FETZEL, Richard
Publication of US20180211367A1 publication Critical patent/US20180211367A1/en
Application granted granted Critical
Publication of US11568520B2 publication Critical patent/US11568520B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06T5/005
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • G01C1/02Theodolites
    • G01C1/04Theodolites combined with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Definitions

  • the present invention relates generally to a surveying instrument and more particularly to systems and methods of inpainting colourised three-dimensional (3D) point clouds using a surveying instrument.
  • Generating three-dimensional point clouds is used to survey many different settings such as construction sites, building facades, industrial facilities, interior of houses, or any other applicable setting.
  • the surveys achieved therewith may be used to obtain accurate three-dimensional (3D) models of a setting, wherein the models consist of point clouds.
  • the points of such a cloud are stored by coordinates in a coordinate system, which may be defined by a surveying instrument which recorded the point cloud.
  • the surveying instrument constitutes the origin of the coordinate system by an instrument center, in particular by the so called nodal point of the surveying instrument.
  • the points are usually surveyed by associating a distance measured with a laser beam (with help of a time-of-flight method) with the alignment under which the distance was measured.
  • the coordinate system is a spherical coordinate system, such that a point is characterised by a distance value, an elevation angle and an azimuth angle with reference to the origin of the coordinate system.
  • Common surveying instruments comprise a unit for sending out a scanning beam and for receiving the reflected beam in order to measure the distance of a point the beam was directed at.
  • these surveying instruments furthermore comprise means to rotatably alter the direction of the beams, commonly a vertical rotation axis and a horizontal rotation axis, wherein both axes are sensed with angle sensors.
  • the rotation of the vertical axis is measured by an azimuth angle and the rotation of the horizontal axis is measured by an elevation angle.
  • one of said axes may be a slow axis and the other one a fast axis.
  • the distances may be calculated with the travel time measurement (time-of-flight) method by observing the time between sending out and receiving a signal.
  • the alignment angles are achieved with said angle sensors arranged at the vertical axis and at the horizontal axis.
  • the point cloud may be digitally colourised.
  • terrestrial surveying is hence supported by imaging data of at least one camera which is combined with a surveying instrument by including the camera in the instrument or mounting it on the same platform as the instrument.
  • 3D points and corresponding image pixels which are affected by parallax, are detected using projective geometry.
  • the 3D points and corresponding image pixels may be corrected by filling them based on adjacent colours. These colours may be selected based on properties of the reflected laser light (e.g. intensity, distance, and Signal-to-Noise-Ratio) from the corresponding 3D points.
  • the detection of parallax points and the used In-painting algorithm allows having camera sensors outside the nodal point without false colourised scan points, and using cameras outside the nodal point eases the placing of the cameras and optic design.
  • Some embodiments of the invention relate to a method for colourising a three-dimensional point cloud.
  • the method includes, with a surveying instrument, surveying a point cloud of a setting, wherein each point of said point cloud is characterised by coordinates within an instrument coordinate system, which has an instrument center as origin.
  • the method further includes with a first camera comprised by the surveying instrument, capturing a first image of the setting, wherein each pixel value of the first image is assigned to coordinates within a first camera coordinate system, which has a first projection center as origin, wherein the first projection center has first parallax shift relative to the instrument center.
  • transforming the point cloud from the instrument coordinate system into the first camera coordinate system resulting in a first transformed point cloud
  • detecting one or more uncovered points which are openly visible from the perspective of the first projection center.
  • uncovered points are those which have a direct line of sight with the first projection center.
  • assigning a pixel value which has corresponding coordinates in the first camera coordinate system.
  • the point cloud can be considered as colourised.
  • the colourised point cloud can then be re-transformed from the first camera coordinate system into the instrument coordinate system.
  • the computer may be incorporated into the surveying instrument, or be embodied as a cloud computer, a smart phone or a tablet PC.
  • the instrument center which is the point of origin to the point cloud within the instrument coordinate system, may be referred to as the “nodal point” of the surveying instrument.
  • the nodal point may be defined by a crossing point of the azimuth axis and the elevation axis of the surveying instrument.
  • the instrument center (or “nodal point”) may be arbitrarily positioned, such that it is a virtual point not stuck to a physical object.
  • the instrument center may be inside or on a beam directing unit of the surveying instrument, or it may be positioned “in the air” around or within the surveying instrument structure.
  • the method further comprises: With the computer, within the first transformed point cloud, detecting one or more covered points, which are non-visible from the perspective of the first projection center due to the first parallax shift, and to each covered point of the first transformed point cloud, assigning a substitute pixel value, which is determined based on a pixel value assigned to a point adjacent to the covered point of the first transformed point cloud.
  • Said point adjacent to the covered point may be an uncovered or it may be itself a covered point which a substitute pixel value already has been assigned to.
  • Detecting covered points takes account of how the visibility has been before the perspective shift (transformation of the point cloud). For example, if a point cluster (recognized as a plane surface) has to be “pierced” through to reach a specific point, then that specific point may be detected as a covered point.
  • the method may comprise the steps:
  • each pixel value of the second image is assigned to coordinates within a second camera coordinate system, which has a second projection center as origin, wherein the second projection center has second parallax shift relative to the instrument center.
  • the method may comprise:
  • each pixel value of the second image is assigned to coordinates within a second camera coordinate system, which has a second projection center as origin, wherein the second projection center has second parallax shift relative to the instrument center.
  • the first transformed point cloud detecting one or more covered points, which are non-visible from the perspective of the first projection center due to the first parallax shift.
  • At least one of the first camera and the second camera may be one of a wide angle camera, a panoramic camera, and a spherical camera.
  • At least one of the coordinates of the pixel values, the coordinates of the points of the point cloud, the coordinates of the points of the first transformed point cloud, and the coordinates of the points of the second transformed point cloud may comprise at least an elevation angle and an azimuth angle.
  • At least one of the coordinates of the points of the point cloud, the coordinates of the points of the first transformed point cloud, and the coordinates of the points of the second transformed point cloud may comprise an elevation angle, an azimuth angle and a distance value.
  • Each pixel value of the first image may be assigned to coordinates within the first camera coordinate system based at least on a focal length of the first camera. Accordingly, each pixel value of the second image may be assigned to coordinates within the second camera coordinate system based at least on a focal length of the second camera.
  • At least one of the uncovered points and the covered points are detected based on an detection algorithm using one of 3D point projection, plane detection, feature detection and object detection.
  • the invention also relates to a surveying instrument for generating point clouds within an instrument coordinate system having an instrument center as origin.
  • the surveying instrument may comprise a base, a body mounted on the base such that the body is rotatable relative to the base about an azimuth axis, a beam directing unit mounted in the body such that the beam directing unit is rotatable relative to the body about an elevation axis, wherein the beam directing unit may be configured to direct a transmission beam towards a setting, and to receive a reception beam from the setting.
  • the reception beam may be considered the transmission beam reflected from the scene.
  • the surveying instrument may further comprise a first camera having a first projection center, which has a first parallax shift relative to the instrument center, and a computer for controlling the body, the beam directing unit and the first camera.
  • the surveying instrument is configured to perform a method as it is described herein.
  • the first camera may have a first focal length, based on which coordinates, in particular an elevation angle and an azimuth angle, may be assigned to each pixel value of the first camera.
  • the surveying instrument may comprise a second camera having a second projection center, which has a second parallax shift relative to the instrument center, wherein the computer may further be configured to control the second camera.
  • the surveying instrument according to the invention can be any surveying instrument configured to generate a three-dimensional point cloud, such as a total station, a theodolite or a laser scanner.
  • the beam directing unit comprises an emitting unit for providing the transmission beam, and a detection unit for detecting the reception beam.
  • the surveying instrument is embodied as a laser scanner
  • the body comprises an emitting unit for providing the transmission beam, and a detection unit for detecting the reception beam.
  • the beam directing unit is embodied as a deflector, in particular a mirror, which is configured to deflect the transmission beam from the emitting unit towards the setting, and to deflect the reception beam from the scene to the detection unit.
  • FIG. 1 shows one embodiment of a surveying instrument according to the invention, embodied as laser scanner;
  • FIG. 2 shows a schematic drawing of the parallax problem solved by the invention
  • FIG. 3 , 4 , 5 illustrate the different perspectives of camera and surveying instrument caused by the parallax shift
  • FIG. 6 shows the steps of a method according to the invention
  • FIG. 7 , 8 show further exemplary embodiments of a surveying instrument according to the invention having multiple cameras;
  • FIG. 9 shows an embodiment of the surveying instrument according to the invention.
  • FIG. 1 shows an exemplary surveying instrument 1 embodied as laser scanner configured to perform a method according to the invention.
  • the surveying instrument comprises a body 2 and a base 3 , optionally mounted on a tripod 9 .
  • a controlled, motorised relative rotation between body 2 and base 3 is provided around axis V.
  • the body 2 comprises an emitting unit 4 , a receiving unit 5 , and a beam directing unit 6 , wherein emitting unit 4 and receiving unit 5 are combined as one part in this example; they may however be embodied by according separate components, e.g. in combination with a beam splitter.
  • the beam directing unit 6 is mounted in the body such that it is rotatable around an elevation axis H by a motor (not shown).
  • At least one camera 7 is comprised by the housing 2 .
  • the one or more cameras may in particular be incorporated in the housing.
  • a computer 8 is comprised by the housing 2 .
  • the computer may, however, also be external to the laser scanner 1 , e.g. embodied by a cloud computer having permanent wireless connection to the laser scanner.
  • the computer 8 is configured to control the mentioned components of the surveying instrument and to perform the steps of the method according to the invention.
  • FIG. 2 shows the general problem of misassignment of a pixel to a point of a 3D point cloud.
  • the surveying instrument 1 targets an exemplary point 12 of the setting 10 , and the camera 7 tries to capture the same point, it fails to do so due to an obstacle 11 . Without the method according to the invention, a pixel of the obstacle 11 would be mistakenly assigned to the point 12 .
  • This misassignment problem is caused by the parallax shift between the instrument center 16 of the surveying instrument, which may also be referred to as “nodal point” and defined as the origin of an instrument coordinate system (within which the point cloud is recorded), on the one hand and the projection center 17 of the camera, which is the vertex of the camera's angle of view (also referred to as entrance pupil or “no-parallax point”) on the other hand.
  • the parallax shift may be defined by line segments a, b and c (c is not shown because it is deemed perpendicular to a and b). So in the shown example, the parallax shift is two-dimensional (a,b), but of course it may be three-dimensional (a,b,c).
  • the parallax shift may also be defined by a line segment connecting the nodal point 16 and the projection center 17 , wherein this connecting line segment may be expressed by a distance value, an azimuth angle and an elevation angle with reference to a coordinate system of the scanner that has the nodal point 16 as its origin.
  • FIG. 4 shows how the setting 10 is captured by the surveying instrument from its instrument center 16 .
  • the shaded surface seen in the right part of FIG. 4 is not captured by the point cloud and therefore unknown to the surveying instrument.
  • FIG. 5 shows how the same setting 10 is captured by the camera from its projection center 17 .
  • the shaded surface seen in the right part of FIG. 5 is not captured by the image and therefore unknown to the camera.
  • Step 20 is surveying a setting with a surveying instrument 1 in order to obtain a three-dimensional point cloud of the setting within a coordinate system that has the instrument center 16 of the surveying instrument 1 as its origin.
  • said instrument coordinate system may be a spherical coordinate system in which points are stored by two angles (elevation angle and azimuth angle) and a distance value.
  • the angles are detected by angle encoders comprised by the surveying instrument 1 .
  • the elevation angle expresses the rotational position of the beam directing unit 6 about the horizontal axis H.
  • the azimuth angle expresses the rotational position of the body 2 about the vertical axis V.
  • the distance value is measured with a time-of-flight method, i.e. by sending out a transmission beam T and receiving a reception beam B, which is the reflected transmission beam T reflected from the setting.
  • the time during transmission and reception is measured and a distance value is calculated out of it with help of knowledge of the speed of light.
  • Step 21 is capturing an image of the setting with a camera, in particular a same part of the setting of which the point cloud had been obtained.
  • the position of the surveying instrument remains unchanged between surveying the point cloud and capturing the image.
  • alignment angles may be assigned to each pixel of the image. These alignment angles are also coordinates of a camera coordinate system that has the projection center 17 as its origin.
  • Each pixel of the image hence, may be assigned to an elevation angle and an azimuth angle within said camera coordinate system. Since distances cannot be derived from the image itself, the coordinates of the pixels only comprise said angles.
  • Step 22 is transforming the point cloud from the instrument coordinate system to the camera coordinate system. This may be performed by the computer 8 , under knowledge of the parallax shift between the projection center 17 and the instrument center 16 . The actual (absolute) shape of the point cloud is not altered. With transforming, a change of perspective is performed, from which the point cloud is “looked” at. That is, the origin now is not anymore the nodal point 17 , but rather the projection center 16 .
  • Step 23 is assigning to each point of the point cloud which is openly, i.e. directly, visible from the new perspective (projection center 17 ) a pixel which has corresponding coordinates.
  • the openly visible uncovered points may be detected by a detection algorithm which may use plane detection, feature detection or object detection. Since a point cloud does not have “filled” walls or planes, but rather is porous, of course points may be, so to speak, “visible” while practically being located behind an obstacle. Therefore, said detection algorithm may take account of the features of the point cloud in order to detect, what will cover a point and what will not, when the change of perspectives is performed. The detection algorithm may also take into account an image taken with the camera.
  • Corresponding coordinates means that a pixel and a point have the same or essentially the same elevation angle and the same azimuth angle with respect to the projection center 17 .
  • step 23 those points which are non-visible from the projection center 17 will not be assigned to a pixel value, such that they remain uncoloured at first. Only the ones visible from the projection center 17 will be colourised and those points of the point cloud which are covered from the cameras point of view are disregarded.
  • those non-visible (covered) points may be coloured nevertheless:
  • the computer 8 may determine a substitute pixel value to be assigned to the non-visible point, based on one of the pixels assigned to points adjacent to the non-visible point. In particular this determination may use at least one of: plane detection, feature detection, material detection, or object surface detection (e.g. based on Signal-to-Noise analysis of the reception beam R and/or the transmission beam T), and image processing.
  • FIGS. 7 and 8 show multiple cameras 7 comprised by the body 2 of the laser scanner 1 which are heading in different directions thereby providing a wide angle view (up to a 360 degree view) when the images are merged.
  • Each camera 7 constitutes its own camera coordinate system with their according projection centers as origins.
  • the steps 20 , 21 , 22 , 23 and further optional steps explained in this application may be performed for each of the multiple cameras.
  • the image data acquisition is an additional step to surveying a 3D point cloud. It is a time consuming task which has to be as fast as possible. At the same time a high resolution imaging data and good image quality are desirable.
  • a surveying instrument provides a high resolution, high quality image and is nevertheless very fast taking all needed images. This can be accomplished by using more than one camera.
  • the cameras are placed in a way to cover the needed vertical field of view (e.g. 160°) and a horizontal field of view as large as possible.
  • the scanner can reduce the pointing directions which have to be taken by the factor of the number of cameras used.
  • FIG. 9 shows a surveying instrument according to the invention, embodied as a theodolite, having a base 3 on which a body 2 is mounted such that it is rotatable about an azimuth axis V.
  • a beam directing unit 6 is mounted in the body 2 such that it is rotatable about an elevation axis H.
  • a camera 7 may be attached to or integrated in the theodolite 1 .
  • the beam directing unit 6 may comprise an emitting unit 4 for providing a transmission beam T and a detection unit 5 for detecting a reception beam R.
  • a computer 8 for controlling the components of the theodolite 1 and for perform a method according to the present invention is comprised by the theodolite 1 , e.g. within the body 2 (as shown).

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
US15/879,303 2017-01-24 2018-01-24 Method and device for inpainting of colourised three-dimensional point clouds Active 2039-03-13 US11568520B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP17152842.5A EP3351899B1 (de) 2017-01-24 2017-01-24 Verfahren und vorrichtung zum "colorising" von dreidimensionalen punktwolken
EP17152842 2017-01-24
EP17152842.5 2017-01-24

Publications (2)

Publication Number Publication Date
US20180211367A1 US20180211367A1 (en) 2018-07-26
US11568520B2 true US11568520B2 (en) 2023-01-31

Family

ID=57890698

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/879,303 Active 2039-03-13 US11568520B2 (en) 2017-01-24 2018-01-24 Method and device for inpainting of colourised three-dimensional point clouds

Country Status (3)

Country Link
US (1) US11568520B2 (de)
EP (1) EP3351899B1 (de)
CN (1) CN108346134B (de)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3351899B1 (de) * 2017-01-24 2020-06-17 Leica Geosystems AG Verfahren und vorrichtung zum "colorising" von dreidimensionalen punktwolken
GB2576548B (en) * 2018-08-23 2021-11-03 Sony Interactive Entertainment Inc Method and system for reconstructing colour and depth information of a scene
CN109615594B (zh) * 2018-11-30 2020-10-23 四川省安全科学技术研究院 一种激光点云空洞修补着色方法
CN111369609B (zh) * 2020-03-04 2023-06-30 山东交通学院 一种基于点云曲面特征约束的建筑物局部变形分析方法
CN111784834A (zh) * 2020-06-24 2020-10-16 北京百度网讯科技有限公司 一种点云地图生成方法、装置以及电子设备
CN112419400A (zh) * 2020-09-28 2021-02-26 广东博智林机器人有限公司 机器人的位置检测方法、检测装置、处理器和电子设备
CN113362445B (zh) * 2021-05-25 2023-05-05 上海奥视达智能科技有限公司 基于点云数据重建对象的方法及装置

Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040150641A1 (en) * 2002-11-15 2004-08-05 Esc Entertainment Reality-based light environment for digital imaging in motion pictures
WO2008107720A1 (en) 2007-03-05 2008-09-12 Geospatial Research Limited Colour enhanced laser imagery
US20100157280A1 (en) * 2008-12-19 2010-06-24 Ambercore Software Inc. Method and system for aligning a line scan camera with a lidar scanner for real time data fusion in three dimensions
US20100239180A1 (en) * 2009-03-17 2010-09-23 Sehoon Yea Depth Reconstruction Filter for Depth Coding Videos
US20100239187A1 (en) * 2009-03-17 2010-09-23 Sehoon Yea Method for Up-Sampling Depth Images
US7860299B2 (en) * 2005-02-10 2010-12-28 Trimble Ab Methods and apparatus for image processing
US20110115812A1 (en) * 2009-11-13 2011-05-19 Harris Corporation Method for colorization of point cloud data based on radiometric imagery
US20120070077A1 (en) * 2009-03-25 2012-03-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment
DE202013001538U1 (de) 2013-02-19 2013-03-19 Ulrich Clauss Anordnung zur Aufnahme geometrischer und photometrischer Objektdaten im Raum
US8411931B2 (en) * 2006-06-23 2013-04-02 Imax Corporation Methods and systems for converting 2D motion pictures for stereoscopic 3D exhibition
US20130293568A1 (en) * 2011-01-18 2013-11-07 Nikon Corporation Image processing apparatus and image processing program product
US20140063489A1 (en) * 2012-09-06 2014-03-06 Faro Technologies, Inc. Laser Scanner
US20140125767A1 (en) * 2012-02-24 2014-05-08 Matterport, Inc. Capturing and aligning three-dimensional scenes
US9070216B2 (en) * 2011-12-14 2015-06-30 The Board Of Trustees Of The University Of Illinois Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring
US20150193963A1 (en) * 2014-01-08 2015-07-09 Here Global B.V. Systems and Methods for Creating an Aerial Image
US20150341552A1 (en) * 2014-05-21 2015-11-26 Here Global B.V. Developing a Panoramic Image
US20160061954A1 (en) * 2014-08-27 2016-03-03 Leica Geosystems Ag Multi-camera laser scanner
US20160189348A1 (en) * 2010-02-05 2016-06-30 Trimble Navigation Limited Systems and methods for processing mapping and modeling data
US20160187486A1 (en) * 2014-12-30 2016-06-30 Nokia Technologies Oy Range Sensing Using a Hybrid Range Sensing Device
US20160266256A1 (en) * 2015-03-11 2016-09-15 The Boeing Company Real Time Multi Dimensional Image Fusing
EP3086283A1 (de) 2015-04-21 2016-10-26 Hexagon Technology Center GmbH Bereitstellung einer point-cloud unter verwendung eines vermessungsinstruments und einer kameravorrichtung
CN106780586A (zh) * 2016-11-14 2017-05-31 厦门大学 一种基于地面激光点云的太阳能潜力评估方法
US20170188002A1 (en) * 2015-11-09 2017-06-29 The University Of Hong Kong Auxiliary data for artifacts - aware view synthesis
US20170307370A1 (en) * 2014-10-24 2017-10-26 Nikon-Trimble Co., Ltd Surveying instrument and program
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
US20180033160A1 (en) * 2016-03-30 2018-02-01 Panasonic Intellectual Property Management Co., Ltd. Position estimation apparatus and position estimation method
US20180033146A1 (en) * 2016-07-27 2018-02-01 Michael Bleyer Reflectivity map estimate from dot based structured light systems
US20180074203A1 (en) * 2016-09-12 2018-03-15 Delphi Technologies, Inc. Lidar Object Detection System for Automated Vehicles
US20180108176A1 (en) * 2016-10-19 2018-04-19 Fuji Xerox Co., Ltd. Data processing apparatus, three-dimensional object molding system, and non-transitory computer readable medium
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
EP3351899A1 (de) * 2017-01-24 2018-07-25 Leica Geosystems AG Verfahren und vorrichtung zum "colorising" von dreidimensionalen punktwolken
US20180262737A1 (en) * 2017-03-07 2018-09-13 Trimble Ab Scan colorization with an uncalibrated camera
EP3425333A1 (de) * 2017-07-04 2019-01-09 Hexagon Technology Center GmbH Überwachungsinstrument zum abtasten eines objekts und bilderfassung des objekts
WO2019210360A1 (en) * 2018-05-01 2019-11-07 Commonwealth Scientific And Industrial Research Organisation Method and system for use in colourisation of a point cloud

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2602588A1 (de) * 2011-12-06 2013-06-12 Hexagon Technology Center GmbH Positions- und Ausrichtungsbestimmung in 6-DOF
CN103017739B (zh) * 2012-11-20 2015-04-29 武汉大学 基于激光雷达点云与航空影像的真正射影像的制作方法
CN104778691B (zh) * 2015-04-07 2017-05-17 中北大学 一种三维点云数据的处理方法
CN106204731A (zh) * 2016-07-18 2016-12-07 华南理工大学 一种基于双目立体视觉系统的多视角三维重建方法

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040150641A1 (en) * 2002-11-15 2004-08-05 Esc Entertainment Reality-based light environment for digital imaging in motion pictures
US7860299B2 (en) * 2005-02-10 2010-12-28 Trimble Ab Methods and apparatus for image processing
US8411931B2 (en) * 2006-06-23 2013-04-02 Imax Corporation Methods and systems for converting 2D motion pictures for stereoscopic 3D exhibition
WO2008107720A1 (en) 2007-03-05 2008-09-12 Geospatial Research Limited Colour enhanced laser imagery
US20100157280A1 (en) * 2008-12-19 2010-06-24 Ambercore Software Inc. Method and system for aligning a line scan camera with a lidar scanner for real time data fusion in three dimensions
US20100239180A1 (en) * 2009-03-17 2010-09-23 Sehoon Yea Depth Reconstruction Filter for Depth Coding Videos
US20100239187A1 (en) * 2009-03-17 2010-09-23 Sehoon Yea Method for Up-Sampling Depth Images
US20120070077A1 (en) * 2009-03-25 2012-03-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment
US20110115812A1 (en) * 2009-11-13 2011-05-19 Harris Corporation Method for colorization of point cloud data based on radiometric imagery
US20160189348A1 (en) * 2010-02-05 2016-06-30 Trimble Navigation Limited Systems and methods for processing mapping and modeling data
US20130293568A1 (en) * 2011-01-18 2013-11-07 Nikon Corporation Image processing apparatus and image processing program product
US9070216B2 (en) * 2011-12-14 2015-06-30 The Board Of Trustees Of The University Of Illinois Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring
US20140125767A1 (en) * 2012-02-24 2014-05-08 Matterport, Inc. Capturing and aligning three-dimensional scenes
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
US20140063489A1 (en) * 2012-09-06 2014-03-06 Faro Technologies, Inc. Laser Scanner
DE202013001538U1 (de) 2013-02-19 2013-03-19 Ulrich Clauss Anordnung zur Aufnahme geometrischer und photometrischer Objektdaten im Raum
US20150193963A1 (en) * 2014-01-08 2015-07-09 Here Global B.V. Systems and Methods for Creating an Aerial Image
US20150341552A1 (en) * 2014-05-21 2015-11-26 Here Global B.V. Developing a Panoramic Image
US20160061954A1 (en) * 2014-08-27 2016-03-03 Leica Geosystems Ag Multi-camera laser scanner
US20170307370A1 (en) * 2014-10-24 2017-10-26 Nikon-Trimble Co., Ltd Surveying instrument and program
US20160187486A1 (en) * 2014-12-30 2016-06-30 Nokia Technologies Oy Range Sensing Using a Hybrid Range Sensing Device
US20160266256A1 (en) * 2015-03-11 2016-09-15 The Boeing Company Real Time Multi Dimensional Image Fusing
EP3086283A1 (de) 2015-04-21 2016-10-26 Hexagon Technology Center GmbH Bereitstellung einer point-cloud unter verwendung eines vermessungsinstruments und einer kameravorrichtung
US20160314593A1 (en) * 2015-04-21 2016-10-27 Hexagon Technology Center Gmbh Providing a point cloud using a surveying instrument and a camera device
US20170188002A1 (en) * 2015-11-09 2017-06-29 The University Of Hong Kong Auxiliary data for artifacts - aware view synthesis
US20180033160A1 (en) * 2016-03-30 2018-02-01 Panasonic Intellectual Property Management Co., Ltd. Position estimation apparatus and position estimation method
US20180033146A1 (en) * 2016-07-27 2018-02-01 Michael Bleyer Reflectivity map estimate from dot based structured light systems
US20180074203A1 (en) * 2016-09-12 2018-03-15 Delphi Technologies, Inc. Lidar Object Detection System for Automated Vehicles
US20180108176A1 (en) * 2016-10-19 2018-04-19 Fuji Xerox Co., Ltd. Data processing apparatus, three-dimensional object molding system, and non-transitory computer readable medium
CN106780586A (zh) * 2016-11-14 2017-05-31 厦门大学 一种基于地面激光点云的太阳能潜力评估方法
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
EP3351899A1 (de) * 2017-01-24 2018-07-25 Leica Geosystems AG Verfahren und vorrichtung zum "colorising" von dreidimensionalen punktwolken
US20180211367A1 (en) * 2017-01-24 2018-07-26 Leica Geosystems Ag Method and device for inpainting of colourised three-dimensional point clouds
US20180262737A1 (en) * 2017-03-07 2018-09-13 Trimble Ab Scan colorization with an uncalibrated camera
EP3425333A1 (de) * 2017-07-04 2019-01-09 Hexagon Technology Center GmbH Überwachungsinstrument zum abtasten eines objekts und bilderfassung des objekts
WO2019210360A1 (en) * 2018-05-01 2019-11-07 Commonwealth Scientific And Industrial Research Organisation Method and system for use in colourisation of a point cloud

Non-Patent Citations (14)

* Cited by examiner, † Cited by third party
Title
"Trimble Expands 3D Laser Scanning Portfolio with Addition of New TX6 and Improved TX8",2016, https://www.trimble.com/news/release.aspx?id=101116a (Year: 2016). *
CN Search Report in Application No. 201810062536.4 dated Feb. 3, 2021.
EP Search Report dated Jul. 12, 2017 as received in Application No. 17152842.5.
Geospatial Media, The Trimble TX6 is a cost effective 3D scanning solution, Nov. 4, 2016, Geospatial world, Intergeo 2016, https://www.geospatialworld.net/videos/trimble-tx6-cost-effective-3d-scanning-solution/ (Year: 2016). *
Gregg Jackson and Gregory Lepere, Inside Trimble TX6 and TX8 Color Acquisition, Mar. 2, 2017, https://upgsolutions.com/docs/WP_Inside%20Trimble%20TX6.pdf, p. 1-9 (Year: 2017). *
Gregg Jackson and Gregory Lepere, Inside Trimble TX6 and TX8—Deep Dive into Lightning Technology, Dec. 8, 2016, https://upgsolutions.com/docs/WP_lnside%20Trimble%20TX8.pdf, p. 1-16 (Year: 2016). *
Ka et al., "Automatic generation of three dimensional colored point clouds based on multi-view image matching", Optics and Precision Engineering, vol. 21, No. 7, Jul. 2013.
Release Notes Trimble Realworks Software Version 10.2, Oct. 2016, https://trl.trimble.com/docushare/dsweb/Get/Document-827447/Trimble_RealWorks_10.2_RELEASE-NOTES_ENG_20161011.pdf (Year: 2016). *
Sun et al., "An automatic 3D point cloud registration method based on regional curvature maps" Image and Vision Computing, Oct. 27, 2016.
Trimble Geospatial, Trimble TX8 Laser Scanner—Customer Testimonial—ECO3D, Jan. 24, 2017, https://www.youtube.com/watch?v=nKekVXxpBzA, (Year: 2017). *
User Guide Trimble Realworks 10.2, Trimble, Oct. 11, 2016, p. 1-1678 (Year: 2016). *
User Guide Trimble TX6/TX8 3D Laser Scanner, Oct. 2016, p. 1-116 (Year: 2016). *
Wu et al., "Automatic cloud detection for high resolution satellite stereo images and its application in terrain extraction" ISPRS Journal of Photogrammetry and remote Sensing, Oct. 12, 2016.
Yan et al., "Image dense matching technique assisted extraction of point cloud by 3D laser scanner", China Academic Journal Electronic Publishing House, Apr. 16, 2014.

Also Published As

Publication number Publication date
EP3351899B1 (de) 2020-06-17
CN108346134B (zh) 2022-04-05
EP3351899A1 (de) 2018-07-25
CN108346134A (zh) 2018-07-31
US20180211367A1 (en) 2018-07-26

Similar Documents

Publication Publication Date Title
US11568520B2 (en) Method and device for inpainting of colourised three-dimensional point clouds
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US10830588B2 (en) Surveying instrument for scanning an object and image acquistion of the object
US10535148B2 (en) Scanner VIS
EP3665506B1 (de) Vorrichtung und verfahren zur erzeugung einer darstellung einer szene
US9693040B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
JP7300948B2 (ja) 測量データ処理装置、測量データ処理方法、測量データ処理用プログラム
US11015932B2 (en) Surveying instrument for scanning an object and image acquisition of the object
US20140063489A1 (en) Laser Scanner
CN108007344B (zh) 用于可视地表示扫描数据的方法、存储介质和测量系统
WO2016040229A1 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
WO2016040271A1 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
WO2016089430A1 (en) Using two-dimensional camera images to speed registration of three-dimensional scans
US10254402B2 (en) Stereo range with lidar correction
JPWO2017199785A1 (ja) 監視システムの設定方法及び監視システム
Schneider et al. Integrated processing of terrestrial laser scanner data and fisheye-camera image data
US11805235B2 (en) Method and device for three-dimensional light detection and ranging (LiDAR), and three-dimensional measuring device thereof
US20240176025A1 (en) Generating a parallax free two and a half (2.5) dimensional point cloud using a high resolution image
GB2543658A (en) Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: LEICA GEOSYSTEMS AG, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BLASER, DANIEL;FETZEL, RICHARD;GORDON, BIANCA;AND OTHERS;SIGNING DATES FROM 20171208 TO 20180130;REEL/FRAME:044805/0011

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE