WO2021075967A1 - Method and apparatus for optical measurement of environments - Google Patents

Method and apparatus for optical measurement of environments Download PDF

Info

Publication number
WO2021075967A1
WO2021075967A1 PCT/NL2020/050639 NL2020050639W WO2021075967A1 WO 2021075967 A1 WO2021075967 A1 WO 2021075967A1 NL 2020050639 W NL2020050639 W NL 2020050639W WO 2021075967 A1 WO2021075967 A1 WO 2021075967A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
measurements
measurement
taking
directions
Prior art date
Application number
PCT/NL2020/050639
Other languages
French (fr)
Inventor
Jeroen Frederik LICHTENAUER
Horatiu Adrian ALEXE
Jan Herman KLUIVER
Original Assignee
Xnr B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xnr B.V. filed Critical Xnr B.V.
Publication of WO2021075967A1 publication Critical patent/WO2021075967A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor

Definitions

  • the present invention relates to a method and apparatus for collecting data for generating a floor plan, point cloud or 3D model of an environment, such as the interior of a home, the interior of an office building or a cave, comprising capturing photographic images, using a wide-angle camera system in said apparatus, and taking a set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using a distance sensor in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner.
  • known embodiments use a single distance sensor in the same way for Simultaneous Localisation And Mapping (SLAM) during motion as well as for taking distance measurements during stationary placement, thus providing inferior results compared to using two different means for distance measurement that are specialised for the two different tasks, respectively.
  • SLAM Simultaneous Localisation And Mapping
  • the aim of the present invention is, first of all, to obtain such a method and apparatus that can automatically self-calibrate the alignment of images of the camera system to the frame of the apparatus, so that, for instance, a camera can be (re)placed by a user without requiring the user to go through a complex or time-consuming calibration procedure.
  • another aim of the present invention is to obtain such an apparatus that can make optimal use of the superior possibilities for taking distance measurements during stationary placement of the apparatus while still being able to take useful distance measurements during movement of the apparatus, so that, for instance, the distance measurements taken during motion can be used by a SLAM algorithm to automatically merge and augment the distance measurements taken stationarily at different locations.
  • the automatic self-calibration of the alignment of images of the camera system to the frame of the apparatus can be achieved because in one embodiment of the method and apparatus the method and apparatus comprise fixating components of said apparatus with respect to each other, using a frame in said apparatus; capturing photographic images, using a wide- angle camera system in the apparatus, preferably with an effective horizontal viewing angle of 360 degrees; and taking a first set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using first means for distance measurement in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, characterized in, that the method and apparatus further comprise taking a second set of one or more distance measurements using light with a wavelength that is visible to the human eye, such as red, green or blue, using second means for distance measurement in the apparatus, such as a LIDAR, a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner
  • the method and apparatus may further comprise transmitting image- or measurement data wirelessly to an external device or a data network, using communication means in the apparatus.
  • the method and apparatus may further comprise storing data, such as photographic images, distance measurements or calibration parameters, using storage means in the apparatus.
  • the method and apparatus may further comprise calculating calibration parameters to align images of said camera system with said frame on the basis of said second set of distance measurements and their corresponding light- reflections visible in the images of said camera system, using first data processing means in said apparatus.
  • the method and apparatus may further comprise calculating movements of said frame in a fixed reference coordinate system, using second data processing means in the apparatus, on the basis of matches between landmark points detected in photographic images captured during movement of the apparatus from a first location to a second location, such as an adjoining space or another part of the same space.
  • the method and apparatus further comprise taking a third set of measurements of distances in multiple directions, using said first means for distance measurement, whereby the measurement directions with respect to said frame are periodically repeated during said movement of the apparatus.
  • the method and apparatus may further comprise taking, during stationary placement of the apparatus at said second location, a fourth set of measurements of distances in multiple directions, using said second means for distance measurement or third means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner, whereby one measurement cycle of the range of measurement directions required for said third set of measurements is completed in a shorter amount of time using said first means for distance measurement than one measurement cycle of the range of measurements directions required for said fourth set of measurements is completed using said second or said third means for distance measurement.
  • a fourth set of measurements of distances in multiple directions such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner
  • the method and apparatus may further comprise calculating relative point locations corresponding to measured distances and their known directions within the coordinate system of said frame, the movements of said frame and the point locations of said relative point locations in a fixed reference coordinate system, using third data processing means in said apparatus.
  • Fig. 1 illustrates an example usage of one embodiment of the present invention to measure the interior of a building.
  • Fig. 2 is a conceptual diagram of an embodiment of the apparatus to which the present invention relates.
  • a desired property of an apparatus used for Simultaneous Localisation And Mapping is that it does not disturb people, such as a person who carries the apparatus through an environment. Therefore, an optical multi-directional distance sensor used for SLAM, such as a 2D LIDAR, 3D LIDAR or structured-light 3D scanner, often uses light with a wavelength outside of the spectrum that is easily visible to the human eye. For example, near- infrared or ultraviolet.
  • the method and apparatus do not only comprise taking a first set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using first means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, but are characterized in, that the method and apparatus further comprise taking a second set of one or more distance measurements using light with a wavelength that is visible to the human eye (and thus is also detectable by a camera system that is meant for the visible light spectrum), such as red, green or blue, using second means for distance measurement in said apparatus, such as a LIDAR, a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner.
  • the parameters can be stored by storage means in the apparatus or elsewhere and used for as long as the camera system remains fixated by the frame of the apparatus.
  • said first means for distance measurement comprise a first optical distance sensor emitting invisible light, such as near- infrared or ultraviolet
  • said second means for distance measurement comprise a second optical distance sensor emitting visible light, such as red, green or blue.
  • said first means for distance measurement comprise a first light source inside a LIDAR unit emitting invisible light, such as near-infrared or ultraviolet
  • said second means for distance measurement comprise a second light source inside said LIDAR unit emitting visible light, such as red, green or blue.
  • said first means for distance measurement comprise a first light source inside a projector of a structured-light 3D scanner emitting invisible light, such as near-infrared or ultraviolet
  • said second means for distance measurement comprise a second light source inside said projector of said structured-light 3D scanner emitting visible light, such as red, green or blue.
  • Another desired property of an apparatus used for SLAM is that it can make accurate geometric measurements of all relevant parts of the environment.
  • a room can be measured with a multi-directional distance sensor that provides the widest possible coverage of measurement directions, the largest possible range of distances, the highest possible density of measurement directions, the highest possible accuracy in measured distances and/or the highest possible accuracy in measurement directions.
  • a typical dual-axis laser scanner provides a point cloud of 3D measurements in nearly all directions.
  • a laser-based distance sensor scans around, not one, but two axes, causing a quadratic increase in scanning time with the same density of measurement directions, it can take a significant amount of time to finish one scanning cycle.
  • a laser-based distance sensor of the time-of-flight type using the phase-shift measurement method may take several measurements of the same distance with different modulation frequencies in order to get the most accurate distance measurement with the largest possible range between the minimum and maximum distance that it can measure.
  • multi-directional laser-based distance sensors used for SLAM in indoor environments typically only scan around one axis rather than two and/or use faster but inferior methods of measuring distances.
  • Fast but inferior distance measurement methods may include, for example, time-of-flight phase-shift with only one modulation frequency (having lower accuracy of measured distances), or triangulation (which is unsuitable for large distances).
  • a distance sensor using triangulation may either use a laser, to measure in one direction at a time, or use a structured light pattern to scan an entire area within one camera image capture.
  • IMU Inertia Measurement Unit
  • an alternative to a 3D range scanner is to use a camera-based SLAM method such as MonoSLAM or Semi-direct Visual Odometry (SVO).
  • a camera-based SLAM method such as MonoSLAM or Semi-direct Visual Odometry (SVO).
  • These methods track sets of detected salient landmarks over multiple photographs from a moving camera.
  • the changes in image locations of a set of points matched across two or more photographs can be used to simultaneously calculate the 3D locations of these points as well as the movement of the camera between those photographs.
  • camera-based motion estimation methods are based on visible details in the environment and thus do not measure point locations in the middle of evenly coloured surfaces, such as most walls and ceilings. Only the surface corners and corners of objects attached to even surfaces can be located. The rest of the surface has to be inferred by assuming flatness between localised corners.
  • Another downside of camera-based SLAM methods is that they cannot solve for scale. This has to be provided by an additional sensor measuring pyshical distances or -displacement. Camera-based SLAM can be useful to complement other distance sensors, but may not suffice on its own to measure certain environments such as the interiors of buildings.
  • the method and apparatus further comprise taking a third set of measurements of distances in multiple directions, using said first means for distance measurement, whereby the measurement directions with respect to said frame are periodically repeated during movement of the apparatus from a first location to a second location, such as an adjoining space or another part of the same space; and taking, during stationary placement of the apparatus at said second location, a fourth set of measurements of distances in multiple directions, using said second means for distance measurement or third means for distance measurement in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner, whereby one measurement cycle of the range of measurement directions required for said third set of measurements is completed in a shorter amount of time using said first means for distance measurement than one measurement cycle of the range of measurements directions required for said fourth set of measurements is completed using said second or said third means for
  • said second or third means for distance measurement may support a wider coverage of measurement directions, a larger range of distances, a higher density of measurement directions, a higher accuracy in measured distances or a higher accuracy in measurement directions than said first means for distance measurement.
  • a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise a first axle that rotates a distance sensor in one direction for a 2D point scan, and said second or third means for distance measurement comprise a second axle that, together with said first axle, rotates said distance sensor in two directions for a 3D point scan.
  • a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise a first distance sensor, such as a LIDAR or a structured- light 3D scanner, that rotates around one axis for a 2D point scan or a narrow 3D scan, and said second or third means for distance measurement comprise a second distance sensor, such as a LIDAR or a structured-light 3D scanner, that rotates around two axes for a wider 3D scan.
  • a first distance sensor such as a LIDAR or a structured- light 3D scanner
  • a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to repeatedly measure distances within a first range of directions, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances within a second range of directions, whereby said second range of directions is wider than said first range of directions.
  • a larger measurable range of distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR or structured-light 3D scanner, with a first range of measureble distances and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, with a second range of measureble distances, whereby said second range of measureble distances is larger than said first range of measureble distances.
  • a first distance sensor such as a 2D LIDAR, 3D LIDAR or structured-light 3D scanner
  • a larger measurable range of distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor of the time-of- flight/phase-shift type to measure distances with a first set of modulation frequencies, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second set of modulation frequencies, whereby said second set of modulation frequencies consists of more frequencies than said first, such that the resulting range of measurable distances is larger.
  • a higher density of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first density of measurement directions, and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second density of measurement directions, whereby said second density of measurement directions is higher than said first.
  • a first distance sensor such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner
  • a higher density of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances with a first density of measurement directions, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second density of measurement directions, whereby said second density of measurement directions is higher than said first.
  • a higher accuracy of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first accuracy of measurement directions, and said second or said third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second accuracy of measurement directions, whereby said second accuracy of measurement directions is higher than said first.
  • a first distance sensor such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner
  • a higher accuracy of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances at a first measurement speed, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second measurement speed, whereby said second measurement speed is slower than said first, allowing for a more accurate orientation or orientation measurement of the distance sensor.
  • a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first accuracy of distance, and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second accuracy of distance, whereby said second accuracy of distance is higher than said first.
  • a first distance sensor such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner
  • a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances at a first measurement speed, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second measurement speed, whereby said second measurement speed is slower than said first, allowing for a more accurate distance measurement.
  • a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor of the time-of- flight/phase-shift type to measure distances with a first set of modulation frequencies, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second set of modulation frequencies, whereby said second set of modulation frequencies consists of more frequencies than said first, such that the resulting distance accuracy is higher.
  • Fig. 1 illustrates an example usage of one embodiment of the present invention to measure the interior of a building.
  • the actions by a human operator to conduct a scanning procedure may comprise 1) placing the apparatus at a stationary position as close as possible to the middle of a room, 2) commanding the apparatus to take stationary measurements using a mobile device with a wireless connection to the apparatus, 3) carrying the apparatus around the room to let it capture corners that were not visible from the stationary location, 4) placing the apparatus at the next stationary measurement location and repeating these steps until the apparatus has taken stationary measurements at at least one location in every room.
  • Contour 101 in fig. 1 is a 2D floor projection of the vertical surfaces of the interior of a building.
  • Dots 102 to 108 indicate the 2D floor-projected locations where the apparatus has taken measurements from stationary placements.
  • Curves 109 to 115 indicate the 2D floor- projected motion paths that the apparatus has travelled when it was carried from one stationary placement to the next. Along all motion paths, the apparatus has kept taking measurements to continuously estimate the changing location of the apparatus and further complete the building map.
  • SLAM Simultaneous Localisation And Mapping
  • a part of a cycle of range measurements taken from stationary placement location 103 are indicted by dashed arrows 116.
  • a closet 118 has completely obstructed the view from location 103 onto a wall 119 behind the closet, causing a gap in the stationary measurements.
  • a clockwise horizontally rotating laser-based distance sensor has measured distances.
  • a part of the distance measurements during motion path 110 are indicated by arrows 117. These measurements include wall 119 that could not be measured from any of the stationary placement locations.
  • an operator may choose to take photographs at locations 104 and 105 with door 120 closed, making it impossible to match the range measurements taken from both locations. But since the operator carried the apparatus through the doorway of door 120, a part of the measurements 121 taken during motion path 112 include parts of the two spaces on either side of the doorway, including the inner side of the door post. Because measurements 121 can be matched to the set of stationary measurements taken from 104 as well as the set of stationary measurements taken from 105, it is possible to link both sets of measurements together into one complete building map nonetheless.
  • Taking measurements from a stationary placement has several advantages over measurements taken while the apparatus is in motion.
  • the height and pose of the sensors in the apparatus, with respect to the floor can be known implicitly from the physical dimensions of the apparatus.
  • a range sensor that requires time to make a full scan of the environment has the opportunity to do so without its measurement position changing during the measurement time. This allows the use of, for example, a distance sensor that needs to be rotated around one or two axes to generate a point cloud of many distance measurements or a laser-based distance sensor that makes accurate measurements by measuring the same distance several times with different wavelengths of laser modulation.
  • HDR High Dynamic Range
  • the proposed method and apparatus overcomes the shortcomings of conventional approaches by combining at least two different sensors, of which at least one sensor is particularly suitable to take measurements from stationary placement, and at least one other sensor is particularly suitable to take measurements during motion.
  • An apparatus that includes at least one of both kinds can make optimal use of the advantages possible in stationary measurement, while also making optimal use of the advantages of measurement during motion.
  • Fig. 2 illustrates an example of an embodiment of the proposed apparatus.
  • a laser-based distance sensor 201 that may optionally pivot up and down around a horizontal axle 203, or otherwise be fixed in a (near) horizontal direction, is attached to a rotor 204 that can pivot horizontally around a vertical axle 205. Due to being attached to horizontal rotor 204, laser-based distance sensor 201 can take range measurements in different horizontal angles to generate a 2D scan of the environment. In the optional case that laser-based distance sensor 201 can pivot up and/or down as well, it can even generate a 3D scan of the environment. An alternative away to achieve this is by fixing laser-based distance sensor 201 to rotor 204 and deflecting its measurement beam using a reflective surface attached to horizontal axle 203.
  • the laser light of distance sensor 201 preferably has a wavelength in the human-visible spectrum, such as red, green or blue, to allow for automatic calibration of a camera system that is only sensitive to the human-visible spectrum, as will be further explained below.
  • a second laser-based distance sensor 206 that pivots around a vertical axle 208 emits laser light that is invisible to the human eye due to its infrared wavelength.
  • This laser-based distance sensor can scan horizontally and generate a 2D point cloud at a rate that is fast enough to be used for SLAM. Because the laser 207 is not visible to the human eye, a human will not be disturbed by the continuously sweeping projections of the laser onto surfaces of the environment.
  • a frame 209 connects the top parts of the apparatus to the bottom part with minimal interference to the measurements of the range sensors mounted within. For instance, by using a transparent material or narrow vertical rods or slats. Frame 209 may also include electrical conductors to distribute electrical current and communication signals among the different electrical components of the apparatus.
  • An optional wide angle camera system 214 preferably with a 360 degree horizontal viewing angle, may comprise of one or more cameras and lenses. In case the camera system takes several photographs in multiple directions, the photographs can be combined together into one wide angle photograph using calibration parameters stored in memory module 230. The individual camera views may also be calibrated and used separately, in order to maximise accuracy.
  • Camera system 214 can serve at least two distinct purposes in the application for which the apparatus is intended. First of all, the camera system can be used to generate panoramic views and virtual tours of an environment for cinematic-, promotional- or inspection purposes. Secondly, a photograph from a well-calibrated camera system can be used to calculate the 3D location of any visible point on a plane of which its 3D pose with respect to the camera is fully known. This can be used to take 3D measurements of wall distances, ceiling heights, doorways, window frames, etc. For example, knowing the height of the center of camera system 214 from the floor 237, the distance of the intersection 224 of wall 238 with the floor can be derived from the angle of line 223 along which 224 is seen.
  • the angle of line 223 can be derived directly from the location within the photograph where 224 can be seen.
  • the height of ceiling 236 can be calculated using the angle of line 221 along which the intersection 222 of the ceiling with the wall is seen in the photograph.
  • laser-based distance sensor 201 uses a wavelength that is visible to camera system 214, the projection of laser 202 on the wall at 235 can be detected within a photograph along line 226 and used for automatic calibration of the alignment of photographs from the camera system.
  • the 3D location of the laser projection point 235 with respect to the camera can be derived from the distance measured with laser beam 202. Knowing 3D location 235 provides horizontal and vertical distances 216 and 225, respectively, between the laser projection 235 and the optical center of camera system 214.
  • Angle 219 between horizontal line 216 and the direction 226 at which the laser projection 235 is seen from the view of camera system 214 is calculated as the arctangent of vertical distance 225 divided by horizontal distance 216.
  • the angle 218 of the laser projection 235 to the horizontal line 215 in the misaligned imaging coordinate system can be derived from the image location of the laser projection 235 detectable in a photograph from camera system 214, using the intrinsic camera calibration parameters, such as focal length and radial lens distortion parameters. Now the camera system's misalignment angle 217 can be calculated as the difference between angle 218 and angle 219.
  • the 3D rotation matrix can be determined that correctly aligns images from camera system 214 to the frame of the apparatus.
  • Random Sampling Consensus can be applied to a sufficiently high number of laser projections in different directions.
  • laser range measurements together with the detected image locations of their laser-surface projections, taken at different stationary locations may be combined in the above calibration procedure. If sufficient measurements are available, taken at different stationary locations in order to have sufficient variation in wall distances, it will also be possible to optimise for an unknown origin height and direction of the laser-based distance sensor 201, and thus obtain these calibration parameters automatically as well.
  • the above automatic calibration procedure to align the camera system to the apparatus can also be performed with multiple, preferably three, fixed laser-based distance sensors, preferably with a horizontal angle of 90 degrees between their measurement directions.
  • These preferably three fixed distance sensors may be oriented in any direction, as long as their laser projections are visible in images of the camera system. Pointing all of them towards the ceiling or floor around the apparatus, however, may provide the additional benefit that their set of distance measurements may be used to calculate the pose of the ceiling or floor relative to the frame of the apparatus. This will facilitate accurate measurement of wall distances from photographs when the frame of the apparatus is not perfectly or consistently aligned with the floor.
  • An optional vertical laser-based distance sensor 210 can measure its vertical distance 211 from the ceiling.
  • Ceiling height can be used to generate a 3D model of the environment from horizontal 2D scans of walls and/or to facilitate measurement of wall distances from photographs in case the intersection of a wall with the floor is not clearly visible in the photographs.
  • An optional laser-based distance sensor 212 can measure its distance 213 from the floor. This can be useful in case the height of stand 232 of the apparatus is not fixed in order to know the height of the optical center of the camera system from the floor as well as to derive the ceiling height from the distance measurement 211 by laser- based distance sensor 210.
  • An optional IMU 227 can provide accelerometer data to determine and compensate for the angle of the apparatus with respect to the floor in case the apparatus is not standing perfectly straight. For instance, due to an uneven floor surface underneath the base 233 of the apparatus. Furthermore the SLAM algorithm performed by a computing device 229 that is used while the apparatus is moving can use the data from IMU 227 to improve localisation and pose estimation of the apparatus.
  • Computing device 229 may further comprise means to improve the 2D map or 3D model of the environment by combining SLAM data measured during motion with data from laser-based distance sensor 201 and/or laser-based distance sensor 206 measured during known stationary poses.
  • a memory module 230 stores data, such as photographic images, distance measurements and calibration parameters.
  • a communication device 231 with an antenna collects user input from and transmits data to a mobile device through a wireless connection. This can for instance receive an instruction from an operator to start a stationary measurement.
  • a hand grip 234 allows an operator to easily carry the apparatus around for moving it to the next stationary measurement location or take extra measurements of the building while moving.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Manufacturing & Machinery (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An apparatus for measuring shapes of environments comprises means to capture photographic images, preferably with a viewing angle of 360 degrees, and means to measure surface distances in multiple directions by making use of both visible light, preferably red, green or blue, as well as invisible light, preferably infrared, depending on the mode of operation. A computing device calculates calibration parameters to align images from the camera system with the frame of the apparatus on the basis of distances measured using visible light and the corresponding image locations of their light-reflections detectable in the images of the camera system.

Description

METHOD AND APPARATUS FOR OPTICAL MEASUREMENT OF
ENVIRONMENTS
BACKGROUND OF THE INVENTION
The present invention relates to a method and apparatus for collecting data for generating a floor plan, point cloud or 3D model of an environment, such as the interior of a home, the interior of an office building or a cave, comprising capturing photographic images, using a wide-angle camera system in said apparatus, and taking a set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using a distance sensor in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner.
While known embodiments of such a method or apparatus make use of a camera system to augment the measurements from a distance sensor with distance measurements from images, they do not include a way to automatically compute the calibration coefficients that align the camera view with the distance measurements, thus requiring a complex or time-consuming manual procedure to configure the apparatus each time its camera is (re)placed.
Furthermore, known embodiments use a single distance sensor in the same way for Simultaneous Localisation And Mapping (SLAM) during motion as well as for taking distance measurements during stationary placement, thus providing inferior results compared to using two different means for distance measurement that are specialised for the two different tasks, respectively.
The aim of the present invention is, first of all, to obtain such a method and apparatus that can automatically self-calibrate the alignment of images of the camera system to the frame of the apparatus, so that, for instance, a camera can be (re)placed by a user without requiring the user to go through a complex or time-consuming calibration procedure.
Secondly, another aim of the present invention is to obtain such an apparatus that can make optimal use of the superior possibilities for taking distance measurements during stationary placement of the apparatus while still being able to take useful distance measurements during movement of the apparatus, so that, for instance, the distance measurements taken during motion can be used by a SLAM algorithm to automatically merge and augment the distance measurements taken stationarily at different locations.
BRIEF SUMMARY OF THE INVENTION
According to the present invention, the automatic self-calibration of the alignment of images of the camera system to the frame of the apparatus can be achieved because in one embodiment of the method and apparatus the method and apparatus comprise fixating components of said apparatus with respect to each other, using a frame in said apparatus; capturing photographic images, using a wide- angle camera system in the apparatus, preferably with an effective horizontal viewing angle of 360 degrees; and taking a first set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using first means for distance measurement in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, characterized in, that the method and apparatus further comprise taking a second set of one or more distance measurements using light with a wavelength that is visible to the human eye, such as red, green or blue, using second means for distance measurement in the apparatus, such as a LIDAR, a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner.
It is further contemplated that one in embodiment, the method and apparatus may further comprise transmitting image- or measurement data wirelessly to an external device or a data network, using communication means in the apparatus.
It is yet further contemplated that in one embodiment, the method and apparatus may further comprise storing data, such as photographic images, distance measurements or calibration parameters, using storage means in the apparatus.
It is yet further contemplated that in one embodiment, the method and apparatus may further comprise calculating calibration parameters to align images of said camera system with said frame on the basis of said second set of distance measurements and their corresponding light- reflections visible in the images of said camera system, using first data processing means in said apparatus.
It is yet further contemplated that in one embodiment, the method and apparatus may further comprise calculating movements of said frame in a fixed reference coordinate system, using second data processing means in the apparatus, on the basis of matches between landmark points detected in photographic images captured during movement of the apparatus from a first location to a second location, such as an adjoining space or another part of the same space.
Also according to the present invention, making optimal use of the superior possibilities for taking distance measurements during stationary placement of the apparatus, while still being able to take useful distance measurements during movement of the apparatus, can be achieved because in one embodiment, the method and apparatus further comprise taking a third set of measurements of distances in multiple directions, using said first means for distance measurement, whereby the measurement directions with respect to said frame are periodically repeated during said movement of the apparatus.
It is further contemplated that in one embodiment, the method and apparatus may further comprise taking, during stationary placement of the apparatus at said second location, a fourth set of measurements of distances in multiple directions, using said second means for distance measurement or third means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner, whereby one measurement cycle of the range of measurement directions required for said third set of measurements is completed in a shorter amount of time using said first means for distance measurement than one measurement cycle of the range of measurements directions required for said fourth set of measurements is completed using said second or said third means for distance measurement.
It is yet further contemplated that in one embodiment, the method and apparatus may further comprise calculating relative point locations corresponding to measured distances and their known directions within the coordinate system of said frame, the movements of said frame and the point locations of said relative point locations in a fixed reference coordinate system, using third data processing means in said apparatus.
BRIEF SUMMARY OF THE DRAWING
Fig. 1 illustrates an example usage of one embodiment of the present invention to measure the interior of a building. Fig. 2 is a conceptual diagram of an embodiment of the apparatus to which the present invention relates.
DETAILED DESCRIPTION OF THE INVENTION The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features. Before the present invention is described in further detail, it is to be understood that the invention is not limited to the particular embodiments described, as such may, of course, vary. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting, since the scope of the present invention will be limited only by the appended claims.
Although the invention has been described with reference to a particular embodiment, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments as well as alternative embodiments of the invention will become apparent to persons skilled in the art to which this invention belongs. It is therefore contemplated that the appended claims will cover any such modifications or embodiments that fall within the scope of the invention.
Unless otherwise noted, the drawings of the present application are not necessarily drawn to scale. They demonstrate the basic relationship of the constituent parts, but not necessarily their respective sizes.
It must be noted that, as used herein and in the appended claims, the singular forms "a", "an", and "the" include plural referents unless the context clearly dictates otherwise.
The publications discussed herein are provided solely for their disclosure prior to the filing date of the present application. Nothing herein is to be construed as an admission that the present invention is not entitled to antedate such publication by virtue of prior invention. Further, the dates of publication provided may be different from the actual publication dates which may need to be independently confirmed.
Hereafter the features of the present invention are described in more detail and their practical and economical significance is explained.
A desired property of an apparatus used for Simultaneous Localisation And Mapping (SLAM) is that it does not disturb people, such as a person who carries the apparatus through an environment. Therefore, an optical multi-directional distance sensor used for SLAM, such as a 2D LIDAR, 3D LIDAR or structured-light 3D scanner, often uses light with a wavelength outside of the spectrum that is easily visible to the human eye. For example, near- infrared or ultraviolet.
On the other hand, if the projected light at the distance measurement locations in the environment would be clearly detectable in the camera images, it would be possible to automatically compute the calibration parameters to align the images from the camera system with the optical distance measurements. And if the geometrical relationship between the optical distance measurements and the frame of the apparatus is known, this would mean that the images of the camera system could now be aligned to the frame of the apparatus as well as to any other part of the apparatus of which its geometrical relationship to the frame is known.
Unfortunately, to make a camera system suitable to produce images that are visually realistic to humans it is necessary to avoid sensitivity to infrared light, which is often achieved by incorporating an infrared blocking filter in a camera. This makes it impossible to detect the projections from an infrared-light-based distance sensor in photographic images from such a camera.
This contradictory requirement is overcome in the presently disclosed method and apparatus because in one embodiment the method and apparatus do not only comprise taking a first set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to the human eye, such as near-infrared or ultraviolet, using first means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, but are characterized in, that the method and apparatus further comprise taking a second set of one or more distance measurements using light with a wavelength that is visible to the human eye (and thus is also detectable by a camera system that is meant for the visible light spectrum), such as red, green or blue, using second means for distance measurement in said apparatus, such as a LIDAR, a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner.
As long as no visible-light-based distance sensor is used for SLAM when the apparatus is carried around by a person, no one has to be disturbed by it. Once the calibration parameters to align images from the camera system to the frame of the apparatus are computed using said second set of distance measurements and their corresponding light-reflections visible in the images of said camera system, the parameters can be stored by storage means in the apparatus or elsewhere and used for as long as the camera system remains fixated by the frame of the apparatus.
In one embodiment of the present invention, said first means for distance measurement comprise a first optical distance sensor emitting invisible light, such as near- infrared or ultraviolet, and said second means for distance measurement comprise a second optical distance sensor emitting visible light, such as red, green or blue.
In another embodiment of the present invention, said first means for distance measurement comprise a first light source inside a LIDAR unit emitting invisible light, such as near-infrared or ultraviolet, and said second means for distance measurement comprise a second light source inside said LIDAR unit emitting visible light, such as red, green or blue.
In yet another embodiment of the present invention, said first means for distance measurement comprise a first light source inside a projector of a structured-light 3D scanner emitting invisible light, such as near-infrared or ultraviolet, and said second means for distance measurement comprise a second light source inside said projector of said structured-light 3D scanner emitting visible light, such as red, green or blue.
Another desired property of an apparatus used for SLAM is that it can make accurate geometric measurements of all relevant parts of the environment. During stationary placement, a room can be measured with a multi-directional distance sensor that provides the widest possible coverage of measurement directions, the largest possible range of distances, the highest possible density of measurement directions, the highest possible accuracy in measured distances and/or the highest possible accuracy in measurement directions. For example, a typical dual-axis laser scanner provides a point cloud of 3D measurements in nearly all directions. However, when a laser-based distance sensor scans around, not one, but two axes, causing a quadratic increase in scanning time with the same density of measurement directions, it can take a significant amount of time to finish one scanning cycle. It typically takes several minutes to make a high-density 3D scan from one stationary location. Similarly, to make the most accurate distance measurement, a laser-based distance sensor of the time-of-flight type using the phase-shift measurement method may take several measurements of the same distance with different modulation frequencies in order to get the most accurate distance measurement with the largest possible range between the minimum and maximum distance that it can measure.
Unfortunately, for a laser-based distance sensor to be useful for SLAM it has to be able to take range measurements during movement. It is to be understood that with movement, we mean changes in location, height as well as horizontal and vertical orientation. Measuring distances during movement excludes the possibility of taking several time-of-flight measurements at the same location in the same direction with different modulation frequencies. Furthermore, SLAM requires a full scan to be completed before the apparatus has moved too far to have enough data overlap with a previous scan to automatically merge subsequent measurements and estimate the new location. This limits the density or coverage of directions in which distances can be measured during one period of the repeated measurement cycle.
This is why multi-directional laser-based distance sensors used for SLAM in indoor environments typically only scan around one axis rather than two and/or use faster but inferior methods of measuring distances. Fast but inferior distance measurement methods may include, for example, time-of-flight phase-shift with only one modulation frequency (having lower accuracy of measured distances), or triangulation (which is unsuitable for large distances). A distance sensor using triangulation may either use a laser, to measure in one direction at a time, or use a structured light pattern to scan an entire area within one camera image capture. Besides the time it takes, another drawback of using 3D point clouds from a dual-axis range scanner for SLAM is that it requires significantly more data storage capacity and/or significantly more computational power to process the data than SLAM based on 2D data from single-axis scans. In rooms where most walls are vertical and the apparatus is carried in a near-vertical pose, a 2D SLAM approach can suffice to map a single floor and augment 3D measurements taken at stationary locations. An Inertia Measurement Unit (IMU) can be used to compensate deviations in measured distances of points on vertical walls due to non-horizontal measurement angles.
If 3D motion estimation is required, an alternative to a 3D range scanner is to use a camera-based SLAM method such as MonoSLAM or Semi-direct Visual Odometry (SVO).
These methods track sets of detected salient landmarks over multiple photographs from a moving camera. The changes in image locations of a set of points matched across two or more photographs can be used to simultaneously calculate the 3D locations of these points as well as the movement of the camera between those photographs.
One downside of these camera-based motion estimation methods is that they are based on visible details in the environment and thus do not measure point locations in the middle of evenly coloured surfaces, such as most walls and ceilings. Only the surface corners and corners of objects attached to even surfaces can be located. The rest of the surface has to be inferred by assuming flatness between localised corners. Another downside of camera-based SLAM methods is that they cannot solve for scale. This has to be provided by an additional sensor measuring pyshical distances or -displacement. Camera-based SLAM can be useful to complement other distance sensors, but may not suffice on its own to measure certain environments such as the interiors of buildings.
For greater accuracy, improved usability or cost- reduction, the above mentioned contradictions and trade offs between sensor-requirements for SLAM and high-quality distance measurements are overcome in one embodiment of the presently disclosed method and apparatus because the method and apparatus further comprise taking a third set of measurements of distances in multiple directions, using said first means for distance measurement, whereby the measurement directions with respect to said frame are periodically repeated during movement of the apparatus from a first location to a second location, such as an adjoining space or another part of the same space; and taking, during stationary placement of the apparatus at said second location, a fourth set of measurements of distances in multiple directions, using said second means for distance measurement or third means for distance measurement in the apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner, whereby one measurement cycle of the range of measurement directions required for said third set of measurements is completed in a shorter amount of time using said first means for distance measurement than one measurement cycle of the range of measurements directions required for said fourth set of measurements is completed using said second or said third means for distance measurement.
For example, said second or third means for distance measurement may support a wider coverage of measurement directions, a larger range of distances, a higher density of measurement directions, a higher accuracy in measured distances or a higher accuracy in measurement directions than said first means for distance measurement.
In one embodiment of the present invention, a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise a first axle that rotates a distance sensor in one direction for a 2D point scan, and said second or third means for distance measurement comprise a second axle that, together with said first axle, rotates said distance sensor in two directions for a 3D point scan.
In another embodiment of the present invention, a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise a first distance sensor, such as a LIDAR or a structured- light 3D scanner, that rotates around one axis for a 2D point scan or a narrow 3D scan, and said second or third means for distance measurement comprise a second distance sensor, such as a LIDAR or a structured-light 3D scanner, that rotates around two axes for a wider 3D scan.
In yet another embodiment of the present invention, a wider coverage of measurement directions is achieved during stationary placement than during movement of the apparatus, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to repeatedly measure distances within a first range of directions, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances within a second range of directions, whereby said second range of directions is wider than said first range of directions.
In yet another embodiment of the present invention, a larger measurable range of distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR or structured-light 3D scanner, with a first range of measureble distances and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, with a second range of measureble distances, whereby said second range of measureble distances is larger than said first range of measureble distances.
In yet another embodiment of the present invention, a larger measurable range of distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor of the time-of- flight/phase-shift type to measure distances with a first set of modulation frequencies, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second set of modulation frequencies, whereby said second set of modulation frequencies consists of more frequencies than said first, such that the resulting range of measurable distances is larger.
In yet another embodiment of the present invention, a higher density of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first density of measurement directions, and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second density of measurement directions, whereby said second density of measurement directions is higher than said first. In yet another embodiment of the present invention, a higher density of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances with a first density of measurement directions, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second density of measurement directions, whereby said second density of measurement directions is higher than said first.
In yet another embodiment of the present invention, a higher accuracy of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first accuracy of measurement directions, and said second or said third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second accuracy of measurement directions, whereby said second accuracy of measurement directions is higher than said first.
In yet another embodiment of the present invention, a higher accuracy of measurement directions during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances at a first measurement speed, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second measurement speed, whereby said second measurement speed is slower than said first, allowing for a more accurate orientation or orientation measurement of the distance sensor.
In yet another embodiment of the present invention, a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise a first distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a first accuracy of distance, and said second or third means for distance measurement comprise a second distance sensor, such as a 2D LIDAR, 3D LIDAR, or structured-light 3D scanner, that can measure with a second accuracy of distance, whereby said second accuracy of distance is higher than said first.
In yet another embodiment of the present invention, a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor to measure distances at a first measurement speed, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second measurement speed, whereby said second measurement speed is slower than said first, allowing for a more accurate distance measurement.
In yet another embodiment of the present invention, a higher accuracy of measured distances during stationary placement than during movement of the apparatus is achieved, because said first means for distance measurement comprise first instructions which, when executed by a micro processor, cause a distance sensor of the time-of- flight/phase-shift type to measure distances with a first set of modulation frequencies, and said second or third means for distance measurement comprise second instructions which, when executed by a micro processor, cause said distance sensor to measure distances with a second set of modulation frequencies, whereby said second set of modulation frequencies consists of more frequencies than said first, such that the resulting distance accuracy is higher.
Fig. 1 illustrates an example usage of one embodiment of the present invention to measure the interior of a building. The actions by a human operator to conduct a scanning procedure may comprise 1) placing the apparatus at a stationary position as close as possible to the middle of a room, 2) commanding the apparatus to take stationary measurements using a mobile device with a wireless connection to the apparatus, 3) carrying the apparatus around the room to let it capture corners that were not visible from the stationary location, 4) placing the apparatus at the next stationary measurement location and repeating these steps until the apparatus has taken stationary measurements at at least one location in every room.
Contour 101 in fig. 1 is a 2D floor projection of the vertical surfaces of the interior of a building. Dots 102 to 108 indicate the 2D floor-projected locations where the apparatus has taken measurements from stationary placements. Curves 109 to 115 indicate the 2D floor- projected motion paths that the apparatus has travelled when it was carried from one stationary placement to the next. Along all motion paths, the apparatus has kept taking measurements to continuously estimate the changing location of the apparatus and further complete the building map.
Such a method of combining subsequent geometric measurements taken during motion into one map is well known in the art to which this invention belongs as Simultaneous Localisation And Mapping (SLAM).
A part of a cycle of range measurements taken from stationary placement location 103 are indicted by dashed arrows 116. A closet 118 has completely obstructed the view from location 103 onto a wall 119 behind the closet, causing a gap in the stationary measurements. Along all motions paths 109 to 115 a clockwise horizontally rotating laser-based distance sensor has measured distances. A part of the distance measurements during motion path 110 are indicated by arrows 117. These measurements include wall 119 that could not be measured from any of the stationary placement locations. By carrying the apparatus around, an operator can quickly complete the building map with only a minimal amount of stationary placements of the apparatus. Furthermore, an operator may choose to take photographs at locations 104 and 105 with door 120 closed, making it impossible to match the range measurements taken from both locations. But since the operator carried the apparatus through the doorway of door 120, a part of the measurements 121 taken during motion path 112 include parts of the two spaces on either side of the doorway, including the inner side of the door post. Because measurements 121 can be matched to the set of stationary measurements taken from 104 as well as the set of stationary measurements taken from 105, it is possible to link both sets of measurements together into one complete building map nonetheless.
Taking measurements from a stationary placement has several advantages over measurements taken while the apparatus is in motion. First of all, when the apparatus is placed on the floor, the height and pose of the sensors in the apparatus, with respect to the floor, can be known implicitly from the physical dimensions of the apparatus. Secondly, a range sensor that requires time to make a full scan of the environment has the opportunity to do so without its measurement position changing during the measurement time. This allows the use of, for example, a distance sensor that needs to be rotated around one or two axes to generate a point cloud of many distance measurements or a laser-based distance sensor that makes accurate measurements by measuring the same distance several times with different wavelengths of laser modulation. Furthermore, low-noise photographs can also be taken without any motion blur while the apparatus is standing still, even if the lighting conditions demand a long exposure. By taking several subsequent photographs with different exposure times from the exact same location, a High Dynamic Range (HDR) image can be generated as well, allowing all details of the room to be visible in one combined image, even if there is an extreme contrast in the lighting condition.
Conversely, taking measurements while carrying the apparatus around has other advantages. Contrary to stationary measurements, time is not spent by walking away from the apparatus to hide while the apparatus takes measurements and then walking back to pick up the apparatus again before moving to the next measurement location. By measuring while the apparatus is carried around, many different measurement viewpoints can be taken in a short amount of time, making it possible to quickly measure every wall and corner of a building. Furthermore, when subsequent measurements during motion are taken with only small differences in the location of the apparatus, it is more likely that the subsequent measurements have sufficient overlap with each other to automatically match them all together into one complete 2D map or 3D model of the entire environment.
The down side of measurements taken during motion is that many more matches of measured locations need to be done to merge all measurements into one 2D map or 3D model. Due to measurement noise and matching ambiguities, each match inevitably has at least some small error. Many small errors together can accumulate to significant distortions of the estimated motion path and the reconstruction of the 2D map or 3D building model. Combining a few sets of stationary measurements (at least one in every room) with the measurements along the motion paths can result in a 2D map or 3D model of the interior of a building that is more complete, more accurate and/or faster to capture than what can be achieved with either approach separately.
The proposed method and apparatus overcomes the shortcomings of conventional approaches by combining at least two different sensors, of which at least one sensor is particularly suitable to take measurements from stationary placement, and at least one other sensor is particularly suitable to take measurements during motion.
An apparatus that includes at least one of both kinds can make optimal use of the advantages possible in stationary measurement, while also making optimal use of the advantages of measurement during motion.
Fig. 2 illustrates an example of an embodiment of the proposed apparatus. A laser-based distance sensor 201 that may optionally pivot up and down around a horizontal axle 203, or otherwise be fixed in a (near) horizontal direction, is attached to a rotor 204 that can pivot horizontally around a vertical axle 205. Due to being attached to horizontal rotor 204, laser-based distance sensor 201 can take range measurements in different horizontal angles to generate a 2D scan of the environment. In the optional case that laser-based distance sensor 201 can pivot up and/or down as well, it can even generate a 3D scan of the environment. An alternative away to achieve this is by fixing laser-based distance sensor 201 to rotor 204 and deflecting its measurement beam using a reflective surface attached to horizontal axle 203.
The laser light of distance sensor 201 preferably has a wavelength in the human-visible spectrum, such as red, green or blue, to allow for automatic calibration of a camera system that is only sensitive to the human-visible spectrum, as will be further explained below.
A second laser-based distance sensor 206 that pivots around a vertical axle 208 emits laser light that is invisible to the human eye due to its infrared wavelength. This laser-based distance sensor can scan horizontally and generate a 2D point cloud at a rate that is fast enough to be used for SLAM. Because the laser 207 is not visible to the human eye, a human will not be disturbed by the continuously sweeping projections of the laser onto surfaces of the environment.
A frame 209 connects the top parts of the apparatus to the bottom part with minimal interference to the measurements of the range sensors mounted within. For instance, by using a transparent material or narrow vertical rods or slats. Frame 209 may also include electrical conductors to distribute electrical current and communication signals among the different electrical components of the apparatus. An optional wide angle camera system 214, preferably with a 360 degree horizontal viewing angle, may comprise of one or more cameras and lenses. In case the camera system takes several photographs in multiple directions, the photographs can be combined together into one wide angle photograph using calibration parameters stored in memory module 230. The individual camera views may also be calibrated and used separately, in order to maximise accuracy.
Camera system 214 can serve at least two distinct purposes in the application for which the apparatus is intended. First of all, the camera system can be used to generate panoramic views and virtual tours of an environment for cinematic-, promotional- or inspection purposes. Secondly, a photograph from a well-calibrated camera system can be used to calculate the 3D location of any visible point on a plane of which its 3D pose with respect to the camera is fully known. This can be used to take 3D measurements of wall distances, ceiling heights, doorways, window frames, etc. For example, knowing the height of the center of camera system 214 from the floor 237, the distance of the intersection 224 of wall 238 with the floor can be derived from the angle of line 223 along which 224 is seen. With a fully calibrated camera system, the angle of line 223 can be derived directly from the location within the photograph where 224 can be seen. Similarly, knowing the distance of wall 238 from the camera and assuming that the wall is vertical, the height of ceiling 236 can be calculated using the angle of line 221 along which the intersection 222 of the ceiling with the wall is seen in the photograph. By stacking several of such measurements for surfaces that can be assumed perpendicular to each other, complex architectural structures can be measured from just a single photograph. This method is also known from patent JP2015125002(A).
Combining photographic data and data from other sensors such as distance sensor 201 and IMU 227 requires an accurate alignment between the camera system and the frame of the apparatus. Unfortunately, due to unpredictable variations in production, calibration and/or attachment of camera system 214, the coordinate system of the photographs may not be perfectly nor consistently aligned with the coordinate system of the frame of the apparatus. This is illustrated in fig. 2 where the direction 215 of the optical horizontal plane in the imaging coordinate system of has a different orientation than the horizontal plane 216 of the frame of the apparatus. If the calibration parameters of this misalignment can be known, it can be compensated for. Computing device 228 performs automatic calculation of these parameters as follows:
Because laser-based distance sensor 201 uses a wavelength that is visible to camera system 214, the projection of laser 202 on the wall at 235 can be detected within a photograph along line 226 and used for automatic calibration of the alignment of photographs from the camera system.
Assuming that the 3D direction 202 and origin 203 of laser-based distance sensor 201 are known with respect to the optical center of camera 214, then the 3D location of the laser projection point 235 with respect to the camera can be derived from the distance measured with laser beam 202. Knowing 3D location 235 provides horizontal and vertical distances 216 and 225, respectively, between the laser projection 235 and the optical center of camera system 214.
Angle 219 between horizontal line 216 and the direction 226 at which the laser projection 235 is seen from the view of camera system 214 is calculated as the arctangent of vertical distance 225 divided by horizontal distance 216. The angle 218 of the laser projection 235 to the horizontal line 215 in the misaligned imaging coordinate system can be derived from the image location of the laser projection 235 detectable in a photograph from camera system 214, using the intrinsic camera calibration parameters, such as focal length and radial lens distortion parameters. Now the camera system's misalignment angle 217 can be calculated as the difference between angle 218 and angle 219. If the misalignment angle of camera system 214 is calculated in this same way with wall-projections of laser-based distance sensor 201 in two or more different horizontal directions around the apparatus (preferably 90 degrees apart), the 3D rotation matrix can be determined that correctly aligns images from camera system 214 to the frame of the apparatus.
Automatic detection of the bright-coloured dot of the wall-projection of a laser visible within a photograph is a trivial matter for those skilled in the art of computer vision. In case there are false positive laser dot detections, a Random Sampling Consensus (RANSAC) can be applied to a sufficiently high number of laser projections in different directions.
For improved robustness and accuracy, laser range measurements together with the detected image locations of their laser-surface projections, taken at different stationary locations, may be combined in the above calibration procedure. If sufficient measurements are available, taken at different stationary locations in order to have sufficient variation in wall distances, it will also be possible to optimise for an unknown origin height and direction of the laser-based distance sensor 201, and thus obtain these calibration parameters automatically as well.
Instead of using one rotating range finder such as 201, the above automatic calibration procedure to align the camera system to the apparatus can also be performed with multiple, preferably three, fixed laser-based distance sensors, preferably with a horizontal angle of 90 degrees between their measurement directions. These preferably three fixed distance sensors may be oriented in any direction, as long as their laser projections are visible in images of the camera system. Pointing all of them towards the ceiling or floor around the apparatus, however, may provide the additional benefit that their set of distance measurements may be used to calculate the pose of the ceiling or floor relative to the frame of the apparatus. This will facilitate accurate measurement of wall distances from photographs when the frame of the apparatus is not perfectly or consistently aligned with the floor.
An optional vertical laser-based distance sensor 210 can measure its vertical distance 211 from the ceiling. Ceiling height can be used to generate a 3D model of the environment from horizontal 2D scans of walls and/or to facilitate measurement of wall distances from photographs in case the intersection of a wall with the floor is not clearly visible in the photographs.
Instead of using multiple fixed laser-based distance sensors, similar results may be achieved using one or more structured-light 3D scanners.
An optional laser-based distance sensor 212 can measure its distance 213 from the floor. This can be useful in case the height of stand 232 of the apparatus is not fixed in order to know the height of the optical center of the camera system from the floor as well as to derive the ceiling height from the distance measurement 211 by laser- based distance sensor 210.
An optional IMU 227 can provide accelerometer data to determine and compensate for the angle of the apparatus with respect to the floor in case the apparatus is not standing perfectly straight. For instance, due to an uneven floor surface underneath the base 233 of the apparatus. Furthermore the SLAM algorithm performed by a computing device 229 that is used while the apparatus is moving can use the data from IMU 227 to improve localisation and pose estimation of the apparatus.
Computing device 229 may further comprise means to improve the 2D map or 3D model of the environment by combining SLAM data measured during motion with data from laser-based distance sensor 201 and/or laser-based distance sensor 206 measured during known stationary poses.
A memory module 230 stores data, such as photographic images, distance measurements and calibration parameters.
A communication device 231 with an antenna collects user input from and transmits data to a mobile device through a wireless connection. This can for instance receive an instruction from an operator to start a stationary measurement.
A hand grip 234 allows an operator to easily carry the apparatus around for moving it to the next stationary measurement location or take extra measurements of the building while moving.

Claims

C L A IM S
1. Apparatus for collecting data for generating a floor plan, point cloud or 3D model of an environment, such as the interior of an office building or cave, comprising a frame that fixates components relative to each other, a wide-angle camera system, preferably with an effective horizontal viewing angle of 360 degrees, and first means for distance measurement, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, using light with a wavelength that is nearly invisible to Humans, such as near-infrared or ultraviolet, for taking a first set of distance measurements in multiple directions, characterized in that the apparatus further comprises second means for distance measurement, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, using light with a wavelength visible to Humans, such as red, green or blue, for taking a second set of measurements of one or more distances.
2. The apparatus of claim 1, further comprising means for wireless communication that are capable or transmitting image- or measurement data to an external device or a data network.
3. The apparatus of claim 1 or 2, further comprising data storage means capable of storing data such as photographic images, distance measurements or calibration parameters.
4. The apparatus of claim 1, 2 or 3, further comprising first data processing means that are capable of calculating calibration parameters for aligning images from said camera system with said frame on the basis of said second set of distance measurements and the corresponding image locations of the light-reflections of these measurements detectable in images of said camera system.
5. The apparatus of one of claims 1 to 4, further comprising second data processing means that are capable of calculating movements of said frame within a fixed reference coordinate system on the basis of matches of landmarks between photographic images captured during displacement of said apparatus from a first location to a second location, such as an adjacent room or another part of the same room.
6. The apparatus of one of claims 1 to 5, further characterized in that said second means for distance measurement comprise means that are capable of scanning around two axes in order to generate a 3D point cloud.
7. The apparatus of one of claims 1 to 5, further characterized in that said second means for distance measurement comprise means that are capable of scanning around one axis.
8. The apparatus of claim 7, further characterized in that said second means for distance measurement comprise means that are capable of taking one or more distance measurements aimed at one or more locations on the floor around the apparatus, preferably at least three.
9. The apparatus of claim 7, further characterized in that said second means for distance measurement comprise means that are capable of taking one or more distance measurements aimed at one or more locations on the ceiling around the apparatus, preferably at least three.
10. The apparatus of one of claims 1 to 5, further characterized in that said second means for distance measurement comprise a combination of one or more fixed laser distance sensors placed at one or more angles, preferably three pieces of which each can measure in a direction with an angle of, preferably, 90 degrees from the directions of the other two.
11. The apparatus of claim 10, further characterized in that said second means for distance measurement comprise means that are capable of taking one or more distance measurements directed at one or more locations on the floor around the apparatus, preferably at least three.
12. The apparatus of claim 10, further characterized in that said second means for distance measurement comprise means that are capable of taking one or more distance measurements directed at one or more locations on the ceiling around the apparatus, preferably at least three.
13. The apparatus of one of claims 1 to 12, further characterized in that said first means for distance measurement are capable of taking a third set of distance measurements in multiple directions whereby the measurement directions relative to said frame are periodically repeated during said displacement of said apparatus.
14. The apparatus of claim 13, further characterized in that said second means for distance measurement are capable of taking a fourth set of distance measurements in multiple directions during stationary placement of said apparatus at said second location, and that said first means for distance measurement are capable of completing one cycle of the range of measurement directions for said third set of measurements within a shorter amount of time than said second means of distance measurement can complete one cycle of the range of measurement directions for said fourth set of measurements.
15. The apparatus of claim 13, further characterized in that the apparatus further comprises third means for distance measurement, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, that are capable of taking a fourth set of distance measurements in multiple directions during stationary placement of said apparatus at said second location, and further characterized in that said first means for distance measurement are capable of completing one cycle of the range of measurement directions for said third set of measurements within a shorter amount of time than said third means for distance measurement can complete one cycle of the range of measurement directions for said fourth set of measurements.
16. The apparatus of claim 14 or 15, further comprising third data processing means that are capable of calculating relative point locations belonging to measured distances and their known directions within the coordinate system of said frame, the movements of said frame, and the point locations of said relative point locations within a fixed reference coordinate system.
17. Method for collecting data, by an apparatus, for generating a floor plan, point cloud or 3D model of an environment, such as the interior of an office building or cave, comprising fixating components of the apparatus relative to each other, by a frame in the apparatus, capturing photographic images, by a wide-angle camera system in said apparatus, preferably with an effective horizontal viewing angle of 360 degrees, and taking a first set of distance measurements in multiple directions using light with a wavelength that is nearly invisible to Humans, such as near-infrared or ultraviolet, by first means for distant measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner, characterized in that the method further comprises taking a second set of measurements of one or more distances using light with a wavelength visible to Humans, such as red, green or blue, by second means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured-light 3D scanner.
18. The method of claim 17, further comprising wirelessly transmitting image- or measurement data to an external device or a data network, by communication means in said apparatus.
19. The method of claim 17 or 18, further comprising storing data, such as photographic images, distance measurements or calibration parameters, by storage means in said apparatus.
20. The method of claim 17, 18 or 19, further comprising calculating calibration parameters for aligning images captured by said camera system with said frame on the basis of said second set of distance measurements and the corresponding image locations of the light-reflections of these measurements detectable in images from said camera system, by first data processing means in said apparatus.
21. The method of one of claims 17 to 20, further comprising calculating movements of said frame within a fixed reference coordinate system, by second data processing means in said apparatus, on the basis of matches of landmarks between photographic images captured during displacement of said apparatus from a first location to a second location, such as an adjacent room or another part of the same room.
22. The method of one of claims 17 to 21, further characterized in that taking said second set of distance measurements is achieved by scanning around two axes in order to generate a 3D point cloud.
23. The method of one of claims 17 to 21, further characterized in that taking said second set of distance measurements is achieved by scanning around one axis.
24. The method of claim 23, further characterized in that taking said second set of distance measurement comprises measuring one or more distances to one or more locations on the floor around the apparatus, preferably at least three.
25. The method of claim 23, further characterized in that taking said second set of distance measurement comprises measuring one or more distances to one or more locations on the ceiling around the apparatus, preferably at least three.
26. The method of one of claims 17 to 21, further characterized in that taking said second set of distance measurements is done using one or more fixed laser distance sensors in said apparatus, placed at different angles, preferably three pieces whereby each is used to measure in a direction with an angle of, preferably, 90 degrees from the directions in which is measured with the other two.
27. The method of claim 26, further characterized in that taking said second set of distance measurements comprises measuring one or more distances to one or more locations on the floor around said apparatus, preferably at least three.
28. The method of claim 26, further characterized in that taking said second set of distance measurements comprises measuring one or more distances to one or more locations on the ceiling around said apparatus, preferably at least three.
29. The method of one of claims 17 to 28, further comprising taking a third set of distance measurements in multiple directions, by said first means for distance measurement in said apparatus, whereby the measurement directions relative to said frame are periodically repeated during said displacement of said apparatus.
30. The method of claim 29, further comprising taking a fourth set of distance measurements in multiple directions during stationary placement of said apparatus at said second location, by said second means for distance measurement or third means for distance measurement in said apparatus, such as a 2D LIDAR, a 3D LIDAR or a structured- light 3D scanner, whereby one cycle of the range of measurement directions for said third set of measurements is completed within a shorter amount of time than one cycle of the range of measurement directions for said fourth set of measurements.
31. The method of claim 30, further comprising calculating relative point locations belonging to measured distances and their known directions within the coordinate system of said frame, the movements of said frame, and the point locations of said relative point locations within a fixed reference coordinate system, by third data processing means in said apparatus.
PCT/NL2020/050639 2019-10-18 2020-10-16 Method and apparatus for optical measurement of environments WO2021075967A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2024047 2019-10-18
NL2024047 2019-10-18

Publications (1)

Publication Number Publication Date
WO2021075967A1 true WO2021075967A1 (en) 2021-04-22

Family

ID=69173351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2020/050639 WO2021075967A1 (en) 2019-10-18 2020-10-16 Method and apparatus for optical measurement of environments

Country Status (1)

Country Link
WO (1) WO2021075967A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070047940A1 (en) * 2005-08-30 2007-03-01 Kosei Matsumoto Image input device and calibration method
US20070139262A1 (en) * 2005-12-15 2007-06-21 Bruno Scherzinger Managed traverse system and method to acquire accurate survey data in absence of precise GPS data
US20140267700A1 (en) * 2005-12-15 2014-09-18 Trimble Navigation Limited Method and apparatus for image-based positioning
JP2015125002A (en) 2013-12-25 2015-07-06 株式会社ズームスケープ Photographing method for images for measuring use, and image measuring program
US20180180416A1 (en) * 2016-12-23 2018-06-28 Topcon Positioning Systems, Inc. Enhanced remote surveying systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070047940A1 (en) * 2005-08-30 2007-03-01 Kosei Matsumoto Image input device and calibration method
US20070139262A1 (en) * 2005-12-15 2007-06-21 Bruno Scherzinger Managed traverse system and method to acquire accurate survey data in absence of precise GPS data
US20140267700A1 (en) * 2005-12-15 2014-09-18 Trimble Navigation Limited Method and apparatus for image-based positioning
JP2015125002A (en) 2013-12-25 2015-07-06 株式会社ズームスケープ Photographing method for images for measuring use, and image measuring program
US20180180416A1 (en) * 2016-12-23 2018-06-28 Topcon Positioning Systems, Inc. Enhanced remote surveying systems and methods

Similar Documents

Publication Publication Date Title
US11815600B2 (en) Using a two-dimensional scanner to speed registration of three-dimensional scan data
US11035955B2 (en) Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US8699005B2 (en) Indoor surveying apparatus
US9513107B2 (en) Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
Schiller et al. Calibration of a PMD-camera using a planar calibration pattern together with a multi-camera setup
US9693040B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
JP5583761B2 (en) 3D surface detection method and apparatus using dynamic reference frame
US10120075B2 (en) Using a two-dimensional scanner to speed registration of three-dimensional scan data
US9762883B2 (en) Balancing colors in a scanned three-dimensional image
Menna et al. Geometric investigation of a gaming active device
AU2023282280B2 (en) System and method of capturing and generating panoramic three-dimensional images
US20210132195A1 (en) Mobile apparatus and method for capturing an object space
CN112204345A (en) Indoor positioning method of mobile equipment, mobile equipment and control system
JP2018527575A5 (en)
JP2017528714A (en) Method for optical measurement of three-dimensional coordinates and control of a three-dimensional measuring device
WO2022078439A1 (en) Apparatus and method for acquisition and matching of 3d information of space and object
Lemkens et al. Multi RGB-D camera setup for generating large 3D point clouds
US20210055420A1 (en) Base for spherical laser scanner and method for three-dimensional measurement of an area
JP4227037B2 (en) Imaging system and calibration method
WO2021075967A1 (en) Method and apparatus for optical measurement of environments
CN112254678A (en) Indoor 3D information acquisition equipment and method
WO2016089428A1 (en) Using a two-dimensional scanner to speed registration of three-dimensional scan data
US20230326053A1 (en) Capturing three-dimensional representation of surroundings using mobile device
JP2017111118A (en) Registration calculation between three-dimensional(3d)scans based on two-dimensional (2d) scan data from 3d scanner

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20829695

Country of ref document: EP

Kind code of ref document: A1

WA Withdrawal of international application
NENP Non-entry into the national phase

Ref country code: DE