US11525923B2 - Real-time three-dimensional map building method and device using three-dimensional lidar - Google Patents

Real-time three-dimensional map building method and device using three-dimensional lidar Download PDF

Info

Publication number
US11525923B2
US11525923B2 US16/565,943 US201916565943A US11525923B2 US 11525923 B2 US11525923 B2 US 11525923B2 US 201916565943 A US201916565943 A US 201916565943A US 11525923 B2 US11525923 B2 US 11525923B2
Authority
US
United States
Prior art keywords
dimensional
transformation
voxel
eigenvector
eigenvalue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/565,943
Other versions
US20200088882A1 (en
Inventor
Yongdeuk SHIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AMAutonomy Co Ltd
Original Assignee
AMAutonomy Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AMAutonomy Co Ltd filed Critical AMAutonomy Co Ltd
Assigned to A.M.AUTONOMY CO., LTD. reassignment A.M.AUTONOMY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHIN, YONGDEUK
Publication of US20200088882A1 publication Critical patent/US20200088882A1/en
Application granted granted Critical
Publication of US11525923B2 publication Critical patent/US11525923B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/001Transmission of position information to remote stations
    • G01S2205/002Transmission of position information to remote stations for traffic control, mobile tracking, guidance, surveillance or anti-collision
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/56Particle system, point based geometry or rendering

Definitions

  • the present invention relates to real-time three-dimensional map building method and device using a three-dimensional lidar, and more particularly, to a method and a device of building a three-dimensional map in real time by using a three-dimensional lidar while moving.
  • Three-dimensional mapping enables a robot including an autonomous vehicle to perform a series of operations such as, generating a route by recognizing an environment, and planning an avoidance operation by detecting an obstacle.
  • the three-dimensional mapping is a technical field of a simultaneous location recognition and simultaneous localization and mapping (SLAM).
  • An object of the present invention for solving the above problems is to provide a real-time three-dimensional map building method using a three-dimensional lidar capable of building a three-dimensional map in real time even during move.
  • Another object of the present invention for solving the above problems is to provide a real-time three-dimensional map building device using a three-dimensional lidar capable of building a three-dimensional map in real time even during move.
  • the present invention provides real-time three-dimensional map building method and device using a three-dimensional lidar which represent three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels, acquire an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels, detect a three-dimensional corresponding point in the voxel corresponding to all the three-dimensional points of three-dimensional data newly acquired by using the three-dimensional lidar while the moving object travels, calculate a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point, and update the three-dimensional map data based on the rotation transformation and the translation transformation.
  • is the eigenvalue
  • is the eigenvector
  • R is the rotation transformation
  • t is the translation transformation
  • e i is an error in each eigenvector direction
  • P i is the three-dimensional point in the newly acquired three-dimensional data
  • q k is the three-dimensional corresponding point in the voxel.
  • the updating of the three-dimensional map data may include acquiring new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updating the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point.
  • the calculating of the rotation transformation and the translation transformation may include determining that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
  • the real-time three-dimensional map building method and device using a three-dimensional lidar according to the present invention have an advantage in that an accurate three-dimensional map may be created by estimating a translation and a rotation transformation in real time even if a three-dimensional lidar moves by using a unique continuous motion method according to the present invention, and thus, the present invention may be applied to a dynamic robot, a drone, an autonomous vehicle, and the like, and may also be applied both indoors and outdoors.
  • FIG. 2 is a view illustrating eigenvectors by representing three-dimensional map data in voxels.
  • FIG. 3 is a diagram illustrating a change in an expected value when the voxels to which an objective function of the embodiment of the present invention is applied are arranged in a plane.
  • FIG. 4 is a diagram illustrating a change in the expected value when the voxels to which the objective function of the embodiment of the present invention is applied are arranged in a straight line.
  • FIG. 5 is a diagram illustrating a change in the expected value when the voxels to which the object function of the embodiment of the present invention is applied are arranged in a cylinder.
  • FIG. 6 is a view illustrating a comparison of a change in a value to which the objective function of the embodiment of the present invention is applied and a change in a value to which an ICP objective function of related art is applied.
  • FIG. 7 is a view illustrating disadvantages of the ICP objective function of the related art.
  • FIG. 8 is a view illustrating results of creating the three-dimensional map in an internal space using a three-dimensional lidar by comparing an ICP algorithm with the objective function of the present invention.
  • FIG. 9 is an image illustrating a result of creating a three-dimensional map in an outdoor environment by using the objective function of the present invention.
  • FIG. 10 is a pseudo code representing an algorithm for building the three-dimensional map according to the embodiment of the present invention.
  • FIG. 11 is a block diagram illustrating each configuration of the three-dimensional map building device according to the embodiment of the present invention.
  • first, second, A, and B may be used to describe various configuration elements, but the configuration elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one configuration element from another configuration element.
  • first configuration element may be referred to as the second configuration element, and similarly, the second configuration element may also be referred to as the first configuration element.
  • a term and/or includes a combination of a plurality of related items or any item of the plurality of related items.
  • one configuration element When one configuration element is referred to as being “connected” or “coupled” to another configuration element, the one configuration element may be directly connected to or coupled to the other configuration element, it should be understood that still another configuration element may exist therebetween. Meanwhile, when one configuration element is referred to as being “directly connected” or “directly coupled” to another configuration element, it should be understood that there is no other configuration elements therebetween.
  • Three-dimensional mapping is becoming an important axis in the fourth Industrial Revolution together with accurate three-dimensional mapping for an autonomous vehicle, three-dimensional mapping for an autonomous flying drone, and a virtual reality (VR) service for indoor and outdoor spaces.
  • MMS mobile mapping system
  • 3D LiDAR three-dimensional lidar
  • RTK real-time kinematic
  • IMU inertial measurement unit
  • DI distance measurement indicator
  • the other is a method for estimate the position and posture by using an algorithm such as scan matching with data acquired from the three-dimensional lidar and for using sensor data of the RTK and IMU as auxiliary data.
  • One of advantages of the second method is that the three-dimensional mapping may be made even for an indoor space without a GPS signal.
  • the present invention relates to a matching method based on a scan matching method of the three-dimensional lidar, which is a second method.
  • a go-stop motion method is used for a traditional method of creating a map by using the three-dimensional lidar, and the method has a problem that, if the lidar moves during three-dimensional scan, the movement causes a distortion in measured data and an error to be large, and thereby, accuracy decreases, and thus, mapping takes a lot of time because the data is scanned while repeating a process of moving a sensor to a next position after acquiring a distance image in a stationary state.
  • the present invention uses a unique continuous motion method, thereby, generating an accurate three-dimensional map by estimating a translation and a rotation transformation in real time even if the three-dimensional lidar moves. Therefore, the present invention may be applied to a dynamic robot, a drone, an autonomous vehicle, and the like.
  • FIG. 1 is a flowchart illustrating a process of performing a method of creating a three-dimensional map according to an embodiment of the present invention.
  • the method of creating the there-dimensional map may include representing three-dimensional map data as voxels (S 100 ), acquiring an eigenvalue and an eigenvector based on all three-dimensional points in the voxels (S 200 ), detecting a three-dimensional corresponding point corresponding to a newly acquired three-dimensional point (S 300 ), calculating a rotation transformation and a translation transformation (S 400 ), and updating a map (S 500 ).
  • each step of the method of creating the three-dimensional map according to the embodiment of the present invention may be described as follows.
  • the three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar is represented as voxels (S 100 ), and an eigenvalue ⁇ and an eigenvector ⁇ are calculated based on all three-dimensional points in each voxel (S 200 ).
  • FIG. 2 illustrates the eigenvector for each voxel represented from the three-dimensional map data. Elements included in one voxel are represented in the same color and represent the eigenvector at the center of the voxel.
  • a three-dimensional corresponding point in the voxel corresponding to all three-dimensional points of the three-dimensional data acquired by newly scanning may be detected by using the three-dimensional lidar while a moving object (for example, an autonomous vehicle or a drone or the like) travels (S 300 ).
  • the Euclidean distance may be used for the corresponding point detection like an iterative closest points (ICP) algorithm, which is not limited to the present embodiment. Since the three-dimensional corresponding point detection is a known method, description thereof will be omitted.
  • Equation 1 an error calculated when there is a corresponding point q k after a point p i is transformed by a rotation transformation R and a translation transformation t is represented by Equation 1.
  • the rotation transformation R and the translation transformation t for minimizing an error according to Equation 1 are acquired as follows (S 400 ). That is, the rotation transformation and the translational transformation for minimizing the error is acquired by minimizing an inner product between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from the three-dimensional corresponding point.
  • Equation 2 Equation 2.
  • a corresponding point of the three-dimensional map data closest in Euclidean distance to the newly acquired point using the three-dimensional lidar is found.
  • a vector P made by two three-dimensional points p; and q k are calculated by using a component value of the eigenvector. That is, an error of each eigenvector direct is calculated.
  • ⁇ 1, ⁇ 2, and ⁇ 3 a relationship between the eigenvalues is as follows.
  • the eigenvectors corresponding to the respective eigenvalues are ⁇ 1, ⁇ 2, and ⁇ 3.
  • the three-dimensional map data is updated based on the rotation transformation and the translation transformation calculated above (S 500 ).
  • the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired. If the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.
  • FIG. 3 is a diagram illustrating a change in an expected value when a voxel to which an objective function is applied is in a plane, according to the embodiment of the present invention.
  • a magnitude of a component of a vector ⁇ 1 of the vector P is made to be largest in a case of a plane, and contributiveness of the remaining two direction vector components is reduced.
  • a graph of FIG. 3 illustrates how the error changes with the eigenvectors.
  • An x axis represents a magnitude of the vector P in the graph.
  • an error of the component of the vector ⁇ 1 greatly increases, but magnitudes of the vector components in the directions ⁇ 2 and ⁇ 3 are hardly affected by a distance value.
  • a newly scanned point p i is disposed so as to exist on the same plane as the corresponding point q k with the closest Euclidean distance on the plane. That is, the one point p i has a small change in an error sum anywhere on the plane.
  • FIG. 4 is a diagram illustrating a change in an expected value when the voxels to which the objective function is applied are arranged in a straight line, according to the embodiment of the present invention.
  • an eigenvector direction error value of each of the two vectors may have a shape illustrated in the graph of FIG. 4 , and as newly scanned data has to exist near a region above a straight line within the voxel in a space, the error value may be maintained as the smallest value.
  • FIG. 5 is a diagram illustrating a change in an expected value when voxels to which an objective function is applied are arranged in a cylinder, according to the embodiment of the present invention.
  • each eigenvector direction error may be affected by all direction components to some extent as in the plane of FIG. 3 and the straight line of FIG. 4 . Basically, however, since the Euclidean distances are matched with the smallest component, the newly scanned data will converge to a certain point on a surface of the cylinder. This point has the smallest sum of errors near a point where the cylinder intersects a point on a plane where two eigenvalues form a plane rather than a height direction of the cylinder with the largest eigenvalue.
  • FIG. 6 is a view illustrating a comparison of a change in a value to which an objective function of the embodiment of the present invention is applied and a change in a value to which the ICP objective function of the related art is applied.
  • FIG. 6 illustrates test results acquired by applying the objective function of the present invention and the ICP objective function for matching by using two consecutively acquired three-dimensional lidar data.
  • a graph of (a) of FIG. 6 illustrates a changed value of the objective function of the present invention
  • a graph of (b) of FIG. 6 illustrates a changed value of the objective function of an iterative closest points (ICP) algorithm.
  • ICP iterative closest points
  • a horizontal x-axis represents the number of iterations.
  • FIG. 7 is a view illustrating disadvantages of the ICP objective function of the related art, and illustrates the case described above.
  • FIG. 7 data represented in red is first acquired data, and data represented in black is second acquired data.
  • (b) of FIG. 7 is an enlarged view of a solid line rectangular portion 700 in (a) of FIG. 7 . It can be seen that transformation in a direction of arrows occurs to reduce the Euclidean distance between the first and second acquired data, and that an error in a direction of yellow arrows increases as the number of iterations of the ICP increases.
  • FIG. 8 is a view illustrating results of creating the three-dimensional map in an internal space using the three-dimensional lidar by comparing the ICP algorithm ((a) of FIG. 8 ) with the objective function ((b) of FIG. 8 ) of the present invention.
  • the three-dimensional mapping result ((b) of FIG. 8 ) acquired by the objective function of the present invention is more accurate than the result ((a) of FIG. 8 ) acquired by the ICP.
  • FIG. 9 shows test images illustrating results of creating the three-dimensional map in an outdoor environment using the objective function of the present invention
  • (a) of FIG. 9 is a result image viewed from above
  • (b) of FIG. 9 is a result image viewed from the side. This is results of acquiring three-dimensional data for a wider space by carrying the VLP-16 lidar in hand and rotating the lidar up and down.
  • a plurality of voxels V a , V b , and V c are used to perform an algorithm for creating the three-dimensional map according to the embodiment of the present invention.
  • a subscript in the voxel V represents a size of the voxel and has a relation of a>b>c.
  • the centers c x , c y , and c z of the voxels represent average values of all three-dimensional points included in the voxels.
  • upper voxels V a and V b are made by using the center of the voxel V 0 .
  • the top voxel V a is made for real-time performance. This is because when the three-dimensional map is created, the number of data stored in a computer memory increases considerably according to a travel distance, and thus, the real-time performance may not be guaranteed. Therefore, the data required to perform the current algorithm is selected through the voxel V b extracted from the upper voxel V a .
  • the three-dimensional map data M t-1 includes all three-dimensional data generated up to the previous time t ⁇ 1 and V b is generated from a subset m t-1 CM t-1 .
  • V b [ c x c y c z ⁇ 1 ⁇ 2 ⁇ 3 ⁇ 1 T ⁇ 2 T ⁇ 3 T ] T
  • An element q of a voxel map m t-1 is a 3 ⁇ 1 vector q ⁇ m t-1 having the center of the voxel q the voxel as an element in q ⁇ m t-1
  • p ⁇ Z t is a vector having the center as an element from p ⁇ Z t in a set of voxels made of raw data acquired from a sensor in time.
  • a map is a set Z t of V c acquired by making raw data acquired from a sensor in time t into the voxel.
  • a map m t-1 is a set including all the elements of the voxel which are a center value, an eigenvalue, and an eigenvector, in time t ⁇ 1, and m t-1 is a set of the 3 ⁇ 1 vectors having only the center value among these.
  • a method of finding a correspondence relation in the ICP may be used.
  • element centers, eigenvalues, and eigenvectors of two sets are all compared to each other, but in order to ensure the real-time performance, only the centers are compared to each other, and it is possible to assume that the two three-dimensional points P i and q k with the smallest Euclidean distance therebetween correspond to each other.
  • Equation 1 is calculated by using the eigenvalue and eigenvector in a voxel having the two three-dimensional points P i and q k .
  • KD-Tree may be used to find the fastest correspondence relation.
  • Equation 2 may be calculated for all data for which the correspondence relation is found. Since Equation 2 may not be calculated directly, the Levenberg-Marquardt algorithm may be applied among optimization methods.
  • FIG. 10 is a pseudo code representing an algorithm for creating the three-dimensional map according to the embodiment of the present invention.
  • a subset m t-1 of the three-dimensional map M t-1 created until now, a rigid body transformation T t-1 , and a set Z t of the three-dimensional points that store the center of the voxel V c from data newly acquired from the three-dimensional lidar sensor are given as input data.
  • the center, the eigenvalue, and the eigenvector for each voxel are already calculated.
  • Results of the algorithm are a rigid body transformation T t at the current time and a newly updated three-dimensional map m t of all elements of the voxel. There are two conditions for ending the algorithm.
  • FIG. 11 is a block diagram illustrating each configuration of the three-dimensional map building device according to another embodiment of the present invention.
  • the three-dimensional map building device 100 operates to generate three-dimensional map data 20 based on three-dimensional data scanned by a three-dimensional lidar 10 , and is configured to include a voxel representation unit 110 , an eigenvalue and eigenvector acquisition unit 120 , a corresponding point detection unit 130 , a transformation calculation unit 140 , and a map updating unit 150 .
  • each configuration of the three-dimensional map building device 100 according to another embodiment of the present invention may be described with reference to FIG. 10 as follows.
  • the voxel representation unit 110 may represent the three-dimensional map data 20 for a surrounding environment acquired by using the three-dimensional lidar 10 as a voxel.
  • the eigenvalue and eigenvector acquisition unit 120 may calculate an eigenvalue ⁇ and an eigenvector ⁇ of each voxel based on all three-dimensional points in each voxel.
  • the corresponding point detection unit 130 may detect three-dimensional corresponding points in the voxels corresponding to all the three-dimensional points of the three-dimensional data acquired by newly scanning by using the three-dimensional lidar 10 while the moving object travels. At this time, the Euclidean distance may be used for a corresponding point detection like the iterative closest points (ICP) algorithm.
  • ICP iterative closest points
  • the transformation calculation unit 140 may acquire rotation transformation and translation transformation for all the three-dimensional points of the newly acquired data according to Equation 1. That is, the rotation transformation and the translation transformation for minimizing an error are acquired by minimizing an inner product value between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point.
  • the algorithm may end.
  • the map updating unit 150 updates the three-dimensional map data based on the rotation transformation and the translation transformation calculated by the transformation calculation unit 140 .
  • the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired, and if the three-dimensional point belongs to a voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.

Abstract

Real-time three-dimensional (3D) map building method and device using a 3D lidar includes representing 3D map data of a surrounding environment acquired by using a 3D lidar attached to a moving object as voxels, acquiring an eigenvalue and an eigenvector for each voxel based on all 3D points in a 3D map represented as the voxels, detecting a 3D corresponding point in the voxel corresponding to all the 3D points of 3D data newly acquired by using the 3D lidar while the moving object travels, calculating a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the 3D corresponding point belongs and a vector generated from a 3D corresponding point, and updating the 3D map data based on the rotation transformation and the translation transformation.

Description

CROSS REFERENCE TO RELATED APPLICATION
This application claims priority to Korean Patent Application No. 10-2018-0109348, filed Sep. 13, 2018, the entire content of which is incorporated herein by reference.
BACKGROUND OF THE INVENTION 1. Field of the Invention
The present invention relates to real-time three-dimensional map building method and device using a three-dimensional lidar, and more particularly, to a method and a device of building a three-dimensional map in real time by using a three-dimensional lidar while moving.
2. Description of Related Art
Three-dimensional mapping enables a robot including an autonomous vehicle to perform a series of operations such as, generating a route by recognizing an environment, and planning an avoidance operation by detecting an obstacle. In the field of robots, the three-dimensional mapping is a technical field of a simultaneous location recognition and simultaneous localization and mapping (SLAM).
In the related art, in order to build a three-dimensional map for an environment by mounting a motor in a sensor such as a two-dimensional laser scanner, a method of acquiring three-dimensional data by using an iterative closest points (ICP) algorithm after stopping at a place were used (PJ Besl and NDMckay, A Method for Registration of 3D Shapes, IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 14, Issue 2, 1992). However, this method has a problem that it is impossible to build a map by attaching the three-dimensional lidar to a mobile robot due to a severe data distortion of movement, and it takes too much time to the three-dimensional mapping.
M. Bosse and R. Zlot, Continuous 3D Scan-Matching with a Spinning 2D Laser, IEEE, Int. Conf. on Robotics and Automation, 2009. Discloses an algorithm to solve the problem, but since many feature vectors that define two three-dimensional points are used, moset of algorithm execution time is consumed in finding a set of two corresponding points, and thus, there is a disadvantage in that real-time performance is not guaranteed.
SUMMARY OF THE INVENTION
An object of the present invention for solving the above problems is to provide a real-time three-dimensional map building method using a three-dimensional lidar capable of building a three-dimensional map in real time even during move.
Another object of the present invention for solving the above problems is to provide a real-time three-dimensional map building device using a three-dimensional lidar capable of building a three-dimensional map in real time even during move.
In order to achieve the objects, the present invention provides real-time three-dimensional map building method and device using a three-dimensional lidar which represent three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels, acquire an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels, detect a three-dimensional corresponding point in the voxel corresponding to all the three-dimensional points of three-dimensional data newly acquired by using the three-dimensional lidar while the moving object travels, calculate a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point, and update the three-dimensional map data based on the rotation transformation and the translation transformation.
Here, the rotation transformation and the translation transformation may be acquired by an equation below.
[ 1 λ 1 v 1 1 λ 2 v 2 1 λ 3 v 3 ] T [ ( Rp i + t ) - q k ] = e i 1 e i 2 e i 3 T λ1 > λ2 > λ3
In the above equation, λ is the eigenvalue, ν is the eigenvector, R is the rotation transformation, t is the translation transformation, ei is an error in each eigenvector direction, Pi is the three-dimensional point in the newly acquired three-dimensional data, and qk is the three-dimensional corresponding point in the voxel.
Here, the updating of the three-dimensional map data may include acquiring new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updating the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point.
Here, the calculating of the rotation transformation and the translation transformation may include determining that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
The real-time three-dimensional map building method and device using a three-dimensional lidar according to the present invention have an advantage in that an accurate three-dimensional map may be created by estimating a translation and a rotation transformation in real time even if a three-dimensional lidar moves by using a unique continuous motion method according to the present invention, and thus, the present invention may be applied to a dynamic robot, a drone, an autonomous vehicle, and the like, and may also be applied both indoors and outdoors.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a flowchart illustrating a process of performing a method of building a three-dimensional map according to an embodiment of the present invention.
FIG. 2 is a view illustrating eigenvectors by representing three-dimensional map data in voxels.
FIG. 3 is a diagram illustrating a change in an expected value when the voxels to which an objective function of the embodiment of the present invention is applied are arranged in a plane.
FIG. 4 is a diagram illustrating a change in the expected value when the voxels to which the objective function of the embodiment of the present invention is applied are arranged in a straight line.
FIG. 5 is a diagram illustrating a change in the expected value when the voxels to which the object function of the embodiment of the present invention is applied are arranged in a cylinder.
FIG. 6 is a view illustrating a comparison of a change in a value to which the objective function of the embodiment of the present invention is applied and a change in a value to which an ICP objective function of related art is applied.
FIG. 7 is a view illustrating disadvantages of the ICP objective function of the related art.
FIG. 8 is a view illustrating results of creating the three-dimensional map in an internal space using a three-dimensional lidar by comparing an ICP algorithm with the objective function of the present invention.
FIG. 9 is an image illustrating a result of creating a three-dimensional map in an outdoor environment by using the objective function of the present invention.
FIG. 10 is a pseudo code representing an algorithm for building the three-dimensional map according to the embodiment of the present invention.
FIG. 11 is a block diagram illustrating each configuration of the three-dimensional map building device according to the embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Since the present invention may be variously changed and have several embodiments, specific embodiments will be illustrated in the drawings and described in detail in the specification. However, this is not intended to limit the present invention to the specific embodiments, and it should be understood that the present invention includes all modifications, equivalents, and substitutes included in the spirit and scope of the present invention. In describing the drawings, similar reference numerals are used for similar configuration elements.
Terms such as first, second, A, and B may be used to describe various configuration elements, but the configuration elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one configuration element from another configuration element. For example, without departing from the scope of the present invention, the first configuration element may be referred to as the second configuration element, and similarly, the second configuration element may also be referred to as the first configuration element. A term and/or includes a combination of a plurality of related items or any item of the plurality of related items.
When one configuration element is referred to as being “connected” or “coupled” to another configuration element, the one configuration element may be directly connected to or coupled to the other configuration element, it should be understood that still another configuration element may exist therebetween. Meanwhile, when one configuration element is referred to as being “directly connected” or “directly coupled” to another configuration element, it should be understood that there is no other configuration elements therebetween.
The terminology used in the present application is for the purpose of describing the specific embodiments only and is not intended to limit the present invention. A singular expression includes plural expressions unless the context clearly indicates otherwise. In the present application, it should be understood that the terms “include”, “have”, and the like are intended to designate existence of a feature, a number, a step, an operation, a configuration element, a component, or a combination thereof described in the specification, and a possibility of existence or addition of one or more other features, a number, a step, an operation, a configuration element, a component, or a combination thereof is not excluded in advance.
Unless defined otherwise, all terms which are used herein and include technical or scientific terms have the same meaning as commonly understood by those skilled in the art to which the present belongs. Terms such as terms defined in the commonly used dictionary should be construed as having meanings consistent with the meanings in the context of a related art and shall not be construed as ideal or excessively formal meanings unless expressly defined in the present application.
Three-dimensional mapping is becoming an important axis in the fourth Industrial Revolution together with accurate three-dimensional mapping for an autonomous vehicle, three-dimensional mapping for an autonomous flying drone, and a virtual reality (VR) service for indoor and outdoor spaces. Currently, the technology is largely divided into two types, and one of which is a mobile mapping system (MMS) that matches three-dimensional lidar (3D LiDAR) based on information on a position and a posture acquired through fusion of several sensors such as real-time kinematic (RTK), inertial measurement unit (IMU), and distance measurement indicator (DMI).
The other is a method for estimate the position and posture by using an algorithm such as scan matching with data acquired from the three-dimensional lidar and for using sensor data of the RTK and IMU as auxiliary data. One of advantages of the second method is that the three-dimensional mapping may be made even for an indoor space without a GPS signal. The present invention relates to a matching method based on a scan matching method of the three-dimensional lidar, which is a second method.
Meanwhile, a go-stop motion method is used for a traditional method of creating a map by using the three-dimensional lidar, and the method has a problem that, if the lidar moves during three-dimensional scan, the movement causes a distortion in measured data and an error to be large, and thereby, accuracy decreases, and thus, mapping takes a lot of time because the data is scanned while repeating a process of moving a sensor to a next position after acquiring a distance image in a stationary state.
In order to solve the problems, the present invention uses a unique continuous motion method, thereby, generating an accurate three-dimensional map by estimating a translation and a rotation transformation in real time even if the three-dimensional lidar moves. Therefore, the present invention may be applied to a dynamic robot, a drone, an autonomous vehicle, and the like.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 is a flowchart illustrating a process of performing a method of creating a three-dimensional map according to an embodiment of the present invention.
Referring to FIG. 1 , the method of creating the there-dimensional map according to the embodiment of the present invention may include representing three-dimensional map data as voxels (S100), acquiring an eigenvalue and an eigenvector based on all three-dimensional points in the voxels (S200), detecting a three-dimensional corresponding point corresponding to a newly acquired three-dimensional point (S300), calculating a rotation transformation and a translation transformation (S400), and updating a map (S500).
Further, referring to FIG. 1 , each step of the method of creating the three-dimensional map according to the embodiment of the present invention may be described as follows.
First, the three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar (for example, a Velodyne lidar) is represented as voxels (S100), and an eigenvalue λ and an eigenvector ν are calculated based on all three-dimensional points in each voxel (S200).
FIG. 2 illustrates the eigenvector for each voxel represented from the three-dimensional map data. Elements included in one voxel are represented in the same color and represent the eigenvector at the center of the voxel.
Next, a three-dimensional corresponding point in the voxel corresponding to all three-dimensional points of the three-dimensional data acquired by newly scanning may be detected by using the three-dimensional lidar while a moving object (for example, an autonomous vehicle or a drone or the like) travels (S300). The Euclidean distance may be used for the corresponding point detection like an iterative closest points (ICP) algorithm, which is not limited to the present embodiment. Since the three-dimensional corresponding point detection is a known method, description thereof will be omitted.
At this time, an error calculated when there is a corresponding point qk after a point pi is transformed by a rotation transformation R and a translation transformation t is represented by Equation 1.
[ 1 λ 1 v 1 1 λ 2 v 2 1 λ 3 v 3 ] T [ ( Rp i + t ) - q k ] = [ e i 1 e i 2 e i 3 ] T Equation 1
Next, for all three-dimensional points p; of the newly acquired data, the rotation transformation R and the translation transformation t for minimizing an error according to Equation 1 are acquired as follows (S400). That is, the rotation transformation and the translational transformation for minimizing the error is acquired by minimizing an inner product between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from the three-dimensional corresponding point.
Meanwhile, it is assumed that there are a plane, a straight line, and a cylindrical figure in order to explain Equation 2.
E i = [ e i 1 e i 2 e i 3 ] , min Rt i E i T E i Equation 2
A corresponding point of the three-dimensional map data closest in Euclidean distance to the newly acquired point using the three-dimensional lidar is found. At this time, a vector P made by two three-dimensional points p; and qk are calculated by using a component value of the eigenvector. That is, an error of each eigenvector direct is calculated. When the acquired eigenvalues are referred to as λ1, λ2, and λ3, a relationship between the eigenvalues is as follows.
λ1≤λ2≤λ3
The eigenvectors corresponding to the respective eigenvalues are ν1, ν2, and ν3.
At this time, for example, if differences between the rotation transformation and translation transformation which are updated by applying the Levenberg-Marquardt algorithm and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold, an error is determined to be minimum, the algorithm ends, and the processing proceeds to the next step (S500).
Next, the three-dimensional map data is updated based on the rotation transformation and the translation transformation calculated above (S500). At this time, if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired. If the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.
FIG. 3 is a diagram illustrating a change in an expected value when a voxel to which an objective function is applied is in a plane, according to the embodiment of the present invention.
Referring to FIG. 3 , a magnitude of a component of a vector ν1 of the vector P is made to be largest in a case of a plane, and contributiveness of the remaining two direction vector components is reduced. A graph of FIG. 3 illustrates how the error changes with the eigenvectors. An x axis represents a magnitude of the vector P in the graph. As a distance increases, an error of the component of the vector ν1 greatly increases, but magnitudes of the vector components in the directions ν2 and ν3 are hardly affected by a distance value. By doing so, a newly scanned point pi is disposed so as to exist on the same plane as the corresponding point qk with the closest Euclidean distance on the plane. That is, the one point pi has a small change in an error sum anywhere on the plane.
FIG. 4 is a diagram illustrating a change in an expected value when the voxels to which the objective function is applied are arranged in a straight line, according to the embodiment of the present invention.
If only a point cloud for a straight line exists in any one voxel and an eigenvalue thereof is acquired, two eigenvalues may be small and the other eigenvalue may be relatively large. Therefore, an eigenvector direction error value of each of the two vectors may have a shape illustrated in the graph of FIG. 4 , and as newly scanned data has to exist near a region above a straight line within the voxel in a space, the error value may be maintained as the smallest value.
FIG. 5 is a diagram illustrating a change in an expected value when voxels to which an objective function is applied are arranged in a cylinder, according to the embodiment of the present invention.
If the eigenvalues for any one voxel are similar, a cylinder may be assumed. In this case, each eigenvector direction error may be affected by all direction components to some extent as in the plane of FIG. 3 and the straight line of FIG. 4 . Basically, however, since the Euclidean distances are matched with the smallest component, the newly scanned data will converge to a certain point on a surface of the cylinder. This point has the smallest sum of errors near a point where the cylinder intersects a point on a plane where two eigenvalues form a plane rather than a height direction of the cylinder with the largest eigenvalue.
FIG. 6 is a view illustrating a comparison of a change in a value to which an objective function of the embodiment of the present invention is applied and a change in a value to which the ICP objective function of the related art is applied.
FIG. 6 illustrates test results acquired by applying the objective function of the present invention and the ICP objective function for matching by using two consecutively acquired three-dimensional lidar data.
In the test, the exact rotation transformation and the translation transformation are made by using a value confirmed by a human eye while changing the values little by little as a true value, a graph of (a) of FIG. 6 illustrates a changed value of the objective function of the present invention, and a graph of (b) of FIG. 6 illustrates a changed value of the objective function of an iterative closest points (ICP) algorithm. In the graphs of FIG. 6 , a horizontal x-axis represents the number of iterations. Referring to the graphs of FIG. 6 , the closer the true value confirmed by the eye, the larger the value of the objective function of the ICP, while the value of the objective function of the present invention decreases rapidly. Accordingly, it is possible to confirm that the values are constantly converging.
A main reason for this tendency of the results of ICP is that in the three-dimensional map data with a low density, the Euclidean distance between two different data is reduced for an empty space.
FIG. 7 is a view illustrating disadvantages of the ICP objective function of the related art, and illustrates the case described above.
In FIG. 7 , data represented in red is first acquired data, and data represented in black is second acquired data. Meanwhile, (b) of FIG. 7 is an enlarged view of a solid line rectangular portion 700 in (a) of FIG. 7 . It can be seen that transformation in a direction of arrows occurs to reduce the Euclidean distance between the first and second acquired data, and that an error in a direction of yellow arrows increases as the number of iterations of the ICP increases.
FIG. 8 is a view illustrating results of creating the three-dimensional map in an internal space using the three-dimensional lidar by comparing the ICP algorithm ((a) of FIG. 8 ) with the objective function ((b) of FIG. 8 ) of the present invention.
Referring to FIG. 8 , it can be seen that the three-dimensional mapping result ((b) of FIG. 8 ) acquired by the objective function of the present invention is more accurate than the result ((a) of FIG. 8 ) acquired by the ICP.
FIG. 9 shows test images illustrating results of creating the three-dimensional map in an outdoor environment using the objective function of the present invention, (a) of FIG. 9 is a result image viewed from above, and (b) of FIG. 9 is a result image viewed from the side. This is results of acquiring three-dimensional data for a wider space by carrying the VLP-16 lidar in hand and rotating the lidar up and down.
Hereinafter, an algorithm for creating the three-dimensional map according to the embodiment of the present invention will be described in more detail.
As described above, a plurality of voxels Va, Vb, and Vc are used to perform an algorithm for creating the three-dimensional map according to the embodiment of the present invention. A subscript in the voxel V represents a size of the voxel and has a relation of a>b>c. The centers cx, cy, and cz of the voxels represent average values of all three-dimensional points included in the voxels.
Further, upper voxels Va and Vb are made by using the center of the voxel V0. The top voxel Va is made for real-time performance. This is because when the three-dimensional map is created, the number of data stored in a computer memory increases considerably according to a travel distance, and thus, the real-time performance may not be guaranteed. Therefore, the data required to perform the current algorithm is selected through the voxel Vb extracted from the upper voxel Va. The three-dimensional map data Mt-1 includes all three-dimensional data generated up to the previous time t−1 and Vb is generated from a subset mt-1CMt-1. Elements included in the voxel Vb are a center, an eigenvalue, and an eigenvector of the voxel, and represented as follows.
V b=[c x c y c zλ1λ2λ3ν1 Tν2 Tν3 T]T
An element q of a voxel map mt-1 is a 3×1 vector q∈mt-1 having the center of the voxel q the voxel as an element in qm t-1, and p∈Zt is a vector having the center as an element from pZ t in a set of voxels made of raw data acquired from a sensor in time. A map is a set Z t of Vc acquired by making raw data acquired from a sensor in time t into the voxel. A map m t-1 is a set including all the elements of the voxel which are a center value, an eigenvalue, and an eigenvector, in time t−1, and mt-1 is a set of the 3×1 vectors having only the center value among these.
Next, in order to acquire a correspondence relation between the two sets mt-1 and Zt, a method of finding a correspondence relation in the ICP may be used. In order to find a more accurate correspondence relation, element centers, eigenvalues, and eigenvectors of two sets are all compared to each other, but in order to ensure the real-time performance, only the centers are compared to each other, and it is possible to assume that the two three-dimensional points Pi and qk with the smallest Euclidean distance therebetween correspond to each other. At this time, above Equation 1 is calculated by using the eigenvalue and eigenvector in a voxel having the two three-dimensional points Pi and qk. KD-Tree may be used to find the fastest correspondence relation.
Now, the rotation transformation and the translation transformation that has to be calculated to build the three-dimensional map may be applied to the set Zt to finally acquire the updated maps mt and Mt from the three-dimensional map mt-1. To this end, Equation 2 may be calculated for all data for which the correspondence relation is found. Since Equation 2 may not be calculated directly, the Levenberg-Marquardt algorithm may be applied among optimization methods.
FIG. 10 is a pseudo code representing an algorithm for creating the three-dimensional map according to the embodiment of the present invention.
Referring to FIG. 10 , a subset m t-1 of the three-dimensional map M t-1 created until now, a rigid body transformation Tt-1, and a set Zt of the three-dimensional points that store the center of the voxel Vc from data newly acquired from the three-dimensional lidar sensor are given as input data. At this time, since the three-dimensional map m t-1 is given, the center, the eigenvalue, and the eigenvector for each voxel are already calculated. Results of the algorithm are a rigid body transformation Tt at the current time and a newly updated three-dimensional map m t of all elements of the voxel. There are two conditions for ending the algorithm. One is a case where a convergence condition is satisfied, and the other is a case where the number of given iterations are all performed. It is assumed that the convergence condition of the algorithm is satisfied if a difference between the rigid body transformation Tt-1 updated from Levenberg-Marqudart and the rigid body transformation previously calculated is less than a set threshold.
Hereinafter, a three-dimensional map building device according to another embodiment of the present invention will be described.
FIG. 11 is a block diagram illustrating each configuration of the three-dimensional map building device according to another embodiment of the present invention.
Referring to FIG. 11 , the three-dimensional map building device 100 according to another embodiment of the present invention operates to generate three-dimensional map data 20 based on three-dimensional data scanned by a three-dimensional lidar 10, and is configured to include a voxel representation unit 110, an eigenvalue and eigenvector acquisition unit 120, a corresponding point detection unit 130, a transformation calculation unit 140, and a map updating unit 150.
Further, each configuration of the three-dimensional map building device 100 according to another embodiment of the present invention may be described with reference to FIG. 10 as follows.
The voxel representation unit 110 may represent the three-dimensional map data 20 for a surrounding environment acquired by using the three-dimensional lidar 10 as a voxel.
The eigenvalue and eigenvector acquisition unit 120 may calculate an eigenvalue λ and an eigenvector ν of each voxel based on all three-dimensional points in each voxel.
The corresponding point detection unit 130 may detect three-dimensional corresponding points in the voxels corresponding to all the three-dimensional points of the three-dimensional data acquired by newly scanning by using the three-dimensional lidar 10 while the moving object travels. At this time, the Euclidean distance may be used for a corresponding point detection like the iterative closest points (ICP) algorithm.
The transformation calculation unit 140 may acquire rotation transformation and translation transformation for all the three-dimensional points of the newly acquired data according to Equation 1. That is, the rotation transformation and the translation transformation for minimizing an error are acquired by minimizing an inner product value between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point.
At this time, for example, if differences between the rotation transformation and translation transformation which are updated by applying the Levenberg-Marquardt algorithm and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold, the error is determined to be minimum and thus, the algorithm may end.
The map updating unit 150 updates the three-dimensional map data based on the rotation transformation and the translation transformation calculated by the transformation calculation unit 140. At this time, if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired, and if the three-dimensional point belongs to a voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.
Although above description is made with reference to preferred embodiments of the present invention, it will be understood that those skilled in the art may variously modify and change the present invention without departing from the spirit and scope of the present invention described in claims below.

Claims (6)

What is claimed is:
1. A real-time three-dimensional map building method using a three-dimensional lidar, comprising:
representing three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels;
acquiring an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels;
detecting three-dimensional corresponding points in the voxels corresponding to all the three-dimensional points newly acquired by using the three-dimensional lidar while the moving object travels;
calculating a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point; and
updating the three-dimensional map data based on the rotation transformation and the translation transformation,
wherein the rotation transformation and the translation transformation are acquired by an equation below,
[ 1 λ 1 v 1 1 λ 2 v 2 1 λ 3 v 3 ] T [ ( Rp i + t ) - q k ] = e i 1 e i 2 e i 3 T λ1 > λ2 > λ3
in the above equation, λ is the eigenvalue, ν is the eigenvector, R is the rotation transformation, t is the translation transformation, ei is an error in each eigenvector direction, Pi is the three-dimensional point in the newly acquired three-dimensional data, qk is the three-dimensional corresponding point in the voxel, and T is transpose of a matrix.
2. The real-time three-dimensional map building method of claim 1, wherein the updating of the three-dimensional map data includes acquiring new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updating the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point.
3. The real-time three-dimensional map building method of claim 1, wherein the calculating of the rotation transformation and the translation transformation includes determining that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
4. A real-time three-dimensional map building device using a three-dimensional lidar, comprising:
a voxel representation unit that represents three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels;
an eigenvalue and eigenvector acquisition unit that acquires an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels;
a corresponding point detection unit that detects a three-dimensional corresponding point in the voxel corresponding to all the three-dimensional points of three-dimensional data newly acquired by using the three-dimensional lidar while the moving object travels;
a transformation calculation unit that calculates a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point; and
a map updating unit that updates the three-dimensional map data based on the rotation transformation and the translation transformation,
wherein the rotation transformation and the translation transformation are acquired by an equation below,
[ 1 λ 1 v 1 1 λ 2 v 2 1 λ 3 v 3 ] T [ ( Rp i + t ) - q k ] = e i 1 e i 2 e i 3 T λ1 > λ2 > λ3
in the above equation, λ is the eigenvalue, ν is the eigenvector, R is the rotation transformation, t is the translation transformation, ei is an error in each eigenvector direction, Pi is the three-dimensional point in the newly acquired three-dimensional data, qk is the three-dimensional corresponding point in the voxel, and T is transpose of a matrix.
5. The real-time three-dimensional map building device of claim 4, wherein the map updating unit acquires new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updates the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same, space as the three-dimensional corresponding point.
6. The real-time three-dimensional map building device of claim 4, wherein the transformation calculation unit determines that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
US16/565,943 2018-09-13 2019-09-10 Real-time three-dimensional map building method and device using three-dimensional lidar Active 2041-06-26 US11525923B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180109348A KR101925862B1 (en) 2018-09-13 2018-09-13 Real-time 3d mapping using 3d-lidar
KR10-2018-0109348 2018-09-13

Publications (2)

Publication Number Publication Date
US20200088882A1 US20200088882A1 (en) 2020-03-19
US11525923B2 true US11525923B2 (en) 2022-12-13

Family

ID=64671403

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/565,943 Active 2041-06-26 US11525923B2 (en) 2018-09-13 2019-09-10 Real-time three-dimensional map building method and device using three-dimensional lidar

Country Status (2)

Country Link
US (1) US11525923B2 (en)
KR (1) KR101925862B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220063659A1 (en) * 2019-07-09 2022-03-03 Refraction Ai, Inc. Method and system for autonomous vehicle control

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11835342B2 (en) * 2019-06-27 2023-12-05 Nvidia Corporation Calibration of inertial measurement units of vehicles using localization
WO2021132811A1 (en) * 2019-12-27 2021-07-01 주식회사 모빌테크 Method for correcting gps error by comparing high-definition 3d maps in overlapping region
KR102195527B1 (en) * 2020-08-06 2020-12-28 이성열 Intelligent smart crosswalk control method and system
KR102371852B1 (en) * 2020-10-16 2022-03-08 (주)컨트롤웍스 Method and Apparatus for Updating Map Based on Point Cloud Information
CN112668653A (en) * 2020-12-30 2021-04-16 北京百度网讯科技有限公司 Loop detection method, device, equipment and medium based on laser radar map
CN113393579B (en) * 2021-08-17 2021-11-12 天津云圣智能科技有限责任公司 Multi-machine cooperative scanning method and device and electronic equipment
KR20230102363A (en) 2021-12-30 2023-07-07 서울대학교산학협력단 Odometry method and apparatus robust to high-speed motion using spinning lidar
KR102524995B1 (en) * 2023-02-02 2023-04-25 국방과학연구소 Map generating method of electronic apparatus
CN116538996B (en) * 2023-07-04 2023-09-29 云南超图地理信息有限公司 Laser radar-based topographic mapping system and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030137646A1 (en) * 2002-01-23 2003-07-24 Quantapoint, Inc. Method and apparatus for generating structural data from laser reflectance images
US20070024611A1 (en) * 2005-07-27 2007-02-01 General Electric Company System and method for aligning three-dimensional models
KR101715780B1 (en) 2010-10-11 2017-03-13 삼성전자주식회사 Voxel Map generator And Method Thereof
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
US20190049566A1 (en) * 2017-08-11 2019-02-14 Zoox, Inc. Vehicle sensor calibration and localization
US20190346271A1 (en) * 2016-03-11 2019-11-14 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
US20200217666A1 (en) * 2016-03-11 2020-07-09 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof
US11274929B1 (en) * 2017-10-17 2022-03-15 AI Incorporated Method for constructing a map while performing work

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101551413B1 (en) 2014-05-13 2015-09-08 금오공과대학교 산학협력단 A web-based EOC monitoring method of slopes

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030137646A1 (en) * 2002-01-23 2003-07-24 Quantapoint, Inc. Method and apparatus for generating structural data from laser reflectance images
US20070024611A1 (en) * 2005-07-27 2007-02-01 General Electric Company System and method for aligning three-dimensional models
KR101715780B1 (en) 2010-10-11 2017-03-13 삼성전자주식회사 Voxel Map generator And Method Thereof
US20180075643A1 (en) * 2015-04-10 2018-03-15 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
US20190346271A1 (en) * 2016-03-11 2019-11-14 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
US20200217666A1 (en) * 2016-03-11 2020-07-09 Kaarta, Inc. Aligning measured signal data with slam localization data and uses thereof
US20190049566A1 (en) * 2017-08-11 2019-02-14 Zoox, Inc. Vehicle sensor calibration and localization
US11274929B1 (en) * 2017-10-17 2022-03-15 AI Incorporated Method for constructing a map while performing work

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Bosse, Michael, and Robert Zlot. "Continuous 3D scan-matching with a spinning 2D laser." 2009 IEEE International Conference on Robotics and Automation. IEEE, 2009. (Year: 2009). *
M. Bosse et al., "Continuous 3D Scan-Matching with a Spinning 2D Laser", IEEE, Int. Conf. on Robotics and Automation, 2009.
Nüchter, Andreas. 3D robotic mapping: the simultaneous localization and mapping problem with six degrees of freedom. vol. 52. Springer, 2008. (Year: 2008). *
P.J.Besl et al., "A Method for Registration of 3D Shape", IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 14, Issue 2, 1992.
Shin, Yong-Deuk, et al. "A study on reliability enhancement for laser and camera calibration." International Journal of Control, Automation and Systems 10.1 (2012): 109-116. (Year: 2012). *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220063659A1 (en) * 2019-07-09 2022-03-03 Refraction Ai, Inc. Method and system for autonomous vehicle control

Also Published As

Publication number Publication date
US20200088882A1 (en) 2020-03-19
KR101925862B1 (en) 2018-12-06

Similar Documents

Publication Publication Date Title
US11525923B2 (en) Real-time three-dimensional map building method and device using three-dimensional lidar
US10782137B2 (en) Methods, apparatus, and systems for localization and mapping
EP3919863A1 (en) Vslam method, controller, and mobile device
Minguez et al. Metric-based scan matching algorithms for mobile robot displacement estimation
Nieto et al. Recursive scan-matching SLAM
US8792726B2 (en) Geometric feature extracting device, geometric feature extracting method, storage medium, three-dimensional measurement apparatus, and object recognition apparatus
US20110205338A1 (en) Apparatus for estimating position of mobile robot and method thereof
US8824775B2 (en) Robot and control method thereof
JP5627325B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
KR20130022994A (en) Method for recognizing the self position of a mobile robot unit using arbitrary ceiling features on the ceiling image/feature map
KR102075844B1 (en) Localization system merging results of multi-modal sensor based positioning and method thereof
Hinduja et al. Degeneracy-aware factors with applications to underwater slam
Hochdorfer et al. 6 DoF SLAM using a ToF camera: The challenge of a continuously growing number of landmarks
CN112101160A (en) Binocular semantic SLAM method oriented to automatic driving scene
Bavle et al. Stereo visual odometry and semantics based localization of aerial robots in indoor environments
JP2010112836A (en) Self-position identification device and mobile robot provided with same
Lee et al. A real-time 3D workspace modeling with stereo camera
Ye et al. Model-based offline vehicle tracking in automotive applications using a precise 3D model
KR20180066668A (en) Apparatus and method constructing driving environment of unmanned vehicle
Leishman et al. Robust Motion Estimation with RBG-D Cameras
Cupec et al. Global localization based on 3d planar surface segments
Qiao et al. Online monocular lane mapping using catmull-rom spline
CN113554705A (en) Robust positioning method for laser radar in changing scene
Song et al. Scale Estimation with Dual Quadrics for Monocular Object SLAM
Thapa et al. A review on visual odometry techniques for mobile robots: Types and challenges

Legal Events

Date Code Title Description
AS Assignment

Owner name: A.M.AUTONOMY CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIN, YONGDEUK;REEL/FRAME:050327/0102

Effective date: 20190905

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE