EP3729225A1 - Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map - Google Patents

Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map

Info

Publication number
EP3729225A1
EP3729225A1 EP17835936.0A EP17835936A EP3729225A1 EP 3729225 A1 EP3729225 A1 EP 3729225A1 EP 17835936 A EP17835936 A EP 17835936A EP 3729225 A1 EP3729225 A1 EP 3729225A1
Authority
EP
European Patent Office
Prior art keywords
environment map
corrections
pose
user
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17835936.0A
Other languages
German (de)
French (fr)
Inventor
Juraj ORSULIC
Damjan MIKLIC
Zdenko KOVACIC
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zagreb Faculty Of Electrical Engineering And Computing, University of
Original Assignee
Zagreb Faculty Of Electrical Engineering And Computing, University of
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zagreb Faculty Of Electrical Engineering And Computing, University of filed Critical Zagreb Faculty Of Electrical Engineering And Computing, University of
Publication of EP3729225A1 publication Critical patent/EP3729225A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Definitions

  • the present invention generally relates to an interactive computer-implemented method, a graphical user interface and a computer program product for building a high-accuracy environment map.
  • Such high accuracy-environment maps may be used for autonomous vehicle navigation, especially in GPS-denied indoor environments.
  • SLAM Simultaneous Localization and Mapping
  • mapping inaccuracies that accumulate during the mapping procedure can be reduced when an area is re-visited, which is called loop closure.
  • loop closure an area is re-visited
  • significant inconsistencies can still occur in the built map.
  • the accuracy of the used mapping procedure is crucial for precise positioning of autonomous mobile apparatus using the previously built map for localization. For example, the typical positioning precision required for the operation of autonomous forklifts (loading and unloading of pallets in automated warehouses) must be below 1 cm and 0.5°.
  • mapping procedure can sometimes produce defects such as fracturing or blurring of object contours, or superfluous multiple appearances of objects caused by incorrectly registering the same objects several times, at different locations. These defects cannot be corrected even by using different local warpings.
  • the present invention allows for storage and retrieval of all sensor measurements data and user corrections necessary to fully reconstruct the complete environment map building process, i.e. the SLAM process.
  • This enables long-term iterative map adjustments which can be performed occasionally or periodically and thus account for persistent changes in the environment.
  • a computer-implemented method according to the present invention allows for iterative map adjustments which can be performed by applying user corrections at any arbitrary point on a mobile device trajectory.
  • subsequent corrections do not need to be ordered chronologically, i.e., when applying corrections, the user can freely rewind forwards and backwards "in time” along the trajectory, employing a graphical user interface according to an embodiment of the present invention.
  • a computer-implemented method according to the present invention presents in detail a nontrivial method of applying user pose corrections with respect to the whole trajectory for modern SLAM implementations (for both graph and non-graph-based SLAM implementations). Furthermore, a computer- implemented method according to the present invention is not restricted to one independent run of only one mobile device, or even to any particular kind of a single autonomously moving device or apparatus.
  • the invention described herein provides specific advantages over prior art. Appropriate map alignment and correction of defects are achieved by iteratively inputting corrections during the SLAM process. This may also be facilitated by automatically aligning the rangefinder sensor measurements, e.g. from a laser scanner, with for example a reference CAD during the SLAM process, although the user may perform further tuning in case the automatic alignment with the reference CAD produces unsatisfactory results.
  • An object of the present invention is to provide an interactive computer-implemented method and an intuitive, easy- to-use, graphical user interface enabling users without expert knowledge about SLAM to tune the final output, i.e. the built environment map and trajectories, by intervention into the SLAM process, i.e. the execution of the SLAM algorithm in the used SLAM implementation.
  • Another object of the present invention is to provide an interactive computer-implemented method by which the environment map building process i.e. the SLAM process is not restricted to data collected from a single mobile device. Sensor measurements collected from several mobile devices, or from several independent runs of a single mobile device, or any combination thereof can be fused seamlessly into a single environment map, and the multiple built trajectories may be inter-constrained together in any way and at any point the user desires. This is achieved in the preferred embodiment by employing and extending an existing graph SLAM implementation capable of handling multiple trajectories (i.e. tracking multiple mobile devices simultaneously).
  • the present method enables the use of a reference CAD, if available, as prior information in the SLAM process. Further, the present method allows for assisted inputting of user corrections by means of scan matching, making initial inputting of user corrections fast and accurate. User corrections are integrated directly into the environment map building process, allowing mapping accuracy to be increased at various deliberately selected points in the environment to a level where the built environment map is free of visible defects. The final result is a high-accuracy map of the environment, adequate for localization in industrial scenarios, with an accuracy better than 1 cm and 0.5°.
  • a task of building an environment map comprises collecting sensor measurements while a mobile device is moving throughout the environment, and processing the collected sensor measurements in order to obtain the final result, the built environment map, as well as one or more mobile device trajectories.
  • the operator employs at least one mobile device equipped with the following sensors: one or more rangefinder sensors (e.g, a laser scanner - LIDAR), and optionally one or more motion sensing devices (e.g. an odometry sensor such as wheel encoders, or an inertial measurement unit - IMU).
  • a mobile device is a mobile robot, equipped with one or more rangefinder sensors, and optionally one or more motion sensing devices and a data processing unit.
  • the mobile robot may be steered through the environment by an operator or driven autonomously.
  • a mobile device is a portable device such as a backpack, carried by the user, equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices and a data processing unit.
  • the map building software takes the sensor measurements collected by at least one mobile device and performs the SLAM process (Simultaneous Localization and Mapping), the output of which is a built map of the environment (usually in the form of an occupancy grid, i.e. an array of numbers representing the probability of the respective discrete map element (cell) being an obstacle or free space), and one or more built trajectories.
  • SLAM process Simultaneous Localization and Mapping
  • Each trajectory is a set of consecutive timestamped mobile device poses (2D or 3D).
  • a 2D pose consists of position, commonly expressed using planar coordinates (x, y), and orientation, expressed as a yaw angle Q.
  • a 3D pose consists of position, commonly expressed using 3D coordinates (x, y, z ) and orientation, commonly expressed using Euler angles, a rotation matrix, or quaternions.
  • Another common pose representation (both for 2D and 3D) is using a homogenous transformation matrix, which includes both position and orientation.
  • a computer-implemented method according to the present invention is envisioned to be used with an arbitrary SLAM implementation.
  • the proposed interactive computer-implemented method and a graphical user interface enable user intervention into SLAM process by allowing the user to input corrections, for the purpose of obtaining a highly accurate, consistent and defect-free map of the environment.
  • the reference CAD may be used to guide the SLAM process, and the resulting environment map will be aligned thereto.
  • a graphical user interface according to the present invention allows the user to observe the progress of the SLAM process, to rewind forwards and backwards in time along built trajectories, to observe and assess collected sensor measurements, and to iteratively input corrections and view the effects thereof.
  • a graphical user interface according to the present invention thus enables the user to gain a high level of insight into the SLAM process, i.e. the SLAM algorithm execution, aiding them in diagnosing sensor and SLAM configuration/tuning issues.
  • Instructions for performing the aforementioned method and a graphical user interface may be included in a computer program product configured for execution by a computing device 104, which when executed by a computing device having a screen displaying the graphical user interface for interactively building a high-accuracy environment map causes the computing device to perform the computer-implemented method in accordance with the present invention.
  • Figure 1 shows an environment map illustrating an example of failure of a SLAM implementation to perform loop closure successfully and thus to eliminate accumulated errors in the environment map building process.
  • Figure 2 shows one example of an environment map defect such as fracturing and blurring of object contours in an environment map building process.
  • Figure 3 shows another example of an environment map defect such as superfluous double appearance i.e. ghosting of objects caused by incorrectly registering the same objects two times, at different locations, in an environment map building process.
  • Figure 4 shows yet another example of an environment map defect such as misalignment with a reference CAD in an environment map building process.
  • Figure 5 schematically illustrates an example of a mobile device system for implementing an interactive environment map building method, in accordance with an embodiment of the present invention.
  • Figure 6 shows an interactive environment map building workflow, in accordance with an embodiment of the present invention.
  • Figure 7 shows a software flow diagram illustrating a main data processing loop in a SLAM implementation, as well as the extensions which enable an interactive environment map building method in accordance with an embodiment of the present invention.
  • Figure 8 shows an example of an environment map free of visual defects, built using an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
  • Figure 9 schematically illustrates a graphical user interface enabling implementation of an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
  • Figure 10 illustrates a graphical user interface enabling the user to observe an environment map being built, to inspect and gain a high level of insight into the SLAM process, and to input corrections, in accordance with an embodiment of the present invention.
  • Figure 11 shows an enlarged view of a part of a main map window illustrating a section of a built environment map, a trajectory, a translation and a rotation marker, and a rewind point cloud, in accordance with an embodiment of the present invention.
  • Figure 12 shows an enlarged view of a part of a main map window from Figure 11 illustrating a pose-corrected rewind point cloud during inputting of a user pose correction, in accordance with an embodiment of the present invention.
  • Figure 13 illustrates applied user pose corrections, and a rebuilt trajectory honoring the inputted pose corrections, in accordance with an embodiment of the present invention.
  • the expression“at least one mobile device”, as used in the present patent application and patent claims, shall include independent runs of several mobile devices, or several independent runs of a single mobile device, or any combination thereof.
  • user correction shall include any kind of user intervention into the SLAM process, i.e. execution of the SLAM algorithm.
  • a user correction is typically given at a certain time in the sensor measurement dataset or at a certain trajectory node, and may include modifying any parameter of the SLAM implementation or part of the SLAM process state.
  • the effect of applying a user pose correction in the SLAM process is to force the pose of the desired trajectory node in the rebuilt trajectory to the one mandated by the pose correction.
  • linear SLAM as used in the present patent application and patent claims, is related to the following: a SLAM implementation is referred to as linear if a user correction has to be applied exactly at the moment when the corresponding sensor measurement is being processed in the data processing loop, and it is not feasible to apply the correction later.
  • computing device may include, for example, one or more of: a desktop computing device, a laptop computing device, a tablet computing device, a computer, a computing device of a vehicle of the user, or a wearable apparatus of the user that includes a computing device.
  • SLAM implementation is related to one or more machine-readable program code portions configured to run a SLAM process, i.e. execute a SLAM algorithm on loaded sensor measurements, which may comprise algorithms for scan matching, localization, trajectory building and environment map building.
  • SLAM process i.e. execute a SLAM algorithm on loaded sensor measurements, which may comprise algorithms for scan matching, localization, trajectory building and environment map building.
  • the computer program product described herein uses and extends an existing SLAM implementation as one of its components; one its component implements and displays in a graphical user interface a built or subsequently rebuilt environment map and tools, where said component enables the user to perform a map building workflow using an interactive computer-implemented method according to the present invention.
  • the interactive method according to the present invention is a computer implemented method.
  • an embodiment of the present invention may utilize multiple different SLAM implementations and accordingly provide means in the graphical user interface for selecting the SLAM implementation.
  • SLAM implementations usually have many intricate parameters which influence their execution and the quality of the end result, therefore requiring a highly skilled operator and specialized knowledge to find an optimal tuning of these parameters for a particular combination of sensors and the environment.
  • the interactive environment map building method and the graphical user interface according to the present invention enable a less experienced user to mitigate causes of mapping errors such as suboptimally tuned SLAM parameters, imperfect sensor measurements and others, by allowing them to manually correct the mapping errors caused thereby.
  • Loop closure is an important concept in SLAM. It is an event which is triggered when at least one mobile device visits an area it has visited before. Performing loop closure reduces the positioning error along the whole loop which begins with the previous visit and ends with the revisit.
  • Particle filter-based SLAM uses a set of particles.
  • Each particle is a SLAM state hypothesis and contains a current pose estimate and a map that has been built along a trajectory which ends at that current pose estimate.
  • Incoming sensor measurements such as rangefinder and motion measurements are used to update every particle according to the mobile device measurement and motion models. All particles i.e. hypotheses are periodically evaluated, and the level of agreement with the incoming measurements is numerically calculated as particle weight. The best particles are assigned the highest weights.
  • the set of particles is periodically resampled, meaning that a new set is formed by sampling the best particles from the previous set. This ensures loop closure-like behavior of particle filter-based SLAM: when a loop is closed, the particles that ended up in correct poses, i.e. which successfully closed the loop, will have their weights rise drastically due to agreement with the newly available evidence (the range data of the revisited place).
  • a well-known particle filter-based SLAM implementation is GMapping.
  • a class of SLAM implementations known as graph SLAM internally uses a structure known as the pose graph to represent how poses of various trajectory nodes are constrained one to another. For example, consecutive nodes are typically locally constrained.
  • Graph SLAM implementations usually account for loop closure, when detected, by explicitly modeling the detected relationship between the trajectory nodes of the previous visit and the new trajectory node(s) which correspond to the revisit. The relationship is usually modeled with a loop closure constraint that is inserted in the pose graph when loop closure is detected.
  • graph SLAM usually runs an optimization process and produces optimized i.e. rebuilt trajectories, which honor the newly added constraints as well as all the constraints previously inserted into the pose graph.
  • a constraint is a measurement of one [trajectory] node q from another's (q) position.
  • the measured offset between q and q, in q's frame, is z ij: with precision matrix L i - (inverse of covariance). For any actual poses of q and q, their offset can be calculated as
  • R t is the 2x2 rotation matrix of Q*. /i(q, q) is called the measurement equation.
  • the constrained pose graph optimization problem is periodically re-solved with newly added constraints (which can be local and loop closure constraints).
  • the results of the optimization process are displayed to the user in real time.
  • the SPA method (as implemented in Google Cartographer using the Google Ceres nonlinear solver) can quickly optimize pose graphs containing thousands of nodes.
  • fast performance of the optimization process is instrumental in quickly processing inputted user corrections and displaying the result to the user, which facilitates a faster iterative workflow.
  • the used SLAM implementation also supports tracking multiple, possibly concurrent, trajectories.
  • Sensor measurements collected from at least one mobile device can be fused seamlessly into a single environment map.
  • Each independent run of one mobile device corresponds to one built trajectory, and each built trajectory is displayed within the built environment map in the graphical user interface.
  • the built environment map in accordance with the present invention comprises at least one built trajectory.
  • SLAM SLAM implementations
  • the data processing loop takes care of routing the sensor measurements data to the corresponding trajectories.
  • the multi-trajectory capability of the SLAM implementation combined with the present invention, enables the user to iteratively build and correct an environment map from data collected from at least one mobile device.
  • snapshots of the SLAM state may be taken (typically in the data processing loop). This need not be done in each loop iteration, but rather sparsely, in order to avoid overly consuming computing device resources.
  • the SLAM state is restored from the most recent snapshot, taken at a time before the first (earliest) of the new user corrections is due to be applied. This avoids running the complete SLAM process from the beginning, thus reducing processing time of applying corrections.
  • a user pose correction may be applied by forcing the entire set of particles into the pose mandated by the correction, by manipulating their pose hypotheses into the target pose.
  • the manipulation may be performed as a continuous interpolation to the target pose within a predetermined time interval in the SLAM process before the correction is due. Because this is clearly a correction which must be applied at that certain point in the SLAM process and not afterwards, particle filter-based SLAM is treated as linear SLAM with respect to user pose corrections.
  • graph SLAM is retroactively correctable, easily enabling an iterative workflow without the need to re-execute the complete SLAM process. If the optimization process and rebuilding the environment map is quick enough (which is the case with the preferred embodiment that uses Google Cartographer), the user may quickly see the effect of applying the inputted corrections.
  • the user may also have the ability to provide relative pose corrections which are not in the global coordinate system, but rather tie together any two parts of the trajectory relative one to another by specifying a relative pose correction in a coordinate system tied to a trajectory node located in one of the respective trajectory parts. In other words, the user may select c other than c 0 .
  • FIG. 5 schematically illustrates one possible embodiment of a map building system 100 for implementing an interactive environment map 10 building method according to the present invention.
  • the map building system 100 includes at least one mobile device 102, a computing device 104, and optionally a dedicated control command system 105.
  • the mobile device 102 is equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices, and a data processing unit. It is responsive to motion commands generated by the user through the control command system 105 and conveys the sensor measurements data to the computing device 104 over a communication channel 106. All collected sensor measurements data is being stored on the computing device 104, and optionally also on the data processing unit of the mobile device 102.
  • the graphical user interface 20 provides to the user means for gaining a high level of insight and enables performing intervention into the SLAM process by iteratively inputting corrections, using a keyboard, a mouse, a 3D mouse, or any other human interface device known to the person skilled in the art.
  • the interactive environment map building method is performed online, meaning that the environment map 10 is being built and user corrections are being inputted and applied while the mobile device 102 is moving and streaming online sensor measurements to the computing device 104 via the communications channel 106.
  • one mobile device 102 is moving in the environment, controlled by the user through the control command system 105.
  • the sensor measurements data is also being displayed to the user on the screen of the computing device 104, but user corrections are not being inputted during motion of the mobile device 102.
  • the interactive map building method according to the present invention is performed offline, by processing at any later point in time on the computing device 104 the dataset containing the collected sensor measurements.
  • the interactive map building method according to the present invention can also be performed by simultaneously processing sensor measurements recorded in several independent runs of at least one mobile device 102.
  • a computer-implemented method for interactively building a high-accuracy environment map 10 comprises the steps of: moving at least one mobile device 102 in the environment; collecting sensor measurements from at least one mobile device 102; loading into a SLAM implementation 306 the collected sensor measurements in the form of offline stored sensor measurement datasets ordered by timestamps or in the form of online streamed data; building an environment map 10 and at least one trajectory 11 by running a SLAM process in the SLAM implementation 306; exporting the built environment map 10 and trajectories 11 to a persistent storage 212 of a computing device 104; displaying the built environment map 10 and at least one built trajectory 11 in a graphical user interface 20; and assessing by the user if the built environment map 10 has visible defects.
  • the method further comprises: iterative inputting of user corrections in the graphical user interface 20 and applying the inputted corrections in the SLAM implementation 306; subsequently rebuilding the environment map 10 and trajectories 11 , honoring the inputted user corrections and sensor measurement datasets, wherein after each inputted and applied user correction the subsequently rebuilt environment map 10 and trajectories 11 are displayed in the graphical user interface 20 for assessing by the user of visible defects; and, if the subsequently rebuilt map 10 is free of visible defects, exporting the subsequently rebuilt environment map 10 and trajectories 11 to the persistent storage 212.
  • Assessing by the user if the built or subsequently rebuilt environment map 10, displayed in the graphical user interface 20, contains defects is performed by tools for rewinding along at least one built or subsequently rebuilt trajectory 11. Iterative inputting and applying of user corrections, in accordance with present invention, includes inputting pose corrections 18 of trajectory nodes 13, and modifying SLAM implementation parameters or part of the SLAM process state.
  • an interactive computer-implemented method hereinafter the method, can be performed in offline or online mode.
  • execution of the map building software is initiated by the user, by creating a new mapping project (block 201 ) and loading in (block 202) the sensor measurements data from at least one mobile device 102, whether offline in the form of a dataset ordered by timestamps, or online by setting up connections with at least one active mobile device 102.
  • each independent active mobile device 102 streams the collected sensor data to the map building software.
  • the method according to the present invention provides a possibility of using an existing reference CAD 14, in which case the reference CAD is also loaded (block 202).
  • a high-accuracy environment map 10 can be built, and the workflow enabled by the present invention can be performed even without the reference CAD 14 being provided.
  • the reference CAD 14, if available, is displayed under/overlaid with the built environment map 10.
  • the sensor measurements used for creating the environment map 10 may be collected during one or several independent runs of at least one mobile device 102.
  • the independent runs may or may not be concurrent.
  • Each independent run is closely associated with one trajectory 11 , and these two terms will be used herein interchangeably.
  • the elements of the set of timestamped poses that makes up a trajectory 1 1 are called trajectory nodes 13.
  • the preferred embodiment uses a graph SLAM implementation which is capable of tracking multiple concurrent trajectories 1 1. However, a significant portion of the present method, described herein, is applicable to simpler single-trajectory SLAM implementations as well. Thus, the second preferred embodiment uses a single-trajectory, particle filter-based linear SLAM implementation.
  • the SLAM implementation 306 After loading into SLAM implementation 306 one or more sensor measurement datasets (blocks 202; 301 ), the SLAM implementation 306 asynchronously starts the environment map building process (block 203) by performing operations depicted in Figure 7.
  • the environment map building process takes place in background and does not block the user from performing actions in the graphical user interface 20, such as rewinding (block 206) and inputting of user corrections (block 207).
  • the output of the environment map building process is displayed in the graphical user interface 20 as it progresses.
  • the user is free to examine in detail the built environment map 10 and one or more built trajectories 11 , as they are being built, or afterwards, by performing rewinding (block 206). Displaying the built environment map 10 in a graphical user interface 20 and rewinding along at least one built trajectory 11 enables the user to assess if the built or subsequently rebuilt environment map 10 has visible defects.
  • a rewinding tool provided in the graphical user interface 20 enables the user to rewind along each trajectory 11 and to perceive the sensor measurements related to the corresponding trajectory nodes 13 where the SLAM implementation 306 built the respective section of the built or subsequently rebuilt environment map 10.
  • the user can rewind along each built trajectory 1 1 and view the poses of the tracked mobile devices 102 at each trajectory node 13.
  • a rangefinder point cloud from the sensor measurements data which corresponds to the trajectory node 13 selected during rewinding is also displayed in the graphical user interface.
  • the rangefinder point cloud displayed on the built environment map 10 corresponds to the mobile device 102 pose at the respective trajectory node 13.
  • rewind point clouds 15 These displayed rangefinder point clouds are called herein rewind point clouds 15, and have an important role in enabling the user to input corrections (illustrated in Figs. 10 to 12). Rewinding along a built or subsequently rebuilt trajectory 11 and displaying the corresponding rewind point clouds 15 enables the user to gain a high level of insight into the environment map building process, and enables the user to identify with high precision, both temporal and spatial, which scans were incorrectly inserted into the built environment map 10.
  • the iterative interactive environment map building method begins with the user perceiving one or more visible defects in the built environment map 10 (block 205).
  • defects may include: fracturing or blurring of object contours (illustrated in Figure 2); superfluous multiple appearances, also called ghosting, of objects in the built environment map 10 caused by incorrectly registering the same objects several times at different locations, e.g. improperly closed loops; illustrated in Figure 3; and misalignment with the reference CAD 14 (illustrated in Figure 4).
  • the user assesses visible defects in the built or subsequently rebuilt environment map 10.
  • the user is enabled to visually locate the built or subsequently rebuilt trajectory nodes 13 where incorrect handling of sensor measurements, imperfect sensor measurements or random environmental disturbances caused visible defects in the built environment map 10 (block 206), as for example illustrated in Figure 11.
  • the user employs the tools from the graphical user interface 20 to input corrections which will correct the visible defects (block 207).
  • the method of applying corrections to linear SLAM is suitable for other types of corrections as well.
  • the user may opt to change certain SLAM parameters. This would be another type of correction, a SLAM parameter correction, applied at the trajectory node 13 that corresponds to the point of entry into the new environment.
  • the user may input pose corrections by aligning the incorrectly registered objects 12 e.g. walls, or certain key features visible in the rewind point cloud 15 with the rest of the built environment map 10 or with the reference CAD 14, if available. Furthermore, after inputting an initial pose correction, the user may further tune it either manually or refine it by using a pose refinement tool.
  • Different means of enabling inputting a pose correction include: mouse drag-and-drop input of position and orientation; CAD-like numeric entry; and selecting pairs of matching points and computing the transformation between them.
  • Figures 10 to 12 illustrate inputting of a pose correction by aligning the incorrectly registered objects 12 such as walls; namely, visually aligning the rewind point cloud 15 with the reference CAD 14, or with the rest of the built environment map 10.
  • the result of alignment is a pose-corrected rewind point cloud 17, as illustrated in Figure 10.
  • the pose corresponding to the pose-corrected rewind point cloud 17 is applied thereafter as a pose correction for the respective trajectory node 13.
  • the method in accordance with present invention further comprises displaying and assessing in the graphical user interface 20 sensor measurements data which the SLAM implementation 306 used when building or subsequently rebuilding the environment map 10, the displayed sensor measurements data corresponding to trajectory nodes 13 selected during rewinding.
  • the inputted user correction is applied as soon as possible (blocks 208; 305) in the SLAM implementation 306; the trajectory 11 and the environment map 10 are rebuilt; the subsequently rebuilt environment map 10 and the rebuilt trajectory 11 , which honor the newly inputted correction, as well as all the previously inputted corrections and sensor measurements (blocks 208; 308) are displayed in the graphical user interface 20.
  • the user is now able to evaluate the effect of the inputted correction by assessing the subsequently rebuilt environment map 10 to see if applying the correction has resulted in a more accurate environment map 10 with respect to the previously observed visible defects, such as map blurring, fracturing, ghosting or CAD misalignment.
  • the user may also employ the insight-providing tools of the graphical user interface 20 such as rewinding. If the rebuilt environment map 10 is still visibly inaccurate (contains visible defects), the user can further tune the inputted correction. The correction is reverted to the stage where user input is performed (block 210), and the user can further tune the correction with additional input. If unsuccessful again, the user can choose to remove the correction altogether and try inputting another correction elsewhere, i.e. at another trajectory node 13 or in another section of the built or subsequently rebuilt environment map 10.
  • the insight-providing tools of the graphical user interface 20 such as rewinding. If the rebuilt environment map 10 is still visibly inaccurate (contains visible defects), the user can further tune the inputted correction. The correction is reverted to the stage where user input is performed (block 210), and the user can further tune the correction with additional input. If unsuccessful again, the user can choose to remove the correction altogether and try inputting another correction elsewhere, i.e. at another trajectory node 13 or in another section of the built or
  • the user is enabled to perform an iterative workflow where the user observes one or more visible defects in the environment map 10, inputs corrections until the respective section of the environment map 10 is visibly accurate and consistent, and moves on to the remaining visible defects, if any. The process is repeated until the user obtains an environment map 10 that is consistent and free of visible defects.
  • the absence of visible defects in the built or subsequently rebuilt environment map 10 also implies that the corresponding one or more trajectories 11 represent a reasonably good estimate of the true trajectories 11 travelled by at least one mobile device through the environment.
  • the user may utilize the available editing-like facilities to cut/exclude (tool 23 in the graphical user interface 20) such redundant parts from the map building process.
  • the user may also inter-constrain the multiple trajectories 11 together in any way and at any point the user desires. Inter-constraining pairs of trajectories 1 1 together is achieved by allowing the user during inputting of pose corrections to select a coordinate system tied to any trajectory node 13, thus enabling inputting relative pose corrections between any two trajectories 11 at one or more trajectory nodes 13, as described earlier.
  • the method according to the present invention provides saving and loading of the user’s work done so far at any time as illustrated in block 204.
  • the SLAM state and all corrections inputted so far are preserved and saved in the persistent storage 212, enabling the user to restore the complete map building project and to continue work on the same project until the rebuilt environment map 10 and the trajectory 11 are free of visible defects.
  • the user may also load a new dataset recorded some time later, e.g. when the environment has changed, and use the available tools and the graphical user interface 20 according to the present invention to obtain an updated version of the environment map 10 that is consistent with the previously built or rebuilt environment map 10, which assumes the role of the reference CAD 14.
  • said rebuilt environment map 10 may be exported in the form of an occupancy grid (block 211 ) and saved in the persistent storage 212.
  • the built trajectories 11 may also be exported as sets of consecutive timestamped mobile device poses.
  • FIG. 7 shows a software flow diagram illustrating the main data processing loop in a SLAM implementation 306, as well as the extensions which enable the interactive environment map building method in accordance with an embodiment of the present invention.
  • Block 301 illustrates performing SLAM initialization, which includes loading offline sensor measurement datasets ordered by timestamps, or setting up online communication with at least one active mobile device 102.
  • the SLAM implementation 306 proceeds to build the environment map 10 and the trajectory 11 of at least one mobile device 102, as illustrated in the data processing loop consisting of blocks 302 to 308.
  • a series of operational steps is performed until all sensor measurements data are processed (block 302), the operational steps comprising: each sensor measurement is retrieved (block 303) and handled (block 307) according to its type and originating mobile device 102.
  • Handling of each sensor measurement may include performing scan matching, localization, building a trajectory node 13 and updating the built environment map 10, depending on the measurement type and the used SLAM implementation 306.
  • the current, i.e. updated, SLAM state namely the currently built environment map 10 poses of at least one mobile device 102 and the corresponding trajectories 11 are displayed in the graphical user interface 20.
  • the data processing loop in the SLAM implementation 306 is extended to support applying user corrections.
  • the user iteratively inputs one or more new corrections (block 207).
  • Blocks 304 and 305 illustrate applying due user corrections in the data processing loop of the SLAM implementation 306. Applying the due user corrections (blocks 304 and 305) is performed as soon as possible, i.e.
  • the resulting updated SLAM state i.e. the subsequently rebuilt trajectories 11 and environment map 10, are afterwards displayed (blocks 208; 308) in the graphical user interface 20.
  • the user can as soon as possible observe the effect of the inputted corrections, which facilitates a faster iterative workflow.
  • a pose constraint for a certain trajectory node 13 may be visualized like a spring which pulls that trajectory node 13 towards a certain position. Rebuilding the trajectory 11 given new constraints is performed by solving the constrained pose graph optimization problem as described above, referred to herein as running an optimization process. Running the optimization process is analogous to finding a configuration of trajectory nodes 13 for which the springs are at their most relaxed position, i.e. all constraints are satisfied as much as possible.
  • the pose graph constraints are calculated from pose corrections inputted by the user, as illustrated in Figure 13.
  • the user pose corrections 18 were in this case given in the global map coordinate system with an origin 19.
  • constraints have weights (corresponding to elements of L ⁇ ⁇ in the formal description above) which control the impact of the constraints on the final optimized solution. Since it is desired that the constraints calculated from the inputted user pose corrections 18 have a big impact on the optimization process (i.e. strongly influence the optimized solution, in order to force trajectory node poses to ones mandated by constraints calculated from user corrections), larger weights are assigned to these constraints.
  • the assigned weights are predetermined to be equal to or larger than the weights of loop closure constraints automatically generated by the graph SLAM implementation 306, where the user is further enabled to adjust the weights assigned to pose constraints calculated from the inputted pose corrections 18. If necessary, the exact values of the assigned weights may be determined depending on the optimization process result.
  • the user can increase the values of assigned weights until the resulting optimized poses of the trajectory nodes 13 match the ones mandated by the inputted user corrections to a desired degree of precision, namely until the subsequently rebuilt environment map 10 is free of visible defects.
  • a graphical element such as a residual marker may be used to visualize the difference between the pose mandated by the inputted user correction and the resulting optimized trajectory node pose.
  • applying of inputted user pose corrections 18 is performed by excluding from the pose graph optimization process a subset of variables, and subsequently rebuilding the trajectories 11 and the environment map 10 by re-running the pose graph optimization process.
  • the variables excluded from the optimization process i.e. a subset of global poses c corresponding to poses of trajectory nodes 13 with pose corrections, are treated specially by having their values exactly fixed to ones mandated by the corresponding user pose corrections 18. This would be analogous to having constraints with infinite weights.
  • linear SLAM does not lend itself naturally to applying additional pose corrections, which is required for an iterative map building workflow.
  • the SLAM process needs to be re-run from the point where the user correction is due to be applied.
  • applying of user pose corrections 18 is performed by interpolating the values of particle poses into values mandated by the user pose corrections 18 within a predetermined time interval in the SLAM process before the correction is due, where the user is further enabled to adjust the interpolation time interval.
  • the time needed to rebuild the environment map 10 after applying the correction can be reduced by regularly snapshotting the state of the SLAM process and rerunning the SLAM process in the SLAM implementation 306 from the latest possible snapshot before a correction is due to be applied, as described above.
  • Inputting of user corrections is performed exclusively through the graphical user interface 20, displayed on the screen of the computing device 104 and operably connected to the SLAM implementation 306.
  • the graphical user interface 20 according to the present invention enables performing the computer-implemented method for interactively building a high-accuracy environment map 10 by iterative inputting and applying of user corrections in the SLAM implementation 306.
  • the graphical user interface 20 is configured to display the built or subsequently rebuilt environment map 10 and trajectories 11 to enable the user to observe the progress of the environment map 10 building process, to rewind along the built or subsequently rebuilt trajectories 11 , to iteratively input and apply corrections, to store the complete mapping project at any point, and to export the built or subsequently rebuilt environment map 10 which is assessed to be free of visible defects.
  • the tools in the graphical user interface 20 available to the user include accurate controls for inspecting the built or subsequently rebuilt environment map 10 and trajectory 11. Furthermore, the tools for accurate inputting and refining of pose corrections are available to the user. These tools work by allowing the user to align the rewind point cloud 15 so that the visible defects in the environment map 10 will be removed.
  • the graphical user interface 20 for implementing an interactive environment map building method comprises, schematically illustrated in Figure 9, a zoomable and pannable main map window 400 configured to display the built environment map 10 and trajectories 11 , the current pose and the currently observed point cloud of at least one mobile device 102, a zoomable and scrollable trajectories timeline window 405, and a user corrections management window 403, wherein the graphical user interface is configured to provide tools for manipulating the viewport of the main map window 400 such as zooming and panning, tools for rewinding along a selected built or subsequently rebuilt trajectory 1 1 , and tools for iterative inputting and applying of user corrections.
  • tools for manipulating the viewport of the main map window 400 such as zooming and panning, tools for rewinding along a selected built or subsequently rebuilt trajectory 1 1 , and tools for iterative inputting and applying of user corrections.
  • the graphical user interface 20 comprises a sensor measurement datasets window 401 configured to list the loaded sensor measurement datasets when building the environment map 10 is performed offline, an active devices window 402 configured to list at least one tracked active mobile device 102 when building the environment map 10 is performed online, and a tool window 404 providing saving and loading the map building project data into/from a file on persistent storage 212.
  • a sensor measurement datasets window 401 configured to list the loaded sensor measurement datasets when building the environment map 10 is performed offline
  • an active devices window 402 configured to list at least one tracked active mobile device 102 when building the environment map 10 is performed online
  • a tool window 404 providing saving and loading the map building project data into/from a file on persistent storage 212.
  • the graphical user interface 20 is structured to provide a“map building project”-type interface.
  • the user may load one or more sensor measurement datasets or active devices into the map building project, and perform the interactive environment map 10 building method according to the present invention.
  • the user corrections management window 403 enables the user to perform actions such as inputting, applying, removing, editing, enabling and disabling user corrections.
  • the inputted corrections may also be displayed in the trajectories timeline window 405 by highlighting the keyframe points corresponding to respective trajectory nodes 13. Further, during rewinding, when a trajectory node 13 is selected, its current pose is displayed in the main map window 400, and the selected trajectory node 13 may be highlighted in the trajectories timeline window 405 as well.
  • the graphical user interface 20 displays the sensor measurement datasets window 401 , which lists the loaded offline sensor measurement datasets, or the active mobile devices window 402, which lists the tracked active mobile devices 102 when the environment map 10 building is performed online.
  • the main map window 400 of the graphical user interface 20 displays the built or subsequently rebuilt environment map 10 and trajectories 1 1 , the current pose and the currently observed point cloud of at least one mobile device 102, the rewind point cloud 15 and the reference CAD 14, if available.
  • the reference CAD 14 may be displayed under/overlaid with the built or subsequently rebuilt environment map 10, which enables the user to observe and evaluate the alignment of the built or subsequently rebuilt environment map 10 with the reference CAD 14, and to identify sections of the built or subsequently rebuilt environment map 10 which need further corrections.
  • Means provided in the graphical user interface 20 enabling rewinding along a selected built or subsequently rebuilt trajectory 11 include tools for scrubbing, or directly selecting a point in timeline, keyframe points or other graphical elements which correspond to trajectory nodes 13, user corrections, or timestamped sensor measurements.
  • the trajectories timeline window 405 is configured to display keyframe points corresponding to built trajectory nodes 13 and graphical elements corresponding to inputted user corrections or timestamped sensor measurements.
  • the trajectories timeline window 405 includes a graphical playhead element tool 22 configured to enable rewinding along a selected trajectory 1 1 and controlling the position in the timeline by the action of dragging using the mouse or another human interface device.
  • the trajectories timeline window 405 may further include: a set of buttons which control the position in the timeline, akin to a conventional media player interface; tools for selecting a point in the timeline by directly entering a numeric timecode or a numeric trajectory node ID; and a tool 23 for excluding parts of the sensor measurement dataset e.g. by cutting, akin to video or audio editing. If multiple trajectories 11 are built, they are displayed analogously to standard multi-track displays in video or audio editing interfaces. Additional means of rewinding through time are keys on the computer keyboard (for example, arrow keys), a swiping gesture on a touch screen or a similar human interface device.
  • an alternative tool is provided for selecting a trajectory node 13 and the corresponding point in the timeline by clicking or dragging the mouse on the displayed built or subsequently rebuilt trajectories 11 in the main map window 400.
  • the implementation of such tool is slightly more involved because it requires locating the trajectory node 13 which is closest to the clicked point, for which an efficient point search data structure known to the person skilled in the art, such as a KD-tree or an Octree, should be used.
  • the graphical user interface 20 displays the mobile device 102 pose of the corresponding node 13 of the selected trajectory 11 , as built by the SLAM implementation 306, and shows the point cloud observed by the rangefinder sensor at that selected point in time, namely the rewind point cloud 15.
  • the rewind point cloud 15 is displayed on the built or subsequently rebuilt environment map 10 in accordance with the pose of the corresponding built or subsequently rebuilt trajectory node
  • Tools of the graphical user interface 20 for inputting of user pose corrections and for further tuning of the inputted corrections include a translation marker 16 and a rotation marker 21 , illustrated in Figures 10 to 12. Said markers are displayed in the main map window 400 and are configured to be dragged with the mouse or another human interface device.
  • the main map window 400 is configured to display a pose-corrected rewind point cloud 17.
  • the pose-corrected rewind point cloud 17 is translated and rotated in accordance with the pose of the dragged markers 16 and 21 which correspond to the pose of the mobile device 102.
  • the translation marker 16 and the rotation marker 21 enable accurate and precise alignment of the displayed rewind point clouds 15 with the reference CAD
  • the user may input a pose correction by moving the translation marker 16 and the rotation marker 21 which represent the pose of the mobile device 102 associated with the trajectory node 13 selected during rewinding.
  • the user may also specify a pivot point for the rotation, which facilitates aligning key points in the point cloud.
  • Another way of enabling the user to input the pose correction is to allow them to specify pairs of corresponding points, e.g. two key points from the rewind point cloud 15, and their corresponding key points in the built or subsequently rebuilt environment map 10 or in the reference CAD 14, if available. Subsequently, a transformation is computed which aligns the rewind point cloud 15 by matching the specified pairs of corresponding points, and used for the inputted pose correction. In case that the user specifies more than two pairs of corresponding points, thus overconstraining the problem of finding the transformation, a least squares solution may be used, or a more sophisticated regression which compensates for outliers, known to the person skilled in the art.
  • the user is further provided with a“pose refinement” tool.
  • the user may input an initial pose correction by roughly aligning the rewind point cloud 15 with the built environment map 10 or with the reference CAD 14, if available.
  • the user may subsequently use the pose refinement tool which takes the initially inputted user pose correction and performs scan matching, the output of which is a refined pose.
  • the user may further tune the pose correction before confirming the final input.
  • the pose refinement tool thus facilitates fast and precise alignment of the displayed rewind point clouds 15 with the built or subsequently rebuilt environment map 10 or with the reference CAD 14, if available.
  • the correction for example, a pose correction using the aforementioned tools for accurate and precise alignment of the displayed rewind point clouds 15, it is applied as soon as the execution of the SLAM process permits, as described earlier.
  • This enables the user to see in the graphical user interface 20 the effects of the inputted correction on the subsequently rebuilt environment map 10 as soon as possible, wherein the delay of displaying to the user depends on the type of the used SLAM implementation 306.
  • Figure 10 illustrates the graphical user interface 20 where the trajectories timeline window 405 displays two trajectories 11 , namely T rajectory 0 and T rajectory 1.
  • the two trajectory nodes 13 on T rajectory 0 where corrections were applied are highlighted, and the corresponding corrections are displayed in the user corrections management window 403.
  • Trajectory 1 is selected as the trajectory along which rewinding is performed, and the vertical playhead line passes through the highlighted keyframe point corresponding to the node that has been selected by rewinding.
  • the mobile device pose of the selected trajectory node 13 is also displayed in the main map window 400, and the user is inputting a pose correction for that trajectory node 13 by dragging the markers 16 and 21 , as described above.
  • extensions to a SLAM implementation according to the present invention include exposing all sensor measurements data, including all point clouds from the rangefinder sensor measurements.
  • the exposed offline sensor measurement datasets are listed in the graphical user interface 20 in the sensor measurement datasets window 401.
  • Point clouds from the rangefinder sensor measurements are exposed for the purpose of enabling in the graphical user interface 20 rewinding along the built or subsequently rebuilt trajectories 11 , and displaying the rewind point clouds 15.
  • the timestamps, types and content of exposed sensor measurements may also be displayed in the graphical user interface 20, e.g. in the trajectories timeline window 405 or in the main map window 400.
  • extensions to a SLAM implementation may further include exposing a scan matcher, if available, which enables the pose refinement functionality.
  • a scan matcher if available, an alternative one may be provided and used in the graphical user interface 20 for the pose refinement functionality.
  • the SLAM implementation may be extended to perform automatic alignment with the reference CAD during the SLAM process.
  • Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a microprocessor, a programmable computing device or an electronic circuit. In some embodiments, one or more of the method steps may be executed by the mobile device 102. In the preferred embodiments of the invention, the method described herein is processor-implemented or computer-implemented.
  • embodiments of the invention can be implemented in hardware or in software.
  • the implementation can be performed using a non-transitory storage medium.
  • embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing the method when the computer program product runs on a computing device.
  • the program code may, for example, be stored on a machine-readable carrier.
  • an embodiment of the inventive method is, therefore, a computer program product having a program code for performing the method described herein, when the computer program runs on a computing device.
  • a further embodiment comprises a processing means, for example, a computing device or a programmable logic device, programmed to, configured to, or adapted to, perform the method described herein.
  • a processing means for example, a computing device or a programmable logic device, programmed to, configured to, or adapted to, perform the method described herein.
  • a further embodiment comprises a computer having installed thereon the computer program product for performing the method described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computer-implemented method for interactively building a high-accuracy environment map (10), an interactive graphical user interface (20), and a computer program product are disclosed. The computer- implemented method, in accordance with the present invention comprising the steps of iterative inputting of user corrections and subsequently rebuilding of the environment map (10) and of at least one trajectory (11), honouring the inputted user corrections and sensor measurement dataset, wherein after each inputted user correction the subsequently rebuilt environment map (10) and at least one rebuilt trajectory (11 ) are displayed in the graphical user interface (20); assessing by the user in the graphical user interface (20) if there are still visible defects in the subsequently rebuilt environment map (10), and if not exporting the rebuilt environment map (10) to the persistent storage (212). The interactive graphical user (20), in accordance with the present invention, comprising a zoomable/pannable main map window (400); a user corrections management window (403), and a zoomable/scrollable trajectories timeline window (405), wherein the graphical user interface (20) is provided with tools for rewinding the environment map (10) along selected trajectory (11), and with tools for iterative inputting of user corrections. The interactive method and an intuitive, easy-to-use, user interface (20) enabling users without expert knowledge about SLAM algorithms to tune the final output, i.e. the built environment map (10) and trajectory (11), by intervention into the execution of the used SLAM algorithm.

Description

INTERACTIVE COMPUTER-IMPLEMENTED METHOD, GRAPHICAL USER INTERFACE AND COMPUTER PROGRAM PRODUCT FOR BUILDING A HIGH-ACCURACY ENVIRONMENT MAP
Technical field
The present invention generally relates to an interactive computer-implemented method, a graphical user interface and a computer program product for building a high-accuracy environment map. Such high accuracy-environment maps may be used for autonomous vehicle navigation, especially in GPS-denied indoor environments.
Description of the prior art
A procedure which enables a mobile apparatus equipped with sensors to localize itself in an environment and simultaneously build a map of this environment is called Simultaneous Localization and Mapping (SLAM) and has been intensively studied in the scientific community for over two decades. The different approaches to the SLAM problem can be grouped into three categories: Kalman filter-based [Thrun et al., 2005], particle filter-based [Grisetti et al., 2007], and graph-based [Hess et al., 2016, Lucidarme and Lagrange, 2014], As the mobile apparatus moves through the environment, its pose estimate is subject to errors due to imperfect sensor measurements and random environmental disturbances (e.g. wheel slippage in case of a wheeled device). These errors accumulate over time and cause mapping inaccuracies and errors. Some mapping inaccuracies that accumulate during the mapping procedure can be reduced when an area is re-visited, which is called loop closure. However, if such events are not frequent enough, or if scan matching produces inaccurate results due to imperfect sensor measurements or environment disturbances, or if SLAM parameters are suboptimally tuned, significant inconsistencies can still occur in the built map. The accuracy of the used mapping procedure is crucial for precise positioning of autonomous mobile apparatus using the previously built map for localization. For example, the typical positioning precision required for the operation of autonomous forklifts (loading and unloading of pallets in automated warehouses) must be below 1 cm and 0.5°. Such accuracy is difficult to achieve in automatic and consistent fashion with today's state of the art SLAM algorithms. Therefore, artificial landmarks solutions [nav, 2011] are considered the current industrial state of the art. These solutions incur significant installation and maintenance cost, are not usable in some scenarios (e.g., markers installed close to ground tend to get occluded or become too dirty and damaged to be viable). Some authors have considered increasing SLAM accuracy by using prior information (maps extracted from aerial photographs): [Kummerle et al., 2010], [Vysotska and Stachniss, 2016], However, these solutions are mostly aimed at outdoors scenarios and cannot guarantee sufficient precision, especially in situations where there are significant deviations between the prior map and the on-site environment. Furthermore, simply aligning the finished map (built by SLAM) with a reference CAD using trivial transformations (scaling and rotation) may not ensure accurate alignment in all areas of the map. As the various parts of the built map are usually deformed in different ways, they would require different local warpings of the map in order to accurately align these parts to the reference CAD. Furthermore, the mapping procedure can sometimes produce defects such as fracturing or blurring of object contours, or superfluous multiple appearances of objects caused by incorrectly registering the same objects several times, at different locations. These defects cannot be corrected even by using different local warpings.
Another approach, allowing user intervention in the SLAM process, is described in [Jensfelt and Christensen, 2005] and also in the document EP1804149B1 [ABB Research Ltd., 2011], User intervention in the SLAM process as disclosed in EP1804149B1 explicitly relies on loop closure wherein the "starting representation" and "ending representation" of an object play a special role where the user may perform the instruction that they are at the same place. Document EP1804149B1 is particularly vague in describing the application of user pose corrections with respect to the whole trajectory and only briefly mentions the desired corrective loop-closing effect, i.e. it only mandates that the end representations should correspond to the starting representations, paragraph [0037], Document EP1804149B1 reveals a single installation mode and a subsequent maintenance mode, during which the map is used for navigation and not further updated.
The present invention allows for storage and retrieval of all sensor measurements data and user corrections necessary to fully reconstruct the complete environment map building process, i.e. the SLAM process. This enables long-term iterative map adjustments which can be performed occasionally or periodically and thus account for persistent changes in the environment. Further, a computer-implemented method according to the present invention allows for iterative map adjustments which can be performed by applying user corrections at any arbitrary point on a mobile device trajectory. Furthermore, subsequent corrections do not need to be ordered chronologically, i.e., when applying corrections, the user can freely rewind forwards and backwards "in time" along the trajectory, employing a graphical user interface according to an embodiment of the present invention. Effectively, this enables the user to achieve greater map accuracy and to correct map defects at various deliberately selected points in the environment. Further, a computer-implemented method according to the present invention presents in detail a nontrivial method of applying user pose corrections with respect to the whole trajectory for modern SLAM implementations (for both graph and non-graph-based SLAM implementations). Furthermore, a computer- implemented method according to the present invention is not restricted to one independent run of only one mobile device, or even to any particular kind of a single autonomously moving device or apparatus.
Therefore, the invention described herein provides specific advantages over prior art. Appropriate map alignment and correction of defects are achieved by iteratively inputting corrections during the SLAM process. This may also be facilitated by automatically aligning the rangefinder sensor measurements, e.g. from a laser scanner, with for example a reference CAD during the SLAM process, although the user may perform further tuning in case the automatic alignment with the reference CAD produces unsatisfactory results.
Summary of the invention
An object of the present invention is to provide an interactive computer-implemented method and an intuitive, easy- to-use, graphical user interface enabling users without expert knowledge about SLAM to tune the final output, i.e. the built environment map and trajectories, by intervention into the SLAM process, i.e. the execution of the SLAM algorithm in the used SLAM implementation.
Another object of the present invention is to provide an interactive computer-implemented method by which the environment map building process i.e. the SLAM process is not restricted to data collected from a single mobile device. Sensor measurements collected from several mobile devices, or from several independent runs of a single mobile device, or any combination thereof can be fused seamlessly into a single environment map, and the multiple built trajectories may be inter-constrained together in any way and at any point the user desires. This is achieved in the preferred embodiment by employing and extending an existing graph SLAM implementation capable of handling multiple trajectories (i.e. tracking multiple mobile devices simultaneously).
The present method enables the use of a reference CAD, if available, as prior information in the SLAM process. Further, the present method allows for assisted inputting of user corrections by means of scan matching, making initial inputting of user corrections fast and accurate. User corrections are integrated directly into the environment map building process, allowing mapping accuracy to be increased at various deliberately selected points in the environment to a level where the built environment map is free of visible defects. The final result is a high-accuracy map of the environment, adequate for localization in industrial scenarios, with an accuracy better than 1 cm and 0.5°.
A task of building an environment map (e.g. a two-dimensional building floor plan or a 3D map) comprises collecting sensor measurements while a mobile device is moving throughout the environment, and processing the collected sensor measurements in order to obtain the final result, the built environment map, as well as one or more mobile device trajectories. To collect sensor measurements, the operator employs at least one mobile device equipped with the following sensors: one or more rangefinder sensors (e.g, a laser scanner - LIDAR), and optionally one or more motion sensing devices (e.g. an odometry sensor such as wheel encoders, or an inertial measurement unit - IMU).
One possible embodiment of a mobile device is a mobile robot, equipped with one or more rangefinder sensors, and optionally one or more motion sensing devices and a data processing unit. The mobile robot may be steered through the environment by an operator or driven autonomously. Another possible embodiment of a mobile device is a portable device such as a backpack, carried by the user, equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices and a data processing unit.
In the collected sensor measurements processing step, which can be performed offline or online (i.e. in parallel with collection of sensor measurements), the map building software takes the sensor measurements collected by at least one mobile device and performs the SLAM process (Simultaneous Localization and Mapping), the output of which is a built map of the environment (usually in the form of an occupancy grid, i.e. an array of numbers representing the probability of the respective discrete map element (cell) being an obstacle or free space), and one or more built trajectories. Each trajectory is a set of consecutive timestamped mobile device poses (2D or 3D). A 2D pose consists of position, commonly expressed using planar coordinates (x, y), and orientation, expressed as a yaw angle Q. A 3D pose consists of position, commonly expressed using 3D coordinates (x, y, z ) and orientation, commonly expressed using Euler angles, a rotation matrix, or quaternions. Another common pose representation (both for 2D and 3D) is using a homogenous transformation matrix, which includes both position and orientation.
A computer-implemented method according to the present invention is envisioned to be used with an arbitrary SLAM implementation.
The proposed interactive computer-implemented method and a graphical user interface enable user intervention into SLAM process by allowing the user to input corrections, for the purpose of obtaining a highly accurate, consistent and defect-free map of the environment. If available, the reference CAD may be used to guide the SLAM process, and the resulting environment map will be aligned thereto.
A graphical user interface according to the present invention allows the user to observe the progress of the SLAM process, to rewind forwards and backwards in time along built trajectories, to observe and assess collected sensor measurements, and to iteratively input corrections and view the effects thereof. A graphical user interface according to the present invention thus enables the user to gain a high level of insight into the SLAM process, i.e. the SLAM algorithm execution, aiding them in diagnosing sensor and SLAM configuration/tuning issues.
Instructions for performing the aforementioned method and a graphical user interface may be included in a computer program product configured for execution by a computing device 104, which when executed by a computing device having a screen displaying the graphical user interface for interactively building a high-accuracy environment map causes the computing device to perform the computer-implemented method in accordance with the present invention.
It is the object of the present invention to provide a computer-implemented method, a graphical user interface which permit gaining a high level of insight and performing user intervention into the execution of the SLAM algorithm, and a computer program product.
This object is achieved by a computer-implemented method according to claim 1 , an interactive graphical user interface according to claim 21 , and a computer program product according to claim 32.
It is clear to those skilled in the art that a method and a graphical user interface explained and described in the present application may be implemented by means of respective computing devices or one or more processors configured and/or programmed to obtain the functionality described.
Brief description of the drawings
For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the description below, in conjunction with the following drawings in which reference numerals refer to corresponding parts throughout the figures. Figure 1 shows an environment map illustrating an example of failure of a SLAM implementation to perform loop closure successfully and thus to eliminate accumulated errors in the environment map building process.
Figure 2 shows one example of an environment map defect such as fracturing and blurring of object contours in an environment map building process.
Figure 3 shows another example of an environment map defect such as superfluous double appearance i.e. ghosting of objects caused by incorrectly registering the same objects two times, at different locations, in an environment map building process.
Figure 4 shows yet another example of an environment map defect such as misalignment with a reference CAD in an environment map building process.
Figure 5 schematically illustrates an example of a mobile device system for implementing an interactive environment map building method, in accordance with an embodiment of the present invention.
Figure 6 shows an interactive environment map building workflow, in accordance with an embodiment of the present invention.
Figure 7 shows a software flow diagram illustrating a main data processing loop in a SLAM implementation, as well as the extensions which enable an interactive environment map building method in accordance with an embodiment of the present invention.
Figure 8 shows an example of an environment map free of visual defects, built using an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
Figure 9 schematically illustrates a graphical user interface enabling implementation of an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
Figure 10 illustrates a graphical user interface enabling the user to observe an environment map being built, to inspect and gain a high level of insight into the SLAM process, and to input corrections, in accordance with an embodiment of the present invention.
Figure 11 shows an enlarged view of a part of a main map window illustrating a section of a built environment map, a trajectory, a translation and a rotation marker, and a rewind point cloud, in accordance with an embodiment of the present invention.
Figure 12 shows an enlarged view of a part of a main map window from Figure 11 illustrating a pose-corrected rewind point cloud during inputting of a user pose correction, in accordance with an embodiment of the present invention.
Figure 13 illustrates applied user pose corrections, and a rebuilt trajectory honouring the inputted pose corrections, in accordance with an embodiment of the present invention. Detailed description of the invention
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent the person of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
To facilitate the understanding of the present invention, a series of terms shall be defined. The terminology used in present patent application is for describing specific embodiments of the invention, however their use does not limit the invention, unless otherwise stated in patent claims.
The expression“at least one mobile device", as used in the present patent application and patent claims, shall include independent runs of several mobile devices, or several independent runs of a single mobile device, or any combination thereof.
The term“user correction” or“correction” as used in the present patent application and patent claims, shall include any kind of user intervention into the SLAM process, i.e. execution of the SLAM algorithm. A user correction is typically given at a certain time in the sensor measurement dataset or at a certain trajectory node, and may include modifying any parameter of the SLAM implementation or part of the SLAM process state. In the preferred embodiments of the present invention, we primarily use a certain kind of user corrections - user corrections of trajectory node poses, herein referred to as user pose corrections. The effect of applying a user pose correction in the SLAM process is to force the pose of the desired trajectory node in the rebuilt trajectory to the one mandated by the pose correction.
The term linear SLAM”, as used in the present patent application and patent claims, is related to the following: a SLAM implementation is referred to as linear if a user correction has to be applied exactly at the moment when the corresponding sensor measurement is being processed in the data processing loop, and it is not feasible to apply the correction later.
The term“computing device", as used in the present patent application and patent claims, may include, for example, one or more of: a desktop computing device, a laptop computing device, a tablet computing device, a computer, a computing device of a vehicle of the user, or a wearable apparatus of the user that includes a computing device.
The term“SLAM implementation”, as used in the present patent application and patent claims, is related to one or more machine-readable program code portions configured to run a SLAM process, i.e. execute a SLAM algorithm on loaded sensor measurements, which may comprise algorithms for scan matching, localization, trajectory building and environment map building. As used in the present patent application and patent claims, the terms‘a’,‘an’ and‘the’ include both singular and plural referents, unless the context clearly dictates otherwise.
The computer program product described herein uses and extends an existing SLAM implementation as one of its components; one its component implements and displays in a graphical user interface a built or subsequently rebuilt environment map and tools, where said component enables the user to perform a map building workflow using an interactive computer-implemented method according to the present invention. The interactive method according to the present invention is a computer implemented method. Also, an embodiment of the present invention may utilize multiple different SLAM implementations and accordingly provide means in the graphical user interface for selecting the SLAM implementation.
SLAM implementations usually have many intricate parameters which influence their execution and the quality of the end result, therefore requiring a highly skilled operator and specialized knowledge to find an optimal tuning of these parameters for a particular combination of sensors and the environment. The interactive environment map building method and the graphical user interface according to the present invention enable a less experienced user to mitigate causes of mapping errors such as suboptimally tuned SLAM parameters, imperfect sensor measurements and others, by allowing them to manually correct the mapping errors caused thereby.
As at least one mobile device moves through the environment, the estimate of its pose is subject to errors due to imperfect sensor measurements and random environmental disturbances. These errors accumulate over time and cause inaccuracies in the built environment map 10. Some errors that accumulate during the SLAM process can be reduced when an area is re-visited, which is called loop closure. Loop closure is an important concept in SLAM. It is an event which is triggered when at least one mobile device visits an area it has visited before. Performing loop closure reduces the positioning error along the whole loop which begins with the previous visit and ends with the revisit. However, if such loop closure events are not frequent enough, or if scan matching produces inaccurate results due to imperfect sensor measurements and environment disturbances, or if the SLAM parameters are suboptimally tuned, or any combination of these and similar circumstances occurs, significant visible defects can occur in the built environment map 10, as illustrated in Figure 1.
Particle filter-based SLAM uses a set of particles. Each particle is a SLAM state hypothesis and contains a current pose estimate and a map that has been built along a trajectory which ends at that current pose estimate. Incoming sensor measurements such as rangefinder and motion measurements are used to update every particle according to the mobile device measurement and motion models. All particles i.e. hypotheses are periodically evaluated, and the level of agreement with the incoming measurements is numerically calculated as particle weight. The best particles are assigned the highest weights. The set of particles is periodically resampled, meaning that a new set is formed by sampling the best particles from the previous set. This ensures loop closure-like behavior of particle filter-based SLAM: when a loop is closed, the particles that ended up in correct poses, i.e. which successfully closed the loop, will have their weights rise drastically due to agreement with the newly available evidence (the range data of the revisited place). A well-known particle filter-based SLAM implementation is GMapping.
A class of SLAM implementations known as graph SLAM internally uses a structure known as the pose graph to represent how poses of various trajectory nodes are constrained one to another. For example, consecutive nodes are typically locally constrained. Graph SLAM implementations usually account for loop closure, when detected, by explicitly modeling the detected relationship between the trajectory nodes of the previous visit and the new trajectory node(s) which correspond to the revisit. The relationship is usually modeled with a loop closure constraint that is inserted in the pose graph when loop closure is detected. At some time afterwards, graph SLAM usually runs an optimization process and produces optimized i.e. rebuilt trajectories, which honour the newly added constraints as well as all the constraints previously inserted into the pose graph.
A recent graph SLAM implementation, used and extended in the preferred embodiment of the present invention, is Google Cartographer. The definition of the pose graph and the optimization problem currently used therein are based on the Sparse Pose Adjustment (SPA) method [Konolige, 2010], quoted verbatim:“The variables of the [optimization problem] are the set of global poses c of the robot, parameterized by a translation and angle: q [ti’QiY = {c ί· Uί> q ί]t· A constraint is a measurement of one [trajectory] node q from another's (q) position. The measured offset between q and q, in q's frame, is zij: with precision matrix L i - (inverse of covariance). For any actual poses of q and q, their offset can be calculated as
Here Rt is the 2x2 rotation matrix of Q*. /i(q, q) is called the measurement equation. The error function associated with a constraint, and the total error [where p is the set of constraints], are
“(end verbatim quote).
The constrained pose graph optimization problem, as defined above, is periodically re-solved with newly added constraints (which can be local and loop closure constraints). The results of the optimization process are displayed to the user in real time. The SPA method (as implemented in Google Cartographer using the Google Ceres nonlinear solver) can quickly optimize pose graphs containing thousands of nodes. In the preferred embodiment of the present invention, fast performance of the optimization process is instrumental in quickly processing inputted user corrections and displaying the result to the user, which facilitates a faster iterative workflow.
According to one embodiment of the present invention, the used SLAM implementation also supports tracking multiple, possibly concurrent, trajectories. Sensor measurements collected from at least one mobile device can be fused seamlessly into a single environment map. Each independent run of one mobile device corresponds to one built trajectory, and each built trajectory is displayed within the built environment map in the graphical user interface. Accordingly, the built environment map in accordance with the present invention comprises at least one built trajectory.
In SLAM implementations, there is usually a main loop which processes and dispatches the incoming sensor measurements data, hereinafter the data processing loop. In a multi-trajectory capable SLAM implementation, typically the data processing loop takes care of routing the sensor measurements data to the corresponding trajectories. The multi-trajectory capability of the SLAM implementation, combined with the present invention, enables the user to iteratively build and correct an environment map from data collected from at least one mobile device.
In an embodiment of the present invention which uses a linear SLAM implementation, when applying user corrections, in order to mitigate long re-execution times, snapshots of the SLAM state may be taken (typically in the data processing loop). This need not be done in each loop iteration, but rather sparsely, in order to avoid overly consuming computing device resources. When applying one or more new user corrections, the SLAM state is restored from the most recent snapshot, taken at a time before the first (earliest) of the new user corrections is due to be applied. This avoids running the complete SLAM process from the beginning, thus reducing processing time of applying corrections.
In an embodiment of the present invention which uses a particle filter-based SLAM implementation, a user pose correction may be applied by forcing the entire set of particles into the pose mandated by the correction, by manipulating their pose hypotheses into the target pose. The manipulation may be performed as a continuous interpolation to the target pose within a predetermined time interval in the SLAM process before the correction is due. Because this is clearly a correction which must be applied at that certain point in the SLAM process and not afterwards, particle filter-based SLAM is treated as linear SLAM with respect to user pose corrections.
In an embodiment of the present invention which uses a graph SLAM implementation, there is a simpler venue for applying user pose corrections than with linear SLAM. The order of insertion of constraints into the pose graph does not matter. A constraint calculated from a user pose correction may be inserted into the pose graph at any time. Afterwards, the optimization process is run to obtain a rebuilt trajectory and map. Thus, graph SLAM is retroactively correctable, easily enabling an iterative workflow without the need to re-execute the complete SLAM process. If the optimization process and rebuilding the environment map is quick enough (which is the case with the preferred embodiment that uses Google Cartographer), the user may quickly see the effect of applying the inputted corrections.
The constraint inserted during applying a user pose correction will typically tie a certain trajectory node (having a pose cj) to the origin of the global map coordinate system, which will typically be aligned with the beginning of a specific, usually first trajectory (q == c0). More precisely, this means that the constraints calculated from inputted user pose corrections will typically be given with respect to the first trajectory node, with the pose c0. However, the user may also have the ability to provide relative pose corrections which are not in the global coordinate system, but rather tie together any two parts of the trajectory relative one to another by specifying a relative pose correction in a coordinate system tied to a trajectory node located in one of the respective trajectory parts. In other words, the user may select c other than c0.
Figure 5 schematically illustrates one possible embodiment of a map building system 100 for implementing an interactive environment map 10 building method according to the present invention. The map building system 100 includes at least one mobile device 102, a computing device 104, and optionally a dedicated control command system 105. The mobile device 102 is equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices, and a data processing unit. It is responsive to motion commands generated by the user through the control command system 105 and conveys the sensor measurements data to the computing device 104 over a communication channel 106. All collected sensor measurements data is being stored on the computing device 104, and optionally also on the data processing unit of the mobile device 102. The graphical user interface 20, displayed on a screen of the computing device 104, conveys to the user the sensor measurements data and provides for execution of the interactive environment map 10 building method according to the present invention. The graphical user interface 20 provides to the user means for gaining a high level of insight and enables performing intervention into the SLAM process by iteratively inputting corrections, using a keyboard, a mouse, a 3D mouse, or any other human interface device known to the person skilled in the art. In one embodiment, the interactive environment map building method is performed online, meaning that the environment map 10 is being built and user corrections are being inputted and applied while the mobile device 102 is moving and streaming online sensor measurements to the computing device 104 via the communications channel 106. In another embodiment, one mobile device 102 is moving in the environment, controlled by the user through the control command system 105. Optionally, the sensor measurements data is also being displayed to the user on the screen of the computing device 104, but user corrections are not being inputted during motion of the mobile device 102. Instead, the interactive map building method according to the present invention is performed offline, by processing at any later point in time on the computing device 104 the dataset containing the collected sensor measurements. The interactive map building method according to the present invention can also be performed by simultaneously processing sensor measurements recorded in several independent runs of at least one mobile device 102.
A computer-implemented method for interactively building a high-accuracy environment map 10 comprises the steps of: moving at least one mobile device 102 in the environment; collecting sensor measurements from at least one mobile device 102; loading into a SLAM implementation 306 the collected sensor measurements in the form of offline stored sensor measurement datasets ordered by timestamps or in the form of online streamed data; building an environment map 10 and at least one trajectory 11 by running a SLAM process in the SLAM implementation 306; exporting the built environment map 10 and trajectories 11 to a persistent storage 212 of a computing device 104; displaying the built environment map 10 and at least one built trajectory 11 in a graphical user interface 20; and assessing by the user if the built environment map 10 has visible defects. If the built environment map 10 has visible defects, the method further comprises: iterative inputting of user corrections in the graphical user interface 20 and applying the inputted corrections in the SLAM implementation 306; subsequently rebuilding the environment map 10 and trajectories 11 , honoring the inputted user corrections and sensor measurement datasets, wherein after each inputted and applied user correction the subsequently rebuilt environment map 10 and trajectories 11 are displayed in the graphical user interface 20 for assessing by the user of visible defects; and, if the subsequently rebuilt map 10 is free of visible defects, exporting the subsequently rebuilt environment map 10 and trajectories 11 to the persistent storage 212. Assessing by the user if the built or subsequently rebuilt environment map 10, displayed in the graphical user interface 20, contains defects is performed by tools for rewinding along at least one built or subsequently rebuilt trajectory 11. Iterative inputting and applying of user corrections, in accordance with present invention, includes inputting pose corrections 18 of trajectory nodes 13, and modifying SLAM implementation parameters or part of the SLAM process state.
Referring to Figures 6 and 7, the operations are illustrated that may be performed in order to obtain a high-accuracy environment map 10. As shown, an interactive computer-implemented method, hereinafter the method, can be performed in offline or online mode. In both cases, execution of the map building software is initiated by the user, by creating a new mapping project (block 201 ) and loading in (block 202) the sensor measurements data from at least one mobile device 102, whether offline in the form of a dataset ordered by timestamps, or online by setting up connections with at least one active mobile device 102. In the online use case, each independent active mobile device 102 streams the collected sensor data to the map building software.
The method according to the present invention provides a possibility of using an existing reference CAD 14, in which case the reference CAD is also loaded (block 202). A high-accuracy environment map 10 can be built, and the workflow enabled by the present invention can be performed even without the reference CAD 14 being provided. The reference CAD 14, if available, is displayed under/overlaid with the built environment map 10.
The sensor measurements used for creating the environment map 10 may be collected during one or several independent runs of at least one mobile device 102. The independent runs may or may not be concurrent. Each independent run is closely associated with one trajectory 11 , and these two terms will be used herein interchangeably. The elements of the set of timestamped poses that makes up a trajectory 1 1 are called trajectory nodes 13.
Support for multiple trajectories depends on the type of the used SLAM implementation. The preferred embodiment uses a graph SLAM implementation which is capable of tracking multiple concurrent trajectories 1 1. However, a significant portion of the present method, described herein, is applicable to simpler single-trajectory SLAM implementations as well. Thus, the second preferred embodiment uses a single-trajectory, particle filter-based linear SLAM implementation. After loading into SLAM implementation 306 one or more sensor measurement datasets (blocks 202; 301 ), the SLAM implementation 306 asynchronously starts the environment map building process (block 203) by performing operations depicted in Figure 7. More precisely, the environment map building process takes place in background and does not block the user from performing actions in the graphical user interface 20, such as rewinding (block 206) and inputting of user corrections (block 207). The output of the environment map building process is displayed in the graphical user interface 20 as it progresses. Through the graphical user interface 20, the user is free to examine in detail the built environment map 10 and one or more built trajectories 11 , as they are being built, or afterwards, by performing rewinding (block 206). Displaying the built environment map 10 in a graphical user interface 20 and rewinding along at least one built trajectory 11 enables the user to assess if the built or subsequently rebuilt environment map 10 has visible defects.
A rewinding tool provided in the graphical user interface 20 enables the user to rewind along each trajectory 11 and to perceive the sensor measurements related to the corresponding trajectory nodes 13 where the SLAM implementation 306 built the respective section of the built or subsequently rebuilt environment map 10. During rewinding, the user can rewind along each built trajectory 1 1 and view the poses of the tracked mobile devices 102 at each trajectory node 13. A rangefinder point cloud from the sensor measurements data which corresponds to the trajectory node 13 selected during rewinding is also displayed in the graphical user interface. The rangefinder point cloud displayed on the built environment map 10 corresponds to the mobile device 102 pose at the respective trajectory node 13. These displayed rangefinder point clouds are called herein rewind point clouds 15, and have an important role in enabling the user to input corrections (illustrated in Figs. 10 to 12). Rewinding along a built or subsequently rebuilt trajectory 11 and displaying the corresponding rewind point clouds 15 enables the user to gain a high level of insight into the environment map building process, and enables the user to identify with high precision, both temporal and spatial, which scans were incorrectly inserted into the built environment map 10.
The iterative interactive environment map building method begins with the user perceiving one or more visible defects in the built environment map 10 (block 205). Such defects may include: fracturing or blurring of object contours (illustrated in Figure 2); superfluous multiple appearances, also called ghosting, of objects in the built environment map 10 caused by incorrectly registering the same objects several times at different locations, e.g. improperly closed loops; illustrated in Figure 3; and misalignment with the reference CAD 14 (illustrated in Figure 4). At block 205, the user assesses visible defects in the built or subsequently rebuilt environment map 10.
By way of observing the relationship of the rewind point clouds 15 and the built or subsequently rebuilt environment map 10, the user is enabled to visually locate the built or subsequently rebuilt trajectory nodes 13 where incorrect handling of sensor measurements, imperfect sensor measurements or random environmental disturbances caused visible defects in the built environment map 10 (block 206), as for example illustrated in Figure 11. Following that, the user employs the tools from the graphical user interface 20 to input corrections which will correct the visible defects (block 207). In the preferred embodiments, we primarily use a certain type of corrections, namely trajectory node 13 pose corrections, hereinafter“pose corrections”, which enable the user to force the SLAM implementation 306 to position the selected trajectory nodes 13 at given poses, exactly. However, the method of applying corrections to linear SLAM is suitable for other types of corrections as well. For example, when a tracked mobile device 102 enters a different type of environment, the user may opt to change certain SLAM parameters. This would be another type of correction, a SLAM parameter correction, applied at the trajectory node 13 that corresponds to the point of entry into the new environment.
The user may input pose corrections by aligning the incorrectly registered objects 12 e.g. walls, or certain key features visible in the rewind point cloud 15 with the rest of the built environment map 10 or with the reference CAD 14, if available. Furthermore, after inputting an initial pose correction, the user may further tune it either manually or refine it by using a pose refinement tool. Different means of enabling inputting a pose correction include: mouse drag-and-drop input of position and orientation; CAD-like numeric entry; and selecting pairs of matching points and computing the transformation between them. Figures 10 to 12 illustrate inputting of a pose correction by aligning the incorrectly registered objects 12 such as walls; namely, visually aligning the rewind point cloud 15 with the reference CAD 14, or with the rest of the built environment map 10. The result of alignment is a pose-corrected rewind point cloud 17, as illustrated in Figure 10. The pose corresponding to the pose-corrected rewind point cloud 17 is applied thereafter as a pose correction for the respective trajectory node 13. The method in accordance with present invention further comprises displaying and assessing in the graphical user interface 20 sensor measurements data which the SLAM implementation 306 used when building or subsequently rebuilding the environment map 10, the displayed sensor measurements data corresponding to trajectory nodes 13 selected during rewinding.
When the user has completed inputting the correction and requested it to be applied (block 207), the inputted user correction is applied as soon as possible (blocks 208; 305) in the SLAM implementation 306; the trajectory 11 and the environment map 10 are rebuilt; the subsequently rebuilt environment map 10 and the rebuilt trajectory 11 , which honour the newly inputted correction, as well as all the previously inputted corrections and sensor measurements (blocks 208; 308) are displayed in the graphical user interface 20. The user is now able to evaluate the effect of the inputted correction by assessing the subsequently rebuilt environment map 10 to see if applying the correction has resulted in a more accurate environment map 10 with respect to the previously observed visible defects, such as map blurring, fracturing, ghosting or CAD misalignment. To perform the assessment, the user may also employ the insight-providing tools of the graphical user interface 20 such as rewinding. If the rebuilt environment map 10 is still visibly inaccurate (contains visible defects), the user can further tune the inputted correction. The correction is reverted to the stage where user input is performed (block 210), and the user can further tune the correction with additional input. If unsuccessful again, the user can choose to remove the correction altogether and try inputting another correction elsewhere, i.e. at another trajectory node 13 or in another section of the built or subsequently rebuilt environment map 10.
Therefore, the user is enabled to perform an iterative workflow where the user observes one or more visible defects in the environment map 10, inputs corrections until the respective section of the environment map 10 is visibly accurate and consistent, and moves on to the remaining visible defects, if any. The process is repeated until the user obtains an environment map 10 that is consistent and free of visible defects. Importantly, the absence of visible defects in the built or subsequently rebuilt environment map 10 also implies that the corresponding one or more trajectories 11 represent a reasonably good estimate of the true trajectories 11 travelled by at least one mobile device through the environment.
If the user deems a certain part of one or more trajectories 11 , i.e. the sensor measurement dataset, being of too low quality or redundant, for example, if one of the built trajectories 11 already visited that area and produced the environment map 10 that is free of visible defects, the user may utilize the available editing-like facilities to cut/exclude (tool 23 in the graphical user interface 20) such redundant parts from the map building process.
In an embodiment of the present invention which uses a multi trajectory-capable SLAM implementation 306, the user may also inter-constrain the multiple trajectories 11 together in any way and at any point the user desires. Inter-constraining pairs of trajectories 1 1 together is achieved by allowing the user during inputting of pose corrections to select a coordinate system tied to any trajectory node 13, thus enabling inputting relative pose corrections between any two trajectories 11 at one or more trajectory nodes 13, as described earlier.
The method according to the present invention provides saving and loading of the user’s work done so far at any time as illustrated in block 204. The SLAM state and all corrections inputted so far are preserved and saved in the persistent storage 212, enabling the user to restore the complete map building project and to continue work on the same project until the rebuilt environment map 10 and the trajectory 11 are free of visible defects. The user may also load a new dataset recorded some time later, e.g. when the environment has changed, and use the available tools and the graphical user interface 20 according to the present invention to obtain an updated version of the environment map 10 that is consistent with the previously built or rebuilt environment map 10, which assumes the role of the reference CAD 14.
If the user assessed the rebuilt environment map 10 and the trajectory 11 as being free of visible defects (block 205), said rebuilt environment map 10 may be exported in the form of an occupancy grid (block 211 ) and saved in the persistent storage 212. The built trajectories 11 may also be exported as sets of consecutive timestamped mobile device poses.
Figure 7 shows a software flow diagram illustrating the main data processing loop in a SLAM implementation 306, as well as the extensions which enable the interactive environment map building method in accordance with an embodiment of the present invention. Block 301 illustrates performing SLAM initialization, which includes loading offline sensor measurement datasets ordered by timestamps, or setting up online communication with at least one active mobile device 102.
When all sensor measurements are loaded, the SLAM implementation 306 proceeds to build the environment map 10 and the trajectory 11 of at least one mobile device 102, as illustrated in the data processing loop consisting of blocks 302 to 308. A series of operational steps is performed until all sensor measurements data are processed (block 302), the operational steps comprising: each sensor measurement is retrieved (block 303) and handled (block 307) according to its type and originating mobile device 102. Handling of each sensor measurement may include performing scan matching, localization, building a trajectory node 13 and updating the built environment map 10, depending on the measurement type and the used SLAM implementation 306. Afterwards, at block 308 the current, i.e. updated, SLAM state, namely the currently built environment map 10, poses of at least one mobile device 102 and the corresponding trajectories 11 are displayed in the graphical user interface 20.
According to the present invention, the data processing loop in the SLAM implementation 306 is extended to support applying user corrections. As previously described in the interactive map building workflow, after assessing (blocks 205, 209) that the so far built or subsequently rebuilt environment map 10 still has visible defects, the user iteratively inputs one or more new corrections (block 207). When the user finishes inputting the corrections and requests them to be applied, the user corrections are scheduled to be applied. Blocks 304 and 305 illustrate applying due user corrections in the data processing loop of the SLAM implementation 306. Applying the due user corrections (blocks 304 and 305) is performed as soon as possible, i.e. at the beginning of each iteration of the data processing loop, before handling (block 307) the remaining sensor measurements data, if any. The resulting updated SLAM state, i.e. the subsequently rebuilt trajectories 11 and environment map 10, are afterwards displayed (blocks 208; 308) in the graphical user interface 20. Thus, the user can as soon as possible observe the effect of the inputted corrections, which facilitates a faster iterative workflow.
In the preferred embodiment which uses a graph SLAM implementation, the concept of pose corrections lends itself especially well into the graph SLAM-specific concept of pose constraints which make up the pose graph structure that is central to graph SLAM. A pose constraint for a certain trajectory node 13 may be visualized like a spring which pulls that trajectory node 13 towards a certain position. Rebuilding the trajectory 11 given new constraints is performed by solving the constrained pose graph optimization problem as described above, referred to herein as running an optimization process. Running the optimization process is analogous to finding a configuration of trajectory nodes 13 for which the springs are at their most relaxed position, i.e. all constraints are satisfied as much as possible. The pose graph constraints are calculated from pose corrections inputted by the user, as illustrated in Figure 13. The user pose corrections 18 were in this case given in the global map coordinate system with an origin 19.
In graph SLAM, constraints have weights (corresponding to elements of Lί · in the formal description above) which control the impact of the constraints on the final optimized solution. Since it is desired that the constraints calculated from the inputted user pose corrections 18 have a big impact on the optimization process (i.e. strongly influence the optimized solution, in order to force trajectory node poses to ones mandated by constraints calculated from user corrections), larger weights are assigned to these constraints. In the preferred embodiment of the present invention, the assigned weights are predetermined to be equal to or larger than the weights of loop closure constraints automatically generated by the graph SLAM implementation 306, where the user is further enabled to adjust the weights assigned to pose constraints calculated from the inputted pose corrections 18. If necessary, the exact values of the assigned weights may be determined depending on the optimization process result. The user can increase the values of assigned weights until the resulting optimized poses of the trajectory nodes 13 match the ones mandated by the inputted user corrections to a desired degree of precision, namely until the subsequently rebuilt environment map 10 is free of visible defects. Also, a graphical element such as a residual marker may be used to visualize the difference between the pose mandated by the inputted user correction and the resulting optimized trajectory node pose. In another embodiment of the present invention, applying of inputted user pose corrections 18 is performed by excluding from the pose graph optimization process a subset of variables, and subsequently rebuilding the trajectories 11 and the environment map 10 by re-running the pose graph optimization process. The variables excluded from the optimization process, i.e. a subset of global poses c corresponding to poses of trajectory nodes 13 with pose corrections, are treated specially by having their values exactly fixed to ones mandated by the corresponding user pose corrections 18. This would be analogous to having constraints with infinite weights.
Contrary to graph SLAM, linear SLAM does not lend itself naturally to applying additional pose corrections, which is required for an iterative map building workflow. Thus, in an embodiment which uses linear SLAM, the SLAM process needs to be re-run from the point where the user correction is due to be applied. When using a particle filter-based SLAM implementation, applying of user pose corrections 18 is performed by interpolating the values of particle poses into values mandated by the user pose corrections 18 within a predetermined time interval in the SLAM process before the correction is due, where the user is further enabled to adjust the interpolation time interval.
For both, the linear SLAM and the graph SLAM implementation, the time needed to rebuild the environment map 10 after applying the correction can be reduced by regularly snapshotting the state of the SLAM process and rerunning the SLAM process in the SLAM implementation 306 from the latest possible snapshot before a correction is due to be applied, as described above.
Inputting of user corrections is performed exclusively through the graphical user interface 20, displayed on the screen of the computing device 104 and operably connected to the SLAM implementation 306. The graphical user interface 20 according to the present invention enables performing the computer-implemented method for interactively building a high-accuracy environment map 10 by iterative inputting and applying of user corrections in the SLAM implementation 306. The graphical user interface 20 is configured to display the built or subsequently rebuilt environment map 10 and trajectories 11 to enable the user to observe the progress of the environment map 10 building process, to rewind along the built or subsequently rebuilt trajectories 11 , to iteratively input and apply corrections, to store the complete mapping project at any point, and to export the built or subsequently rebuilt environment map 10 which is assessed to be free of visible defects. The tools in the graphical user interface 20 available to the user include accurate controls for inspecting the built or subsequently rebuilt environment map 10 and trajectory 11. Furthermore, the tools for accurate inputting and refining of pose corrections are available to the user. These tools work by allowing the user to align the rewind point cloud 15 so that the visible defects in the environment map 10 will be removed. The graphical user interface 20 for implementing an interactive environment map building method comprises, schematically illustrated in Figure 9, a zoomable and pannable main map window 400 configured to display the built environment map 10 and trajectories 11 , the current pose and the currently observed point cloud of at least one mobile device 102, a zoomable and scrollable trajectories timeline window 405, and a user corrections management window 403, wherein the graphical user interface is configured to provide tools for manipulating the viewport of the main map window 400 such as zooming and panning, tools for rewinding along a selected built or subsequently rebuilt trajectory 1 1 , and tools for iterative inputting and applying of user corrections. Further, the graphical user interface 20 comprises a sensor measurement datasets window 401 configured to list the loaded sensor measurement datasets when building the environment map 10 is performed offline, an active devices window 402 configured to list at least one tracked active mobile device 102 when building the environment map 10 is performed online, and a tool window 404 providing saving and loading the map building project data into/from a file on persistent storage 212. Each of these windows is maintained, but all are not necessarily displayed in the graphical user interface 20.
The graphical user interface 20 is structured to provide a“map building project”-type interface. The user may load one or more sensor measurement datasets or active devices into the map building project, and perform the interactive environment map 10 building method according to the present invention. The user corrections management window 403 enables the user to perform actions such as inputting, applying, removing, editing, enabling and disabling user corrections. The inputted corrections may also be displayed in the trajectories timeline window 405 by highlighting the keyframe points corresponding to respective trajectory nodes 13. Further, during rewinding, when a trajectory node 13 is selected, its current pose is displayed in the main map window 400, and the selected trajectory node 13 may be highlighted in the trajectories timeline window 405 as well.
The graphical user interface 20 displays the sensor measurement datasets window 401 , which lists the loaded offline sensor measurement datasets, or the active mobile devices window 402, which lists the tracked active mobile devices 102 when the environment map 10 building is performed online. The main map window 400 of the graphical user interface 20 displays the built or subsequently rebuilt environment map 10 and trajectories 1 1 , the current pose and the currently observed point cloud of at least one mobile device 102, the rewind point cloud 15 and the reference CAD 14, if available. The reference CAD 14 may be displayed under/overlaid with the built or subsequently rebuilt environment map 10, which enables the user to observe and evaluate the alignment of the built or subsequently rebuilt environment map 10 with the reference CAD 14, and to identify sections of the built or subsequently rebuilt environment map 10 which need further corrections.
Means provided in the graphical user interface 20 enabling rewinding along a selected built or subsequently rebuilt trajectory 11 include tools for scrubbing, or directly selecting a point in timeline, keyframe points or other graphical elements which correspond to trajectory nodes 13, user corrections, or timestamped sensor measurements. The trajectories timeline window 405 is configured to display keyframe points corresponding to built trajectory nodes 13 and graphical elements corresponding to inputted user corrections or timestamped sensor measurements. The trajectories timeline window 405 includes a graphical playhead element tool 22 configured to enable rewinding along a selected trajectory 1 1 and controlling the position in the timeline by the action of dragging using the mouse or another human interface device. The trajectories timeline window 405 may further include: a set of buttons which control the position in the timeline, akin to a conventional media player interface; tools for selecting a point in the timeline by directly entering a numeric timecode or a numeric trajectory node ID; and a tool 23 for excluding parts of the sensor measurement dataset e.g. by cutting, akin to video or audio editing. If multiple trajectories 11 are built, they are displayed analogously to standard multi-track displays in video or audio editing interfaces. Additional means of rewinding through time are keys on the computer keyboard (for example, arrow keys), a swiping gesture on a touch screen or a similar human interface device.
Additionally, an alternative tool is provided for selecting a trajectory node 13 and the corresponding point in the timeline by clicking or dragging the mouse on the displayed built or subsequently rebuilt trajectories 11 in the main map window 400. The implementation of such tool is slightly more involved because it requires locating the trajectory node 13 which is closest to the clicked point, for which an efficient point search data structure known to the person skilled in the art, such as a KD-tree or an Octree, should be used.
As the user rewinds through time and selects a point in the timeline, the graphical user interface 20 displays the mobile device 102 pose of the corresponding node 13 of the selected trajectory 11 , as built by the SLAM implementation 306, and shows the point cloud observed by the rangefinder sensor at that selected point in time, namely the rewind point cloud 15. The rewind point cloud 15 is displayed on the built or subsequently rebuilt environment map 10 in accordance with the pose of the corresponding built or subsequently rebuilt trajectory node
13.
Tools of the graphical user interface 20 for inputting of user pose corrections and for further tuning of the inputted corrections include a translation marker 16 and a rotation marker 21 , illustrated in Figures 10 to 12. Said markers are displayed in the main map window 400 and are configured to be dragged with the mouse or another human interface device. The main map window 400 is configured to display a pose-corrected rewind point cloud 17. The pose-corrected rewind point cloud 17 is translated and rotated in accordance with the pose of the dragged markers 16 and 21 which correspond to the pose of the mobile device 102. The translation marker 16 and the rotation marker 21 enable accurate and precise alignment of the displayed rewind point clouds 15 with the reference CAD
14, if available, or with the built or subsequently rebuilt environment map 10. The user may input a pose correction by moving the translation marker 16 and the rotation marker 21 which represent the pose of the mobile device 102 associated with the trajectory node 13 selected during rewinding. The user may also specify a pivot point for the rotation, which facilitates aligning key points in the point cloud.
Another way of enabling the user to input the pose correction is to allow them to specify pairs of corresponding points, e.g. two key points from the rewind point cloud 15, and their corresponding key points in the built or subsequently rebuilt environment map 10 or in the reference CAD 14, if available. Subsequently, a transformation is computed which aligns the rewind point cloud 15 by matching the specified pairs of corresponding points, and used for the inputted pose correction. In case that the user specifies more than two pairs of corresponding points, thus overconstraining the problem of finding the transformation, a least squares solution may be used, or a more sophisticated regression which compensates for outliers, known to the person skilled in the art.
The user is further provided with a“pose refinement” tool. The user may input an initial pose correction by roughly aligning the rewind point cloud 15 with the built environment map 10 or with the reference CAD 14, if available. The user may subsequently use the pose refinement tool which takes the initially inputted user pose correction and performs scan matching, the output of which is a refined pose. The user may further tune the pose correction before confirming the final input. The pose refinement tool thus facilitates fast and precise alignment of the displayed rewind point clouds 15 with the built or subsequently rebuilt environment map 10 or with the reference CAD 14, if available.
After the user inputs the correction, for example, a pose correction using the aforementioned tools for accurate and precise alignment of the displayed rewind point clouds 15, it is applied as soon as the execution of the SLAM process permits, as described earlier. This enables the user to see in the graphical user interface 20 the effects of the inputted correction on the subsequently rebuilt environment map 10 as soon as possible, wherein the delay of displaying to the user depends on the type of the used SLAM implementation 306.
Figure 10 illustrates the graphical user interface 20 where the trajectories timeline window 405 displays two trajectories 11 , namely T rajectory 0 and T rajectory 1. The two trajectory nodes 13 on T rajectory 0 where corrections were applied are highlighted, and the corresponding corrections are displayed in the user corrections management window 403. Further, Trajectory 1 is selected as the trajectory along which rewinding is performed, and the vertical playhead line passes through the highlighted keyframe point corresponding to the node that has been selected by rewinding. The mobile device pose of the selected trajectory node 13 is also displayed in the main map window 400, and the user is inputting a pose correction for that trajectory node 13 by dragging the markers 16 and 21 , as described above.
To facilitate observing the progress of the SLAM process in the graphical user interface 20, extensions to a SLAM implementation according to the present invention include exposing all sensor measurements data, including all point clouds from the rangefinder sensor measurements. The exposed offline sensor measurement datasets are listed in the graphical user interface 20 in the sensor measurement datasets window 401. Point clouds from the rangefinder sensor measurements are exposed for the purpose of enabling in the graphical user interface 20 rewinding along the built or subsequently rebuilt trajectories 11 , and displaying the rewind point clouds 15. The timestamps, types and content of exposed sensor measurements may also be displayed in the graphical user interface 20, e.g. in the trajectories timeline window 405 or in the main map window 400. Additionally, extensions to a SLAM implementation may further include exposing a scan matcher, if available, which enables the pose refinement functionality. Depending on the SLAM implementation, if such a scan matcher is not available, an alternative one may be provided and used in the graphical user interface 20 for the pose refinement functionality.
In order to facilitate alignment of the resulting built or subsequently rebuilt environment map 10 with the reference CAD 14, the SLAM implementation may be extended to perform automatic alignment with the reference CAD during the SLAM process.
Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a microprocessor, a programmable computing device or an electronic circuit. In some embodiments, one or more of the method steps may be executed by the mobile device 102. In the preferred embodiments of the invention, the method described herein is processor-implemented or computer-implemented.
Depending on certain implementation requirements, embodiments of the invention can be implemented in hardware or in software. The implementation can be performed using a non-transitory storage medium.
Generally, embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing the method when the computer program product runs on a computing device. The program code may, for example, be stored on a machine-readable carrier.
In other words, an embodiment of the inventive method is, therefore, a computer program product having a program code for performing the method described herein, when the computer program runs on a computing device.
A further embodiment comprises a processing means, for example, a computing device or a programmable logic device, programmed to, configured to, or adapted to, perform the method described herein.
A further embodiment comprises a computer having installed thereon the computer program product for performing the method described herein.
It is clear to those skilled in the art that a method and a graphical user interface explained and described in the present application may be implemented by means of respective computing devices or one or more processors configured and/or programmed to obtain the functionality described.
The above described embodiments are merely illustrative forthe principles of the present invention. It is understood that modifications and variations of the arrangements and the details described herein will be apparent to others skilled in the art. It is the intent, therefore, to be limited only by the scope of the impending patent claims and not by the specific details presented by way of description and explanation of the embodiments herein.

Claims

1. A computer-implemented method for interactively building a high-accuracy environment map (10), comprising the steps of:
moving at least one mobile device (102) in the environment- collecting sensor measurements from at least one mobile device (102);
loading into a SLAM implementation (306) the collected sensor measurements in the form of offline stored sensor measurement datasets ordered by timestamps or in the form of online streamed data;
building an environment map (10) and at least one trajectory (11 ) by running a SLAM process in the SLAM implementation (306);
exporting the built environment map (10) and trajectories (11 ) to a persistent storage (212) of a computing device (104);
displaying the built environment map (10) and at least one built trajectory (11 ) in a graphical user interface
(20);
assessing by the user if the built environment map (10) has visible defects, characterized by that the method comprises:
- iterative inputting of user corrections in the graphical user interface (20) and applying the inputted corrections in the SLAM implementation (306);
- subsequently rebuilding the environment map (10) and trajectories (11 ), honouring the inputted user corrections and sensor measurement datasets, wherein after each inputted and applied user correction the subsequently rebuilt environment map (10) and trajectories (1 1 ) are displayed in the graphical user interface (20) for assessing by the user of visible defects; and
- exporting the subsequently rebuilt environment map (10) and trajectories (11 ) to the persistent storage (212).
2. The computer-implemented method of claim 1 , wherein assessing by the user of the built or subsequently rebuilt environment map (10) in the graphical user interface (20) is performed by tools for rewinding along at least one built or subsequently rebuilt trajectory (1 1).
3. The method of claim 2, further comprising displaying and assessing rewind point clouds (15) in the graphical user interface (20), the rewind point clouds being rangefinder point clouds from the sensor measurements data which correspond to trajectory nodes (13) selected during rewinding.
4. The computer-implemented method of claims 1 to 3, further comprising displaying and assessing in the graphical user interface (20) sensor measurements data which the SLAM implementation (306) used when building or subsequently rebuilding the environment map (10), the displayed sensor measurements data corresponding to trajectory nodes (13) selected during rewinding.
5. The computer-implemented method of claim 1 , wherein iterative inputting and applying of user corrections includes inputting pose corrections (18) of trajectory nodes (13).
6 The computer-implemented method according to claim 5, wherein iterative inputting and applying of user corrections further includes modifying the SLAM implementation parameters or state of the SLAM process.
7. The computer-implemented method of claim 6, comprising re-running the SLAM process in the SLAM implementation (306) from the beginning and applying user corrections exactly when the corresponding trajectory nodes (13) are being built or the corresponding sensor measurements are being handled.
8. The computer-implemented method of claim 7, further comprising regularly snapshotting the state of the SLAM process and re-running the SLAM process in the SLAM implementation (306) from the latest possible snapshot before the user corrections are due to be applied.
9. The computer-implemented method of claim 1 , comprising fusing seamlessly all sensor measurements collected from at least one mobile device (102) into a single built environment map (10), where each independent run of one mobile device (102) corresponds to one built trajectory (1 1) and each built trajectory (11 ) is displayed within the built environment map (10) in the graphical user interface (20).
10. The computer-implemented method of claims 5 and 9, further comprising inter-constraining pairs of trajectories (1 1 ) together at one or more trajectory nodes (13) by inputting and applying relative pose corrections.
11. The computer-implemented method according to any of claims to 1 to 10, wherein the SLAM implementation (306) is a graph SLAM implementation.
12. The computer-implemented method of claims 1 to 8, wherein the SLAM implementation (306) is a linear particle filter SLAM implementation.
13. The computer-implemented method of claim 11 , wherein applying of inputted user pose corrections (18) is performed by calculating pose constraints from said pose corrections (18), inserting the calculated pose constraints into a pose graph, and rebuilding the trajectories (1 1 ) and the environment map (10) by rerunning a pose graph optimization process.
14. The computer-implemented method of claim 13, wherein weights assigned to the calculated pose constraints are predetermined to be equal to or larger than the weights of loop closure constraints automatically generated by the graph SLAM implementation (306), where the user is further enabled to adjust the weights assigned to pose constraints calculated from the inputted pose corrections (18).
15. The computer-implemented method of claim 11 , wherein applying of inputted user pose corrections (18) is performed by excluding from a pose graph optimization process a subset of variables corresponding to global poses of trajectory nodes (13) and fixing the values of the excluded variables exactly to ones mandated by the corresponding user pose corrections (18), and subsequently rebuilding the trajectories (11 ) and the environment map (10) by re-running the pose graph optimization process.
16. The computer-implemented method of claim 12, wherein applying of user pose corrections is performed by interpolating the values of particle poses into values mandated by the user pose corrections within a predetermined time interval in the SLAM process before the correction is due, where the user is further enabled to adjust the interpolation time interval.
17. The computer-implemented method of any of preceding claims, wherein multiple SLAM implementations (306) are available to be used, where the user selects the used SLAM implementation.
18. The computer-implemented method of any of preceding claims, further comprising assessing by the user of the built or subsequently rebuilt environment map (10) with respect to a reference CAD (14), if available, where the reference CAD (14) is displayed under or overlaid with the built or subsequently rebuilt environment map (10) in the graphical user interface (20).
19. The computer-implemented method of claim 18, wherein the SLAM implementation (306) is extended for performing automatic alignment with the reference CAD (14) during the SLAM process.
20. The computer-implemented method of any of preceding claims, further comprising loading into the SLAM implementation (306) additional sensor measurements collected in a changed environment, building and exporting an updated environment map (10), where the updated environment map (10) is consistent with the previously built or rebuilt environment map (10) which assumes the role of the reference CAD (14).
21. A graphical user interface (20) for interactively building a high-accuracy environment map (10), comprising a zoomable and pannable main map window (400) configured to display the built environment map (10) and trajectories (1 1 ), the current pose and the currently observed point cloud of at least one mobile device (102), characterized in that the user interface (20) further comprises:
- a user corrections management window (403), and
- a zoomable/scrollable trajectories timeline window (405),
wherein the user interface (20) is configured to provide tools for rewinding along any built or subsequently rebuilt trajectory (11) selected by the user, and tools for iteratively inputting user corrections.
22. The graphical user interface (20) according to claim 21 , wherein the user interface (20) further comprises a sensor measurement datasets window (401 ) configured to list the loaded sensor measurement datasets when building the environment map (10) is performed offline; an active devices window (402) configured to list at least one tracked active mobile device (102) when building the environment map (10) is performed online; and a tool window (404) providing saving and loading the map building project into/from a file on the persistent storage (212).
23. The graphical user interface (20) according to claim 21 , wherein the main map window (400) is configured to display the built or subsequently rebuilt environment map (10) and trajectories (11 ), the rewind point cloud (15) and the reference CAD (14), if available.
24. The graphical interactive user interface (20) according to claim 21 , wherein the user corrections management window (403) is configured to enable the user to manage the inputted user corrections by performing actions such as adding, removing, editing, enabling and disabling user corrections.
25. The graphical user interface (20) according to claim 21 , wherein the trajectories timeline window (405) is configured to display keyframe points corresponding to built trajectory nodes (13) and graphical elements corresponding to inputted user corrections or timestamped sensor measurements.
26. The graphical user interface (20) according to claims 21 and 25, wherein the trajectories timeline window (405) comprises a graphical playhead element (22) configured to enable rewinding along a selected trajectory (11 ) and controlling the position in the timeline by the action of dragging using the mouse or another human interface device.
27. The graphical user interface (20) according to claim 26, wherein the trajectories timeline window (405) further includes a tool (23) for excluding parts of the loaded sensor measurement datasets from the map building process, and additional means for controlling the position in the trajectory timeline, the additional means comprising a set of standard media player control buttons, tools for direct entry of a numeric timecode or trajectory node ID, and tools for rewinding using the keyboard keys.
28. The graphical user interface (20) according to claim 21 , wherein the main map window (400) includes tools for selecting the corresponding point in the timeline by clicking on the corresponding trajectory node (13).
29. The graphical user interface (20) according to claim 21 , wherein the tools for inputting of user pose corrections include markers which correspond to the pose of the mobile device (102), the markers consisting of a translation marker (16) and a rotation marker (21 ), wherein said markers are displayed in the main map window (400) and are configured to be dragged with the mouse or another human interface device.
30. The graphical user interface (20) according to claim 29, wherein a pose-corrected rewind point cloud (17) is displayed in the main map window (400), where the pose-corrected rewind point cloud (17) is translated and rotated in accordance with the pose of the dragged markers (16) and (21 ) which correspond to the pose of the mobile device (102).
31. The graphical user interface (20) according to claims 21 and 29, wherein tools for inputting of user pose corrections in the main map window (400) further include: a tool for specifying pairs of corresponding points configured to compute a transformation which aligns the rewind point cloud (15) by matching the specified pairs of corresponding points, one point in a pair being in the rewind point cloud (15) and the other in the built or subsequently rebuilt environment map (10) or in the reference CAD (14); and a pose refinement tool, which refines the inputted user correction by performing scan matching.
32. A Computer program product with computer-executable program instructions configured for execution by a computing device (104), which when executed by the computing device (104) with a screen displaying a graphical user interface (20) for interactively building a high-accuracy environment map (10) causes the computing device (104) to perform the method of any of claims 1 to 20.
EP17835936.0A 2017-12-21 2017-12-21 Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map Withdrawn EP3729225A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/HR2017/000019 WO2019122939A1 (en) 2017-12-21 2017-12-21 Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map

Publications (1)

Publication Number Publication Date
EP3729225A1 true EP3729225A1 (en) 2020-10-28

Family

ID=61074467

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17835936.0A Withdrawn EP3729225A1 (en) 2017-12-21 2017-12-21 Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map

Country Status (2)

Country Link
EP (1) EP3729225A1 (en)
WO (1) WO2019122939A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427027B (en) * 2019-07-18 2022-09-02 浙江吉利汽车研究院有限公司 Navigation route generation method and device for automatic driving and automatic driving system
CN111009036B (en) * 2019-12-10 2023-11-21 北京歌尔泰克科技有限公司 Grid map correction method and device in synchronous positioning and map construction
CN111551953A (en) * 2020-05-06 2020-08-18 天津博诺智创机器人技术有限公司 Indoor map construction optimization method based on SLAM algorithm
CN114253511A (en) * 2020-09-21 2022-03-29 成都睿芯行科技有限公司 SLAM hardware accelerator based on laser radar and implementation method thereof
CN112241002B (en) * 2020-10-11 2022-10-18 西北工业大学 Novel robust closed-loop detection method based on Karto SLAM
WO2022143713A1 (en) * 2020-12-31 2022-07-07 杭州海康机器人技术有限公司 V-slam map verification method and apparatus, and device
CN112837241A (en) * 2021-02-09 2021-05-25 贵州京邦达供应链科技有限公司 Method and device for removing image-building ghost and storage medium
CN113031620A (en) * 2021-03-19 2021-06-25 成都河狸智能科技有限责任公司 Robot complex environment positioning method
CN113763551B (en) * 2021-09-08 2023-10-27 北京易航远智科技有限公司 Rapid repositioning method for large-scale map building scene based on point cloud
CN115727854A (en) * 2022-11-28 2023-03-03 同济大学 VSLAM positioning method based on BIM structure information

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ATE510248T1 (en) 2005-12-28 2011-06-15 Abb Research Ltd MOBILE ROBOT
US8577538B2 (en) * 2006-07-14 2013-11-05 Irobot Corporation Method and system for controlling a remote vehicle
JP5304128B2 (en) * 2008-09-16 2013-10-02 村田機械株式会社 Environmental map correction device and autonomous mobile device
US8510041B1 (en) * 2011-05-02 2013-08-13 Google Inc. Automatic correction of trajectory data
DE102013211126A1 (en) * 2013-06-14 2014-12-18 Robert Bosch Gmbh Method for modeling an environment of a vehicle
JP6506279B2 (en) * 2014-06-16 2019-04-24 株式会社日立製作所 Map generation system and map generation method

Also Published As

Publication number Publication date
WO2019122939A1 (en) 2019-06-27

Similar Documents

Publication Publication Date Title
EP3729225A1 (en) Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map
CN111024100B (en) Navigation map updating method and device, readable storage medium and robot
CN114236552B (en) Repositioning method and repositioning system based on laser radar
Zhou et al. Road tracking in aerial images based on human–computer interaction and Bayesian filtering
US10803619B2 (en) Method and system for efficiently mining dataset essentials with bootstrapping strategy in 6DOF pose estimate of 3D objects
Grisetti et al. Improved techniques for grid mapping with rao-blackwellized particle filters
Alsadik et al. Automated camera network design for 3D modeling of cultural heritage objects
CN109857123A (en) A kind of fusion method of view-based access control model perception and the indoor SLAM map of laser acquisition
US20140316698A1 (en) Observability-constrained vision-aided inertial navigation
CN108829116B (en) Barrier-avoiding method and equipment based on monocular cam
JP2023519641A (en) Machine learning-based object identification using scale maps and 3D models
US11911921B1 (en) Training of artificial intelligence model
JP2022093291A (en) Induction inspection using object recognition model and navigation plan
Chen et al. Improving completeness and accuracy of 3D point clouds by using deep learning for applications of digital twins to civil structures
CN112837241A (en) Method and device for removing image-building ghost and storage medium
Andolfo et al. Precise pose estimation of the NASA Mars 2020 Perseverance rover through a stereo‐vision‐based approach
CN114299192B (en) Method, device, equipment and medium for positioning and mapping
Ehambram et al. Interval-based visual-inertial LiDAR SLAM with anchoring poses
CN115034987A (en) Map bidirectional updating adaptive device based on SLAM and scheduling system
CN113093715B (en) Motion control method, device and equipment of unmanned equipment and storage medium
Li-Chee-Ming et al. Augmenting visp’s 3d model-based tracker with rgb-d slam for 3d pose estimation in indoor environments
CN110807818A (en) RGB-DSLAM method and device based on key frame
Zhang et al. Challenges of Automating Interior Construction Progress Monitoring
Lee et al. A non-iterative pose-graph optimization algorithm for fast 2D SLAM
Pedersen et al. Multiple target single cycle instrument placement

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200624

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201104

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210316