EP3729225A1 - Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map - Google Patents
Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment mapInfo
- Publication number
- EP3729225A1 EP3729225A1 EP17835936.0A EP17835936A EP3729225A1 EP 3729225 A1 EP3729225 A1 EP 3729225A1 EP 17835936 A EP17835936 A EP 17835936A EP 3729225 A1 EP3729225 A1 EP 3729225A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- environment map
- corrections
- pose
- user
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 158
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 30
- 238000004590 computer program Methods 0.000 title claims abstract description 14
- 238000012937 correction Methods 0.000 claims abstract description 167
- 238000005259 measurement Methods 0.000 claims abstract description 86
- 230000007547 defect Effects 0.000 claims abstract description 37
- 238000003860 storage Methods 0.000 claims abstract description 11
- 230000002085 persistent effect Effects 0.000 claims abstract description 10
- 230000008569 process Effects 0.000 claims description 71
- 239000002245 particle Substances 0.000 claims description 21
- 238000005457 optimization Methods 0.000 claims description 19
- 239000003550 marker Substances 0.000 claims description 10
- 230000009466 transformation Effects 0.000 claims description 6
- 238000013519 translation Methods 0.000 claims description 6
- 230000009471 action Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 21
- 238000013507 mapping Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 7
- 230000004807 localization Effects 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000005201 scrubbing Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0044—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Definitions
- the present invention generally relates to an interactive computer-implemented method, a graphical user interface and a computer program product for building a high-accuracy environment map.
- Such high accuracy-environment maps may be used for autonomous vehicle navigation, especially in GPS-denied indoor environments.
- SLAM Simultaneous Localization and Mapping
- mapping inaccuracies that accumulate during the mapping procedure can be reduced when an area is re-visited, which is called loop closure.
- loop closure an area is re-visited
- significant inconsistencies can still occur in the built map.
- the accuracy of the used mapping procedure is crucial for precise positioning of autonomous mobile apparatus using the previously built map for localization. For example, the typical positioning precision required for the operation of autonomous forklifts (loading and unloading of pallets in automated warehouses) must be below 1 cm and 0.5°.
- mapping procedure can sometimes produce defects such as fracturing or blurring of object contours, or superfluous multiple appearances of objects caused by incorrectly registering the same objects several times, at different locations. These defects cannot be corrected even by using different local warpings.
- the present invention allows for storage and retrieval of all sensor measurements data and user corrections necessary to fully reconstruct the complete environment map building process, i.e. the SLAM process.
- This enables long-term iterative map adjustments which can be performed occasionally or periodically and thus account for persistent changes in the environment.
- a computer-implemented method according to the present invention allows for iterative map adjustments which can be performed by applying user corrections at any arbitrary point on a mobile device trajectory.
- subsequent corrections do not need to be ordered chronologically, i.e., when applying corrections, the user can freely rewind forwards and backwards "in time” along the trajectory, employing a graphical user interface according to an embodiment of the present invention.
- a computer-implemented method according to the present invention presents in detail a nontrivial method of applying user pose corrections with respect to the whole trajectory for modern SLAM implementations (for both graph and non-graph-based SLAM implementations). Furthermore, a computer- implemented method according to the present invention is not restricted to one independent run of only one mobile device, or even to any particular kind of a single autonomously moving device or apparatus.
- the invention described herein provides specific advantages over prior art. Appropriate map alignment and correction of defects are achieved by iteratively inputting corrections during the SLAM process. This may also be facilitated by automatically aligning the rangefinder sensor measurements, e.g. from a laser scanner, with for example a reference CAD during the SLAM process, although the user may perform further tuning in case the automatic alignment with the reference CAD produces unsatisfactory results.
- An object of the present invention is to provide an interactive computer-implemented method and an intuitive, easy- to-use, graphical user interface enabling users without expert knowledge about SLAM to tune the final output, i.e. the built environment map and trajectories, by intervention into the SLAM process, i.e. the execution of the SLAM algorithm in the used SLAM implementation.
- Another object of the present invention is to provide an interactive computer-implemented method by which the environment map building process i.e. the SLAM process is not restricted to data collected from a single mobile device. Sensor measurements collected from several mobile devices, or from several independent runs of a single mobile device, or any combination thereof can be fused seamlessly into a single environment map, and the multiple built trajectories may be inter-constrained together in any way and at any point the user desires. This is achieved in the preferred embodiment by employing and extending an existing graph SLAM implementation capable of handling multiple trajectories (i.e. tracking multiple mobile devices simultaneously).
- the present method enables the use of a reference CAD, if available, as prior information in the SLAM process. Further, the present method allows for assisted inputting of user corrections by means of scan matching, making initial inputting of user corrections fast and accurate. User corrections are integrated directly into the environment map building process, allowing mapping accuracy to be increased at various deliberately selected points in the environment to a level where the built environment map is free of visible defects. The final result is a high-accuracy map of the environment, adequate for localization in industrial scenarios, with an accuracy better than 1 cm and 0.5°.
- a task of building an environment map comprises collecting sensor measurements while a mobile device is moving throughout the environment, and processing the collected sensor measurements in order to obtain the final result, the built environment map, as well as one or more mobile device trajectories.
- the operator employs at least one mobile device equipped with the following sensors: one or more rangefinder sensors (e.g, a laser scanner - LIDAR), and optionally one or more motion sensing devices (e.g. an odometry sensor such as wheel encoders, or an inertial measurement unit - IMU).
- a mobile device is a mobile robot, equipped with one or more rangefinder sensors, and optionally one or more motion sensing devices and a data processing unit.
- the mobile robot may be steered through the environment by an operator or driven autonomously.
- a mobile device is a portable device such as a backpack, carried by the user, equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices and a data processing unit.
- the map building software takes the sensor measurements collected by at least one mobile device and performs the SLAM process (Simultaneous Localization and Mapping), the output of which is a built map of the environment (usually in the form of an occupancy grid, i.e. an array of numbers representing the probability of the respective discrete map element (cell) being an obstacle or free space), and one or more built trajectories.
- SLAM process Simultaneous Localization and Mapping
- Each trajectory is a set of consecutive timestamped mobile device poses (2D or 3D).
- a 2D pose consists of position, commonly expressed using planar coordinates (x, y), and orientation, expressed as a yaw angle Q.
- a 3D pose consists of position, commonly expressed using 3D coordinates (x, y, z ) and orientation, commonly expressed using Euler angles, a rotation matrix, or quaternions.
- Another common pose representation (both for 2D and 3D) is using a homogenous transformation matrix, which includes both position and orientation.
- a computer-implemented method according to the present invention is envisioned to be used with an arbitrary SLAM implementation.
- the proposed interactive computer-implemented method and a graphical user interface enable user intervention into SLAM process by allowing the user to input corrections, for the purpose of obtaining a highly accurate, consistent and defect-free map of the environment.
- the reference CAD may be used to guide the SLAM process, and the resulting environment map will be aligned thereto.
- a graphical user interface according to the present invention allows the user to observe the progress of the SLAM process, to rewind forwards and backwards in time along built trajectories, to observe and assess collected sensor measurements, and to iteratively input corrections and view the effects thereof.
- a graphical user interface according to the present invention thus enables the user to gain a high level of insight into the SLAM process, i.e. the SLAM algorithm execution, aiding them in diagnosing sensor and SLAM configuration/tuning issues.
- Instructions for performing the aforementioned method and a graphical user interface may be included in a computer program product configured for execution by a computing device 104, which when executed by a computing device having a screen displaying the graphical user interface for interactively building a high-accuracy environment map causes the computing device to perform the computer-implemented method in accordance with the present invention.
- Figure 1 shows an environment map illustrating an example of failure of a SLAM implementation to perform loop closure successfully and thus to eliminate accumulated errors in the environment map building process.
- Figure 2 shows one example of an environment map defect such as fracturing and blurring of object contours in an environment map building process.
- Figure 3 shows another example of an environment map defect such as superfluous double appearance i.e. ghosting of objects caused by incorrectly registering the same objects two times, at different locations, in an environment map building process.
- Figure 4 shows yet another example of an environment map defect such as misalignment with a reference CAD in an environment map building process.
- Figure 5 schematically illustrates an example of a mobile device system for implementing an interactive environment map building method, in accordance with an embodiment of the present invention.
- Figure 6 shows an interactive environment map building workflow, in accordance with an embodiment of the present invention.
- Figure 7 shows a software flow diagram illustrating a main data processing loop in a SLAM implementation, as well as the extensions which enable an interactive environment map building method in accordance with an embodiment of the present invention.
- Figure 8 shows an example of an environment map free of visual defects, built using an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
- Figure 9 schematically illustrates a graphical user interface enabling implementation of an interactive computer- implemented environment map building method, in accordance with an embodiment of the present invention.
- Figure 10 illustrates a graphical user interface enabling the user to observe an environment map being built, to inspect and gain a high level of insight into the SLAM process, and to input corrections, in accordance with an embodiment of the present invention.
- Figure 11 shows an enlarged view of a part of a main map window illustrating a section of a built environment map, a trajectory, a translation and a rotation marker, and a rewind point cloud, in accordance with an embodiment of the present invention.
- Figure 12 shows an enlarged view of a part of a main map window from Figure 11 illustrating a pose-corrected rewind point cloud during inputting of a user pose correction, in accordance with an embodiment of the present invention.
- Figure 13 illustrates applied user pose corrections, and a rebuilt trajectory honoring the inputted pose corrections, in accordance with an embodiment of the present invention.
- the expression“at least one mobile device”, as used in the present patent application and patent claims, shall include independent runs of several mobile devices, or several independent runs of a single mobile device, or any combination thereof.
- user correction shall include any kind of user intervention into the SLAM process, i.e. execution of the SLAM algorithm.
- a user correction is typically given at a certain time in the sensor measurement dataset or at a certain trajectory node, and may include modifying any parameter of the SLAM implementation or part of the SLAM process state.
- the effect of applying a user pose correction in the SLAM process is to force the pose of the desired trajectory node in the rebuilt trajectory to the one mandated by the pose correction.
- linear SLAM as used in the present patent application and patent claims, is related to the following: a SLAM implementation is referred to as linear if a user correction has to be applied exactly at the moment when the corresponding sensor measurement is being processed in the data processing loop, and it is not feasible to apply the correction later.
- computing device may include, for example, one or more of: a desktop computing device, a laptop computing device, a tablet computing device, a computer, a computing device of a vehicle of the user, or a wearable apparatus of the user that includes a computing device.
- SLAM implementation is related to one or more machine-readable program code portions configured to run a SLAM process, i.e. execute a SLAM algorithm on loaded sensor measurements, which may comprise algorithms for scan matching, localization, trajectory building and environment map building.
- SLAM process i.e. execute a SLAM algorithm on loaded sensor measurements, which may comprise algorithms for scan matching, localization, trajectory building and environment map building.
- the computer program product described herein uses and extends an existing SLAM implementation as one of its components; one its component implements and displays in a graphical user interface a built or subsequently rebuilt environment map and tools, where said component enables the user to perform a map building workflow using an interactive computer-implemented method according to the present invention.
- the interactive method according to the present invention is a computer implemented method.
- an embodiment of the present invention may utilize multiple different SLAM implementations and accordingly provide means in the graphical user interface for selecting the SLAM implementation.
- SLAM implementations usually have many intricate parameters which influence their execution and the quality of the end result, therefore requiring a highly skilled operator and specialized knowledge to find an optimal tuning of these parameters for a particular combination of sensors and the environment.
- the interactive environment map building method and the graphical user interface according to the present invention enable a less experienced user to mitigate causes of mapping errors such as suboptimally tuned SLAM parameters, imperfect sensor measurements and others, by allowing them to manually correct the mapping errors caused thereby.
- Loop closure is an important concept in SLAM. It is an event which is triggered when at least one mobile device visits an area it has visited before. Performing loop closure reduces the positioning error along the whole loop which begins with the previous visit and ends with the revisit.
- Particle filter-based SLAM uses a set of particles.
- Each particle is a SLAM state hypothesis and contains a current pose estimate and a map that has been built along a trajectory which ends at that current pose estimate.
- Incoming sensor measurements such as rangefinder and motion measurements are used to update every particle according to the mobile device measurement and motion models. All particles i.e. hypotheses are periodically evaluated, and the level of agreement with the incoming measurements is numerically calculated as particle weight. The best particles are assigned the highest weights.
- the set of particles is periodically resampled, meaning that a new set is formed by sampling the best particles from the previous set. This ensures loop closure-like behavior of particle filter-based SLAM: when a loop is closed, the particles that ended up in correct poses, i.e. which successfully closed the loop, will have their weights rise drastically due to agreement with the newly available evidence (the range data of the revisited place).
- a well-known particle filter-based SLAM implementation is GMapping.
- a class of SLAM implementations known as graph SLAM internally uses a structure known as the pose graph to represent how poses of various trajectory nodes are constrained one to another. For example, consecutive nodes are typically locally constrained.
- Graph SLAM implementations usually account for loop closure, when detected, by explicitly modeling the detected relationship between the trajectory nodes of the previous visit and the new trajectory node(s) which correspond to the revisit. The relationship is usually modeled with a loop closure constraint that is inserted in the pose graph when loop closure is detected.
- graph SLAM usually runs an optimization process and produces optimized i.e. rebuilt trajectories, which honor the newly added constraints as well as all the constraints previously inserted into the pose graph.
- a constraint is a measurement of one [trajectory] node q from another's (q) position.
- the measured offset between q and q, in q's frame, is z ij: with precision matrix L i - (inverse of covariance). For any actual poses of q and q, their offset can be calculated as
- R t is the 2x2 rotation matrix of Q*. /i(q, q) is called the measurement equation.
- the constrained pose graph optimization problem is periodically re-solved with newly added constraints (which can be local and loop closure constraints).
- the results of the optimization process are displayed to the user in real time.
- the SPA method (as implemented in Google Cartographer using the Google Ceres nonlinear solver) can quickly optimize pose graphs containing thousands of nodes.
- fast performance of the optimization process is instrumental in quickly processing inputted user corrections and displaying the result to the user, which facilitates a faster iterative workflow.
- the used SLAM implementation also supports tracking multiple, possibly concurrent, trajectories.
- Sensor measurements collected from at least one mobile device can be fused seamlessly into a single environment map.
- Each independent run of one mobile device corresponds to one built trajectory, and each built trajectory is displayed within the built environment map in the graphical user interface.
- the built environment map in accordance with the present invention comprises at least one built trajectory.
- SLAM SLAM implementations
- the data processing loop takes care of routing the sensor measurements data to the corresponding trajectories.
- the multi-trajectory capability of the SLAM implementation combined with the present invention, enables the user to iteratively build and correct an environment map from data collected from at least one mobile device.
- snapshots of the SLAM state may be taken (typically in the data processing loop). This need not be done in each loop iteration, but rather sparsely, in order to avoid overly consuming computing device resources.
- the SLAM state is restored from the most recent snapshot, taken at a time before the first (earliest) of the new user corrections is due to be applied. This avoids running the complete SLAM process from the beginning, thus reducing processing time of applying corrections.
- a user pose correction may be applied by forcing the entire set of particles into the pose mandated by the correction, by manipulating their pose hypotheses into the target pose.
- the manipulation may be performed as a continuous interpolation to the target pose within a predetermined time interval in the SLAM process before the correction is due. Because this is clearly a correction which must be applied at that certain point in the SLAM process and not afterwards, particle filter-based SLAM is treated as linear SLAM with respect to user pose corrections.
- graph SLAM is retroactively correctable, easily enabling an iterative workflow without the need to re-execute the complete SLAM process. If the optimization process and rebuilding the environment map is quick enough (which is the case with the preferred embodiment that uses Google Cartographer), the user may quickly see the effect of applying the inputted corrections.
- the user may also have the ability to provide relative pose corrections which are not in the global coordinate system, but rather tie together any two parts of the trajectory relative one to another by specifying a relative pose correction in a coordinate system tied to a trajectory node located in one of the respective trajectory parts. In other words, the user may select c other than c 0 .
- FIG. 5 schematically illustrates one possible embodiment of a map building system 100 for implementing an interactive environment map 10 building method according to the present invention.
- the map building system 100 includes at least one mobile device 102, a computing device 104, and optionally a dedicated control command system 105.
- the mobile device 102 is equipped with sensors such as one or more rangefinder sensors, one or more motion sensing devices, and a data processing unit. It is responsive to motion commands generated by the user through the control command system 105 and conveys the sensor measurements data to the computing device 104 over a communication channel 106. All collected sensor measurements data is being stored on the computing device 104, and optionally also on the data processing unit of the mobile device 102.
- the graphical user interface 20 provides to the user means for gaining a high level of insight and enables performing intervention into the SLAM process by iteratively inputting corrections, using a keyboard, a mouse, a 3D mouse, or any other human interface device known to the person skilled in the art.
- the interactive environment map building method is performed online, meaning that the environment map 10 is being built and user corrections are being inputted and applied while the mobile device 102 is moving and streaming online sensor measurements to the computing device 104 via the communications channel 106.
- one mobile device 102 is moving in the environment, controlled by the user through the control command system 105.
- the sensor measurements data is also being displayed to the user on the screen of the computing device 104, but user corrections are not being inputted during motion of the mobile device 102.
- the interactive map building method according to the present invention is performed offline, by processing at any later point in time on the computing device 104 the dataset containing the collected sensor measurements.
- the interactive map building method according to the present invention can also be performed by simultaneously processing sensor measurements recorded in several independent runs of at least one mobile device 102.
- a computer-implemented method for interactively building a high-accuracy environment map 10 comprises the steps of: moving at least one mobile device 102 in the environment; collecting sensor measurements from at least one mobile device 102; loading into a SLAM implementation 306 the collected sensor measurements in the form of offline stored sensor measurement datasets ordered by timestamps or in the form of online streamed data; building an environment map 10 and at least one trajectory 11 by running a SLAM process in the SLAM implementation 306; exporting the built environment map 10 and trajectories 11 to a persistent storage 212 of a computing device 104; displaying the built environment map 10 and at least one built trajectory 11 in a graphical user interface 20; and assessing by the user if the built environment map 10 has visible defects.
- the method further comprises: iterative inputting of user corrections in the graphical user interface 20 and applying the inputted corrections in the SLAM implementation 306; subsequently rebuilding the environment map 10 and trajectories 11 , honoring the inputted user corrections and sensor measurement datasets, wherein after each inputted and applied user correction the subsequently rebuilt environment map 10 and trajectories 11 are displayed in the graphical user interface 20 for assessing by the user of visible defects; and, if the subsequently rebuilt map 10 is free of visible defects, exporting the subsequently rebuilt environment map 10 and trajectories 11 to the persistent storage 212.
- Assessing by the user if the built or subsequently rebuilt environment map 10, displayed in the graphical user interface 20, contains defects is performed by tools for rewinding along at least one built or subsequently rebuilt trajectory 11. Iterative inputting and applying of user corrections, in accordance with present invention, includes inputting pose corrections 18 of trajectory nodes 13, and modifying SLAM implementation parameters or part of the SLAM process state.
- an interactive computer-implemented method hereinafter the method, can be performed in offline or online mode.
- execution of the map building software is initiated by the user, by creating a new mapping project (block 201 ) and loading in (block 202) the sensor measurements data from at least one mobile device 102, whether offline in the form of a dataset ordered by timestamps, or online by setting up connections with at least one active mobile device 102.
- each independent active mobile device 102 streams the collected sensor data to the map building software.
- the method according to the present invention provides a possibility of using an existing reference CAD 14, in which case the reference CAD is also loaded (block 202).
- a high-accuracy environment map 10 can be built, and the workflow enabled by the present invention can be performed even without the reference CAD 14 being provided.
- the reference CAD 14, if available, is displayed under/overlaid with the built environment map 10.
- the sensor measurements used for creating the environment map 10 may be collected during one or several independent runs of at least one mobile device 102.
- the independent runs may or may not be concurrent.
- Each independent run is closely associated with one trajectory 11 , and these two terms will be used herein interchangeably.
- the elements of the set of timestamped poses that makes up a trajectory 1 1 are called trajectory nodes 13.
- the preferred embodiment uses a graph SLAM implementation which is capable of tracking multiple concurrent trajectories 1 1. However, a significant portion of the present method, described herein, is applicable to simpler single-trajectory SLAM implementations as well. Thus, the second preferred embodiment uses a single-trajectory, particle filter-based linear SLAM implementation.
- the SLAM implementation 306 After loading into SLAM implementation 306 one or more sensor measurement datasets (blocks 202; 301 ), the SLAM implementation 306 asynchronously starts the environment map building process (block 203) by performing operations depicted in Figure 7.
- the environment map building process takes place in background and does not block the user from performing actions in the graphical user interface 20, such as rewinding (block 206) and inputting of user corrections (block 207).
- the output of the environment map building process is displayed in the graphical user interface 20 as it progresses.
- the user is free to examine in detail the built environment map 10 and one or more built trajectories 11 , as they are being built, or afterwards, by performing rewinding (block 206). Displaying the built environment map 10 in a graphical user interface 20 and rewinding along at least one built trajectory 11 enables the user to assess if the built or subsequently rebuilt environment map 10 has visible defects.
- a rewinding tool provided in the graphical user interface 20 enables the user to rewind along each trajectory 11 and to perceive the sensor measurements related to the corresponding trajectory nodes 13 where the SLAM implementation 306 built the respective section of the built or subsequently rebuilt environment map 10.
- the user can rewind along each built trajectory 1 1 and view the poses of the tracked mobile devices 102 at each trajectory node 13.
- a rangefinder point cloud from the sensor measurements data which corresponds to the trajectory node 13 selected during rewinding is also displayed in the graphical user interface.
- the rangefinder point cloud displayed on the built environment map 10 corresponds to the mobile device 102 pose at the respective trajectory node 13.
- rewind point clouds 15 These displayed rangefinder point clouds are called herein rewind point clouds 15, and have an important role in enabling the user to input corrections (illustrated in Figs. 10 to 12). Rewinding along a built or subsequently rebuilt trajectory 11 and displaying the corresponding rewind point clouds 15 enables the user to gain a high level of insight into the environment map building process, and enables the user to identify with high precision, both temporal and spatial, which scans were incorrectly inserted into the built environment map 10.
- the iterative interactive environment map building method begins with the user perceiving one or more visible defects in the built environment map 10 (block 205).
- defects may include: fracturing or blurring of object contours (illustrated in Figure 2); superfluous multiple appearances, also called ghosting, of objects in the built environment map 10 caused by incorrectly registering the same objects several times at different locations, e.g. improperly closed loops; illustrated in Figure 3; and misalignment with the reference CAD 14 (illustrated in Figure 4).
- the user assesses visible defects in the built or subsequently rebuilt environment map 10.
- the user is enabled to visually locate the built or subsequently rebuilt trajectory nodes 13 where incorrect handling of sensor measurements, imperfect sensor measurements or random environmental disturbances caused visible defects in the built environment map 10 (block 206), as for example illustrated in Figure 11.
- the user employs the tools from the graphical user interface 20 to input corrections which will correct the visible defects (block 207).
- the method of applying corrections to linear SLAM is suitable for other types of corrections as well.
- the user may opt to change certain SLAM parameters. This would be another type of correction, a SLAM parameter correction, applied at the trajectory node 13 that corresponds to the point of entry into the new environment.
- the user may input pose corrections by aligning the incorrectly registered objects 12 e.g. walls, or certain key features visible in the rewind point cloud 15 with the rest of the built environment map 10 or with the reference CAD 14, if available. Furthermore, after inputting an initial pose correction, the user may further tune it either manually or refine it by using a pose refinement tool.
- Different means of enabling inputting a pose correction include: mouse drag-and-drop input of position and orientation; CAD-like numeric entry; and selecting pairs of matching points and computing the transformation between them.
- Figures 10 to 12 illustrate inputting of a pose correction by aligning the incorrectly registered objects 12 such as walls; namely, visually aligning the rewind point cloud 15 with the reference CAD 14, or with the rest of the built environment map 10.
- the result of alignment is a pose-corrected rewind point cloud 17, as illustrated in Figure 10.
- the pose corresponding to the pose-corrected rewind point cloud 17 is applied thereafter as a pose correction for the respective trajectory node 13.
- the method in accordance with present invention further comprises displaying and assessing in the graphical user interface 20 sensor measurements data which the SLAM implementation 306 used when building or subsequently rebuilding the environment map 10, the displayed sensor measurements data corresponding to trajectory nodes 13 selected during rewinding.
- the inputted user correction is applied as soon as possible (blocks 208; 305) in the SLAM implementation 306; the trajectory 11 and the environment map 10 are rebuilt; the subsequently rebuilt environment map 10 and the rebuilt trajectory 11 , which honor the newly inputted correction, as well as all the previously inputted corrections and sensor measurements (blocks 208; 308) are displayed in the graphical user interface 20.
- the user is now able to evaluate the effect of the inputted correction by assessing the subsequently rebuilt environment map 10 to see if applying the correction has resulted in a more accurate environment map 10 with respect to the previously observed visible defects, such as map blurring, fracturing, ghosting or CAD misalignment.
- the user may also employ the insight-providing tools of the graphical user interface 20 such as rewinding. If the rebuilt environment map 10 is still visibly inaccurate (contains visible defects), the user can further tune the inputted correction. The correction is reverted to the stage where user input is performed (block 210), and the user can further tune the correction with additional input. If unsuccessful again, the user can choose to remove the correction altogether and try inputting another correction elsewhere, i.e. at another trajectory node 13 or in another section of the built or subsequently rebuilt environment map 10.
- the insight-providing tools of the graphical user interface 20 such as rewinding. If the rebuilt environment map 10 is still visibly inaccurate (contains visible defects), the user can further tune the inputted correction. The correction is reverted to the stage where user input is performed (block 210), and the user can further tune the correction with additional input. If unsuccessful again, the user can choose to remove the correction altogether and try inputting another correction elsewhere, i.e. at another trajectory node 13 or in another section of the built or
- the user is enabled to perform an iterative workflow where the user observes one or more visible defects in the environment map 10, inputs corrections until the respective section of the environment map 10 is visibly accurate and consistent, and moves on to the remaining visible defects, if any. The process is repeated until the user obtains an environment map 10 that is consistent and free of visible defects.
- the absence of visible defects in the built or subsequently rebuilt environment map 10 also implies that the corresponding one or more trajectories 11 represent a reasonably good estimate of the true trajectories 11 travelled by at least one mobile device through the environment.
- the user may utilize the available editing-like facilities to cut/exclude (tool 23 in the graphical user interface 20) such redundant parts from the map building process.
- the user may also inter-constrain the multiple trajectories 11 together in any way and at any point the user desires. Inter-constraining pairs of trajectories 1 1 together is achieved by allowing the user during inputting of pose corrections to select a coordinate system tied to any trajectory node 13, thus enabling inputting relative pose corrections between any two trajectories 11 at one or more trajectory nodes 13, as described earlier.
- the method according to the present invention provides saving and loading of the user’s work done so far at any time as illustrated in block 204.
- the SLAM state and all corrections inputted so far are preserved and saved in the persistent storage 212, enabling the user to restore the complete map building project and to continue work on the same project until the rebuilt environment map 10 and the trajectory 11 are free of visible defects.
- the user may also load a new dataset recorded some time later, e.g. when the environment has changed, and use the available tools and the graphical user interface 20 according to the present invention to obtain an updated version of the environment map 10 that is consistent with the previously built or rebuilt environment map 10, which assumes the role of the reference CAD 14.
- said rebuilt environment map 10 may be exported in the form of an occupancy grid (block 211 ) and saved in the persistent storage 212.
- the built trajectories 11 may also be exported as sets of consecutive timestamped mobile device poses.
- FIG. 7 shows a software flow diagram illustrating the main data processing loop in a SLAM implementation 306, as well as the extensions which enable the interactive environment map building method in accordance with an embodiment of the present invention.
- Block 301 illustrates performing SLAM initialization, which includes loading offline sensor measurement datasets ordered by timestamps, or setting up online communication with at least one active mobile device 102.
- the SLAM implementation 306 proceeds to build the environment map 10 and the trajectory 11 of at least one mobile device 102, as illustrated in the data processing loop consisting of blocks 302 to 308.
- a series of operational steps is performed until all sensor measurements data are processed (block 302), the operational steps comprising: each sensor measurement is retrieved (block 303) and handled (block 307) according to its type and originating mobile device 102.
- Handling of each sensor measurement may include performing scan matching, localization, building a trajectory node 13 and updating the built environment map 10, depending on the measurement type and the used SLAM implementation 306.
- the current, i.e. updated, SLAM state namely the currently built environment map 10 poses of at least one mobile device 102 and the corresponding trajectories 11 are displayed in the graphical user interface 20.
- the data processing loop in the SLAM implementation 306 is extended to support applying user corrections.
- the user iteratively inputs one or more new corrections (block 207).
- Blocks 304 and 305 illustrate applying due user corrections in the data processing loop of the SLAM implementation 306. Applying the due user corrections (blocks 304 and 305) is performed as soon as possible, i.e.
- the resulting updated SLAM state i.e. the subsequently rebuilt trajectories 11 and environment map 10, are afterwards displayed (blocks 208; 308) in the graphical user interface 20.
- the user can as soon as possible observe the effect of the inputted corrections, which facilitates a faster iterative workflow.
- a pose constraint for a certain trajectory node 13 may be visualized like a spring which pulls that trajectory node 13 towards a certain position. Rebuilding the trajectory 11 given new constraints is performed by solving the constrained pose graph optimization problem as described above, referred to herein as running an optimization process. Running the optimization process is analogous to finding a configuration of trajectory nodes 13 for which the springs are at their most relaxed position, i.e. all constraints are satisfied as much as possible.
- the pose graph constraints are calculated from pose corrections inputted by the user, as illustrated in Figure 13.
- the user pose corrections 18 were in this case given in the global map coordinate system with an origin 19.
- constraints have weights (corresponding to elements of L ⁇ ⁇ in the formal description above) which control the impact of the constraints on the final optimized solution. Since it is desired that the constraints calculated from the inputted user pose corrections 18 have a big impact on the optimization process (i.e. strongly influence the optimized solution, in order to force trajectory node poses to ones mandated by constraints calculated from user corrections), larger weights are assigned to these constraints.
- the assigned weights are predetermined to be equal to or larger than the weights of loop closure constraints automatically generated by the graph SLAM implementation 306, where the user is further enabled to adjust the weights assigned to pose constraints calculated from the inputted pose corrections 18. If necessary, the exact values of the assigned weights may be determined depending on the optimization process result.
- the user can increase the values of assigned weights until the resulting optimized poses of the trajectory nodes 13 match the ones mandated by the inputted user corrections to a desired degree of precision, namely until the subsequently rebuilt environment map 10 is free of visible defects.
- a graphical element such as a residual marker may be used to visualize the difference between the pose mandated by the inputted user correction and the resulting optimized trajectory node pose.
- applying of inputted user pose corrections 18 is performed by excluding from the pose graph optimization process a subset of variables, and subsequently rebuilding the trajectories 11 and the environment map 10 by re-running the pose graph optimization process.
- the variables excluded from the optimization process i.e. a subset of global poses c corresponding to poses of trajectory nodes 13 with pose corrections, are treated specially by having their values exactly fixed to ones mandated by the corresponding user pose corrections 18. This would be analogous to having constraints with infinite weights.
- linear SLAM does not lend itself naturally to applying additional pose corrections, which is required for an iterative map building workflow.
- the SLAM process needs to be re-run from the point where the user correction is due to be applied.
- applying of user pose corrections 18 is performed by interpolating the values of particle poses into values mandated by the user pose corrections 18 within a predetermined time interval in the SLAM process before the correction is due, where the user is further enabled to adjust the interpolation time interval.
- the time needed to rebuild the environment map 10 after applying the correction can be reduced by regularly snapshotting the state of the SLAM process and rerunning the SLAM process in the SLAM implementation 306 from the latest possible snapshot before a correction is due to be applied, as described above.
- Inputting of user corrections is performed exclusively through the graphical user interface 20, displayed on the screen of the computing device 104 and operably connected to the SLAM implementation 306.
- the graphical user interface 20 according to the present invention enables performing the computer-implemented method for interactively building a high-accuracy environment map 10 by iterative inputting and applying of user corrections in the SLAM implementation 306.
- the graphical user interface 20 is configured to display the built or subsequently rebuilt environment map 10 and trajectories 11 to enable the user to observe the progress of the environment map 10 building process, to rewind along the built or subsequently rebuilt trajectories 11 , to iteratively input and apply corrections, to store the complete mapping project at any point, and to export the built or subsequently rebuilt environment map 10 which is assessed to be free of visible defects.
- the tools in the graphical user interface 20 available to the user include accurate controls for inspecting the built or subsequently rebuilt environment map 10 and trajectory 11. Furthermore, the tools for accurate inputting and refining of pose corrections are available to the user. These tools work by allowing the user to align the rewind point cloud 15 so that the visible defects in the environment map 10 will be removed.
- the graphical user interface 20 for implementing an interactive environment map building method comprises, schematically illustrated in Figure 9, a zoomable and pannable main map window 400 configured to display the built environment map 10 and trajectories 11 , the current pose and the currently observed point cloud of at least one mobile device 102, a zoomable and scrollable trajectories timeline window 405, and a user corrections management window 403, wherein the graphical user interface is configured to provide tools for manipulating the viewport of the main map window 400 such as zooming and panning, tools for rewinding along a selected built or subsequently rebuilt trajectory 1 1 , and tools for iterative inputting and applying of user corrections.
- tools for manipulating the viewport of the main map window 400 such as zooming and panning, tools for rewinding along a selected built or subsequently rebuilt trajectory 1 1 , and tools for iterative inputting and applying of user corrections.
- the graphical user interface 20 comprises a sensor measurement datasets window 401 configured to list the loaded sensor measurement datasets when building the environment map 10 is performed offline, an active devices window 402 configured to list at least one tracked active mobile device 102 when building the environment map 10 is performed online, and a tool window 404 providing saving and loading the map building project data into/from a file on persistent storage 212.
- a sensor measurement datasets window 401 configured to list the loaded sensor measurement datasets when building the environment map 10 is performed offline
- an active devices window 402 configured to list at least one tracked active mobile device 102 when building the environment map 10 is performed online
- a tool window 404 providing saving and loading the map building project data into/from a file on persistent storage 212.
- the graphical user interface 20 is structured to provide a“map building project”-type interface.
- the user may load one or more sensor measurement datasets or active devices into the map building project, and perform the interactive environment map 10 building method according to the present invention.
- the user corrections management window 403 enables the user to perform actions such as inputting, applying, removing, editing, enabling and disabling user corrections.
- the inputted corrections may also be displayed in the trajectories timeline window 405 by highlighting the keyframe points corresponding to respective trajectory nodes 13. Further, during rewinding, when a trajectory node 13 is selected, its current pose is displayed in the main map window 400, and the selected trajectory node 13 may be highlighted in the trajectories timeline window 405 as well.
- the graphical user interface 20 displays the sensor measurement datasets window 401 , which lists the loaded offline sensor measurement datasets, or the active mobile devices window 402, which lists the tracked active mobile devices 102 when the environment map 10 building is performed online.
- the main map window 400 of the graphical user interface 20 displays the built or subsequently rebuilt environment map 10 and trajectories 1 1 , the current pose and the currently observed point cloud of at least one mobile device 102, the rewind point cloud 15 and the reference CAD 14, if available.
- the reference CAD 14 may be displayed under/overlaid with the built or subsequently rebuilt environment map 10, which enables the user to observe and evaluate the alignment of the built or subsequently rebuilt environment map 10 with the reference CAD 14, and to identify sections of the built or subsequently rebuilt environment map 10 which need further corrections.
- Means provided in the graphical user interface 20 enabling rewinding along a selected built or subsequently rebuilt trajectory 11 include tools for scrubbing, or directly selecting a point in timeline, keyframe points or other graphical elements which correspond to trajectory nodes 13, user corrections, or timestamped sensor measurements.
- the trajectories timeline window 405 is configured to display keyframe points corresponding to built trajectory nodes 13 and graphical elements corresponding to inputted user corrections or timestamped sensor measurements.
- the trajectories timeline window 405 includes a graphical playhead element tool 22 configured to enable rewinding along a selected trajectory 1 1 and controlling the position in the timeline by the action of dragging using the mouse or another human interface device.
- the trajectories timeline window 405 may further include: a set of buttons which control the position in the timeline, akin to a conventional media player interface; tools for selecting a point in the timeline by directly entering a numeric timecode or a numeric trajectory node ID; and a tool 23 for excluding parts of the sensor measurement dataset e.g. by cutting, akin to video or audio editing. If multiple trajectories 11 are built, they are displayed analogously to standard multi-track displays in video or audio editing interfaces. Additional means of rewinding through time are keys on the computer keyboard (for example, arrow keys), a swiping gesture on a touch screen or a similar human interface device.
- an alternative tool is provided for selecting a trajectory node 13 and the corresponding point in the timeline by clicking or dragging the mouse on the displayed built or subsequently rebuilt trajectories 11 in the main map window 400.
- the implementation of such tool is slightly more involved because it requires locating the trajectory node 13 which is closest to the clicked point, for which an efficient point search data structure known to the person skilled in the art, such as a KD-tree or an Octree, should be used.
- the graphical user interface 20 displays the mobile device 102 pose of the corresponding node 13 of the selected trajectory 11 , as built by the SLAM implementation 306, and shows the point cloud observed by the rangefinder sensor at that selected point in time, namely the rewind point cloud 15.
- the rewind point cloud 15 is displayed on the built or subsequently rebuilt environment map 10 in accordance with the pose of the corresponding built or subsequently rebuilt trajectory node
- Tools of the graphical user interface 20 for inputting of user pose corrections and for further tuning of the inputted corrections include a translation marker 16 and a rotation marker 21 , illustrated in Figures 10 to 12. Said markers are displayed in the main map window 400 and are configured to be dragged with the mouse or another human interface device.
- the main map window 400 is configured to display a pose-corrected rewind point cloud 17.
- the pose-corrected rewind point cloud 17 is translated and rotated in accordance with the pose of the dragged markers 16 and 21 which correspond to the pose of the mobile device 102.
- the translation marker 16 and the rotation marker 21 enable accurate and precise alignment of the displayed rewind point clouds 15 with the reference CAD
- the user may input a pose correction by moving the translation marker 16 and the rotation marker 21 which represent the pose of the mobile device 102 associated with the trajectory node 13 selected during rewinding.
- the user may also specify a pivot point for the rotation, which facilitates aligning key points in the point cloud.
- Another way of enabling the user to input the pose correction is to allow them to specify pairs of corresponding points, e.g. two key points from the rewind point cloud 15, and their corresponding key points in the built or subsequently rebuilt environment map 10 or in the reference CAD 14, if available. Subsequently, a transformation is computed which aligns the rewind point cloud 15 by matching the specified pairs of corresponding points, and used for the inputted pose correction. In case that the user specifies more than two pairs of corresponding points, thus overconstraining the problem of finding the transformation, a least squares solution may be used, or a more sophisticated regression which compensates for outliers, known to the person skilled in the art.
- the user is further provided with a“pose refinement” tool.
- the user may input an initial pose correction by roughly aligning the rewind point cloud 15 with the built environment map 10 or with the reference CAD 14, if available.
- the user may subsequently use the pose refinement tool which takes the initially inputted user pose correction and performs scan matching, the output of which is a refined pose.
- the user may further tune the pose correction before confirming the final input.
- the pose refinement tool thus facilitates fast and precise alignment of the displayed rewind point clouds 15 with the built or subsequently rebuilt environment map 10 or with the reference CAD 14, if available.
- the correction for example, a pose correction using the aforementioned tools for accurate and precise alignment of the displayed rewind point clouds 15, it is applied as soon as the execution of the SLAM process permits, as described earlier.
- This enables the user to see in the graphical user interface 20 the effects of the inputted correction on the subsequently rebuilt environment map 10 as soon as possible, wherein the delay of displaying to the user depends on the type of the used SLAM implementation 306.
- Figure 10 illustrates the graphical user interface 20 where the trajectories timeline window 405 displays two trajectories 11 , namely T rajectory 0 and T rajectory 1.
- the two trajectory nodes 13 on T rajectory 0 where corrections were applied are highlighted, and the corresponding corrections are displayed in the user corrections management window 403.
- Trajectory 1 is selected as the trajectory along which rewinding is performed, and the vertical playhead line passes through the highlighted keyframe point corresponding to the node that has been selected by rewinding.
- the mobile device pose of the selected trajectory node 13 is also displayed in the main map window 400, and the user is inputting a pose correction for that trajectory node 13 by dragging the markers 16 and 21 , as described above.
- extensions to a SLAM implementation according to the present invention include exposing all sensor measurements data, including all point clouds from the rangefinder sensor measurements.
- the exposed offline sensor measurement datasets are listed in the graphical user interface 20 in the sensor measurement datasets window 401.
- Point clouds from the rangefinder sensor measurements are exposed for the purpose of enabling in the graphical user interface 20 rewinding along the built or subsequently rebuilt trajectories 11 , and displaying the rewind point clouds 15.
- the timestamps, types and content of exposed sensor measurements may also be displayed in the graphical user interface 20, e.g. in the trajectories timeline window 405 or in the main map window 400.
- extensions to a SLAM implementation may further include exposing a scan matcher, if available, which enables the pose refinement functionality.
- a scan matcher if available, an alternative one may be provided and used in the graphical user interface 20 for the pose refinement functionality.
- the SLAM implementation may be extended to perform automatic alignment with the reference CAD during the SLAM process.
- Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a microprocessor, a programmable computing device or an electronic circuit. In some embodiments, one or more of the method steps may be executed by the mobile device 102. In the preferred embodiments of the invention, the method described herein is processor-implemented or computer-implemented.
- embodiments of the invention can be implemented in hardware or in software.
- the implementation can be performed using a non-transitory storage medium.
- embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing the method when the computer program product runs on a computing device.
- the program code may, for example, be stored on a machine-readable carrier.
- an embodiment of the inventive method is, therefore, a computer program product having a program code for performing the method described herein, when the computer program runs on a computing device.
- a further embodiment comprises a processing means, for example, a computing device or a programmable logic device, programmed to, configured to, or adapted to, perform the method described herein.
- a processing means for example, a computing device or a programmable logic device, programmed to, configured to, or adapted to, perform the method described herein.
- a further embodiment comprises a computer having installed thereon the computer program product for performing the method described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/HR2017/000019 WO2019122939A1 (en) | 2017-12-21 | 2017-12-21 | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3729225A1 true EP3729225A1 (en) | 2020-10-28 |
Family
ID=61074467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17835936.0A Withdrawn EP3729225A1 (en) | 2017-12-21 | 2017-12-21 | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3729225A1 (en) |
WO (1) | WO2019122939A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110427027B (en) * | 2019-07-18 | 2022-09-02 | 浙江吉利汽车研究院有限公司 | Navigation route generation method and device for automatic driving and automatic driving system |
CN111009036B (en) * | 2019-12-10 | 2023-11-21 | 北京歌尔泰克科技有限公司 | Grid map correction method and device in synchronous positioning and map construction |
CN111551953A (en) * | 2020-05-06 | 2020-08-18 | 天津博诺智创机器人技术有限公司 | Indoor map construction optimization method based on SLAM algorithm |
CN114253511A (en) * | 2020-09-21 | 2022-03-29 | 成都睿芯行科技有限公司 | SLAM hardware accelerator based on laser radar and implementation method thereof |
CN112241002B (en) * | 2020-10-11 | 2022-10-18 | 西北工业大学 | Novel robust closed-loop detection method based on Karto SLAM |
WO2022143713A1 (en) * | 2020-12-31 | 2022-07-07 | 杭州海康机器人技术有限公司 | V-slam map verification method and apparatus, and device |
CN112837241A (en) * | 2021-02-09 | 2021-05-25 | 贵州京邦达供应链科技有限公司 | Method and device for removing image-building ghost and storage medium |
CN113031620A (en) * | 2021-03-19 | 2021-06-25 | 成都河狸智能科技有限责任公司 | Robot complex environment positioning method |
CN113763551B (en) * | 2021-09-08 | 2023-10-27 | 北京易航远智科技有限公司 | Rapid repositioning method for large-scale map building scene based on point cloud |
CN115727854A (en) * | 2022-11-28 | 2023-03-03 | 同济大学 | VSLAM positioning method based on BIM structure information |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ATE510248T1 (en) | 2005-12-28 | 2011-06-15 | Abb Research Ltd | MOBILE ROBOT |
US8577538B2 (en) * | 2006-07-14 | 2013-11-05 | Irobot Corporation | Method and system for controlling a remote vehicle |
JP5304128B2 (en) * | 2008-09-16 | 2013-10-02 | 村田機械株式会社 | Environmental map correction device and autonomous mobile device |
US8510041B1 (en) * | 2011-05-02 | 2013-08-13 | Google Inc. | Automatic correction of trajectory data |
DE102013211126A1 (en) * | 2013-06-14 | 2014-12-18 | Robert Bosch Gmbh | Method for modeling an environment of a vehicle |
JP6506279B2 (en) * | 2014-06-16 | 2019-04-24 | 株式会社日立製作所 | Map generation system and map generation method |
-
2017
- 2017-12-21 EP EP17835936.0A patent/EP3729225A1/en not_active Withdrawn
- 2017-12-21 WO PCT/HR2017/000019 patent/WO2019122939A1/en active Search and Examination
Also Published As
Publication number | Publication date |
---|---|
WO2019122939A1 (en) | 2019-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3729225A1 (en) | Interactive computer-implemented method, graphical user interface and computer program product for building a high-accuracy environment map | |
CN111024100B (en) | Navigation map updating method and device, readable storage medium and robot | |
CN114236552B (en) | Repositioning method and repositioning system based on laser radar | |
Zhou et al. | Road tracking in aerial images based on human–computer interaction and Bayesian filtering | |
US10803619B2 (en) | Method and system for efficiently mining dataset essentials with bootstrapping strategy in 6DOF pose estimate of 3D objects | |
Grisetti et al. | Improved techniques for grid mapping with rao-blackwellized particle filters | |
Alsadik et al. | Automated camera network design for 3D modeling of cultural heritage objects | |
CN109857123A (en) | A kind of fusion method of view-based access control model perception and the indoor SLAM map of laser acquisition | |
US20140316698A1 (en) | Observability-constrained vision-aided inertial navigation | |
CN108829116B (en) | Barrier-avoiding method and equipment based on monocular cam | |
JP2023519641A (en) | Machine learning-based object identification using scale maps and 3D models | |
US11911921B1 (en) | Training of artificial intelligence model | |
JP2022093291A (en) | Induction inspection using object recognition model and navigation plan | |
Chen et al. | Improving completeness and accuracy of 3D point clouds by using deep learning for applications of digital twins to civil structures | |
CN112837241A (en) | Method and device for removing image-building ghost and storage medium | |
Andolfo et al. | Precise pose estimation of the NASA Mars 2020 Perseverance rover through a stereo‐vision‐based approach | |
CN114299192B (en) | Method, device, equipment and medium for positioning and mapping | |
Ehambram et al. | Interval-based visual-inertial LiDAR SLAM with anchoring poses | |
CN115034987A (en) | Map bidirectional updating adaptive device based on SLAM and scheduling system | |
CN113093715B (en) | Motion control method, device and equipment of unmanned equipment and storage medium | |
Li-Chee-Ming et al. | Augmenting visp’s 3d model-based tracker with rgb-d slam for 3d pose estimation in indoor environments | |
CN110807818A (en) | RGB-DSLAM method and device based on key frame | |
Zhang et al. | Challenges of Automating Interior Construction Progress Monitoring | |
Lee et al. | A non-iterative pose-graph optimization algorithm for fast 2D SLAM | |
Pedersen et al. | Multiple target single cycle instrument placement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200624 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20201104 |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210316 |