CN110887493A - Trajectory estimation method, medium, terminal and device based on local map matching - Google Patents

Trajectory estimation method, medium, terminal and device based on local map matching Download PDF

Info

Publication number
CN110887493A
CN110887493A CN201911205872.0A CN201911205872A CN110887493A CN 110887493 A CN110887493 A CN 110887493A CN 201911205872 A CN201911205872 A CN 201911205872A CN 110887493 A CN110887493 A CN 110887493A
Authority
CN
China
Prior art keywords
point cloud
frame
pose
relative
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911205872.0A
Other languages
Chinese (zh)
Other versions
CN110887493B (en
Inventor
蔡龙生
李国飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Has A Robot Co Ltd
Original Assignee
Shanghai Has A Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Has A Robot Co Ltd filed Critical Shanghai Has A Robot Co Ltd
Priority to CN201911205872.0A priority Critical patent/CN110887493B/en
Publication of CN110887493A publication Critical patent/CN110887493A/en
Application granted granted Critical
Publication of CN110887493B publication Critical patent/CN110887493B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a track calculation method, medium, terminal and device based on local map matching. The invention only uses one laser sensor, thus avoiding the difficulty of multi-sensor correction and fusion, and being simpler to implement, meanwhile, the track obtained by the method of the invention is improved by at least 10 times compared with the track obtained by the prior inertial navigation or wheel type odometer, the position error can be controlled within 0.1 percent, meanwhile, the time for calculating the absolute position at a time is greatly reduced, and can be controlled within 20ms, thus being suitable for being deployed on the main body of the robot to carry out real-time estimation.

Description

Trajectory estimation method, medium, terminal and device based on local map matching
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of navigation positioning, in particular to a track calculation method, medium, terminal and device based on local map matching.
[ background of the invention ]
With the progress of electronic hardware and the increase of computing power of processors, mobile robots capable of responding in real time are gradually moving into human daily lives. The positioning capability is the key for the robot to realize autonomous movement. The positioning is a process that the robot calculates the self pose through sensing the self and surrounding environment information and certain data fusion processing. At present, robot positioning is generally divided into relative positioning and absolute positioning. Common absolute positioning includes Global Positioning System (GPS), ultrasonic positioning system, and the like. Since the working area of a robot is usually determined and limited, the position of the robot in the earth is generally not considered, but only the absolute position of the robot in the working area. Commonly used relative positioning refers to inertial navigation or odometry. Inertial navigation primarily uses three-axis acceleration, a three-axis gyroscope, or a three-axis magnetometer, provided by an Inertial Measurement Unit (IMU). And in a static state, acquiring the self-direction posture through the magnetometer and the accelerometer. In the motion state, the position is acquired through the accelerometer, and the posture is corrected through the gyroscope. The mileage calculation method mainly uses an on-wheel encoder, calculates the relative pose of the robot according to the rotating pulse count of the encoder on the driving wheel in a motion state, and can accumulate the current absolute pose of the robot in a working area according to the initial pose. In recent years, there has been considerable progress in relative positioning. The pose of the robot obtained by the mileage calculation method is not limited to the use of an on-wheel encoder. Unlike the estimation of self pose according to self information, the robot can estimate self relative pose information according to environment information. For example, the relative displacement of the robot is deduced by using the relative displacement of adjacent laser point clouds or visual images. The relative displacement information of the adjacent point clouds or images can be obtained through matching calculation. And according to the initial pose, the absolute pose of the robot in the working area can be obtained through accumulative calculation.
In absolute positioning, a Global Positioning System (GPS) cannot be used indoors, and the update frequency is low; although the ultrasonic positioning system has the characteristics of low cost, miniaturization and easy connection, the ultrasonic positioning system cannot be used in a long distance, and meanwhile, a corresponding device needs to be arranged in a scene in a large area, and the problem of signal interference makes the ultrasonic positioning system unable to meet the positioning requirement of a robot. In the relative positioning, the calculation of the pose needs to perform one or two-time integration on the data of the Inertial Measurement Unit (IMU), and the estimated value lacks stability and accuracy by adding the nonlinear temperature drift of the Inertial Measurement Unit (IMU). The mileage calculation method based on the on-wheel encoder is limited by the self vibration of the robot or the slip caused by the ground factor, and the pose calculation has accumulated errors, so that the method is not accurate. In the mileage calculation method based on point cloud or image matching, the accuracy of pose estimation is limited by the matching effect and the inherent deviation of laser or vision to environment perception, and the matching is limited by a certain overlapping range, so that the estimation result has a large error.
[ summary of the invention ]
The invention provides a track calculation method, medium, terminal and device based on local map matching, which solve the technical problems.
The technical scheme for solving the technical problems is as follows: a trajectory estimation method based on local map matching comprises the following steps:
step 1, taking an initial frame point cloud obtained at an initial position as a reference point cloud;
step 2, converting the reference point cloud into a grid map;
step 3, acquiring a second frame of point cloud adjacent to the reference point cloud and taking the second frame of point cloud as a current point cloud;
step 4, matching the current point cloud with the reference point cloud to generate a first relative pose of the current point cloud relative to the reference point cloud;
step 5, accumulating the first relative pose to generate an absolute pose of a second frame of point cloud, and adding the second frame of point cloud to a grid map at the absolute pose of the second frame of point cloud;
step 6, acquiring a third frame of point cloud adjacent to the second frame of point cloud, matching the third frame of point cloud with the second frame of point cloud to generate a second relative pose of the third frame of point cloud relative to the second frame of point cloud, and adding the third frame of point cloud to the current raster map at the absolute pose of the second frame of point cloud;
step 7, extracting the angle of the second relative pose, and generating an absolute pose of a third frame of point cloud based on a map matching algorithm and a current grid map;
step 8, judging whether the distance between the absolute pose of the third frame of point cloud and the initial position is larger than a preset distance, if not, updating the third frame of point cloud to be the current point cloud, continuously acquiring point cloud data, and repeating the steps 4-8 to calculate the track of the robot; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot.
In a preferred embodiment, an iterative closest point algorithm ICP is used to generate a first relative pose of the current point cloud with respect to the reference point cloud and a second relative pose of the third frame point cloud with respect to the second frame point cloud, the iterative closest point algorithm ICP is set to 2-5 iterations, and the initial transformation is set to (0, 0, 0).
In a preferred embodiment, the predetermined distance is 10-20 meters.
In a preferred embodiment, accumulating the first relative pose using a wheeled differentially driven model generates an absolute pose of the second frame point cloud.
A second aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program, which when executed by a processor, implements the trajectory estimation method based on local map matching described above.
A third aspect of the embodiments of the present invention provides a trajectory estimation terminal based on local map matching, including the computer-readable storage medium and a processor, where the processor implements the steps of the trajectory estimation method based on local map matching when executing a computer program on the computer-readable storage medium.
The fourth aspect of the embodiment of the invention provides a trajectory estimation device based on local map matching, which comprises a point cloud acquisition module, a first grid map transformation module, a relative pose generation module, a first absolute pose generation module, a second grid map transformation module, a second absolute pose generation module and a judgment module,
the point cloud acquisition module is used for acquiring point cloud data;
the first grid map transformation module is used for taking an initial frame point cloud as a reference point cloud and transforming the reference point cloud into a grid map;
the relative pose generation module is used for matching the current point cloud with the reference point cloud to generate a first relative pose of the current point cloud relative to the reference point cloud; the second relative pose of the third frame point cloud relative to the second frame point cloud is generated by matching the third frame point cloud with the second frame point cloud;
the first absolute pose generation module is used for accumulating the first relative pose to generate an absolute pose of the second frame point cloud;
the second grid map transformation module is used for adding the second frame point cloud and the third frame point cloud to the grid map at the absolute pose of the second frame point cloud;
the second absolute pose generation module is used for extracting the angle of the second relative pose and generating the absolute pose of a third frame of point cloud based on a map matching algorithm and a current grid map;
the judging module is used for judging whether the distance between the absolute pose of the third frame of point cloud and the initial position is larger than the preset distance, if not, updating the third frame of point cloud to be the current point cloud, continuously acquiring point cloud data, and repeating the steps 4-8 to calculate the track of the robot; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot.
In a preferred embodiment, an iterative closest point algorithm ICP is used to generate a first relative pose of the current point cloud with respect to the reference point cloud and a second relative pose of the third frame point cloud with respect to the second frame point cloud, the iterative closest point algorithm ICP is set to 2-5 iterations, and the initial transformation is set to (0, 0, 0).
In a preferred embodiment, the predetermined distance is 10-20 meters.
In a preferred embodiment, the first absolute pose generation module is specifically configured to generate the absolute pose of the second frame point cloud by accumulating the first relative pose using a wheel differential-driven model.
The invention provides a simple, effective, stable and accurate method for calculating the relative pose and the absolute pose. The invention only uses one laser sensor, thus avoiding the difficulty of multi-sensor correction and fusion, and being simpler to implement, meanwhile, the track obtained by the method of the invention is improved by at least 10 times compared with the track obtained by the prior inertial navigation or wheel type odometer, the position error can be controlled within 0.1 percent, meanwhile, the time for calculating the absolute position at a time is greatly reduced, and can be controlled within 20ms, thus being suitable for being deployed on the main body of the robot to carry out real-time estimation.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a trajectory estimation method based on local map matching provided in embodiment 1;
FIG. 2 is a schematic structural diagram of a trajectory estimation device based on local map matching provided in embodiment 2;
fig. 3 is a schematic structural diagram of a trajectory estimation terminal based on local map matching provided in embodiment 3;
fig. 4 is a movement trajectory of a mobile robot in a 90 x 60 square meter scene;
fig. 5 is a grid map created by a mobile robot in a 90 x 60 square meter scene.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantageous effects of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and the detailed description. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a schematic flowchart of a trajectory estimation method based on local map matching according to embodiment 1 of the present invention, and as shown in fig. 1, the method includes the following steps:
step 1, taking an initial frame point cloud obtained at an initial position as a reference point cloud;
and 2, acquiring the absolute pose of the point cloud and the preset size of the grid map at the moment, and converting the reference point cloud into the grid map.
And 3, acquiring a second frame of point cloud adjacent to the initial frame of point cloud as the current point cloud.
And 4, matching the current point cloud and the reference point cloud by using an iterative closest point algorithm ICP (inductively coupled plasma), and generating a first relative pose of the current point cloud relative to the reference point cloud, wherein initial transformation needs to be given, and the initial transformation given in the embodiment is (0, 0, 0). Meanwhile, the generated first relative pose is only used as initial transformation in a subsequent map matching algorithm, so that only one relatively rough result of the iterative closest point algorithm is needed, and in order to improve the calculation efficiency, the iteration number is set to be 2-5 times, preferably 3 times. Of course, in other embodiments, other iterative algorithms may be adopted to generate the first relative pose of the current point cloud with respect to the reference point cloud, and a specific calculation process is not described in detail here, but is within the scope of the present application.
And 5, accumulating the first relative pose by adopting a wheel type differential driving model to generate an absolute pose of a second frame point cloud, and adding the second frame point cloud to the grid map at the absolute pose of the second frame point cloud. In other embodiments, other robot kinematic models may be used to accumulate the first relative pose to generate an absolute pose for the second frame point cloud. In a preferred embodiment, the current point cloud is added to the grid map at the current absolute pose according to a grid map update rule, such as a grid occupancy probability model.
And 6, acquiring a third frame of point cloud adjacent to the second frame of point cloud, matching the third frame of point cloud with the second frame of point cloud to generate a second relative pose of the third frame of point cloud relative to the second frame of point cloud, and adding the third frame of point cloud to the current raster map at the absolute pose of the second frame of point cloud.
And 7, extracting the angle of the second relative pose, and generating the absolute pose of the third frame of point cloud based on a map matching algorithm and the current grid map. In the map matching algorithm, occupied grids are searched and scores are calculated based on rasterized point cloud data and a current grid map, and the pose is adjusted step by step according to the scores. In this embodiment, the occupied grid search range is set to be within 5 layers of grids around the current rasterized point cloud, so as to ensure the calculation efficiency and accuracy.
Step 8, judging whether the distance between the absolute pose of the third frame of point cloud and the initial position is greater than a preset distance, if not, updating the third frame of point cloud to the current point cloud (namely, a new second frame of point cloud), continuously acquiring point cloud data as a new third frame of point cloud, and then repeating the steps 4-8 to calculate the track of the robot; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot. In a preferred embodiment, the predetermined distance is 10-20 meters, for example 15 meters. As shown in fig. 4 and 5, the moving trajectory of the mobile robot in the scene of 90 × 60 square meters and the created grid map are in the preferred embodiment described above.
The preferred embodiment provides a simple, effective, stable and accurate method for calculating the relative pose and the absolute pose, the relative pose is obtained through matching calculation of adjacent point clouds, the relative pose is used as initial transformation to transform the current point cloud into a local map, and then the absolute pose is obtained through correction of a map matching algorithm. The invention only uses one laser sensor, thus avoiding the difficulty of multi-sensor correction and fusion, and being simpler to implement, meanwhile, the track obtained by the method of the invention is improved by at least 10 times compared with the track obtained by the prior inertial navigation or wheel type odometer, the position error can be controlled within 0.1 percent, meanwhile, the time for calculating the absolute position at a time is greatly reduced, and can be controlled within 20ms, thus being suitable for being deployed on the main body of the robot to carry out real-time estimation.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
An embodiment of the present invention further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the above-mentioned trajectory estimation method based on local map matching is implemented.
Fig. 2 is a schematic structural diagram of a trajectory estimation device based on local map matching according to embodiment 2 of the present invention, as shown in fig. 2, including a point cloud obtaining module 100, a first grid map transformation module 200, a relative pose generation module 300, a first absolute pose generation module 400, a second grid map transformation module 500, a second absolute pose generation module 600, and a judgment module 700,
the point cloud obtaining module 100 is configured to obtain point cloud data;
the first grid map transformation module 200 is configured to use an initial frame point cloud as a reference point cloud, and transform the reference point cloud into a grid map;
the relative pose generating module 300 is configured to match the current point cloud with the reference point cloud to generate a first relative pose of the current point cloud with respect to the reference point cloud; the second relative pose of the third frame point cloud relative to the second frame point cloud is generated by matching the third frame point cloud with the second frame point cloud;
the first absolute pose generation module 400 is configured to accumulate the first relative pose to generate an absolute pose of the second frame point cloud;
the second grid map transformation module 500 is configured to add the second frame point cloud and the third frame point cloud to the grid map at the absolute pose of the second frame point cloud;
the second absolute pose generating module 600 is configured to extract an angle of the second relative pose, and generate an absolute pose of a third frame of point cloud based on a map matching algorithm and a current grid map;
the judging module 700 is configured to judge whether the distance between the absolute pose of the third frame of point cloud and the initial position is greater than a preset distance, if not, update the third frame of point cloud to the current point cloud, continue to acquire point cloud data, and repeat steps 4-8 to calculate the robot track; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot.
In a preferred embodiment, the relative pose generation module 300 generates a first relative pose of the current point cloud with respect to the reference point cloud and a second relative pose of the third frame point cloud with respect to the second frame point cloud by using an iterative closest point algorithm ICP, the iterative number of times of the iterative closest point algorithm ICP is set to 2-5 times, and the initial transformation is set to (0, 0, 0).
In a preferred embodiment, the predetermined distance is 10-20 meters.
In a preferred embodiment, the first absolute pose generation module 400 is specifically configured to generate the absolute pose of the second frame point cloud by accumulating the first relative pose using a wheel differential-driven model.
The embodiment of the invention also provides a track estimation terminal based on local map matching, which comprises the computer readable storage medium and a processor, wherein the processor realizes the steps of the track estimation method based on local map matching when executing the computer program on the computer readable storage medium. Fig. 3 is a schematic structural diagram of a trajectory estimation terminal based on local map matching according to embodiment 3 of the present invention, and as shown in fig. 3, the trajectory estimation terminal 8 based on local map matching according to this embodiment includes: a processor 80, a readable storage medium 81 and a computer program 82 stored in said readable storage medium 81 and executable on said processor 80. The processor 80, when executing the computer program 82, implements the steps in the various method embodiments described above, such as steps 1 through 8 shown in fig. 1. Alternatively, the processor 80, when executing the computer program 82, implements the functions of the modules in the above-described device embodiments, such as the functions of the modules 100 to 700 shown in fig. 2.
Illustratively, the computer program 82 may be partitioned into one or more modules that are stored in the readable storage medium 81 and executed by the processor 80 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program 82 in the trajectory estimation terminal 8 based on local map matching.
The trajectory estimation terminal 8 based on local map matching may include, but is not limited to, a processor 80 and a readable storage medium 81. Those skilled in the art will understand that fig. 3 is only an example of the trajectory estimation terminal 8 based on local map matching, and does not constitute a limitation on the trajectory estimation terminal 8 based on local map matching, and may include more or less components than those shown in the drawings, or combine some components, or different components, for example, the trajectory estimation terminal based on local map matching may further include a power management module, an arithmetic processing module, an input-output device, a network access device, a bus, and the like.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The readable storage medium 81 may be an internal storage unit of the trajectory estimation terminal 8 based on local map matching, such as a hard disk or a memory of the trajectory estimation terminal 8 based on local map matching. The readable storage medium 81 may also be an external storage device of the trajectory estimation terminal 8 based on local map matching, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the trajectory estimation terminal 8 based on local map matching. Further, the readable storage medium 81 may include both an internal storage unit and an external storage device of the trajectory estimation terminal 8 based on local map matching. The readable storage medium 81 is used for storing the computer program and other programs and data required for the trajectory estimation terminal based on the local map matching. The readable storage medium 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The invention is not limited solely to that described in the specification and embodiments, and additional advantages and modifications will readily occur to those skilled in the art, so that the invention is not limited to the specific details, representative apparatus, and illustrative examples shown and described herein, without departing from the spirit and scope of the general concept as defined by the appended claims and their equivalents.

Claims (10)

1. A trajectory estimation method based on local map matching is characterized by comprising the following steps:
step 1, taking an initial frame point cloud obtained at an initial position as a reference point cloud;
step 2, converting the reference point cloud into a grid map;
step 3, acquiring a second frame of point cloud adjacent to the reference point cloud and taking the second frame of point cloud as a current point cloud;
step 4, matching the current point cloud with the reference point cloud to generate a first relative pose of the current point cloud relative to the reference point cloud;
step 5, accumulating the first relative pose to generate an absolute pose of a second frame of point cloud, and adding the second frame of point cloud to a grid map at the absolute pose of the second frame of point cloud;
step 6, acquiring a third frame of point cloud adjacent to the second frame of point cloud, matching the third frame of point cloud with the second frame of point cloud to generate a second relative pose of the third frame of point cloud relative to the second frame of point cloud, and adding the third frame of point cloud to the current raster map at the absolute pose of the second frame of point cloud;
step 7, extracting the angle of the second relative pose, and generating an absolute pose of a third frame of point cloud based on a map matching algorithm and a current grid map;
step 8, judging whether the distance between the absolute pose of the third frame of point cloud and the initial position is larger than a preset distance, if not, updating the third frame of point cloud to be the current point cloud, continuously acquiring point cloud data, and repeating the steps 4-8 to calculate the track of the robot; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot.
2. The trajectory estimation method based on local map matching according to claim 1, wherein an iterative closest point algorithm (ICP) is used to generate a first relative pose of the current point cloud with respect to the reference point cloud and a second relative pose of the third frame point cloud with respect to the second frame point cloud, the iterative closest point algorithm (ICP) is set to 2-5 times, and the initial transformation is set to (0, 0, 0).
3. The local map matching-based trajectory estimation method according to claim 1 or 2, wherein the preset distance is 10-20 meters.
4. The local map matching based trajectory estimation method of claim 3, wherein accumulating the first relative pose using a wheeled differential driven model generates an absolute pose of the second frame point cloud.
5. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the trajectory estimation method based on local map matching according to any one of claims 1 to 4.
6. A trajectory estimation terminal based on local map matching, comprising the computer-readable storage medium of claim 5 and a processor, wherein the processor, when executing the computer program on the computer-readable storage medium, implements the steps of the trajectory estimation method based on local map matching according to any one of claims 1 to 4.
7. A track calculation device based on local map matching for realizing the robot track calculation method of any one of claims 1 to 4 is characterized by comprising a point cloud acquisition module, a first grid map transformation module, a relative pose generation module, a first absolute pose generation module, a second grid map transformation module, a second absolute pose generation module and a judgment module,
the point cloud acquisition module is used for acquiring point cloud data;
the first grid map transformation module is used for taking an initial frame point cloud as a reference point cloud and transforming the reference point cloud into a grid map;
the relative pose generation module is used for matching the current point cloud with the reference point cloud to generate a first relative pose of the current point cloud relative to the reference point cloud; the second relative pose of the third frame point cloud relative to the second frame point cloud is generated by matching the third frame point cloud with the second frame point cloud;
the first absolute pose generation module is used for accumulating the first relative pose to generate an absolute pose of the second frame point cloud;
the second grid map transformation module is used for adding the second frame point cloud and the third frame point cloud to the grid map at the absolute pose of the second frame point cloud;
the second absolute pose generation module is used for extracting the angle of the second relative pose and generating the absolute pose of a third frame of point cloud based on a map matching algorithm and a current grid map;
the judging module is used for judging whether the distance between the absolute pose of the third frame of point cloud and the initial position is larger than the preset distance, if not, updating the third frame of point cloud to be the current point cloud, continuously acquiring point cloud data, and repeating the steps 4-8 to calculate the track of the robot; and if so, deleting the current grid map, taking the absolute pose of the third frame point cloud as the initial position, taking the third frame point cloud as the reference point cloud, repeating the steps 2-8, reestablishing the grid map and calculating the track of the robot.
8. The trajectory estimation device based on local map matching according to claim 7, wherein an iterative closest point algorithm (ICP) is used to generate a first relative pose of the current point cloud with respect to the reference point cloud and a second relative pose of the third frame point cloud with respect to the second frame point cloud, the number of iterations of the ICP is set to 2-5 times, and the initial transformation is set to (0, 0, 0).
9. The trajectory estimation device based on local map matching according to claim 7 or 8, characterized in that the preset distance is 10-20 meters.
10. The trajectory estimation device based on local map matching as claimed in claim 9, wherein the first absolute pose generation module is specifically configured to generate the absolute pose of the second frame point cloud by accumulating the first relative pose using a wheel type differential driving model.
CN201911205872.0A 2019-11-29 2019-11-29 Track calculation method, medium, terminal and device based on local map matching Active CN110887493B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911205872.0A CN110887493B (en) 2019-11-29 2019-11-29 Track calculation method, medium, terminal and device based on local map matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911205872.0A CN110887493B (en) 2019-11-29 2019-11-29 Track calculation method, medium, terminal and device based on local map matching

Publications (2)

Publication Number Publication Date
CN110887493A true CN110887493A (en) 2020-03-17
CN110887493B CN110887493B (en) 2023-05-05

Family

ID=69749672

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911205872.0A Active CN110887493B (en) 2019-11-29 2019-11-29 Track calculation method, medium, terminal and device based on local map matching

Country Status (1)

Country Link
CN (1) CN110887493B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN111461982A (en) * 2020-03-30 2020-07-28 北京百度网讯科技有限公司 Method and device for splicing point clouds
CN111881233A (en) * 2020-06-28 2020-11-03 广州文远知行科技有限公司 Distributed point cloud map construction method and device, server and computer readable storage medium
CN113094462A (en) * 2021-04-30 2021-07-09 腾讯科技(深圳)有限公司 Data processing method and device and storage medium
CN113744301A (en) * 2021-08-05 2021-12-03 深圳供电局有限公司 Motion trajectory estimation method and device for mobile robot and storage medium
CN114019977A (en) * 2021-11-03 2022-02-08 诺力智能装备股份有限公司 Path control method and device for mobile robot, storage medium and electronic device
CN114019977B (en) * 2021-11-03 2024-06-04 诺力智能装备股份有限公司 Path control method and device for mobile robot, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107144292A (en) * 2017-06-08 2017-09-08 杭州南江机器人股份有限公司 The odometer method and mileage counter device of a kind of sports equipment
CN108917759A (en) * 2018-04-19 2018-11-30 电子科技大学 Mobile robot pose correct algorithm based on multi-level map match
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
US20190323843A1 (en) * 2018-07-04 2019-10-24 Baidu Online Network Technology (Beijing) Co., Ltd. Method for generating a high precision map, apparatus and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107144292A (en) * 2017-06-08 2017-09-08 杭州南江机器人股份有限公司 The odometer method and mileage counter device of a kind of sports equipment
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN108917759A (en) * 2018-04-19 2018-11-30 电子科技大学 Mobile robot pose correct algorithm based on multi-level map match
US20190323843A1 (en) * 2018-07-04 2019-10-24 Baidu Online Network Technology (Beijing) Co., Ltd. Method for generating a high precision map, apparatus and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
纪嘉文;杨明欣;: "一种基于多传感融合的室内建图和定位算法" *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111398984A (en) * 2020-03-22 2020-07-10 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN111398984B (en) * 2020-03-22 2022-03-29 华南理工大学 Self-adaptive laser radar point cloud correction and positioning method based on sweeping robot
CN111461982A (en) * 2020-03-30 2020-07-28 北京百度网讯科技有限公司 Method and device for splicing point clouds
CN111461982B (en) * 2020-03-30 2023-09-22 北京百度网讯科技有限公司 Method and apparatus for splice point cloud
CN111881233A (en) * 2020-06-28 2020-11-03 广州文远知行科技有限公司 Distributed point cloud map construction method and device, server and computer readable storage medium
CN111881233B (en) * 2020-06-28 2022-01-18 广州文远知行科技有限公司 Distributed point cloud map construction method and device, server and computer readable storage medium
CN113094462A (en) * 2021-04-30 2021-07-09 腾讯科技(深圳)有限公司 Data processing method and device and storage medium
CN113094462B (en) * 2021-04-30 2023-10-24 腾讯科技(深圳)有限公司 Data processing method and device and storage medium
CN113744301A (en) * 2021-08-05 2021-12-03 深圳供电局有限公司 Motion trajectory estimation method and device for mobile robot and storage medium
CN114019977A (en) * 2021-11-03 2022-02-08 诺力智能装备股份有限公司 Path control method and device for mobile robot, storage medium and electronic device
CN114019977B (en) * 2021-11-03 2024-06-04 诺力智能装备股份有限公司 Path control method and device for mobile robot, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN110887493B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN110887493B (en) Track calculation method, medium, terminal and device based on local map matching
US10852139B2 (en) Positioning method, positioning device, and robot
CN109506642B (en) Robot multi-camera visual inertia real-time positioning method and device
JP7482978B2 (en) Vehicle Sensor Calibration and Localization
US20210183100A1 (en) Data processing method and apparatus
CN112304330B (en) Method for displaying running state of vehicle and electronic equipment
CN110986988B (en) Track calculation method, medium, terminal and device integrating multi-sensor data
CN111380514A (en) Robot position and posture estimation method and device, terminal and computer storage medium
CN110969649A (en) Matching evaluation method, medium, terminal and device of laser point cloud and map
CN111862214B (en) Computer equipment positioning method, device, computer equipment and storage medium
CN109724597B (en) Inertial navigation resolving method and system based on function iteration integral
CN111862215B (en) Computer equipment positioning method and device, computer equipment and storage medium
CN112835064B (en) Mapping positioning method, system, terminal and medium
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN104848861A (en) Image vanishing point recognition technology based mobile equipment attitude measurement method
CN111707262B (en) Point cloud matching method, medium, terminal and device based on closest point vector projection
CN113218389B (en) Vehicle positioning method, device, storage medium and computer program product
CN113252023A (en) Positioning method, device and equipment based on odometer
CN116380039A (en) Mobile robot navigation system based on solid-state laser radar and point cloud map
CN112729349B (en) Method and device for on-line calibration of odometer, electronic equipment and storage medium
CN115560744A (en) Robot, multi-sensor-based three-dimensional mapping method and storage medium
CN113960614A (en) Elevation map construction method based on frame-map matching
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
Hu et al. Efficient Visual-Inertial navigation with point-plane map
CN109387187B (en) Sweeper positioning method and system based on visual feature points and sweeping robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant