WO2022110767A1 - 定位与地图构建方法、装置、机器人及计算机存储介质 - Google Patents
定位与地图构建方法、装置、机器人及计算机存储介质 Download PDFInfo
- Publication number
- WO2022110767A1 WO2022110767A1 PCT/CN2021/100300 CN2021100300W WO2022110767A1 WO 2022110767 A1 WO2022110767 A1 WO 2022110767A1 CN 2021100300 W CN2021100300 W CN 2021100300W WO 2022110767 A1 WO2022110767 A1 WO 2022110767A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- line segment
- robot
- line
- coordinate system
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 230000004807 localization Effects 0.000 title abstract description 5
- 238000013507 mapping Methods 0.000 title abstract description 3
- 238000000605 extraction Methods 0.000 claims abstract description 25
- 238000005457 optimization Methods 0.000 claims abstract description 13
- 230000014509 gene expression Effects 0.000 claims description 79
- 238000010276 construction Methods 0.000 claims description 37
- 239000013598 vector Substances 0.000 claims description 29
- 238000004422 calculation algorithm Methods 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 13
- 238000013519 translation Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 12
- 238000001914 filtration Methods 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 abstract description 17
- 238000010586 diagram Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 230000009466 transformation Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 238000002945 steepest descent method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
Definitions
- the present application belongs to the field of autonomous navigation, and specifically relates to a positioning and map construction method, device, robot and computer storage medium.
- the mobile robot During the operation of the mobile robot, it needs to continuously obtain data so as to locate itself in real time. In addition, it also needs to construct a map of the surrounding environment according to the obtained data.
- SLAM Simultaneous Localization And Mapping, real-time localization and map construction
- the SLAM algorithm used in the related art has problems such as large amount of calculation and low computational efficiency, which further affects the efficiency of real-time positioning and map construction.
- the purpose of the present application is to provide a positioning and map construction method, device, robot and computer storage medium, which can improve at least one of the above problems.
- An embodiment of the present application provides a method for positioning and building a map.
- the method includes: extracting linear features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments; estimating that the robot is compared at time t The relative posture at time t-1; according to the relative posture, line segment matching is performed on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 to obtain each line segment.
- the straight line to which each line segment belongs jointly optimize the posture of the robot at each moment and the parametric representation of two straight lines of all straight lines, and obtain the posture of the robot at each moment and the straight line parameters of all straight lines.
- performing line segment matching on multiple line segments corresponding to the time t and multiple line segments obtained by performing linear feature extraction at the time t-1 includes: determining that the robot is at the time t-
- the robot coordinate system formed at time 1 is the reference coordinate system; according to the relative posture, the coordinates of the two end points of the line segment corresponding to the time t are converted into the reference coordinate system, and the coordinates corresponding to the time t are obtained.
- the expression of the corresponding line segment under the reference coordinate system; the expression of the line segment corresponding to the time t under the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 are carried out. match.
- the method further includes: Including: for each line segment corresponding to the time t, according to a predetermined attribute feature and a pre-stored attribute feature threshold, filtering the difference line segment from all the line segments corresponding to the time t-1, and comparing the line segment with the The remaining line segment corresponding to time t-1 is determined as the candidate line segment corresponding to the line segment;
- the matching the expression of the line segment corresponding to the time t in the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 includes: for the line segment corresponding to the time t For each line segment corresponding to the moment, the expression of the line segment in the reference coordinate system and the expression of the candidate line segment corresponding to the line segment are matched.
- the method further includes: determining an expression of the candidate line segment corresponding to the line segment. The number is not zero; otherwise, the method further includes: determining that the line segment belongs to a new straight line.
- the matching the expression of the line segment corresponding to the time t in the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 includes: For each line segment corresponding to time t, the line segment and each single line segment corresponding to time t-1 form a line segment pair, and according to the pre-built loss function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
- the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
- the joint optimization is performed on the posture of the robot at each moment and the two-line parameterized representation of all straight lines to obtain the posture of the robot at the various moments and the straight line parameters of all straight lines, including: setting: The expression of the posture of the robot at the various moments; the expression of all straight lines obtained after the line segment matching is set; the residual vector is constructed by the expression of the posture and the expression of all the straight lines ; By minimizing the sum of squares of the residual vector, the optimal estimation formula between the attitude at each moment and all the straight lines is obtained; The optimal estimation formula is iteratively solved to obtain the optimal estimation formula at each moment. Attitude and line parameters for all lines.
- the embodiment of the present application provides a positioning and map construction device, the device includes: an extraction module, an estimation module, a matching module, and an optimization module.
- the extraction module is configured to perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
- the estimation module is configured to estimate the relative relative value of the robot at time t compared to time t-1.
- a matching module configured to perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 according to the relative posture, to obtain each line segment The line to which it belongs;
- the optimization module is configured to jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all the lines to obtain the posture of the robot at the time and the line parameters of all the lines.
- the matching module is configured to determine the robot coordinate system formed by the robot at the time t-1 as the reference coordinate system; The coordinates of the two endpoints are converted into the reference coordinate system, and the expression of the line segment corresponding to the time t in the reference coordinate system is obtained; the line segment corresponding to the time t is placed in the reference coordinate system. and the expressions of multiple line segments corresponding to the time t-1 are matched.
- the device further includes a filtering module configured to, for each line segment corresponding to the time t, select from the line segment corresponding to the time t-1 according to a predetermined attribute feature and a pre-stored attribute feature threshold. Filter the difference line segments from all the line segments, and determine the remaining line segments corresponding to the time t-1 as the candidate line segments corresponding to the line segment;
- the matching module is configured to, for each line segment corresponding to the time t, match the expression of the line segment in the reference coordinate system with the expression of the candidate line segment corresponding to the line segment.
- the device further includes a determination module configured to determine that the number of candidate line segments corresponding to the line segment is not zero; when the determination module determines that it is, the device further includes an add module configured to determine that the line segment belongs to new straight line.
- the matching module is configured to, for each line segment corresponding to the time t, form a line segment pair with the line segment and each single line segment corresponding to the time t-1, and according to a pre-built loss function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
- the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
- the optimization module is configured to set the expression of the posture of the robot at the various moments; set the expression of all straight lines obtained after the line segment matching; pass the expression of the posture and The expressions of all the straight lines are used to construct residual vectors; by minimizing the sum of the squares of the residual vectors, the optimal estimation formula between the posture at each moment and all the straight lines is obtained; The estimation formula is iteratively solved to obtain the posture at each moment and the straight line parameters of all straight lines.
- An embodiment of the present application further provides a robot, including: a memory and a processor, the memory is connected to the processor; the memory is configured to store a program; the processor calls the program stored in the memory to Perform any of the possible positioning and map building methods above.
- Embodiments of the present application further provide a non-volatile computer-readable storage medium (hereinafter referred to as a computer storage medium), on which a computer program is stored, and the computer program executes any of the above possible positioning and Map construction method.
- a computer storage medium on which a computer program is stored, and the computer program executes any of the above possible positioning and Map construction method.
- FIG. 1 shows a schematic structural diagram of a robot provided by an embodiment of the present application.
- FIG. 2 shows a flowchart of the method for positioning and building a map provided by an embodiment of the present application.
- FIG. 3 shows a schematic diagram of linear feature extraction according to a point cloud provided by an embodiment of the present application.
- FIG. 4 shows a schematic diagram of line segment matching provided by an embodiment of the present application.
- FIG. 5 shows a structural block diagram of the positioning and map building apparatus provided by the embodiment of the present application.
- Icons 100-robot; 110-processor; 120-memory; 130-laser emission component; 400-location and map construction device; 410-extraction module; 420-estimation module; 430-matching module; 440-optimization module.
- the embodiments of the present application provide a positioning and map construction method, device, robot, and computer storage medium, which can reduce the amount of calculation in the calculation process, thereby improving the calculation efficiency.
- the technology can be implemented by corresponding software, hardware and combination of software and hardware.
- the embodiments of the present application are described in detail below.
- FIG. 1 a robot 100 for implementing the method and apparatus for positioning and map construction according to the embodiment of the present application will be described.
- the robot 100 may include: a processor 110 , a memory 120 , and a laser emitting component 130 .
- the components and structures of the robot 100 shown in FIG. 1 are only exemplary and not restrictive, and the robot 100 may also have other components and structures as required.
- the processor 110 , the memory 120 , the laser emitting component 130 and other components that may be present in the robot 100 are directly or indirectly electrically connected to each other to realize data transmission or interaction.
- the processor 110, the memory 120, the laser emitting component 130 and other possible components may be electrically connected to each other through one or more communication buses or signal lines.
- the laser emitting component 130 After the laser emitting component 130 is powered on, it can emit laser light at a fixed frequency, and receive the reflected laser light after the foreign object reflects the laser light, so as to generate point cloud data corresponding to each laser emission moment.
- the laser emitting component 130 may be a multi-threaded laser component for emitting multi-threaded laser light; or, the laser emitting component 130 may also be a single-threaded laser component for emitting single-threaded laser light.
- point cloud data obtained based on a single-threaded laser is lower than that of point cloud data obtained by multi-threaded lasers, for the purpose of reducing the amount of calculation and data complexity, it is possible to directly
- a single-threaded laser is used as the laser emitting component 130, so that the point cloud data obtained subsequently is composed of two-dimensional discrete points.
- the memory 120 is used for storing programs, such as a program corresponding to a positioning and map construction method appearing later or a positioning and map construction apparatus appearing later.
- programs such as a program corresponding to a positioning and map construction method appearing later or a positioning and map construction apparatus appearing later.
- the location and map construction apparatus includes at least one software function module that can be stored in the memory 120 in the form of software or firmware.
- the software function modules included in the positioning and map building apparatus may also be solidified in an operating system (operating system, OS) of the robot 100 .
- OS operating system
- the processor 110 is configured to execute executable modules stored in the memory 120, such as software function modules or computer programs included in the positioning and map construction apparatus.
- executable modules stored in the memory 120, such as software function modules or computer programs included in the positioning and map construction apparatus.
- the processor 110 can execute the computer program, for example, perform: extracting linear features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments; Compared with the relative posture at time t-1; according to the relative posture, perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 to obtain The lines to which each line segment belongs; jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all the lines to obtain the posture of the robot at the various moments and the linear parameters of all the lines.
- any embodiment of the present application may be applied to the processor 110 or implemented by the processor 110 .
- an embodiment of the present application provides a positioning and map construction method applied to the above-mentioned robot 100 .
- the embodiments of the present application will be described below with reference to steps S110 to S140 shown in FIG. 2 , taking the point cloud data composed of two-dimensional discrete points as an example.
- Step S110 extracting line features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments.
- the laser emitting component 130 can emit laser light to the outside, so as to acquire point cloud data used to reflect the environmental conditions at the current moment.
- the robot 100 can successively obtain the point cloud data of the current moment through the laser emitting component 130 .
- the two-dimensional discrete points in multiple planes produced by a single-threaded laser can be expressed by the formula To represent.
- x represents the abscissa of the two-dimensional discrete point in the robot coordinate system o at the current moment
- y represents the ordinate of the two-dimensional discrete point in the robot coordinate system o at the current moment.
- the expression P representing a two-dimensional discrete point includes two real variables, namely x and y.
- a first identification point and a second identification point are preset on the robot 100 .
- the robot 100 can determine the robot coordinate system corresponding to the current moment by acquiring the position of the first identification point at the current moment and the position of the second identification point at the current moment.
- the robot 100 takes the position of the first identification point as the origin of the coordinates, the line connecting the position of the second identification point and the position of the first identification point is the y-axis, and the vertical The straight line on the y-axis is the x-axis, thereby determining the robot coordinate system corresponding to the current moment.
- the robot coordinate system corresponding to each moment may be different.
- the robot coordinate system determined at the initial time of startup of the robot 100 is used as the global world coordinate system w.
- the first identification point may be a position where the geometric center point of the robot 100 is located, or may be a point corresponding to other positions of the robot 100, which is not specifically limited in this embodiment of the present application.
- a pre-saved known linear feature extraction algorithm such as the least squares algorithm
- ⁇ represents the angle between the normal vector of the line where the line segment is located and the x-axis in the robot coordinate system o at the current moment
- s i , e i represent the two endpoints of the line segment i
- n is a positive integer .
- Step S120 Estimate the relative posture of the robot at the time t compared to the time t-1.
- time t-1 can also be expressed as time (t-1).
- the position of the robot 100 at time t is referred to as the posture of the robot 100 at time t.
- time t-1 the posture of the robot 100 at the moment before time t
- the expression T that represents the attitude includes three real variables, namely x, y,
- the estimation algorithm for the relative posture of the robot at different times may include, but is not limited to: the posture estimation algorithm of the robot wheel encoder, the matching algorithm of the point cloud iterative closest point (ICP, Iterative Closest Point), etc.
- the relative pose pair is not limited The specific type of estimation algorithm for .
- the robot can calculate the relative attitude at time t compared to the above time t-1.
- v is the linear velocity of the robot 100 when it is moving forward measured by the built-in wheel encoder
- ⁇ is the angular velocity of the robot 100 when it is moving forward measured by the built-in wheel encoder
- ⁇ t is the time between t and t -1 Time interval between moments.
- the aforementioned encoder is an angular displacement sensor, which determines the change of the robot pose by detecting the number of radians rotated by the robot wheel in a certain period of time. It can be mainly divided into three types: photoelectric type, contact type, and electromagnetic type. Specifically, the type of the selected encoder can be flexibly set according to the actual situation.
- step S120 may be executed before step S110, in other embodiments, step S120 may be executed after step S110, and in other embodiments, step S120 and step S110 may be executed simultaneously.
- Step S130 Perform line segment matching on the line segments corresponding to the time t and the line segments obtained by performing the linear feature extraction at the time t-1 according to the relative posture, to obtain the lines to which each line segment belongs.
- the laser emitting component 130 collects point cloud data at a rate of 10-50 frames per second in the environment. Therefore, for the same spatial scene, there are differences between the point cloud data collected in several adjacent frames. smaller.
- a map needs to be constructed, for the purpose of reducing the amount of calculation, it is necessary to match the line segments extracted at adjacent moments with each other, so as to associate the line segments belonging to the same straight line and determine the same straight line .
- the process of line segment matching may refer to the following method.
- Step S131 According to the relative posture, convert the coordinates of the two end points of the line segment corresponding to the time t into the reference coordinate system, and obtain the line segment corresponding to the time t in the reference coordinate system. expression.
- Step S132 Match the expression of the line segment corresponding to the time t in the reference coordinate system with the expressions of the multiple line segments corresponding to the time t-1.
- the line segment corresponding to the current moment after the coordinate system transformation may be directly matched with the line segment corresponding to the previous moment.
- the line segments obtained at the previous moment may also be filtered, so as to filter out the difference line segments in the line segments corresponding to the previous moment, and determine the remaining line segments as candidate line segments.
- the line segment corresponding to the current moment is screened for the line segment corresponding to the previous moment.
- the line segment corresponding to the current moment and the candidate line segment corresponding to the previous moment after the coordinate system transformation can be directly matched, thereby reducing the amount of data when performing the matching operation.
- the candidate line segments can be screened by referring to the following methods.
- the difference line segment can be filtered from all the line segments corresponding to the previous moment by comparing the attribute feature and the attribute feature threshold value according to the pre-determined attribute feature and the pre-stored attribute feature threshold value, and The remaining line segments in the previous moment are determined as candidate line segments corresponding to the line segment.
- the expression in the reference coordinate system corresponding to the previous moment r can and each line segment observed by the robot 100 at the previous time r Alignment is performed to determine the attribute characteristics between the two line segments to be compared.
- the attribute features include, but are not limited to, at least one of the angle difference between the two line segments to be compared, the distance between the two line segments to be compared, and the degree of coincidence between the two line segments to be compared.
- the attribute feature thresholds corresponding to the above-listed attribute features are an angle difference threshold, a distance threshold, and a coincidence threshold, respectively.
- the angle difference between the two line segments to be compared is angle in the reference frame and line segments in the reference coordinate system angle in the reference frame angle difference between If the angle difference exceeds the preset angle difference angle difference threshold (for example, 5°), it is considered that the two line segments to be compared are and The probability of belonging to the same straight line is small, for difference line segment.
- the preset angle difference angle difference threshold for example, 5°
- the distance between the two line segments to be compared is Expressions of both ends of , in the reference coordinate system to the line segment in the reference coordinate system
- the distances are d 1 and d 2 , respectively.
- a preset distance threshold for example, 10 cm
- the degree of coincidence d 0 between the two line segments to be compared, the endpoint Project to line get the projected point on d 0 is the line segment and the length of the overlapping region. If d 0 is less than the coincidence threshold (for example, 10 cm), the two line segments to be compared are considered and The probability of belonging to the same straight line is small, for difference line segment.
- the coincidence threshold for example, 10 cm
- a line segment pair is formed between the line segment and each single line segment corresponding to time t-1, and the loss value of each line segment pair corresponding to the line segment is calculated according to the pre-built loss function, and then The pair of line segments corresponding to the smallest loss value is determined to belong to the same straight line.
- the pre-built loss function is d 1 , d 2 are the line segments
- the distance of d 0 is the length of the overlapping area of the two line segments in the line segment pair, for the line segment
- the length of ⁇ is a preset control factor parameter (for example, it can be set to 0.5).
- a certain line segment observed at the current moment c can be obtained as The loss value between each line segment observed at the previous time r (if line segment filtering is performed, here is each candidate line segment observed at the previous time r). Subsequently, by comparing the size of each loss value, the line segment observed at the previous moment r corresponding to the minimum loss value is compared with determined to belong to the same straight line. At this time, the same line identifier (eg, line ID) can be set for the two line segments to indicate that the two line segments belong to the same line, which can be represented by the same expression.
- line ID line ID
- the final candidate line segment is 0, which means that the line segment is 0.
- the new line ID can be inherited to the subsequent line segment with the line segment. Line segments that belong to the same straight line.
- the first moment and the line segments belonging to the same straight line at the current moment can be associated.
- Step S140 Jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all straight lines to obtain the posture of the robot at each moment and the linear parameters of all the straight lines.
- a line can construct a residual vector consisting of two residuals:
- the posture at each moment and the straight line parameters of all straight lines can be obtained, so as to complete the trajectory estimation and map construction of the robot 100 .
- the nonlinear least squares problem can be solved by using existing calculation methods, such as Gauss-Newton algorithm (Gauss-Newton), Levenberg-Marquardt algorithm (Levenberg-Marquardt), steepest descent method, etc. to iterate
- existing calculation methods such as Gauss-Newton algorithm (Gauss-Newton), Levenberg-Marquardt algorithm (Levenberg-Marquardt), steepest descent method, etc.
- an embodiment of the present application further provides a positioning and map construction apparatus 400.
- the positioning and map construction apparatus 400 may also be installed at the robot.
- the positioning and map construction apparatus 400 may include: an extraction module 410, an estimation module 420 , a matching module 430 and an optimization module 440 .
- the extraction module 410 is configured to perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
- Estimation module 420 configured to estimate the relative posture of the robot at time t compared to time t-1;
- the matching module 430 is configured to perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 according to the relative posture, to obtain the belonging of each line segment. straight line;
- the optimization module 440 is configured to jointly optimize the posture of the robot at each moment and the parametric representation of two straight lines of all straight lines, to obtain the posture of the robot at each moment and the straight line parameters of all straight lines.
- the estimation module 420 is configured to set the posture of the robot at the time t as Set the posture of the robot at the time t-1 as
- the matching module 430 is configured to determine the robot coordinate system formed by the robot at the time t-1 as the reference coordinate system; The coordinates of the two endpoints of t are converted into the reference coordinate system, and the expression of the line segment corresponding to the time t in the reference coordinate system is obtained; the line segment corresponding to the time t is in the reference coordinate system.
- the following expressions are matched with expressions of multiple line segments corresponding to the time t-1.
- the device further includes a filtering module configured to, for each line segment corresponding to the time t, select from the line segment corresponding to the time t-1 according to a predetermined attribute feature and a pre-stored attribute feature threshold. Filter the difference line segments from all the line segments, and determine the remaining line segments corresponding to the time t-1 as the candidate line segments corresponding to the line segment;
- the matching module 430 is configured to, for each line segment corresponding to the time t, match the expression of the line segment in the reference coordinate system with the expression of the candidate line segment corresponding to the line segment.
- the device further includes a determination module configured to determine that the number of candidate line segments corresponding to the line segment is not zero; when the determination module determines that it is, the device further includes an add module configured to determine that the line segment belongs to new straight line.
- the matching module 430 is configured to, for each line segment corresponding to the time t, form a line segment pair with the line segment and each single line segment corresponding to the time t-1, and calculate the line segment according to the pre-built loss. function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
- the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
- the optimization module 440 is configured to set the expression of the posture of the robot at each moment; set the expression of all straight lines obtained after the line segment matching; pass the expression of the posture and the expressions of all the straight lines, to construct a residual vector; by minimizing the sum of squares of the residual vector, the optimal estimation formula between the attitude at each moment and all the straight lines is obtained; for the most The optimal estimation formula is iteratively solved to obtain the posture at each moment and the straight line parameters of all straight lines.
- the positioning and map construction device 400 provided by the embodiments of the present application has the same implementation principle and technical effects as the foregoing method embodiments.
- the parts not mentioned in the device embodiments may be referred to in the foregoing method embodiments. corresponding content.
- an embodiment of the present application further provides a computer storage medium, where a computer program is stored on the computer storage medium, and when the computer program is run by a computer, the steps included in the above positioning and map construction method are executed.
- an embodiment of the present invention also provides a robot, which includes a processor and a memory connected to the processor, where a computer program is stored in the memory, and when the computer program is executed by the processor, the robot makes the robot Perform the steps involved in the positioning and map building method described above.
- the schematic diagram of the structure of the robot can be seen in FIG. 1 .
- the positioning and map construction method, device, robot and computer storage medium proposed in the embodiments of the present application perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
- the relative posture at the time t compared to the time t-1; according to the relative posture, a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1
- Line segment matching is performed to obtain the straight line to which each line segment belongs;
- the posture of the robot at each moment and the two-line parametric representation of all straight lines are jointly optimized to obtain the posture of the robot at each moment and all straight lines the line parameters. Since a two-line parameterized representation is used when representing a straight line, the amount of computation during data computation can be reduced as much as possible, thereby improving data acquisition efficiency and shortening computation time.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions. It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures.
- each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
- each functional module in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
- the functions are implemented in the form of software function modules and sold or used as independent products, they can be stored in a computer storage medium.
- the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a notebook computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM, Read-Only Memory), Random Access Memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program codes .
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (16)
- 一种定位与地图构建方法,其特征在于,所述方法包括:在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估机器人在所述t时刻相较于t-1时刻的相对姿态;根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
- 根据权利要求1所述的方法,其特征在于,所述预估机器人在所述t时刻相较于t-1时刻的相对姿态,包括:
- 根据权利要求1或2所述的方法,其特征在于,所述对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,包括:确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
- 根据权利要求3所述的方法,其特征在于,在所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配之前,所述方法还包括:针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;相应的,所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配,包括:针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
- 根据权利要求4所述的方法,其特征在于,在所述将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配之前,所述方法还包括:确定与该线段对应的候选线段的个数不为零;否则,所述方法还包括:确定该线段属于新的直线。
- 根据权利要求1至6任一项所述的方法,其特征在于,所述将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数,包括:设置所述机器人在所述各个时刻的姿态的表达式;设置经过所述线段匹配后所得到的所有直线的表达式;通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
- 一种定位与地图构建装置,其特征在于,所述装置包括:提取模块,配置成在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估模块,配置成预估机器人在所述t时刻相较于t-1时刻的相对姿态;匹配模块,配置成根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;优化模块,配置成将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
- 根据权利要求8所述的装置,其特征在于,所述预估模块,配置成:
- 根据权利要求8或9所述的装置,其特征在于,所述匹配模块,配置成:确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
- 根据权利要求10所述的装置,其特征在于,所述装置还包括过滤模块,配置成:针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;相应的,所述匹配模块,配置成:针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
- 根据权利要求11所述的装置,其特征在于,所述装置还包括确定模块,配置成:确定与该线段对应的候选线段的个数不为零;在确定模块确定为时,所述装置还包括添加模块,配置成确定该线段属于新的直线。
- 根据权利要求8至13任一项所述的装置,其特征在于,所述优化模块,配置成:设置所述机器人在所述各个时刻的姿态的表达式;设置经过所述线段匹配后所得到的所有直线的表达式;通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
- 一种机器人,其特征在于,包括:存储器和处理器,所述存储器和所述处理器连接;所述存储器用于存储程序;所述处理器调用存储于所述存储器中的程序,以执行如权利要求1-7中任一项所述的方法。
- 一种计算机存储介质,其特征在于,其上存储有计算机程序,所述计算机程序被计算机运行时执行如权利要求1-7中任一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023552378A JP2023549298A (ja) | 2020-11-27 | 2021-06-16 | 自己位置推定・地図作成方法、ロボット及びコンピュータ記憶媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011367335.9 | 2020-11-27 | ||
CN202011367335.9A CN112577500A (zh) | 2020-11-27 | 2020-11-27 | 定位与地图构建方法、装置、机器人及计算机存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022110767A1 true WO2022110767A1 (zh) | 2022-06-02 |
Family
ID=75126501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/100300 WO2022110767A1 (zh) | 2020-11-27 | 2021-06-16 | 定位与地图构建方法、装置、机器人及计算机存储介质 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP2023549298A (zh) |
CN (1) | CN112577500A (zh) |
WO (1) | WO2022110767A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577500A (zh) * | 2020-11-27 | 2021-03-30 | 北京迈格威科技有限公司 | 定位与地图构建方法、装置、机器人及计算机存储介质 |
CN113984071B (zh) * | 2021-09-29 | 2023-10-13 | 云鲸智能(深圳)有限公司 | 地图匹配方法、装置、机器人和计算机可读存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110052043A1 (en) * | 2009-08-25 | 2011-03-03 | Samsung Electronics Co., Ltd. | Method of mobile platform detecting and tracking dynamic objects and computer-readable medium thereof |
CN107655473A (zh) * | 2017-09-20 | 2018-02-02 | 南京航空航天大学 | 基于slam技术的航天器相对自主导航系统 |
CN110310331A (zh) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | 一种基于直线特征与点云特征结合的位姿估计方法 |
CN110866927A (zh) * | 2019-11-21 | 2020-03-06 | 哈尔滨工业大学 | 一种基于垂足点线特征结合的ekf-slam算法的机器人定位与构图方法 |
CN111590595A (zh) * | 2020-06-30 | 2020-08-28 | 深圳市银星智能科技股份有限公司 | 一种定位方法、装置、移动机器人及存储介质 |
CN112577500A (zh) * | 2020-11-27 | 2021-03-30 | 北京迈格威科技有限公司 | 定位与地图构建方法、装置、机器人及计算机存储介质 |
-
2020
- 2020-11-27 CN CN202011367335.9A patent/CN112577500A/zh active Pending
-
2021
- 2021-06-16 JP JP2023552378A patent/JP2023549298A/ja active Pending
- 2021-06-16 WO PCT/CN2021/100300 patent/WO2022110767A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110052043A1 (en) * | 2009-08-25 | 2011-03-03 | Samsung Electronics Co., Ltd. | Method of mobile platform detecting and tracking dynamic objects and computer-readable medium thereof |
CN107655473A (zh) * | 2017-09-20 | 2018-02-02 | 南京航空航天大学 | 基于slam技术的航天器相对自主导航系统 |
CN110310331A (zh) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | 一种基于直线特征与点云特征结合的位姿估计方法 |
CN110866927A (zh) * | 2019-11-21 | 2020-03-06 | 哈尔滨工业大学 | 一种基于垂足点线特征结合的ekf-slam算法的机器人定位与构图方法 |
CN111590595A (zh) * | 2020-06-30 | 2020-08-28 | 深圳市银星智能科技股份有限公司 | 一种定位方法、装置、移动机器人及存储介质 |
CN112577500A (zh) * | 2020-11-27 | 2021-03-30 | 北京迈格威科技有限公司 | 定位与地图构建方法、装置、机器人及计算机存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN112577500A (zh) | 2021-03-30 |
JP2023549298A (ja) | 2023-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110261870B (zh) | 一种用于视觉-惯性-激光融合的同步定位与建图方法 | |
CN108764048B (zh) | 人脸关键点检测方法及装置 | |
WO2022110767A1 (zh) | 定位与地图构建方法、装置、机器人及计算机存储介质 | |
CN112734852A (zh) | 一种机器人建图方法、装置及计算设备 | |
Bosse | ATLAS: a framework for large scale automated mapping and localization | |
WO2019196476A1 (zh) | 基于激光传感器生成地图 | |
US20130094706A1 (en) | Information processing apparatus and processing method thereof | |
CN112541423A (zh) | 一种同步定位与地图构建方法和系统 | |
CN114565668A (zh) | 即时定位与建图方法及装置 | |
CN114241050B (zh) | 一种基于曼哈顿世界假设及因子图的相机位姿优化方法 | |
CN113420590B (zh) | 弱纹理环境下的机器人定位方法、装置、设备及介质 | |
Heinemann et al. | A combined monte-carlo localization and tracking algorithm for robocup | |
He et al. | Observation‐driven Bayesian filtering for global location estimation in the field area | |
CN112652020A (zh) | 一种基于AdaLAM算法的视觉SLAM方法 | |
CN116577801A (zh) | 一种基于激光雷达和imu的定位与建图方法及系统 | |
Choi et al. | Metric SLAM in home environment with visual objects and sonar features | |
KR101054520B1 (ko) | 실내 이동 로봇의 위치 및 방향 인식 방법 | |
WO2023072269A1 (zh) | 对象跟踪 | |
CN115239776B (zh) | 点云的配准方法、装置、设备和介质 | |
Cupec et al. | Global localization based on 3d planar surface segments | |
CN110399892B (zh) | 环境特征提取方法和装置 | |
Han et al. | Uav vision: Feature based accurate ground target localization through propagated initializations and interframe homographies | |
Chella et al. | Automatic place detection and localization in autonomous robotics | |
WO2024108753A1 (zh) | 基于激光雷达的移动机器人高效鲁棒全局定位方法 | |
Zeng et al. | Entropy-based Keyframe Established and Accelerated Fast LiDAR Odometry and Mapping |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21896276 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023552378 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/09/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21896276 Country of ref document: EP Kind code of ref document: A1 |