WO2022110767A1 - 定位与地图构建方法、装置、机器人及计算机存储介质 - Google Patents

定位与地图构建方法、装置、机器人及计算机存储介质 Download PDF

Info

Publication number
WO2022110767A1
WO2022110767A1 PCT/CN2021/100300 CN2021100300W WO2022110767A1 WO 2022110767 A1 WO2022110767 A1 WO 2022110767A1 CN 2021100300 W CN2021100300 W CN 2021100300W WO 2022110767 A1 WO2022110767 A1 WO 2022110767A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
line segment
robot
line
coordinate system
Prior art date
Application number
PCT/CN2021/100300
Other languages
English (en)
French (fr)
Inventor
贺一家
赖文芊
刘骁
沈毅
Original Assignee
北京迈格威科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京迈格威科技有限公司 filed Critical 北京迈格威科技有限公司
Priority to JP2023552378A priority Critical patent/JP2023549298A/ja
Publication of WO2022110767A1 publication Critical patent/WO2022110767A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Definitions

  • the present application belongs to the field of autonomous navigation, and specifically relates to a positioning and map construction method, device, robot and computer storage medium.
  • the mobile robot During the operation of the mobile robot, it needs to continuously obtain data so as to locate itself in real time. In addition, it also needs to construct a map of the surrounding environment according to the obtained data.
  • SLAM Simultaneous Localization And Mapping, real-time localization and map construction
  • the SLAM algorithm used in the related art has problems such as large amount of calculation and low computational efficiency, which further affects the efficiency of real-time positioning and map construction.
  • the purpose of the present application is to provide a positioning and map construction method, device, robot and computer storage medium, which can improve at least one of the above problems.
  • An embodiment of the present application provides a method for positioning and building a map.
  • the method includes: extracting linear features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments; estimating that the robot is compared at time t The relative posture at time t-1; according to the relative posture, line segment matching is performed on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 to obtain each line segment.
  • the straight line to which each line segment belongs jointly optimize the posture of the robot at each moment and the parametric representation of two straight lines of all straight lines, and obtain the posture of the robot at each moment and the straight line parameters of all straight lines.
  • performing line segment matching on multiple line segments corresponding to the time t and multiple line segments obtained by performing linear feature extraction at the time t-1 includes: determining that the robot is at the time t-
  • the robot coordinate system formed at time 1 is the reference coordinate system; according to the relative posture, the coordinates of the two end points of the line segment corresponding to the time t are converted into the reference coordinate system, and the coordinates corresponding to the time t are obtained.
  • the expression of the corresponding line segment under the reference coordinate system; the expression of the line segment corresponding to the time t under the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 are carried out. match.
  • the method further includes: Including: for each line segment corresponding to the time t, according to a predetermined attribute feature and a pre-stored attribute feature threshold, filtering the difference line segment from all the line segments corresponding to the time t-1, and comparing the line segment with the The remaining line segment corresponding to time t-1 is determined as the candidate line segment corresponding to the line segment;
  • the matching the expression of the line segment corresponding to the time t in the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 includes: for the line segment corresponding to the time t For each line segment corresponding to the moment, the expression of the line segment in the reference coordinate system and the expression of the candidate line segment corresponding to the line segment are matched.
  • the method further includes: determining an expression of the candidate line segment corresponding to the line segment. The number is not zero; otherwise, the method further includes: determining that the line segment belongs to a new straight line.
  • the matching the expression of the line segment corresponding to the time t in the reference coordinate system and the expressions of the multiple line segments corresponding to the time t-1 includes: For each line segment corresponding to time t, the line segment and each single line segment corresponding to time t-1 form a line segment pair, and according to the pre-built loss function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
  • the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
  • the joint optimization is performed on the posture of the robot at each moment and the two-line parameterized representation of all straight lines to obtain the posture of the robot at the various moments and the straight line parameters of all straight lines, including: setting: The expression of the posture of the robot at the various moments; the expression of all straight lines obtained after the line segment matching is set; the residual vector is constructed by the expression of the posture and the expression of all the straight lines ; By minimizing the sum of squares of the residual vector, the optimal estimation formula between the attitude at each moment and all the straight lines is obtained; The optimal estimation formula is iteratively solved to obtain the optimal estimation formula at each moment. Attitude and line parameters for all lines.
  • the embodiment of the present application provides a positioning and map construction device, the device includes: an extraction module, an estimation module, a matching module, and an optimization module.
  • the extraction module is configured to perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
  • the estimation module is configured to estimate the relative relative value of the robot at time t compared to time t-1.
  • a matching module configured to perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 according to the relative posture, to obtain each line segment The line to which it belongs;
  • the optimization module is configured to jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all the lines to obtain the posture of the robot at the time and the line parameters of all the lines.
  • the matching module is configured to determine the robot coordinate system formed by the robot at the time t-1 as the reference coordinate system; The coordinates of the two endpoints are converted into the reference coordinate system, and the expression of the line segment corresponding to the time t in the reference coordinate system is obtained; the line segment corresponding to the time t is placed in the reference coordinate system. and the expressions of multiple line segments corresponding to the time t-1 are matched.
  • the device further includes a filtering module configured to, for each line segment corresponding to the time t, select from the line segment corresponding to the time t-1 according to a predetermined attribute feature and a pre-stored attribute feature threshold. Filter the difference line segments from all the line segments, and determine the remaining line segments corresponding to the time t-1 as the candidate line segments corresponding to the line segment;
  • the matching module is configured to, for each line segment corresponding to the time t, match the expression of the line segment in the reference coordinate system with the expression of the candidate line segment corresponding to the line segment.
  • the device further includes a determination module configured to determine that the number of candidate line segments corresponding to the line segment is not zero; when the determination module determines that it is, the device further includes an add module configured to determine that the line segment belongs to new straight line.
  • the matching module is configured to, for each line segment corresponding to the time t, form a line segment pair with the line segment and each single line segment corresponding to the time t-1, and according to a pre-built loss function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
  • the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
  • the optimization module is configured to set the expression of the posture of the robot at the various moments; set the expression of all straight lines obtained after the line segment matching; pass the expression of the posture and The expressions of all the straight lines are used to construct residual vectors; by minimizing the sum of the squares of the residual vectors, the optimal estimation formula between the posture at each moment and all the straight lines is obtained; The estimation formula is iteratively solved to obtain the posture at each moment and the straight line parameters of all straight lines.
  • An embodiment of the present application further provides a robot, including: a memory and a processor, the memory is connected to the processor; the memory is configured to store a program; the processor calls the program stored in the memory to Perform any of the possible positioning and map building methods above.
  • Embodiments of the present application further provide a non-volatile computer-readable storage medium (hereinafter referred to as a computer storage medium), on which a computer program is stored, and the computer program executes any of the above possible positioning and Map construction method.
  • a computer storage medium on which a computer program is stored, and the computer program executes any of the above possible positioning and Map construction method.
  • FIG. 1 shows a schematic structural diagram of a robot provided by an embodiment of the present application.
  • FIG. 2 shows a flowchart of the method for positioning and building a map provided by an embodiment of the present application.
  • FIG. 3 shows a schematic diagram of linear feature extraction according to a point cloud provided by an embodiment of the present application.
  • FIG. 4 shows a schematic diagram of line segment matching provided by an embodiment of the present application.
  • FIG. 5 shows a structural block diagram of the positioning and map building apparatus provided by the embodiment of the present application.
  • Icons 100-robot; 110-processor; 120-memory; 130-laser emission component; 400-location and map construction device; 410-extraction module; 420-estimation module; 430-matching module; 440-optimization module.
  • the embodiments of the present application provide a positioning and map construction method, device, robot, and computer storage medium, which can reduce the amount of calculation in the calculation process, thereby improving the calculation efficiency.
  • the technology can be implemented by corresponding software, hardware and combination of software and hardware.
  • the embodiments of the present application are described in detail below.
  • FIG. 1 a robot 100 for implementing the method and apparatus for positioning and map construction according to the embodiment of the present application will be described.
  • the robot 100 may include: a processor 110 , a memory 120 , and a laser emitting component 130 .
  • the components and structures of the robot 100 shown in FIG. 1 are only exemplary and not restrictive, and the robot 100 may also have other components and structures as required.
  • the processor 110 , the memory 120 , the laser emitting component 130 and other components that may be present in the robot 100 are directly or indirectly electrically connected to each other to realize data transmission or interaction.
  • the processor 110, the memory 120, the laser emitting component 130 and other possible components may be electrically connected to each other through one or more communication buses or signal lines.
  • the laser emitting component 130 After the laser emitting component 130 is powered on, it can emit laser light at a fixed frequency, and receive the reflected laser light after the foreign object reflects the laser light, so as to generate point cloud data corresponding to each laser emission moment.
  • the laser emitting component 130 may be a multi-threaded laser component for emitting multi-threaded laser light; or, the laser emitting component 130 may also be a single-threaded laser component for emitting single-threaded laser light.
  • point cloud data obtained based on a single-threaded laser is lower than that of point cloud data obtained by multi-threaded lasers, for the purpose of reducing the amount of calculation and data complexity, it is possible to directly
  • a single-threaded laser is used as the laser emitting component 130, so that the point cloud data obtained subsequently is composed of two-dimensional discrete points.
  • the memory 120 is used for storing programs, such as a program corresponding to a positioning and map construction method appearing later or a positioning and map construction apparatus appearing later.
  • programs such as a program corresponding to a positioning and map construction method appearing later or a positioning and map construction apparatus appearing later.
  • the location and map construction apparatus includes at least one software function module that can be stored in the memory 120 in the form of software or firmware.
  • the software function modules included in the positioning and map building apparatus may also be solidified in an operating system (operating system, OS) of the robot 100 .
  • OS operating system
  • the processor 110 is configured to execute executable modules stored in the memory 120, such as software function modules or computer programs included in the positioning and map construction apparatus.
  • executable modules stored in the memory 120, such as software function modules or computer programs included in the positioning and map construction apparatus.
  • the processor 110 can execute the computer program, for example, perform: extracting linear features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments; Compared with the relative posture at time t-1; according to the relative posture, perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 to obtain The lines to which each line segment belongs; jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all the lines to obtain the posture of the robot at the various moments and the linear parameters of all the lines.
  • any embodiment of the present application may be applied to the processor 110 or implemented by the processor 110 .
  • an embodiment of the present application provides a positioning and map construction method applied to the above-mentioned robot 100 .
  • the embodiments of the present application will be described below with reference to steps S110 to S140 shown in FIG. 2 , taking the point cloud data composed of two-dimensional discrete points as an example.
  • Step S110 extracting line features from the acquired point cloud data at time t to obtain a plurality of corresponding line segments.
  • the laser emitting component 130 can emit laser light to the outside, so as to acquire point cloud data used to reflect the environmental conditions at the current moment.
  • the robot 100 can successively obtain the point cloud data of the current moment through the laser emitting component 130 .
  • the two-dimensional discrete points in multiple planes produced by a single-threaded laser can be expressed by the formula To represent.
  • x represents the abscissa of the two-dimensional discrete point in the robot coordinate system o at the current moment
  • y represents the ordinate of the two-dimensional discrete point in the robot coordinate system o at the current moment.
  • the expression P representing a two-dimensional discrete point includes two real variables, namely x and y.
  • a first identification point and a second identification point are preset on the robot 100 .
  • the robot 100 can determine the robot coordinate system corresponding to the current moment by acquiring the position of the first identification point at the current moment and the position of the second identification point at the current moment.
  • the robot 100 takes the position of the first identification point as the origin of the coordinates, the line connecting the position of the second identification point and the position of the first identification point is the y-axis, and the vertical The straight line on the y-axis is the x-axis, thereby determining the robot coordinate system corresponding to the current moment.
  • the robot coordinate system corresponding to each moment may be different.
  • the robot coordinate system determined at the initial time of startup of the robot 100 is used as the global world coordinate system w.
  • the first identification point may be a position where the geometric center point of the robot 100 is located, or may be a point corresponding to other positions of the robot 100, which is not specifically limited in this embodiment of the present application.
  • a pre-saved known linear feature extraction algorithm such as the least squares algorithm
  • represents the angle between the normal vector of the line where the line segment is located and the x-axis in the robot coordinate system o at the current moment
  • s i , e i represent the two endpoints of the line segment i
  • n is a positive integer .
  • Step S120 Estimate the relative posture of the robot at the time t compared to the time t-1.
  • time t-1 can also be expressed as time (t-1).
  • the position of the robot 100 at time t is referred to as the posture of the robot 100 at time t.
  • time t-1 the posture of the robot 100 at the moment before time t
  • the expression T that represents the attitude includes three real variables, namely x, y,
  • the estimation algorithm for the relative posture of the robot at different times may include, but is not limited to: the posture estimation algorithm of the robot wheel encoder, the matching algorithm of the point cloud iterative closest point (ICP, Iterative Closest Point), etc.
  • the relative pose pair is not limited The specific type of estimation algorithm for .
  • the robot can calculate the relative attitude at time t compared to the above time t-1.
  • v is the linear velocity of the robot 100 when it is moving forward measured by the built-in wheel encoder
  • is the angular velocity of the robot 100 when it is moving forward measured by the built-in wheel encoder
  • ⁇ t is the time between t and t -1 Time interval between moments.
  • the aforementioned encoder is an angular displacement sensor, which determines the change of the robot pose by detecting the number of radians rotated by the robot wheel in a certain period of time. It can be mainly divided into three types: photoelectric type, contact type, and electromagnetic type. Specifically, the type of the selected encoder can be flexibly set according to the actual situation.
  • step S120 may be executed before step S110, in other embodiments, step S120 may be executed after step S110, and in other embodiments, step S120 and step S110 may be executed simultaneously.
  • Step S130 Perform line segment matching on the line segments corresponding to the time t and the line segments obtained by performing the linear feature extraction at the time t-1 according to the relative posture, to obtain the lines to which each line segment belongs.
  • the laser emitting component 130 collects point cloud data at a rate of 10-50 frames per second in the environment. Therefore, for the same spatial scene, there are differences between the point cloud data collected in several adjacent frames. smaller.
  • a map needs to be constructed, for the purpose of reducing the amount of calculation, it is necessary to match the line segments extracted at adjacent moments with each other, so as to associate the line segments belonging to the same straight line and determine the same straight line .
  • the process of line segment matching may refer to the following method.
  • Step S131 According to the relative posture, convert the coordinates of the two end points of the line segment corresponding to the time t into the reference coordinate system, and obtain the line segment corresponding to the time t in the reference coordinate system. expression.
  • Step S132 Match the expression of the line segment corresponding to the time t in the reference coordinate system with the expressions of the multiple line segments corresponding to the time t-1.
  • the line segment corresponding to the current moment after the coordinate system transformation may be directly matched with the line segment corresponding to the previous moment.
  • the line segments obtained at the previous moment may also be filtered, so as to filter out the difference line segments in the line segments corresponding to the previous moment, and determine the remaining line segments as candidate line segments.
  • the line segment corresponding to the current moment is screened for the line segment corresponding to the previous moment.
  • the line segment corresponding to the current moment and the candidate line segment corresponding to the previous moment after the coordinate system transformation can be directly matched, thereby reducing the amount of data when performing the matching operation.
  • the candidate line segments can be screened by referring to the following methods.
  • the difference line segment can be filtered from all the line segments corresponding to the previous moment by comparing the attribute feature and the attribute feature threshold value according to the pre-determined attribute feature and the pre-stored attribute feature threshold value, and The remaining line segments in the previous moment are determined as candidate line segments corresponding to the line segment.
  • the expression in the reference coordinate system corresponding to the previous moment r can and each line segment observed by the robot 100 at the previous time r Alignment is performed to determine the attribute characteristics between the two line segments to be compared.
  • the attribute features include, but are not limited to, at least one of the angle difference between the two line segments to be compared, the distance between the two line segments to be compared, and the degree of coincidence between the two line segments to be compared.
  • the attribute feature thresholds corresponding to the above-listed attribute features are an angle difference threshold, a distance threshold, and a coincidence threshold, respectively.
  • the angle difference between the two line segments to be compared is angle in the reference frame and line segments in the reference coordinate system angle in the reference frame angle difference between If the angle difference exceeds the preset angle difference angle difference threshold (for example, 5°), it is considered that the two line segments to be compared are and The probability of belonging to the same straight line is small, for difference line segment.
  • the preset angle difference angle difference threshold for example, 5°
  • the distance between the two line segments to be compared is Expressions of both ends of , in the reference coordinate system to the line segment in the reference coordinate system
  • the distances are d 1 and d 2 , respectively.
  • a preset distance threshold for example, 10 cm
  • the degree of coincidence d 0 between the two line segments to be compared, the endpoint Project to line get the projected point on d 0 is the line segment and the length of the overlapping region. If d 0 is less than the coincidence threshold (for example, 10 cm), the two line segments to be compared are considered and The probability of belonging to the same straight line is small, for difference line segment.
  • the coincidence threshold for example, 10 cm
  • a line segment pair is formed between the line segment and each single line segment corresponding to time t-1, and the loss value of each line segment pair corresponding to the line segment is calculated according to the pre-built loss function, and then The pair of line segments corresponding to the smallest loss value is determined to belong to the same straight line.
  • the pre-built loss function is d 1 , d 2 are the line segments
  • the distance of d 0 is the length of the overlapping area of the two line segments in the line segment pair, for the line segment
  • the length of ⁇ is a preset control factor parameter (for example, it can be set to 0.5).
  • a certain line segment observed at the current moment c can be obtained as The loss value between each line segment observed at the previous time r (if line segment filtering is performed, here is each candidate line segment observed at the previous time r). Subsequently, by comparing the size of each loss value, the line segment observed at the previous moment r corresponding to the minimum loss value is compared with determined to belong to the same straight line. At this time, the same line identifier (eg, line ID) can be set for the two line segments to indicate that the two line segments belong to the same line, which can be represented by the same expression.
  • line ID line ID
  • the final candidate line segment is 0, which means that the line segment is 0.
  • the new line ID can be inherited to the subsequent line segment with the line segment. Line segments that belong to the same straight line.
  • the first moment and the line segments belonging to the same straight line at the current moment can be associated.
  • Step S140 Jointly optimize the posture of the robot at each moment and the two-line parameterized representations of all straight lines to obtain the posture of the robot at each moment and the linear parameters of all the straight lines.
  • a line can construct a residual vector consisting of two residuals:
  • the posture at each moment and the straight line parameters of all straight lines can be obtained, so as to complete the trajectory estimation and map construction of the robot 100 .
  • the nonlinear least squares problem can be solved by using existing calculation methods, such as Gauss-Newton algorithm (Gauss-Newton), Levenberg-Marquardt algorithm (Levenberg-Marquardt), steepest descent method, etc. to iterate
  • existing calculation methods such as Gauss-Newton algorithm (Gauss-Newton), Levenberg-Marquardt algorithm (Levenberg-Marquardt), steepest descent method, etc.
  • an embodiment of the present application further provides a positioning and map construction apparatus 400.
  • the positioning and map construction apparatus 400 may also be installed at the robot.
  • the positioning and map construction apparatus 400 may include: an extraction module 410, an estimation module 420 , a matching module 430 and an optimization module 440 .
  • the extraction module 410 is configured to perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
  • Estimation module 420 configured to estimate the relative posture of the robot at time t compared to time t-1;
  • the matching module 430 is configured to perform line segment matching on a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1 according to the relative posture, to obtain the belonging of each line segment. straight line;
  • the optimization module 440 is configured to jointly optimize the posture of the robot at each moment and the parametric representation of two straight lines of all straight lines, to obtain the posture of the robot at each moment and the straight line parameters of all straight lines.
  • the estimation module 420 is configured to set the posture of the robot at the time t as Set the posture of the robot at the time t-1 as
  • the matching module 430 is configured to determine the robot coordinate system formed by the robot at the time t-1 as the reference coordinate system; The coordinates of the two endpoints of t are converted into the reference coordinate system, and the expression of the line segment corresponding to the time t in the reference coordinate system is obtained; the line segment corresponding to the time t is in the reference coordinate system.
  • the following expressions are matched with expressions of multiple line segments corresponding to the time t-1.
  • the device further includes a filtering module configured to, for each line segment corresponding to the time t, select from the line segment corresponding to the time t-1 according to a predetermined attribute feature and a pre-stored attribute feature threshold. Filter the difference line segments from all the line segments, and determine the remaining line segments corresponding to the time t-1 as the candidate line segments corresponding to the line segment;
  • the matching module 430 is configured to, for each line segment corresponding to the time t, match the expression of the line segment in the reference coordinate system with the expression of the candidate line segment corresponding to the line segment.
  • the device further includes a determination module configured to determine that the number of candidate line segments corresponding to the line segment is not zero; when the determination module determines that it is, the device further includes an add module configured to determine that the line segment belongs to new straight line.
  • the matching module 430 is configured to, for each line segment corresponding to the time t, form a line segment pair with the line segment and each single line segment corresponding to the time t-1, and calculate the line segment according to the pre-built loss. function Calculate the loss value of each line segment pair corresponding to the line segment; determine the line segment pair corresponding to the minimum loss value as belonging to the same straight line; wherein, d 1 , d 2 are the two end points of the line segment to the line segment pair.
  • the distance of another line segment, d 0 is the length of the overlapping area of the two line segments in the line segment pair, is the length of the line segment, ⁇ is a preset control factor parameter, and d 1 , d 2 , d 0 , Both are determined by the expressions of the two line segments in the line segment pair.
  • the optimization module 440 is configured to set the expression of the posture of the robot at each moment; set the expression of all straight lines obtained after the line segment matching; pass the expression of the posture and the expressions of all the straight lines, to construct a residual vector; by minimizing the sum of squares of the residual vector, the optimal estimation formula between the attitude at each moment and all the straight lines is obtained; for the most The optimal estimation formula is iteratively solved to obtain the posture at each moment and the straight line parameters of all straight lines.
  • the positioning and map construction device 400 provided by the embodiments of the present application has the same implementation principle and technical effects as the foregoing method embodiments.
  • the parts not mentioned in the device embodiments may be referred to in the foregoing method embodiments. corresponding content.
  • an embodiment of the present application further provides a computer storage medium, where a computer program is stored on the computer storage medium, and when the computer program is run by a computer, the steps included in the above positioning and map construction method are executed.
  • an embodiment of the present invention also provides a robot, which includes a processor and a memory connected to the processor, where a computer program is stored in the memory, and when the computer program is executed by the processor, the robot makes the robot Perform the steps involved in the positioning and map building method described above.
  • the schematic diagram of the structure of the robot can be seen in FIG. 1 .
  • the positioning and map construction method, device, robot and computer storage medium proposed in the embodiments of the present application perform linear feature extraction on the acquired point cloud data at time t to obtain a plurality of corresponding line segments;
  • the relative posture at the time t compared to the time t-1; according to the relative posture, a plurality of line segments corresponding to the time t and a plurality of line segments obtained by performing linear feature extraction at the time t-1
  • Line segment matching is performed to obtain the straight line to which each line segment belongs;
  • the posture of the robot at each moment and the two-line parametric representation of all straight lines are jointly optimized to obtain the posture of the robot at each moment and all straight lines the line parameters. Since a two-line parameterized representation is used when representing a straight line, the amount of computation during data computation can be reduced as much as possible, thereby improving data acquisition efficiency and shortening computation time.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions. It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
  • each functional module in the embodiments of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
  • the functions are implemented in the form of software function modules and sold or used as independent products, they can be stored in a computer storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a notebook computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM, Read-Only Memory), Random Access Memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种定位与地图构建方法、装置、机器人(100)及计算机存储介质,方法包括:在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段(S110);预估机器人(100)在t时刻相较于t-1时刻的相对姿态(S120);根据相对姿态,对与t时刻对应的多个线段及在t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线(S130);将机器人(100)在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到机器人(100)在各个时刻的姿态及所有直线的直线参数(S140)。由于在表示直线时,采用的是二直线参数化表示形式,因此,可以尽可能地减少计算的计算量,从而可以提高计算效率,缩短计算时间。

Description

定位与地图构建方法、装置、机器人及计算机存储介质
相关申请的交叉引用
本申请要求于2020年11月27日提交中国专利局的申请号为202011367335.9、名称为“定位与地图构建方法、装置、机器人及计算机存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于自主导航领域,具体涉及一种定位与地图构建方法、装置、机器人及计算机存储介质。
背景技术
移动机器人在运行过程中,需要不断获取数据,从而对自身进行实时定位,此外,还需要根据获取到的数据对周围环境进行地图构建。
一般而言,移动机器人通过SLAM(Simultaneous Localization And Mapping,实时定位和地图构建)算法来获取并确定用于进行实时定位和地图构建的数据,因此,SLAM算法的优劣直接决定了移动机器人在移动过程中的定位精度和稳定性,因此有必要提出稳定高效且高精度的SLAM算法。
在相关技术中所采用的SLAM算法存在计算量较大、计算效率较低等问题,从而进一步影响实时定位和地图构建的效率。
公开内容
有鉴于此,本申请的目的在于提供一种定位与地图构建方法、装置、机器人及计算机存储介质,可以改善以上问题至少之一。
本申请的实施例是这样实现的:
本申请实施例提供一种定位与地图构建方法,所述方法包括:在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估机器人在所述t时刻相较于t-1时刻的相对姿态;根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
可选的,所述预估机器人在所述t时刻相较于t-1时刻的相对姿态,包括:设置所述机 器人在所述t时刻的姿态为
Figure PCTCN2021100300-appb-000001
设置所述机器人在所述t-1时刻的姿态为
Figure PCTCN2021100300-appb-000002
设置所述机器人在所述t时刻的机器人坐标系相对于世界坐标系的旋转矩阵及平移向量分别为
Figure PCTCN2021100300-appb-000003
根据预先保存的机器人轮式编码器姿态估计算法,计算得到所述t时刻相较于所述t-1时刻的相对姿态
Figure PCTCN2021100300-appb-000004
其中,
Figure PCTCN2021100300-appb-000005
Figure PCTCN2021100300-appb-000006
之间满足公式:
Figure PCTCN2021100300-appb-000007
(x k,y k)表示所述机器人在t=k时刻时相对于世界坐标系w的坐标,
Figure PCTCN2021100300-appb-000008
表示所述机器人在t=k时刻时相对于世界坐标系w的朝向,
Figure PCTCN2021100300-appb-000009
为所述旋转矩阵
Figure PCTCN2021100300-appb-000010
为所述平移向量,v为所述机器人的线速度,ω为所述机器人的角速度,△t为所述t时刻与所述t-1时刻之间的时间间隔。
可选的,所述对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,包括:确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
可选的,在所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配之前,所述方法还包括:针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;
相应的,所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配,包括:针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
可选的,在所述将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配之前,所述方法还包括:确定与该线段对应的候选线段的个数不为零;否则,所述方法还包括:确定该线段属于新的直线。
可选的,所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配,包括:针对与所述t时刻对应的每个线段,将该线 段和与所述t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数
Figure PCTCN2021100300-appb-000011
计算与该线段对应的各个线段对的损失值;将所述损失值最小时所对应的线段对确定为属于同一直线;其中,d 1,d 2为该线段的两个端点到线段对中的另一线段的距离,d 0是线段对中两线段的重合区域的长度,
Figure PCTCN2021100300-appb-000012
为该线段的长度,α是预先设定的控制因子参数,且d 1,d 2,d 0
Figure PCTCN2021100300-appb-000013
均由线段对中的两线段的表达式来确定。
可选的,所述将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数,包括:设置所述机器人在所述各个时刻的姿态的表达式;设置经过所述线段匹配后所得到的所有直线的表达式;通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
本申请实施例提供一种定位与地图构建装置,所述装置包括:提取模块、预估模块、匹配模块以及优化模块。提取模块,配置成在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估模块,配置成预估机器人在所述t时刻相较于t-1时刻的相对姿态;匹配模块,配置成根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;优化模块,配置成将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
可选的,所述预估模块,配置成设置所述机器人在所述t时刻的姿态为
Figure PCTCN2021100300-appb-000014
设置所述机器人在所述t-1时刻的姿态为
Figure PCTCN2021100300-appb-000015
设置所述机器人在所述t时刻的机器人坐标系相对于世界坐标系的旋转矩阵及平移向量分别为
Figure PCTCN2021100300-appb-000016
根据预先保存的机器人轮式编码器姿态估计算法,计算得到所述t时刻相较于所述t-1时刻的相对姿态
Figure PCTCN2021100300-appb-000017
其中,
Figure PCTCN2021100300-appb-000018
Figure PCTCN2021100300-appb-000019
Figure PCTCN2021100300-appb-000020
之间满足公式:
Figure PCTCN2021100300-appb-000021
(x k,y k)表示所述机器人在t=k时刻时相对于世界坐标系w的坐标,
Figure PCTCN2021100300-appb-000022
表示所述机器人在t=k时刻时相对于世界坐标系w的 朝向,
Figure PCTCN2021100300-appb-000023
为所述旋转矩阵,
Figure PCTCN2021100300-appb-000024
为所述平移向量,v为所述机器人的线速度,ω为所述机器人的角速度,△t为所述t时刻与所述t-1时刻之间的时间间隔。
可选的,所述匹配模块,配置成确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
可选的,所述装置还包括过滤模块,配置成针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;
相应的,所述匹配模块,配置成针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
可选的,所述装置还包括确定模块,配置成确定与该线段对应的候选线段的个数不为零;在确定模块确定为时,所述装置还包括添加模块,配置成确定该线段属于新的直线。
可选的,所述匹配模块,配置成针对与所述t时刻对应的每个线段,将该线段和与所述t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数
Figure PCTCN2021100300-appb-000025
计算与该线段对应的各个线段对的损失值;将所述损失值最小时所对应的线段对确定为属于同一直线;其中,d 1,d 2为该线段的两个端点到线段对中的另一线段的距离,d 0是线段对中两线段的重合区域的长度,
Figure PCTCN2021100300-appb-000026
为该线段的长度,α是预先设定的控制因子参数,且d 1,d 2,d 0
Figure PCTCN2021100300-appb-000027
均由线段对中的两线段的表达式来确定。
可选的,所述优化模块,配置成设置所述机器人在所述各个时刻的姿态的表达式;设置经过所述线段匹配后所得到的所有直线的表达式;通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
本申请实施例还提供一种机器人,包括:存储器和处理器,所述存储器和所述处理器连接;所述存储器配置成存储程序;所述处理器调用存储于所述存储器中的程序,以执行上述任一种可能的定位与地图构建方法。
本申请实施例还提供一种非易失性计算机可读取存储介质(以下简称计算机存储介质), 其上存储有计算机程序,所述计算机程序被计算机运行时执行上述任一种可能的定位与地图构建方法。
本申请的其他特征和优点将在随后的说明书阐述,并且,部分地从说明书中变得显而易见,或者通过实施本申请实施例而了解。本申请的目的和其他优点可通过在所写的说明书以及附图中所特别指出的结构来实现和获得。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。通过附图所示,本申请的上述及其它目的、特征和优势将更加清晰。在全部附图中相同的附图标记指示相同的部分。并未刻意按实际尺寸等比例缩放绘制附图,重点在于示出本申请的主旨。
图1示出本申请实施例提供的机器人的结构示意图。
图2示出本申请实施例提供的定位与地图构建方法的流程图。
图3示出本申请实施例提供的根据点云进行直线特征提取的示意图。
图4示出本申请实施例提供的线段匹配的示意图。
图5示出本申请实施例提供的定位与地图构建装置的结构框图。
图标:100-机器人;110-处理器;120-存储器;130-激光发射组件;400-定位与地图构建装置;410-提取模块;420-预估模块;430-匹配模块;440-优化模块。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。同时,在本申请的描述中诸如“第一”、“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
再者,本申请中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
相关技术中,提出一种利用图优化技术实时优化直线特征与机器人姿态的SLAM算法,但是,申请人经研究发现,在该算法中,利用与直线特征对应的6个参数来表示直线,使得对于直线的描述不是最小参数化表示,因此,造成后续的计算量较大,计算效率较低的问题,从而进一步影响实时定位和地图构建的效率。
针对相关技术中的实时定位和地图构建方案所存在的缺陷(计算量较大,计算效率较低)是申请人在经过实践并仔细研究后得出的结果,因此,上述缺陷的发现过程以及在下文中本申请实施例针对上述缺陷所提出的解决方案,都应该被认定为申请人对本申请做出的贡献。
为了改善上述问题至少之一,本申请实施例提供一种定位与地图构建方法、装置、机器人及计算机存储介质,可以减少计算过程中的计算量,从而提高计算效率。
该技术可采用相应的软件、硬件以及软硬结合的方式实现。以下对本申请实施例进行详细介绍。
首先,参照图1来描述用于实现本申请实施例的定位与地图构建方法、装置的机器人100。
其中,机器人100可以包括:处理器110、存储器120、激光发射组件130。
应当注意,图1所示的机器人100的组件和结构只是示例性的,而非限制性的,根据需要,机器人100也可以具有其他组件和结构。
处理器110、存储器120、激光发射组件130以及其他可能出现于机器人100的组件相互之间直接或间接地电性连接,以实现数据的传输或交互。例如,处理器110、存储器120、激光发射组件130以及其他可能出现的组件相互之间可通过一条或多条通讯总线或信号线实现电性连接。
激光发射组件130在上电后可以以固定的频率对外发射激光,并接收外物对激光进行反射后的反射激光,从而生成与每个激光发射时刻对应的点云数据。
其中,可选的,激光发射组件130可以是多线程的激光组件,用于发出多线程激光;或者,激光发射组件130也可以是单线程的激光组件,用于发出单线程激光。
可选的,由于基于单线程激光得到的点云数据的数据复杂度低于多线程激光得到的点云数据的数据复杂度,因此,出于减少计算量以及减少数据复杂度的目的,可以直接采用单线程激光作为激光发射组件130,使得后续得到的点云数据由二维离散点所组成。
存储器120用于存储程序,例如存储有后文出现的定位与地图构建方法对应的程序或 者后文出现的定位与地图构建装置。可选的,当存储器120内存储有定位与地图构建装置时,定位与地图构建装置包括至少一个可以以软件或固件(firmware)的形式存储于存储器120中的软件功能模块。
可选的,定位与地图构建装置所包括软件功能模块也可以固化在机器人100的操作系统(operating system,OS)中。
处理器110用于执行存储器120中存储的可执行模块,例如定位与地图构建装置包括的软件功能模块或计算机程序。当处理器110在接收到执行指令后,可以执行计算机程序,例如执行:在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估机器人在所述t时刻相较于t-1时刻的相对姿态;根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
当然,本申请任一实施例所揭示的方法都可以应用于处理器110中,或者由处理器110实现。
下面将针对本申请所提供的定位与地图构建方法进行介绍。
请参阅图2,本申请实施例提供一种应用于上述机器人100的定位与地图构建方法。下面将结合图2中所示的步骤S110~步骤S140,以点云数据由二维离散点所组成为例,对本申请实施例进行介绍。
步骤S110:在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段。
机器人100在上电开始启动后,可以通过激光发射组件130对外发出激光,从而获取到用于反映当前时刻的环境情况的点云数据。
在后续的每个时刻,机器人100可以陆续通过激光发射组件130获取到当前时刻的点云数据。
在点云数据中,包括若干二维离散点。
单线程激光所产生的多个平面内的二维离散点可以用公式
Figure PCTCN2021100300-appb-000028
来表示。其中,x表示二维离散点在当前时刻的机器人坐标系o中的横坐标,y表示二维离散点在当前时刻的机器人坐标系o中的纵坐标。
Figure PCTCN2021100300-appb-000029
表示二维离散点的表达式P中包括2个实数变量,即x与y。
一般而言,在机器人100上预先设置有第一标识点以及第二标识点。
在任何时刻,机器人100可以通过获取当前时刻的第一标识点所在的位置以及当前时刻的第二标识点所在的位置,从而确定与当前时刻对应的机器人坐标系。
例如,在任一时刻,机器人100以第一标识点所在的位置为坐标原点,以第二标识点 所在的位置与第一标识点所在的位置之间的连线所在的直线为y轴,以垂直于y轴的直线为x轴,从而确定与当前时刻对应的机器人坐标系。
值得指出的是,由于机器人100在不断移动,因此,与各个时刻对应的机器人坐标系可能均不相同。此外,一般以机器人100在启动的初始时刻所确定的机器人坐标系为全局的世界坐标系w。
其中,第一标识点可以是机器人100的几何中心点所在的位置,也可以是机器人100的其他位置所对应的点,本申请实施例不做具体限定。
在得到点云数据后,如图3所示,机器人100可以通过预先保存的已知直线特征提取算法(如最小二乘算法),从点云数据中提取出若干线段L i={n i,s i,e i},i=1...n。
其中,
Figure PCTCN2021100300-appb-000030
表示线段i所在直线的法向量,θ表示线段所在直线的法向量与当前时刻的机器人坐标系o中的x轴的夹角,s i、e i表示线段i的两个端点,n为正整数。
步骤S120:预估机器人在所述t时刻相较于t-1时刻的相对姿态。其中,t-1时刻也可表示为(t-1)时刻。
机器人100在t时刻所在的位置称之为机器人100在t时刻的姿态。
假设机器人100在t时刻的姿态为
Figure PCTCN2021100300-appb-000031
且假设机器人100在t时刻的前一时刻(在本申请实施例中称之为t-1时刻)的姿态为
Figure PCTCN2021100300-appb-000032
其中,(x k,y k)表示所述机器人在t=k时刻时,相对于世界坐标系w的坐标,
Figure PCTCN2021100300-appb-000033
表示机器人100在t=k时刻时,相对于世界坐标系w的朝向,
Figure PCTCN2021100300-appb-000034
表示姿态的表达式T中包括3个实数变量,即x、y、
Figure PCTCN2021100300-appb-000035
此外,假设机器人100在t=k时刻的机器人坐标系o相对于世界坐标系w的旋转矩阵及平移向量分别为
Figure PCTCN2021100300-appb-000036
其中,
Figure PCTCN2021100300-appb-000037
为旋转矩阵,
Figure PCTCN2021100300-appb-000038
为平移向量。
在存在上述假设的参数后,可以根据预先保存的对机器人在不同时刻间的相对姿态的估计算法,估计机器人100在t时刻(t=k)相较于t-1时刻的相对姿态
Figure PCTCN2021100300-appb-000039
其中,对机器人在不同时刻间的相对姿态的估计算法可以包括,但不限于:机器人轮式编码器的姿态估计算法,点云迭代最近点(ICP,Iterative Closest Point)的匹配算法等。 在本申请实施例中,不限制相对姿态对
Figure PCTCN2021100300-appb-000040
的估计算法的具体类型。
以机器人轮式编码器的姿态估计算法为例,根据该算法,机器人可以计算得到t时刻相较于上述t-1时刻的相对姿态
Figure PCTCN2021100300-appb-000041
其中,
Figure PCTCN2021100300-appb-000042
Figure PCTCN2021100300-appb-000043
三者之间满足公式:
Figure PCTCN2021100300-appb-000044
v为机器人100通过内置的轮式编码器测得的自身在前进时的线速度,ω为机器人100通过内置的轮式编码器测得的自身在前进时的角速度,△t为t时刻与t-1时刻之间的时间间隔。其中,前述编码器是一种角位移传感器,它通过检测机器人轮子在一定时间内转过的弧度数来确定机器人位姿的变化,主要可分为光电式、接触式、电磁式三种类型,具体可根据实际情况灵活设置所选编码器的类型。
值得指出的是,上述步骤S110与步骤S120之间的执行顺序并不存在严格的先后顺序。可以理解,在一些实施方式中,步骤S120可以在步骤S110之前执行,在另一些实施方式中,步骤S120可以在步骤S110之后执行,在另一些实施方式中,步骤S120与步骤S110可以同时执行。
步骤S130:根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线。
一般而言,激光发射组件130在环境中以每秒10-50帧的速率采集点云数据,因此,针对同一个空间场景,相邻几帧所采集到的点云数据之间存在的差异性较小。在这种前提下,当需要构建地图时,出于减少计算量的目的,需要将相邻时刻所提取的线段之间进行相互匹配,从而将属于同一直线的线段进行关联,并确定为同一直线。此外,还便于后续构建地图时,形成一个完整的图形,提高构建的准确性以及完备性。
可选的,线段匹配的过程可以参考如下方式。
步骤S131:根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式。
其中,假设当前时刻t=c、前一时刻t-1=r、当前时刻相对于前一时刻的相对姿态
Figure PCTCN2021100300-appb-000045
且以机器人100在前一时刻(t-1时刻)时所形成的机器人坐标系为参考坐标系。
假设当前时刻机器人坐标系下的一个点为
Figure PCTCN2021100300-appb-000046
通过机器人100在当前时刻相对于前一时刻的旋转矩阵
Figure PCTCN2021100300-appb-000047
及平移向量
Figure PCTCN2021100300-appb-000048
可以将
Figure PCTCN2021100300-appb-000049
变换到机器人坐标系为参考坐标系,公式为:
Figure PCTCN2021100300-appb-000050
可以简写为
Figure PCTCN2021100300-appb-000051
其中,
Figure PCTCN2021100300-appb-000052
为点
Figure PCTCN2021100300-appb-000053
在参考坐标系下的表达式。
可以通过上述相同的方式,将当前时刻观测到线段的两个端点
Figure PCTCN2021100300-appb-000054
变换到参考坐标系后,得到
Figure PCTCN2021100300-appb-000055
相应的,当前时刻机器人100所提取出的线段
Figure PCTCN2021100300-appb-000056
变换到参考坐标系(前一时刻对应的机器人坐标系)后的表达式为:
Figure PCTCN2021100300-appb-000057
其中,线段的法向量的变换方程为
Figure PCTCN2021100300-appb-000058
步骤S132:将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
可选的,可以直接将经过坐标系变换后的与当前时刻对应的线段与前一时刻对应的线段进行匹配。
可选的,在进行线段匹配之前,还可以先对前一时刻所得到的线段进行过滤,从而将前一时刻所对应的线段中的差异线段筛除,并将剩余的线段确定为候选线段。
其中,就每一条线段而言,与之对应的差异线段表征与该线段之间存在较大的差异,且两者属于同一直线的概率较小。
若在进行线段匹配前,就当前时刻对应的线段,对前一时刻所对应的线段进行了差异线段筛除,相应的,在将经过坐标系变换后的与当前时刻对应的线段与前一时刻对应的线段进行匹配时,可以直接将经过坐标系变换后的与当前时刻对应的线段及与前一时刻对应的候选线段进行匹配,从而可以减少执行匹配操作时的数据量。
其中,可以参考如下方式筛选候选线段。
针对与当前时刻对应的每个线段,可以根据预先确定的属性特征及预先保存的属性特征阈值,通过比对属性特征与属性特征阈值,从与前一时刻对应的所有线段中过滤差异线段,并将前一时刻中剩余的线段确定为与该线段对应的候选线段。
可选的,对于机器人100当前时刻c所观测到的某条线段
Figure PCTCN2021100300-appb-000059
在前一时刻r所对应的参考坐标系下的表达式
Figure PCTCN2021100300-appb-000060
可以将
Figure PCTCN2021100300-appb-000061
与机器人100在前一时刻r所观测到的每一条线段
Figure PCTCN2021100300-appb-000062
进行比对,从而确定待比对的两条线段之间的属性特征。其中,属性特征包括但不限于:待比对的两线段的角度差、待比对的两线段之间的距离、待比对的两线段之间的重合度中的至少一个。相应的,与上述所列举的属性特征所对应的属性特征阈值分别为角度差阈值、距离阈值以及重合度阈值。
可选的,待比对的两线段的角度差为
Figure PCTCN2021100300-appb-000063
在参考坐标系里的角度
Figure PCTCN2021100300-appb-000064
和参考坐标系里线段
Figure PCTCN2021100300-appb-000065
在参考坐标系里的角度
Figure PCTCN2021100300-appb-000066
之间的角度差
Figure PCTCN2021100300-appb-000067
如果该角度差超过预先设定的角度差角度差阈值(例如5°),则认为待比对的两线段
Figure PCTCN2021100300-appb-000068
Figure PCTCN2021100300-appb-000069
之间属于同一直线的概率 较小,
Figure PCTCN2021100300-appb-000070
Figure PCTCN2021100300-appb-000071
的差异线段。
可选的,待比对的两线段之间的距离为
Figure PCTCN2021100300-appb-000072
的两端点在参考坐标系内的表达式
Figure PCTCN2021100300-appb-000073
到参考坐标系里的线段
Figure PCTCN2021100300-appb-000074
的距离,分别为d 1,d 2。其中
Figure PCTCN2021100300-appb-000075
就d 1、d 2而言,其中任一个的所表征的距离超过预先设置的距离阈值(例如10厘米),则认为待比对的两线段
Figure PCTCN2021100300-appb-000076
Figure PCTCN2021100300-appb-000077
之间属于同一直线的概率较小,
Figure PCTCN2021100300-appb-000078
Figure PCTCN2021100300-appb-000079
的差异线段。
可选的,待比对的两线段之间的重合度d 0,将端点
Figure PCTCN2021100300-appb-000080
投影到直线
Figure PCTCN2021100300-appb-000081
上得到投影点
Figure PCTCN2021100300-appb-000082
d 0即为线段
Figure PCTCN2021100300-appb-000083
Figure PCTCN2021100300-appb-000084
的重合区域的长度。如果d 0小于重合度阈值(例如10厘米),则认为待比对的两线段
Figure PCTCN2021100300-appb-000085
Figure PCTCN2021100300-appb-000086
之间属于同一直线的概率较小,
Figure PCTCN2021100300-appb-000087
Figure PCTCN2021100300-appb-000088
的差异线段。
关于d 1、d 2、d 0的直观含义如下图4所示。
下面将针对线段匹配的过程进行介绍。
针对与t时刻对应的每个线段,将该线段和与t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数,计算与该线段对应的各个线段对的损失值,然后将损失值最小时所对应的线段对确定为属于同一直线。
假设当前时刻c所观测到的某一线段为
Figure PCTCN2021100300-appb-000089
其在参考坐标系内的表达式为
Figure PCTCN2021100300-appb-000090
前一时刻r所观测到的某一线段为
Figure PCTCN2021100300-appb-000091
且当前举例时,将
Figure PCTCN2021100300-appb-000092
Figure PCTCN2021100300-appb-000093
组合成线段对。
其中,预先构建的损失函数为
Figure PCTCN2021100300-appb-000094
d 1,d 2为该线段
Figure PCTCN2021100300-appb-000095
的两个端点在参考坐标系内的表达式
Figure PCTCN2021100300-appb-000096
分别到线段对中的另一线段
Figure PCTCN2021100300-appb-000097
的距离,d 0是线段对中两线段的重合区域的长度,
Figure PCTCN2021100300-appb-000098
为该线段
Figure PCTCN2021100300-appb-000099
的长度,α是预先设定的控制因子参数(例如可以设置为0.5)。
通过重复上述计算损失值的过程,可以得到当前时刻c所观测到的某一线段为
Figure PCTCN2021100300-appb-000100
与前一时刻r所观测到的各条线段(若进行了线段过滤,则此处为前一时刻r所观测到的各条候选线段)之间的损失值。后续,通过比较各损失值之间的大小,将损失值最小时所对应的属于前一时刻r所观测到的线段与
Figure PCTCN2021100300-appb-000101
确定为属于同一直线。此时,可以将两线段设置同样的直线标识(例如直线ID),用于表征两者属于同一直线,可用同一个表达式来进行表示。
此外,若针对当前时刻所观测到某条线段
Figure PCTCN2021100300-appb-000102
若在针对其对前一时刻r所观测到的各 条线段进行线段过滤时,最终得到的候选线段为0,此时说明该线段
Figure PCTCN2021100300-appb-000103
为新出现的直线的一部分,此时,可以为该线段
Figure PCTCN2021100300-appb-000104
设置一个全新的直线标识。当后续时刻有与该线段
Figure PCTCN2021100300-appb-000105
属于同一直线的线段产生时,可以将该全新的直线标识继承给后续与该线段
Figure PCTCN2021100300-appb-000106
属于同一直线的线段。
当在各个时刻均通过上述线段匹配的方式将当前时刻观测到的线段与前一时刻所观测的线段进行迭代匹配后,即可以将第一时刻与当前时刻中属于同一直线的线段关联起来。
此外,由于二维直线在平面内只有2个自由度,即y=kx+b,在直线的朝向已知且直线到原点的距离已知的前提下,唯一确定的x、y可以唯一确定一条直线,因此,通过2个参数即可以唯一确定一条直线。
在本申请实施例中,为了减少计算量,通过二直线参数化表示形式来表示得到的每条直线:
Figure PCTCN2021100300-appb-000107
其中
Figure PCTCN2021100300-appb-000108
为直线法向量对应的角度,ρ表示坐标系原点到直线的距离,点在直线上对应的方程为:cos(θ)x+sin(θ)y=ρ。
步骤S140:将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
假设机器人100在当前时刻所需要估计的机器人轨迹包括n个时刻的姿态
Figure PCTCN2021100300-appb-000109
同时,假设当前时刻经过线段匹配后所得到的直线有m条,且m条直线表示为
Figure PCTCN2021100300-appb-000110
假设在t=k时刻时,机器人100在此刻的机器人坐标系的位置相对于世界坐标系的相对姿态为
Figure PCTCN2021100300-appb-000111
假设地图中第i条直线
Figure PCTCN2021100300-appb-000112
在k时刻被机器人100观测到,那么
Figure PCTCN2021100300-appb-000113
在机器人坐标系O中对应的观测量为
Figure PCTCN2021100300-appb-000114
在理想状态下,对于直线上的一个点
Figure PCTCN2021100300-appb-000115
将其通过机器人坐标系变换到世界坐标系后得到的点
Figure PCTCN2021100300-appb-000116
应满足直线方程:
Figure PCTCN2021100300-appb-000117
Figure PCTCN2021100300-appb-000118
然而,由于点云数据测量值存在噪声且机器人姿态存在估计误差,因此观测坐标
Figure PCTCN2021100300-appb-000119
和地图直线
Figure PCTCN2021100300-appb-000120
之间可构建残差约束:
Figure PCTCN2021100300-appb-000121
考虑到每条直线都有两个端点s,e,因此一条直线可以构建一个由两个残差组成的残差向量:
Figure PCTCN2021100300-appb-000122
通过最小化残差向量的平方和,可以得到各个时刻的姿态与所有直线之间的最优估计公式
Figure PCTCN2021100300-appb-000123
通过对上述最优估计公式进行迭代求解,可以得到各个时刻的姿态及所有直线的直线参数,从而完成对机器人100的轨迹估计以及地图构建。
其中,非线性最小二乘问题的求解可以采用现有的计算方式,例如高斯牛顿算法(Gauss-Newton)、列文伯格-马夸尔特算法(Levenberg-Marquardt)、最速下降法等进行迭代求解,本申请实施例中不对具体的求解方式进行限定。
综上所述,本申请实施例提供的定位与地图构建方法,至少具有如下优势:
(1)由于在表示直线时,采用的是二直线参数化表示形式,因此,可以尽可能地减少确定用于进行后续实施定位以及地图构建的数据的计算量,从而提高地图定位与地图构建效率,以便提高后续实时定位和地图构建的效率。
(2)通过减少线段匹配时的所需匹配的线段的数量,从而减少计算量,可进一步提高计算效率。
如图5所示,本申请实施例还提供一种定位与地图构建装置400,定位与地图构建装置400也可设置与机器人处,定位与地图构建装置400可以包括:提取模块410、预估模块420、匹配模块430以及优化模块440。
提取模块410,配置成在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;
预估模块420,配置成预估机器人在所述t时刻相较于t-1时刻的相对姿态;
匹配模块430,配置成根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;
优化模块440,配置成将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
可能的,所述预估模块420,配置成设置所述机器人在所述t时刻的姿态为
Figure PCTCN2021100300-appb-000124
设置所述机器人在所述t-1时刻的姿态为
Figure PCTCN2021100300-appb-000125
设置所述机器人在所述t时刻的机器人坐标系相对于世界坐标系的旋转矩阵及平移向量分 别为
Figure PCTCN2021100300-appb-000126
根据预先保存的机器人轮式编码器姿态估计算法,计算得到所述t时刻相较于所述t-1时刻的相对姿态
Figure PCTCN2021100300-appb-000127
其中,
Figure PCTCN2021100300-appb-000128
Figure PCTCN2021100300-appb-000129
Figure PCTCN2021100300-appb-000130
之间满足公式:
Figure PCTCN2021100300-appb-000131
(x k,y k)表示所述机器人在t=k时刻时相对于世界坐标系w的坐标,
Figure PCTCN2021100300-appb-000132
表示所述机器人在t=k时刻时相对于世界坐标系w的朝向,
Figure PCTCN2021100300-appb-000133
为所述旋转矩阵,
Figure PCTCN2021100300-appb-000134
为所述平移向量,v为所述机器人的线速度,ω为所述机器人的角速度,△t为所述t时刻与所述t-1时刻之间的时间间隔。
可选的,所述匹配模块430,配置成确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
可选的,所述装置还包括过滤模块,配置成针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;
相应的,所述匹配模块430,配置成针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
可选的,所述装置还包括确定模块,配置成确定与该线段对应的候选线段的个数不为零;在确定模块确定为时,所述装置还包括添加模块,配置成确定该线段属于新的直线。
可选的,所述匹配模块430,配置成针对与所述t时刻对应的每个线段,将该线段和与所述t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数
Figure PCTCN2021100300-appb-000135
计算与该线段对应的各个线段对的损失值;将所述损失值最小时所对应的线段对确定为属于同一直线;其中,d 1,d 2为该线段的两个端点到线段对中的另一线段的距离,d 0是线段对中两线段的重合区域的长度,
Figure PCTCN2021100300-appb-000136
为该线段的长度,α是预先设定的控制因子参数,且d 1,d 2,d 0
Figure PCTCN2021100300-appb-000137
均由线段对中的两线段的表达式来确定。
可选的,所述优化模块440,配置成设置所述机器人在所述各个时刻的姿态的表达式; 设置经过所述线段匹配后所得到的所有直线的表达式;通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
本申请实施例所提供的定位与地图构建装置400,其实现原理及产生的技术效果和前述方法实施例相同,为简要描述,装置实施例部分未提及之处,可参考前述方法实施例中相应内容。
此外,本申请实施例还提供一种计算机存储介质,该计算机存储介质上存储有计算机程序,该计算机程序被计算机运行时,执行如上述的定位与地图构建方法所包含的步骤。
此外,本发明实施例还提供一种机器人,包括处理器以及与所述处理器连接的存储器,所述存储器内存储计算机程序,当所述计算机程序被所述处理器执行时,使得所述机器人执行如上述的定位与地图构建方法所包含的步骤。其中,机器人的结构示意图可以参看图1。
综上所述,本申请实施例提出的定位与地图构建方法、装置、机器人及计算机存储介质,在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;预估机器人在所述t时刻相较于t-1时刻的相对姿态;根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。由于在表示直线时,采用的是二直线参数化表示形式,因此,可以尽可能地减少数据计算时的计算量,从而可以提高数据获取效率,缩短计算时间。
在本申请所提供的实施例中,应该理解到,所揭露的装置和方法,也可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,附图中的流程图和框图显示了根据本申请的实施例的装置、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或代码的一部分,所述模块、程序段或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现方式中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
另外,在本申请实施例中的各功能模块可以集成在一起形成一个独立的部分,也可以 是各个模块单独存在,也可以两个或两个以上模块集成形成一个独立的部分。
所述功能如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,笔记本电脑,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。
工业实用性
本申请提出的技术方案中,由于在表示直线时,采用的是二直线参数化表示形式,因此,可以尽可能地减少确定用于进行后续实施定位以及地图构建的数据的计算量,从而提高地图定位与地图构建效率,以便提高后续实时定位和地图构建的效率。

Claims (16)

  1. 一种定位与地图构建方法,其特征在于,所述方法包括:
    在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;
    预估机器人在所述t时刻相较于t-1时刻的相对姿态;
    根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;
    将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
  2. 根据权利要求1所述的方法,其特征在于,所述预估机器人在所述t时刻相较于t-1时刻的相对姿态,包括:
    设置所述机器人在所述t时刻的姿态为
    Figure PCTCN2021100300-appb-100001
    设置所述机器人在所述t-1时刻的姿态为
    Figure PCTCN2021100300-appb-100002
    设置所述机器人在所述t时刻的机器人坐标系相对于世界坐标系的旋转矩阵及平移向量分别为
    Figure PCTCN2021100300-appb-100003
    根据预先保存的机器人轮式编码器姿态估计算法,计算得到所述t时刻相较于所述t-1时刻的相对姿态
    Figure PCTCN2021100300-appb-100004
    其中,
    Figure PCTCN2021100300-appb-100005
    Figure PCTCN2021100300-appb-100006
    之间满足公式:
    Figure PCTCN2021100300-appb-100007
    (x k,y k)表示所述机器人在t=k时刻时相对于世界坐标系w的坐标,
    Figure PCTCN2021100300-appb-100008
    表示所述机器人在t=k时刻时相对于世界坐标系w的朝向,
    Figure PCTCN2021100300-appb-100009
    为所述旋转矩阵,
    Figure PCTCN2021100300-appb-100010
    为所述平移向量,v为所述机器人的线速度,ω为所述机器人的角速度,△t为所述t时刻与所述t-1时刻之间的时间间隔。
  3. 根据权利要求1或2所述的方法,其特征在于,所述对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,包括:
    确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;
    根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;
    将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
  4. 根据权利要求3所述的方法,其特征在于,在所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配之前,所述方法还包括:
    针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;
    相应的,所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配,包括:
    针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
  5. 根据权利要求4所述的方法,其特征在于,在所述将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配之前,所述方法还包括:
    确定与该线段对应的候选线段的个数不为零;
    否则,所述方法还包括:
    确定该线段属于新的直线。
  6. 根据权利要求3或4所述的方法,其特征在于,所述将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配,包括:
    针对与所述t时刻对应的每个线段,将该线段和与所述t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数
    Figure PCTCN2021100300-appb-100011
    计算与该线段对应的各个线段对的损失值;
    将所述损失值最小时所对应的线段对确定为属于同一直线;
    其中,d 1,d 2为该线段的两个端点到线段对中的另一线段的距离,d 0是线段对中两线段的重合区域的长度,
    Figure PCTCN2021100300-appb-100012
    为该线段的长度,α是预先设定的控制因子参数,且d 1,d 2,d 0
    Figure PCTCN2021100300-appb-100013
    均由线段对中的两线段的表达式来确定。
  7. 根据权利要求1至6任一项所述的方法,其特征在于,所述将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数,包括:
    设置所述机器人在所述各个时刻的姿态的表达式;
    设置经过所述线段匹配后所得到的所有直线的表达式;
    通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;
    通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;
    对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
  8. 一种定位与地图构建装置,其特征在于,所述装置包括:
    提取模块,配置成在t时刻对获取到的点云数据进行直线特征提取,得到对应的多个线段;
    预估模块,配置成预估机器人在所述t时刻相较于t-1时刻的相对姿态;
    匹配模块,配置成根据所述相对姿态,对与所述t时刻对应的多个线段及在所述t-1时刻进行直线特征提取所得到的多个线段进行线段匹配,得到各条线段所属的直线;
    优化模块,配置成将所述机器人在各个时刻的姿态及所有直线的二直线参数化表示形式进行联合优化,得到所述机器人在所述各个时刻的姿态及所有直线的直线参数。
  9. 根据权利要求8所述的装置,其特征在于,所述预估模块,配置成:
    设置所述机器人在所述t时刻的姿态为
    Figure PCTCN2021100300-appb-100014
    设置所述机器人在所述t-1时刻的姿态为
    Figure PCTCN2021100300-appb-100015
    设置所述机器人在所述t时刻的机器人坐标系相对于世界坐标系的旋转矩阵及平移向量分别为
    Figure PCTCN2021100300-appb-100016
    根据预先保存的机器人轮式编码器姿态估计算法,计算得到所述t时刻相较于所述t-1时刻的相对姿态
    Figure PCTCN2021100300-appb-100017
    其中,
    Figure PCTCN2021100300-appb-100018
    Figure PCTCN2021100300-appb-100019
    之间满足公式:
    Figure PCTCN2021100300-appb-100020
    (x k,y k)表示所述机器人在t=k时刻时相对于世界坐标系w的坐标,
    Figure PCTCN2021100300-appb-100021
    表示所述机器人在t=k时刻时相对于世界坐标系w的朝向,
    Figure PCTCN2021100300-appb-100022
    为所述旋转矩阵,
    Figure PCTCN2021100300-appb-100023
    为所述平移向量,v为所述机器人的线速度,ω为所述机器人的角速度,△t为所述t时刻与所述t-1时刻之间的时间间隔。
  10. 根据权利要求8或9所述的装置,其特征在于,所述匹配模块,配置成:
    确定所述机器人在所述t-1时刻时所形成的机器人坐标系为参考坐标系;
    根据所述相对姿态,将与所述t时刻对应的线段的两个端点的坐标转换到所述参考坐标系中,得到与所述t时刻对应的线段在所述参考坐标系下的表达式;
    将与所述t时刻对应的线段在所述参考坐标系下的表达式及与所述t-1时刻对应的多个线段的表达式进行匹配。
  11. 根据权利要求10所述的装置,其特征在于,所述装置还包括过滤模块,配置成:
    针对与所述t时刻对应的每个线段,根据预先确定的属性特征及预先保存的属性特征阈值,从与所述t-1时刻对应的所有线段中过滤差异线段,并将与所述t-1时刻对应的剩余的线段确定为与该线段对应的候选线段;
    相应的,所述匹配模块,配置成:针对与所述t时刻对应的每个线段,将该线段在所述参考坐标系下的表达式及与该线段对应的候选线段的表达式进行匹配。
  12. 根据权利要求11所述的装置,其特征在于,所述装置还包括确定模块,配置成:
    确定与该线段对应的候选线段的个数不为零;在确定模块确定为时,所述装置还包括添加模块,配置成确定该线段属于新的直线。
  13. 根据权利要求10或11所述的装置,其特征在于,所述匹配模块,配置成:
    针对与所述t时刻对应的每个线段,将该线段和与所述t-1时刻对应的各个单一线段形成线段对,并根据预先构建的损失函数
    Figure PCTCN2021100300-appb-100024
    计算与该线段对应的各个线段对的损失值;
    将所述损失值最小时所对应的线段对确定为属于同一直线;
    其中,d 1,d 2为该线段的两个端点到线段对中的另一线段的距离,d 0是线段对中两线段的重合区域的长度,
    Figure PCTCN2021100300-appb-100025
    为该线段的长度,α是预先设定的控制因子参数,且d 1,d 2,d 0
    Figure PCTCN2021100300-appb-100026
    均由线段对中的两线段的表达式来确定。
  14. 根据权利要求8至13任一项所述的装置,其特征在于,所述优化模块,配置成:
    设置所述机器人在所述各个时刻的姿态的表达式;
    设置经过所述线段匹配后所得到的所有直线的表达式;
    通过所述姿态的表达式及所述所有直线的表达式,构建残差向量;
    通过最小化所述残差向量的平方和,得到所述各个时刻的姿态与所述所有直线之间的最优估计公式;
    对所述最优估计公式进行迭代求解,得到所述各个时刻的姿态及所有直线的直线参数。
  15. 一种机器人,其特征在于,包括:存储器和处理器,所述存储器和所述处理器连接;
    所述存储器用于存储程序;
    所述处理器调用存储于所述存储器中的程序,以执行如权利要求1-7中任一项所述的方法。
  16. 一种计算机存储介质,其特征在于,其上存储有计算机程序,所述计算机程序被计算机运行时执行如权利要求1-7中任一项所述的方法。
PCT/CN2021/100300 2020-11-27 2021-06-16 定位与地图构建方法、装置、机器人及计算机存储介质 WO2022110767A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023552378A JP2023549298A (ja) 2020-11-27 2021-06-16 自己位置推定・地図作成方法、ロボット及びコンピュータ記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011367335.9 2020-11-27
CN202011367335.9A CN112577500A (zh) 2020-11-27 2020-11-27 定位与地图构建方法、装置、机器人及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2022110767A1 true WO2022110767A1 (zh) 2022-06-02

Family

ID=75126501

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/100300 WO2022110767A1 (zh) 2020-11-27 2021-06-16 定位与地图构建方法、装置、机器人及计算机存储介质

Country Status (3)

Country Link
JP (1) JP2023549298A (zh)
CN (1) CN112577500A (zh)
WO (1) WO2022110767A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112577500A (zh) * 2020-11-27 2021-03-30 北京迈格威科技有限公司 定位与地图构建方法、装置、机器人及计算机存储介质
CN113984071B (zh) * 2021-09-29 2023-10-13 云鲸智能(深圳)有限公司 地图匹配方法、装置、机器人和计算机可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110052043A1 (en) * 2009-08-25 2011-03-03 Samsung Electronics Co., Ltd. Method of mobile platform detecting and tracking dynamic objects and computer-readable medium thereof
CN107655473A (zh) * 2017-09-20 2018-02-02 南京航空航天大学 基于slam技术的航天器相对自主导航系统
CN110310331A (zh) * 2019-06-18 2019-10-08 哈尔滨工程大学 一种基于直线特征与点云特征结合的位姿估计方法
CN110866927A (zh) * 2019-11-21 2020-03-06 哈尔滨工业大学 一种基于垂足点线特征结合的ekf-slam算法的机器人定位与构图方法
CN111590595A (zh) * 2020-06-30 2020-08-28 深圳市银星智能科技股份有限公司 一种定位方法、装置、移动机器人及存储介质
CN112577500A (zh) * 2020-11-27 2021-03-30 北京迈格威科技有限公司 定位与地图构建方法、装置、机器人及计算机存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110052043A1 (en) * 2009-08-25 2011-03-03 Samsung Electronics Co., Ltd. Method of mobile platform detecting and tracking dynamic objects and computer-readable medium thereof
CN107655473A (zh) * 2017-09-20 2018-02-02 南京航空航天大学 基于slam技术的航天器相对自主导航系统
CN110310331A (zh) * 2019-06-18 2019-10-08 哈尔滨工程大学 一种基于直线特征与点云特征结合的位姿估计方法
CN110866927A (zh) * 2019-11-21 2020-03-06 哈尔滨工业大学 一种基于垂足点线特征结合的ekf-slam算法的机器人定位与构图方法
CN111590595A (zh) * 2020-06-30 2020-08-28 深圳市银星智能科技股份有限公司 一种定位方法、装置、移动机器人及存储介质
CN112577500A (zh) * 2020-11-27 2021-03-30 北京迈格威科技有限公司 定位与地图构建方法、装置、机器人及计算机存储介质

Also Published As

Publication number Publication date
CN112577500A (zh) 2021-03-30
JP2023549298A (ja) 2023-11-22

Similar Documents

Publication Publication Date Title
CN110261870B (zh) 一种用于视觉-惯性-激光融合的同步定位与建图方法
CN108764048B (zh) 人脸关键点检测方法及装置
WO2022110767A1 (zh) 定位与地图构建方法、装置、机器人及计算机存储介质
CN112734852A (zh) 一种机器人建图方法、装置及计算设备
Bosse ATLAS: a framework for large scale automated mapping and localization
WO2019196476A1 (zh) 基于激光传感器生成地图
US20130094706A1 (en) Information processing apparatus and processing method thereof
CN112541423A (zh) 一种同步定位与地图构建方法和系统
CN114565668A (zh) 即时定位与建图方法及装置
CN114241050B (zh) 一种基于曼哈顿世界假设及因子图的相机位姿优化方法
CN113420590B (zh) 弱纹理环境下的机器人定位方法、装置、设备及介质
Heinemann et al. A combined monte-carlo localization and tracking algorithm for robocup
He et al. Observation‐driven Bayesian filtering for global location estimation in the field area
CN112652020A (zh) 一种基于AdaLAM算法的视觉SLAM方法
CN116577801A (zh) 一种基于激光雷达和imu的定位与建图方法及系统
Choi et al. Metric SLAM in home environment with visual objects and sonar features
KR101054520B1 (ko) 실내 이동 로봇의 위치 및 방향 인식 방법
WO2023072269A1 (zh) 对象跟踪
CN115239776B (zh) 点云的配准方法、装置、设备和介质
Cupec et al. Global localization based on 3d planar surface segments
CN110399892B (zh) 环境特征提取方法和装置
Han et al. Uav vision: Feature based accurate ground target localization through propagated initializations and interframe homographies
Chella et al. Automatic place detection and localization in autonomous robotics
WO2024108753A1 (zh) 基于激光雷达的移动机器人高效鲁棒全局定位方法
Zeng et al. Entropy-based Keyframe Established and Accelerated Fast LiDAR Odometry and Mapping

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896276

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023552378

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/09/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21896276

Country of ref document: EP

Kind code of ref document: A1