CN114627182B - Positioning method and device of robot, electronic equipment and storage medium - Google Patents

Positioning method and device of robot, electronic equipment and storage medium Download PDF

Info

Publication number
CN114627182B
CN114627182B CN202210096019.5A CN202210096019A CN114627182B CN 114627182 B CN114627182 B CN 114627182B CN 202210096019 A CN202210096019 A CN 202210096019A CN 114627182 B CN114627182 B CN 114627182B
Authority
CN
China
Prior art keywords
global
local
map
feature points
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210096019.5A
Other languages
Chinese (zh)
Other versions
CN114627182A (en
Inventor
董海青
陈波
刘冬
奉飞飞
唐剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
Midea Group Shanghai Co Ltd
Original Assignee
Midea Group Co Ltd
Midea Group Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Midea Group Shanghai Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202210096019.5A priority Critical patent/CN114627182B/en
Publication of CN114627182A publication Critical patent/CN114627182A/en
Application granted granted Critical
Publication of CN114627182B publication Critical patent/CN114627182B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a positioning method and device of a robot, electronic equipment and a storage medium, wherein the positioning method of the robot comprises the following steps: constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map; constructing a plurality of local feature points into at least one local feature graph, wherein the local feature points are used as vertexes of the local feature graph; matching a global feature pattern with a local feature pattern to determine matched global feature points and local feature points, wherein the global feature pattern is constructed by a plurality of global feature points on a global map; and determining the pose of the robot on the global map based on the matched global feature points and the local feature points. The application can solve the problem of more time consumption for robot positioning in the prior art.

Description

Positioning method and device of robot, electronic equipment and storage medium
Technical Field
The present invention relates to the field of positioning technologies, and in particular, to a method and an apparatus for positioning a robot, an electronic device, and a storage medium.
Background
The problem of robot repositioning is a hotspot problem which is concerned in the industry at present, in the traditional method, global map is searched in a violent matching mode or priori information is provided for the robot in an auxiliary mode through other sensors to accelerate initialization, global searching is carried out in a probability map mode in a large scene, and the method is not acceptable in some application scenes; adding other sensors to obtain a priori information increases cost or debugging difficulty, and is not acceptable to some extent.
Disclosure of Invention
The application provides a positioning method and device of a robot, electronic equipment and a storage medium, and aims to solve the problem that the positioning time of the robot is more in the prior art.
In order to solve the above technical problems, the present application provides a positioning method of a robot, including: constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map; constructing the local feature points into at least one local feature graph, wherein the local feature points serve as vertexes of the local feature graph; matching a global feature pattern with the local feature pattern to determine a matched global feature point and a local feature point, wherein the global feature pattern is constructed by a plurality of global feature points on a global map; and determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
In an embodiment, the method further comprises: acquiring a global map and a plurality of global feature points in the global map; and constructing the plurality of global feature points into at least one global feature graph, wherein the global feature points serve as vertexes of the global feature graph.
In an embodiment, the acquiring the plurality of global feature points in the global map, the acquiring the plurality of local feature points in the local map, each includes: acquiring a rasterized map, and converting the rasterized map into a directed distance field map; and deriving the directed distance field map, and determining characteristic points based on a derivation result.
In an embodiment, the deriving the directed distance field map, determining the feature point based on the result of the deriving includes: second-order derivation is carried out on the directed distance field map, and a second-order matrix of the directed distance field map is obtained; and determining characteristic points according to the second-order matrix.
In an embodiment, the constructing the plurality of global feature points into at least one global feature pattern, and the constructing the plurality of local feature points into at least one local feature pattern, each include: the method comprises the steps of constructing a plurality of feature points into at least one feature graph, wherein the feature graph is a polygon with different degrees of each angle.
In an embodiment, the matching the global feature pattern and the local feature pattern to determine a matched global feature point and a local feature point includes: determining matched global feature patterns and local feature patterns, and vertices corresponding to angles with the same degree in the matched global feature patterns and local feature patterns; and taking the determined vertexes as matched global characteristic points and local characteristic points.
In an embodiment, the determining the pose of the robot on the global map based on the matched global feature point and the local feature point includes: calculating the conversion relation between the global map and the local map according to the matched global feature points and local feature points; and determining the pose of the robot on the global map according to the conversion relation.
In an embodiment, the positioning method further comprises: removing abnormal points in the matched global characteristic points and local characteristic points; and determining the pose of the robot on the global map based on the global characteristic points and the local characteristic points which are matched after the abnormal points are removed.
In order to solve the technical problems, the application provides a positioning device of a robot, which comprises a map acquisition module, a characteristic graph module, a characteristic matching module and a pose positioning module; the map acquisition module is used for constructing a local map for the area where the robot is located and acquiring a plurality of local feature points in the local map; the feature pattern module is used for constructing the local feature points into at least one local feature pattern, and the local feature points are used as vertexes of the local feature pattern; the feature matching module is used for matching the global feature graph with the local feature graph to determine matched global feature points and local feature points, and the global feature graph is constructed by a plurality of global feature points on a global map; the pose positioning module is used for determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
To solve the above technical problem, the present application proposes an electronic device, which includes a processor and a memory, the memory is used for storing a computer program, and the processor is used for executing the computer program to implement the above method.
To solve the above technical problem, the present application also proposes a computer storage medium, a computer storage device for storing a computer program, the computer program being executable to implement the above method.
To solve the above technical problem, the present application also proposes a computer program product comprising a computer program which, when executed by a processor, implements the above method.
According to the application, the global feature pattern formed by a plurality of feature points on the global map is matched with the local feature pattern formed by a plurality of feature points on the local map, and the pose of the robot in the global map is determined by utilizing the image matching result, so that the feature point matching efficiency is improved in a pattern matching mode, the matching time consumption in the robot positioning process is reduced, and the robot positioning time consumption is saved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of an embodiment of a positioning method of a robot according to the present application;
FIG. 2 is a schematic flow chart of an embodiment of a positioning method of a robot according to the present application;
FIG. 3 is a schematic diagram of a map after SDF in the positioning method of the robot of the present application;
FIG. 4 is a schematic view showing the distribution of feature points in a map in the positioning method of the robot according to the present application;
FIG. 5 is a schematic diagram of matching of feature points in the positioning method of the robot of the present application;
FIG. 6 is a schematic diagram of an embodiment of an electronic device of the present application;
FIG. 7 is a schematic diagram of a computer storage medium according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application. In addition, the term "or" as used herein refers to a non-exclusive "or" (i.e., "and/or") unless otherwise indicated (e.g., "or otherwise" or in the alternative "). Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments may be combined with one or more other embodiments to form new embodiments.
The existing robot positioning method generally directly matches the characteristic points of the global map with the characteristic points of the local map, so that the matching possibility between the points is more, and the robot positioning method has a great amount of matching time consumption.
Based on the above, the application provides a robot positioning method, which matches a global feature pattern formed by a plurality of feature points on a global map with a local feature pattern formed by a plurality of feature points on a local map, so as to improve the matching efficiency, thereby reducing the matching time consumption in the robot positioning process and saving the robot positioning time consumption.
As shown in fig. 1 and 2, the robot positioning method of the present application includes the following steps, and it should be noted that the following step numbers are only for simplifying the description, and are not intended to limit the execution sequence of the steps, and each step of the present embodiment may be replaced arbitrarily without departing from the technical idea of the present application.
S11: a global map is acquired, and a plurality of global feature points in the global map.
The global map and a plurality of global feature points in the global map can be obtained so as to construct the plurality of global feature points in the global map into at least one global feature pattern, then the global feature pattern formed by the plurality of feature points on the global map and the local feature pattern formed by the plurality of feature points on the local map can be matched to obtain a matching result, and finally the pose of the robot in the global map is determined based on the matching result.
The global map may be preset in an execution body of the positioning method of the robot, so that the execution body may call the global map stored by itself when executing the positioning method of the robot. In other alternative embodiments, the positioning method execution body of the robot of the present application may download the global map from the server.
Alternatively, the global map obtained based on the above method may be a rasterized map, or a directed distance field (SDF) map.
After the global map is obtained, global feature point extraction can be performed on the global map in the following manner.
For example, in the case where the global map is a grid map, SDF may be performed on the global map to obtain an SDF-processed map as shown in fig. 3; the SDF feature points on the global map are then taken as global feature points of the global map.
As shown in fig. 3, the closer to the open center area in the map after SDF, the larger the value of SDF (the darker the color). In this case, the SDF-converted global map may be derived, and then SDF feature points on the global map may be determined based on the result of the derivation, and feature points shown in fig. 4 may be extracted.
Illustratively, a second order derivative may be performed on the SDF-formatted global map to obtain a second order matrix of the SDF-formatted global map; and then determining SDF feature points on the global map based on the second-order matrix, so that the mutation points on the global map can be obtained based on the second-order derivative result, thereby being convenient for constructing an accurate feature graph based on the extracted feature points and facilitating subsequent matching. Specifically, a second-order matrix of each pixel point on the SDF global map may be constructed, then a determinant value of the second-order matrix of each pixel point is solved and/or singular value decomposition is performed on the second-order matrix of each pixel point, and whether each pixel point is a global feature point is determined based on the determinant value and/or the singular value of the second-order matrix of each pixel point. For example, pixels on the global map having determinant values of the second order matrix greater than the first threshold and/or singular values greater than the second threshold may be used as global feature points on the global map.
The second-order matrix of the global map may be a Jacobian matrix or a Hessian matrix of the global map.
Taking the Hessian matrix as an example, the second order matrix for each pixel point can be expressed as:
Wherein I xx (X, σ) is represented as the second partial derivative of each pixel point (X, σ) in the X direction; i yy (X, σ) represents the second partial derivative of each pixel point (X, σ) in the Y direction; i xy (X, σ) is represented as the second derivative of each pixel point (X, σ) in the XY direction.
For another example, line3Dpp processing is performed on the global map to obtain global feature points on the global map.
It should be noted that the global feature points may be feature points with strong robustness, which have a certain invariance, so that a global feature pattern formed by a plurality of global feature points on the global map can show the features of the global map, so as to ensure the matching rate of the global feature pattern and the local feature pattern.
S12: the plurality of global feature points are constructed as at least one global feature pattern.
After obtaining the plurality of global feature points on the global map based on step S11, the plurality of global feature points may be constructed into at least one global feature pattern.
Alternatively, a plurality of global feature points may be input to the graph construction module to obtain at least one constructed global feature graph.
Or in other alternative embodiments, the global feature points with similar distances in the global feature points can be clustered through a clustering method such as DBSCAN or k-means to obtain at least one feature point cluster; and sequentially connecting all the global feature points of each feature point cluster to obtain a global feature graph formed by all the global feature points of each feature point cluster.
Each global feature point can be used as the vertex of the global feature graph to which the global feature point belongs.
Preferably, the global feature pattern constructed by the method can be an asymmetric pattern, so that only one matching result is obtained between the global feature pattern and the corresponding local feature pattern, the number of matching traversal can be reduced, and the matching efficiency is improved. For example, the global feature pattern may be a polygon having different degrees for each angle.
In addition, in other embodiments, the global feature pattern may be generated in advance and stored in the memory, so that when the robot is positioned, the global feature pattern is acquired from the memory, and the acquired global feature pattern and the local feature pattern are matched to determine the position of the robot.
S13: and constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map.
The local map can be constructed for the area where the robot is located, a plurality of local feature points in the local map are obtained, so that the local feature points can be subsequently constructed into at least one local feature pattern, then the global feature pattern formed by the feature points on the global map can be subsequently matched with the local feature pattern formed by the feature points on the local map, a matching result is obtained, and finally the pose of the robot in the global map is determined based on the matching result.
Alternatively, the robot may be provided with a laser sensor or an image pickup device, and the environmental information of the robot may be collected by the laser sensor or the image pickup device of the robot, and a local map may be constructed based on the environmental information of the robot. The constructed local map may be a local map with the current position of the robot as an origin. Alternatively, the constructed local map may be a rasterized map, or a directed distance field (SDF) map.
In addition, when constructing the local map, the robot can rotate in situ so that the robot can acquire environmental information of different directions.
After the local map is obtained, local feature point extraction can be performed on the local map in the following manner.
In one implementation, in the case where the local map is a grid map, the local map may be SDF-formed, and then SDF feature points on the SDF-formed local map may be used as local feature points of the local map.
Wherein the SDF-formatted local map may be derived and then SDF feature points on the local map may be determined based on the derived results.
Illustratively, a second order derivative may be performed on the SDF-formatted local map to obtain a second order matrix of the SDF-formatted local map; and then determining SDF characteristic points on the local map based on the second-order matrix, so that the mutation points on the local map can be obtained based on the second-order derivative result, thereby being convenient for constructing an accurate characteristic graph based on the extracted characteristic points and facilitating subsequent matching. Specifically, a second order matrix of each pixel point on the SDF-based local map may be constructed, then a determinant value of the second order matrix of each pixel point is solved and/or singular value decomposition is performed on the second order matrix of each pixel point, and whether each pixel point is a local feature point is determined based on the determinant value and/or the singular value of the second order matrix of each pixel point. For example, pixels on the local map having determinant values of the second order matrix greater than the first threshold and/or singular values greater than the second threshold may be used as local feature points on the local map.
The second-order matrix of the local map may be a Jacobian matrix or a Hessian matrix of the local map.
Taking the Hessian matrix as an example, the second order matrix for each pixel point can be expressed as:
Wherein, I xx (X, sigma) is expressed as the second partial derivative of each pixel point (X, sigma) in the X direction; i yy (X, σ) represents the second partial derivative of each pixel point (X, σ) in the Y direction; i xy (X, σ) is represented as the second derivative of each pixel point (X, σ) in the XY direction.
In another implementation, line3Dpp processing is performed on the local map to obtain local feature points on the local map.
S14: the plurality of local feature points are constructed as at least one local feature pattern.
After obtaining the plurality of local feature points on the local map based on step S13, the plurality of local feature points may be constructed into at least one local feature pattern.
Alternatively, a plurality of local feature points may be input to the graph construction module to obtain the constructed at least one local feature graph.
Or in other alternative embodiments, clustering the feature points with similar distances in the local feature points by using a clustering method such as DBSCAN or k-means to obtain at least one class; and sequentially connecting all the local feature points of each class to obtain a local feature graph formed by all the local feature points of each class.
Each local feature point can be used as the vertex of the local feature graph to which the local feature point belongs.
Preferably, the local feature pattern constructed by the method can be an asymmetric pattern, so that only one matching result is obtained between the local feature pattern and the corresponding local feature pattern, the number of matching traversal times can be reduced, and the matching efficiency is improved. For example, the local feature pattern may be a polygon having different degrees for each angle.
S15: and matching the global feature pattern with the local feature pattern to determine matched global feature points and local feature points.
After the global feature pattern formed by the plurality of feature points on the global map and the local feature pattern formed by the plurality of feature points on the local map are determined based on the steps, the global feature pattern and the local feature pattern can be matched to obtain a matching result, so that the pose of the robot in the global map can be determined based on the matching result.
In an application scenario, the matched global feature pattern and local feature pattern may be determined first, and then feature points matched with each other in the matched global feature pattern and local feature pattern may be determined.
Alternatively, in the case where a plurality of global feature patterns or a plurality of local feature patterns are constructed in the above steps, the global feature pattern having the highest similarity with each local feature pattern may be used as the global feature pattern that matches each local feature pattern. Under the condition that only one global feature pattern and one local feature pattern are constructed in the steps, the matching of the constructed global feature pattern and the constructed local feature pattern can be directly determined.
After the matched global feature pattern and the local feature pattern are determined, the matching relationship between each local feature point in the local feature pattern and each global feature point in the global feature pattern matched with the local feature pattern can be determined based on the position of each feature point in the pattern.
For example, in the case where the matched global feature pattern and local feature pattern are polygons having different numbers of angles, vertices corresponding to angles having the same number of angles in the matched global feature pattern and local feature pattern may be used as the matched global feature point and local feature point.
For example, as shown in fig. 5, based on the positions of the respective feature points in the graph, it is possible to determine that the J1 local feature point in the local feature graph of fig. 5a matches the Q1 global feature point in the global feature graph of fig. 5 (b), and that the J2 local feature point in the local feature graph of fig. 5 (a) matches the Q2 global feature point in the global feature graph of fig. 5 (b), and that the J3 local feature point in the local feature graph of fig. 5 (a) matches the Q3 global feature point in the global feature graph of fig. 5 (b).
In another application scenario, after the global feature graph is constructed based on step S12, a graph description (for example, a graph shape description, a dimension description, and/or a location description of the feature points in the graph, etc.) may be added to each feature point in the global feature graph; after the local feature pattern is constructed in step S14, a pattern description (e.g., a pattern shape description, a size description, and/or a description of the position of the feature point in the pattern, etc.) may be added to each feature point in the local feature pattern; in step S15, the global feature points and the local feature points may be matched based on the graphic description, for example, it may be determined that the local feature points match the global feature points whose graphic description is the same or substantially the same.
S16: and determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
After the matched global feature points and the local feature points are determined based on the steps, the pose of the robot in the global map can be determined based on the matched global feature points and the local feature points.
Alternatively, the conversion relationship between the global map and the local map may be calculated according to the matched global feature points and local feature points; and determining the pose of the robot on the global map according to the conversion relation. Illustratively, a coordinate conversion formula (i.e., the conversion relation described above) for converting from the local map to the global map may be determined based on the matched global feature points and the local feature points; substituting the pose of the robot in the local map into a coordinate conversion formula to obtain the pose of the robot in the global map.
In other alternative embodiments, the matched global feature points and local feature points may be input to a pose estimation module, and the pose estimation module may be used to determine the pose of the robot in the global map based on the matching relationship between the global feature points and the local feature points. The pose estimation module can solve the problem of optimization with constraint by adopting a Lagrangian multiplier method, so that the pose of the robot in the global map is determined based on the matching relation of the global feature points and the local feature points by the Lagrangian multiplier method.
In addition, when the pose of the robot in the global map is determined based on the matching relation of the global feature points and the local feature points, abnormal points in the matched global feature points and local feature points can be removed first, and then the pose of the robot in the global map is determined based on the matched global feature points and local feature points after the abnormal points are removed, so that the influence of the abnormal points on the positioning precision of the robot is avoided, and the positioning precision of the robot is improved. And eliminating abnormal points in the matched global characteristic points and local characteristic points under the condition that the spread of the global characteristic points and the local characteristic points is larger than a threshold value. Alternatively, a chi-square distribution may be used to reject a proportion of outliers (i.e., outliers) in the matched global feature points and local feature points.
In this embodiment, a global map and a plurality of global feature points in the global map are acquired; constructing a plurality of global feature points into at least one global feature graph, wherein the global feature points are used as vertexes of the global feature graph; constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map; constructing a plurality of local feature points into at least one local feature graph, wherein the local feature points are used as vertexes of the local feature graph; matching the global feature pattern with the local feature pattern to determine matched global feature points and local feature points; based on the matched global feature points and local feature points, the pose of the robot on the global map is determined, namely the global feature graph formed by a plurality of feature points on the global map is matched with the local feature graph formed by a plurality of feature points on the local map, and the pose of the robot in the global map is determined by utilizing an image matching result, so that the feature point matching efficiency is improved in a pattern matching mode, the matching time consumption in the robot positioning process is reduced, and the robot positioning time consumption is saved.
As shown in fig. 2, the second embodiment of the robot positioning method of the present application includes the following steps, and it should be noted that the following step numbers (step numbers are not labeled in fig. 2, and the matching may be performed in fig. 2 based on the corresponding content of each step number) are only for simplifying the description, and are not intended to limit the execution sequence of the steps, and each step of the present embodiment may be arbitrarily replaced without departing from the technical idea of the present application.
S201: and initializing parameters.
The relevant parameters required by the system may be initialized.
S202: and initializing a Gaussian kernel function template.
S203: the global map is initialized.
If the global map is successfully initialized, the step S204 is entered; otherwise, the process advances to step S220.
Alternatively, in step S203, a global map may be loaded, SDF-converted, SDF feature points (as global feature points) are extracted, the extracted global feature points (feature points need to be sufficiently robust) are input to the graphics building module, and a descriptor based on the graphics feature is added for each global feature point.
S204: a local map is collected.
The robot can be controlled to rotate in situ, and a local map of the position of the robot is constructed.
S205: the local map is processed by an image processing unit and an SDF processing unit.
The image processing unit can be utilized to carry out Gaussian deblurring on the local map; and then SDF processing unit is utilized to SDF the local grid map after Gaussian deblurring to obtain the SDF local map.
S206: and utilizing the characteristic point detection unit to solve hessian values of the current pixel point in the partial image.
S207: it is determined whether hessian values of the current pixel point in the partial image are greater than a first threshold.
If it is determined that the hessian value of the current pixel point in the local image is greater than the first threshold, step S208 is performed to determine whether the current pixel point is a local feature point; otherwise, it may be determined that the current pixel is not a local feature point, and the next pixel of the current pixel is taken as the current pixel, and step S206 is returned to determine whether the hessian value and/or the feature value of the next pixel satisfy the preset condition, i.e., determine whether the next pixel is a local feature point.
S208: and determining whether the characteristic value of the current pixel point in the local image meets the edge requirement.
If the feature value (may be singular value) of the current pixel point in the local image meets the edge requirement, confirming that the current pixel point is a local feature point, and entering step S209 to save the local feature point on the local map; otherwise, it is determined that the current pixel is not the local feature point, and the next pixel of the current pixel is taken as the current pixel, and the step S206 is returned to determine whether the hessian value and the feature value of the next pixel meet the preset condition, that is, whether the next pixel is the local feature point.
S209: and saving the local feature points.
After the currently confirmed local feature points are saved, step S210 may be performed to confirm whether the feature detection is completed.
S210: whether the feature is detected.
Whether the feature is detected is confirmed by S210, if so, step S211 may be entered to add a graphic descriptor for each feature point on the local map, that is, a plurality of local feature points on the local map are constructed into at least one local feature graphic, and information of the local feature image to which the feature point belongs and/or position information of the feature point in the local feature graphic are added for each feature point on the local map. If the feature is not detected, the process may return to step S206 to determine whether the hessian value and the feature value of the next pixel point satisfy the preset condition, i.e. determine whether the next pixel point is a local feature point.
Whether the current pixel point is the last pixel point on the local map can be confirmed, and if the current pixel point is the last pixel point on the local map, the feature detection is finished; otherwise, the characteristics are not detected. Of course, whether the feature is detected can be confirmed in other ways, for example, whether all the pixel points on the local map are detected is confirmed, and if all the pixel points are detected, the feature detection is confirmed to be finished; otherwise, the characteristics are not detected.
S211: the feature points employ descriptors based on graph features.
S212: whether feature matching was successful.
Matching the local feature points and the global feature points by using the graphic descriptors, and if the matching is successful, entering step S213; otherwise, the process returns to step S204.
S213: whether the maximum number of iterations is reached.
If the maximum iteration number is reached, step S214 is entered; otherwise, the process advances to step S204.
S214: and (5) pose estimation.
And estimating the pose of the robot by using the successfully matched local feature points and global feature points.
Alternatively, the Lagrangian multiplier method may be used to solve the constrained optimization problem for pose estimation.
S215: and calculating the spread.
The spread of the global feature points and the local feature points can be calculated so as to confirm whether the global feature points and the local feature points are abnormal points or not by utilizing the spread of all the feature points and the local feature points later, and whether the global feature points and the local feature points need to be removed or not.
S216: whether the spread is less than a threshold.
If the divergence of the global feature points and the local feature points is less than the threshold, step S218 may be entered; otherwise, step S217 is performed to eliminate outliers using chi-square distribution.
S217: and removing outliers by using chi-square distribution.
If the spread of the global feature points and the local feature points is larger than a threshold value, abnormal points in the global feature points and the local feature points can be confirmed, and in this case, outliers can be removed by using chi-square distribution so as to improve the matching accuracy of the global feature points and the local feature points, thereby improving the positioning accuracy of the robot.
After removing a certain proportion of outliers by chi-square distribution, the process may return to step S213.
S218: outputting the pose.
If the dispersion of the global feature points and the local feature points is smaller than the threshold value, the global feature points and the local feature points are relatively accurately matched, namely the robot is positioned relatively accurately, and the estimated repositioning pose of the robot can be output.
S219: and (5) finishing initialization.
S220: and (5) ending.
Optionally, the application provides a robot positioning device, which comprises a map acquisition module, a feature graph module, a feature matching module and a pose positioning module.
The map acquisition module is used for constructing a local map for the area where the robot is located and acquiring a plurality of local feature points in the local map;
the feature pattern module is used for constructing the local feature points into at least one local feature pattern, and the local feature points are used as vertexes of the local feature pattern;
the feature matching module is used for matching the global feature graph with the local feature graph to determine matched global feature points and local feature points, and the global feature graph is constructed by a plurality of global feature points on a global map;
The pose positioning module is used for determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
The map acquisition module is used for acquiring a global map and a plurality of global feature points in the global map. The feature graph module is used for constructing a plurality of global feature points into at least one global feature graph, and the global feature points serve as vertexes of the global feature graph.
The map acquisition module is used for acquiring a rasterized map and converting the rasterized map into a directed distance field map; and deriving the directed distance field map, and determining the characteristic points based on the derivation result.
The map acquisition module is used for carrying out second-order derivation on the directed distance field map to obtain a second-order matrix of the directed distance field map; and determining the characteristic points according to the second-order matrix.
The feature pattern module is used for constructing a plurality of feature points into at least one feature pattern, wherein the feature pattern is a polygon with different degrees of each angle.
The feature matching module is used for determining matched global feature patterns and local feature patterns and vertexes corresponding to angles with the same degree in the matched global feature patterns and local feature patterns; and taking the determined vertexes as matched global characteristic points and local characteristic points.
The pose positioning module is used for calculating the conversion relation between the global map and the local map according to the matched global feature points and the local feature points; and determining the pose of the robot on the global map according to the conversion relation.
The pose positioning module is used for removing abnormal points in the matched global characteristic points and local characteristic points; and determining the pose of the robot on the global map based on the global feature points and the local feature points which are matched after the abnormal points are removed.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device 200 according to an embodiment of the application. The electronic device 200 of the application comprises a processor 21, the processor 21 being adapted to execute a computer program to implement the method provided by the method of any of the above embodiments of the application and any non-conflicting combination.
The processor 21 may also be referred to as a CPU (Central Processing Unit ). The processor 21 may be an integrated circuit chip with signal processing capabilities. The processor 21 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor, or the processor 21 may be any conventional processor or the like.
The electronic device 20 may further comprise a memory 22 for storing computer programs needed for the operation of the processor 21.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a computer storage medium according to an embodiment of the application. The computer storage medium 300 of an embodiment of the present application stores a computer program that when executed implements the method provided by any of the above-described methods of the present application, as well as any non-conflicting combination. Wherein the computer program may form a program file stored in the above-mentioned storage medium 300 in the form of a software product for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium 300 includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes, or a computer, a server, a mobile phone, a tablet, or other devices.
Further, the present application also provides a computer program product comprising a computer program. A computer program may be stored on a non-transitory computer readable storage medium, which when executed by a processor, is capable of performing the method provided by the above-described method embodiments, the method comprising: constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map; constructing a plurality of local feature points into at least one local feature graph, wherein the local feature points are used as vertexes of the local feature graph; matching a global feature pattern with a local feature pattern to determine matched global feature points and local feature points, wherein the global feature pattern is constructed by a plurality of global feature points on a global map; and determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The foregoing is only the embodiments of the present application, and therefore, the patent scope of the application is not limited thereto, and all equivalent structures or equivalent processes using the descriptions of the present application and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the application.

Claims (8)

1. A method of positioning a robot, the method comprising:
acquiring a global map and a plurality of global feature points in the global map;
Constructing the global feature points into at least one global feature graph, wherein the global feature points are used as vertexes of the global feature graph;
Constructing a local map for the area where the robot is located, and acquiring a plurality of local feature points in the local map;
Constructing the local feature points into at least one local feature graph, wherein the local feature points serve as vertexes of the local feature graph;
matching the global feature pattern with the local feature pattern to determine matched global feature points and local feature points;
determining the pose of the robot on the global map based on the matched global feature points and local feature points;
The obtaining the plurality of global feature points in the global map, and the obtaining the plurality of local feature points in the local map, each includes:
acquiring a rasterized map, and converting the rasterized map into a directed distance field map;
and deriving the directed distance field map, and determining characteristic points based on a derivation result.
2. The positioning method according to claim 1, wherein said matching the global feature pattern and the local feature pattern previously comprises:
Constructing a plurality of global feature points as at least one global feature graph, wherein the plurality of global feature points are a plurality of feature points on the global map;
The constructing a plurality of global feature points into at least one global feature pattern, and the constructing the plurality of local feature points into at least one local feature pattern, each includes:
the method comprises the steps of constructing a plurality of feature points into at least one feature graph, wherein the feature graph is a polygon with different degrees of each angle.
3. The positioning method according to claim 2, wherein said matching the global feature pattern and the local feature pattern to determine matched global feature points and local feature points includes:
Determining matched global feature patterns and local feature patterns, and vertices corresponding to angles with the same degree in the matched global feature patterns and local feature patterns;
and taking the determined vertexes as matched global characteristic points and local characteristic points.
4. The positioning method according to claim 1, wherein determining the pose of the robot on the global map based on the matched global feature points and local feature points comprises:
calculating the conversion relation between the global map and the local map according to the matched global feature points and local feature points;
and determining the pose of the robot on the global map according to the conversion relation.
5. A positioning device of a robot, characterized in that the positioning device comprises;
The map acquisition module is used for acquiring a global map and a plurality of global feature points in the global map, constructing a local map for the area where the robot is located and acquiring a plurality of local feature points in the local map; the map acquisition module is used for acquiring a rasterized map, converting the rasterized map into a directed distance field map, deriving the directed distance field map, and determining characteristic points based on a derivation result;
the feature graph module is used for constructing the global feature points into at least one global feature graph, wherein the global feature points are used as vertexes of the global feature graph, the local feature points are used for constructing the local feature points into at least one local feature graph, and the local feature points are used as vertexes of the local feature graph;
The feature matching module is used for matching the global feature graph with the local feature graph to determine matched global feature points and local feature points, and the global feature graph is constructed by a plurality of global feature points on a global map;
and the pose positioning module is used for determining the pose of the robot on the global map based on the matched global feature points and the local feature points.
6. An electronic device comprising a processor and a memory, the memory for storing a computer program, the processor for executing the computer program to implement the method of any of claims 1-4.
7. A computer storage medium, characterized in that it stores a computer program for being executed to implement the method of any one of claims 1-4.
8. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 4.
CN202210096019.5A 2022-01-26 2022-01-26 Positioning method and device of robot, electronic equipment and storage medium Active CN114627182B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210096019.5A CN114627182B (en) 2022-01-26 2022-01-26 Positioning method and device of robot, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210096019.5A CN114627182B (en) 2022-01-26 2022-01-26 Positioning method and device of robot, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114627182A CN114627182A (en) 2022-06-14
CN114627182B true CN114627182B (en) 2024-08-13

Family

ID=81897636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210096019.5A Active CN114627182B (en) 2022-01-26 2022-01-26 Positioning method and device of robot, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114627182B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113256712A (en) * 2021-06-01 2021-08-13 北京有竹居网络技术有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113298871A (en) * 2021-05-14 2021-08-24 视辰信息科技(上海)有限公司 Map generation method, positioning method, system thereof, and computer-readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148170A (en) * 2018-08-31 2019-08-20 北京初速度科技有限公司 A kind of positioning initialization method and car-mounted terminal applied to vehicle location
CN110084853A (en) * 2019-04-22 2019-08-02 北京易达图灵科技有限公司 A kind of vision positioning method and system
WO2021253430A1 (en) * 2020-06-19 2021-12-23 深圳市大疆创新科技有限公司 Absolute pose determination method, electronic device and mobile platform

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113298871A (en) * 2021-05-14 2021-08-24 视辰信息科技(上海)有限公司 Map generation method, positioning method, system thereof, and computer-readable storage medium
CN113256712A (en) * 2021-06-01 2021-08-13 北京有竹居网络技术有限公司 Positioning method, positioning device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114627182A (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN110807473B (en) Target detection method, device and computer storage medium
EP3457357A1 (en) Methods and systems for surface fitting based change detection in 3d point-cloud
Fantoni et al. Accurate and automatic alignment of range surfaces
CN110648397B (en) Scene map generation method and device, storage medium and electronic equipment
CN106981077B (en) Infrared image and visible light image registration method based on DCE and LSS
US8798377B2 (en) Efficient scale-space extraction and description of interest points
CN116188805B (en) Image content analysis method and device for massive images and image information network
CN107038443B (en) Method and device for positioning region of interest on circuit board
JP2007018493A (en) Three-dimensional shape aligning method and program
CN112101386B (en) Text detection method, device, computer equipment and storage medium
CN110807110B (en) Image searching method and device combining local and global features and electronic equipment
CN104574401A (en) Image registration method based on parallel line matching
CN113033248B (en) Image recognition method and device and computer readable storage medium
CN116091998A (en) Image processing method, device, computer equipment and storage medium
CN117451033B (en) Synchronous positioning and map construction method, device, terminal and medium
CN109741306B (en) Image processing method applied to dangerous chemical storehouse stacking
CN113435479A (en) Feature point matching method and system based on regional feature expression constraint
CN114627182B (en) Positioning method and device of robot, electronic equipment and storage medium
CN113111741A (en) Assembly state identification method based on three-dimensional feature points
CN117078508A (en) Point cloud registration method and system based on multi-feature point set
US9792675B1 (en) Object recognition using morphologically-processed images
CN109325489B (en) Image recognition method and device, storage medium and electronic device
CN115239776B (en) Point cloud registration method, device, equipment and medium
CN116468761A (en) Registration method, equipment and storage medium based on probability distribution distance feature description
CN109213515B (en) Multi-platform lower buried point normalization method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant