CN112428271B - Robot real-time motion planning method based on multi-mode information feature tree - Google Patents
Robot real-time motion planning method based on multi-mode information feature tree Download PDFInfo
- Publication number
- CN112428271B CN112428271B CN202011262159.2A CN202011262159A CN112428271B CN 112428271 B CN112428271 B CN 112428271B CN 202011262159 A CN202011262159 A CN 202011262159A CN 112428271 B CN112428271 B CN 112428271B
- Authority
- CN
- China
- Prior art keywords
- feature
- node
- map
- point set
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a robot real-time motion planning method based on a multi-mode information feature tree, which comprises the steps of obtaining an environment map, extracting pose information of a robot in real time, judging whether the degree of representation of a feature point to the environment map is complete or not, if not, performing primary feature extraction on the node pose of the robot by using angular speed and distance fusion until the degree of representation of the feature point is complete to obtain an optimal feature point, obtaining a final feature point set, and updating the feature map; if the map is complete, updating the environment map by using the Euclidean distance to obtain a feature map; and then obtaining a characteristic matrix according to the characteristic point set, and generating a multi-mode information characteristic tree of the characteristic points according to the starting point, the end point and the characteristic matrix to obtain a heuristic path. The invention rapidly finds out the candidate nodes of the feasible region by constructing the multi-mode information feature tree and extracting the feature points in real time to optimize the path planning based on random sampling, thereby solving the trap space and improving the intellectualization of the mobile robot.
Description
Technical Field
The invention relates to the technical field of robot motion planning, in particular to a robot real-time motion planning method based on a multi-mode information feature tree.
Background
The development of the artificial intelligence technology brings opportunities for the research of mobile service robots, and at present, mobile service robots such as guide robots, floor sweeping robots, shopping guide robots, goods handling robots and the like are successfully applied to various environments such as airports, supermarkets, museums, families and the like. The path planning of the mobile robot refers to finding a path which is free of collision and meets a specified constraint between a given initial point and a given target point under the condition of no human intervention. Currently, the main path planning methods can be roughly divided into four types: grid-based path planning, artificial potential field-based path planning, reward-based path planning, and random sampling-based motion planning. In the four methods, the path planning algorithm based on random sampling has small calculation amount and flexible obstacle avoidance, does not need to model the environment, greatly reduces the planning time and the memory cost, ensures the real-time performance of path planning, and is more suitable for solving the path planning problem of the dynamic environment. However, in a complex dynamic man-machine fusion environment, such as a scene like a maze, an S-curve, a slit, etc., and a dynamic obstacle like a pedestrian exists, the robot does not simply move in a purely static environment, but also considers that dynamic pedestrians move randomly. Because the occupation ratio of the feasible region in the whole map is greatly reduced, the probability of randomly sampling the feasible region is reduced, and a local planning algorithm is easy to fall into a trap space, so that the planning time of a path is prolonged, and even the planning fails. Meanwhile, the robot is not endowed with the concept of brain memory, the robot walks numerous times in the same scene, and the robot needs to plan a path again next time, so that the computing resources are wasted.
Disclosure of Invention
The invention aims to solve the technical problem of providing a robot real-time motion planning method based on a multi-mode information feature tree, which solves the trap space problem, realizes node multiplexing, ensures real-time performance and saves time cost.
In order to solve the technical problem, the invention provides a robot real-time motion planning method based on a multi-mode information feature tree, which comprises the following steps:
step 1: acquiring an environment map, extracting node pose information of the robot in real time, judging whether the representation degree of the feature point to the environment map is complete, if not, executing the step 2, and if so, executing the step 4;
step 2: performing primary feature extraction on the node pose of the robot by using an angular velocity channel and distance fusion to obtain an initial feature point set, and updating a feature map;
and step 3: further extracting the initial feature point set by using distance fusion again until the degree of characterization of the feature points is complete to obtain optimal feature points, and obtaining a final feature point set;
and 4, step 4: updating the environment map by using the Euclidean distance to obtain a feature map, and finishing the real-time feature extraction;
and 5: and obtaining a characteristic matrix according to the characteristic point set, generating a multi-mode information characteristic tree of the characteristic points according to the starting point, the end point and the characteristic matrix, and obtaining a heuristic path on the characteristic map.
Further, the method for judging whether the degree of characterization of the feature point to the environment map is complete in step 1 is to check whether the environment map of this time has a corresponding feature file before opening the terminal for path planning each time, judge whether the degree of characterization is complete through the feature point in the feature file if the environment map has the corresponding feature file, and if the environment map has no corresponding feature file, judge that the degree of characterization is incomplete; and when the robot finishes the path planning on the environment map and closes the terminal, generating a corresponding feature file of the environment map.
Further, the specific process of step 1 is as follows:
step 1-1: giving a navigation target, starting path planning by the robot based on a Risk-RRT algorithm, and acquiring node pose information G of the robot in the path planning process in real timem=(xm,ym,ωm) Wherein x and y represent the positions of the nodes, and omega represents the angular velocity of the nodes;
step 1-2: judging whether the feature file exists or not when the terminal is opened every time, and calculating the point of an unobstructed area on the map through the feature file to find the ratio representation of the number of the feature point numbers corresponding to the point of the unobstructed area to all the points of the unobstructed area, wherein the ratio representation is used for representing the representation degree;
if no feature file exists, the representation is made 0, and the feature map M is initializedFInitializing a feature point set G for a two-dimensional zero matrix with the same size as the environment mapFPIf the set is empty, executing the step 2;
if the feature file exists but the representation degree is not complete, the node G stored in the feature file is usednew=(xn,yn) Storing feature point set GFPStep 2 is executed;
if the characteristic file exists and the degree of characterization is complete, the node G stored in the characteristic filenew=(xn,yn) Storage featurePoint set GFPAnd step 4 is executed.
Further, the specific process of step 2 is as follows:
step 2-1: using the angular velocity channel to extract nodes with angular velocities greater than a preset threshold, denoted as Gnew=(xn,yn);
Step 2-2: if the representation degree is not complete, executing the step 2-3; if the representation degree is complete, executing the step 2-5;
step 2-3: judging feature point set GFPIf the node G is empty, directly connecting the node G if the node G is emptynewStoring feature point set GFPAnd returning to execute the step 2-1; if not, executing step 2-4;
step 2-4: node G using Euclidean distance channelnewAnd feature point set GFPAll nodes G ini=(xi,yi) Sequentially calculating Euclidean distances, judging whether the Euclidean distances are smaller than or equal to a preset first distance fusion threshold value, and if yes, returning to execute the step 2-1;
if node GnewAnd feature point set GFPIf the distances of all the nodes in the node G are greater than the preset first distance fusion threshold value, the node G is connected with the node GnewAdding into feature point set GFPG isFPCarrying out the step 3;
step 2-5: node GnewAnd feature point set GFPAll nodes G ini=(xi,yi) Sequentially calculating Euclidean distances, judging whether the Euclidean distances are smaller than or equal to a preset first-time distance fusion threshold value, if so, executing the steps 2-6, and if not, executing the step 4;
step 2-6: map of features MFMiddle grid point and node GnewSequentially carrying out collision detection, if all collision detections pass and the characteristic map MFGrid point and node G innewIs less than the feature map MFThe sum of the distances between the grid point and the original characteristic node is the node GnewNode G replacing original characteristicsi=(xi,yi) Updating the feature map, will GFPAnd carrying out the step 4, otherwise, carrying out the step 5.
Further, when the angular velocity channel is used to extract the nodes with the angular velocity greater than the preset threshold in step 2-1, the nodes are extracted at intervals of a certain number for reducing the redundancy of the nodes.
Further, the specific process of step 3 is as follows:
step 3-1: the initialization count c is 0, and the feature point set G to be saved is initializedRIs an empty set; extracting feature point set GFPEach node G ini=(xi,yi) A set of neighboring feature nodes;
step 3-2: initializing i to be 1;
step 3-3: judging whether i is less than or equal to the characteristic point set GFPIf yes, executing step 3-4, otherwise executing step 3-6;
step 3-4: judging whether the number of the adjacent feature node sets of the ith node is less than or equal to 3, if so, performing the step 3-5, otherwise, enabling i to be i +1 and returning to execute the step 3-3;
step 3-5: node Gi=(xi,yi) Adding nodes in the feature node set itself and its adjacent feature node set into a point set GRPerforming the following steps; judging whether the number of the adjacent feature node sets is less than or equal to 1, and counting c +1 if the number of the adjacent feature node sets is less than or equal to 1; if i is not equal to the feature point set GFPIf the number n is greater than the preset value, i is equal to i +1, and the step 3-3 is executed, otherwise, the step 3-6 is executed;
step 3-6: point set GRPerforming deduplication processing if the feature point set GFPThe number n of the point sets is not less than the point set GRIf the number of the feature points is-1, updating the feature map and collecting the feature points GFPCarrying out the step 4, otherwise, executing the step 3-7;
step 3-7: feature point set GFPRemoving point set GRAll other nodes of (2) are added to the point set GDIn the method, the second distance fusion is carried out to delete the feature points and update the feature map, and the feature point set G at the momentFPNamely the final feature point set GFP', will GFP' carry over to step 4 and execute.
Further, the specific method of distance fusion is as follows:
node G in point setxAnd the rest of nodes G in the point setx1Connecting lines, performing collision detection and Euclidean distance calculation on the map, and deleting the node G from the point set if the collision detection is passed and the distance is less than the distance threshold value of distance fusion set by peoplex1Until all points in the set have been detected.
Further, the specific process of step 4 is as follows:
step 4-1: initializing a flag bit to be 0, indicating whether all points of the non-obstacle area on the map find the corresponding characteristic point number, and converting GFP' all node coordinates in this document are converted to grid coordinates, denoted Gi1=(xi1,yi1);
Step 4-2: traversing two-dimensional grid map MPAll grid points (i) of the middle obstacle-free areak,jk) Initializing count to be 0, and initializing min _ dist to be the map length multiplied by the map width; grid point (i)k,jk) In turn with GFP' feature node G ini1The connecting line is used for collision detection on the map, and if the collision detection is passed, the Euclidean distance dist between the two points is calculated1If dist1Less than min _ dist, then in the feature map MFIn updating grid point (i)k,jk) Is the number of the feature node i and min _ dist is updated to dist1A value of (d); if the collision detection fails, count +1 is counted; up to GFP' all nodes in the set go through end if count is GFPIf the number n of the' is equal to 1, the loop is exited, and the step 2 is returned; if count ≠ GFP' n, repeating the step 4-2 until the two-dimensional grid map MpAll grid points of the middle barrier-free area are traversed and ended to obtain a feature map MF;
Step 4-3: if the flag bit is equal to 0, the representation degree is complete, and the representational is equal to 1, otherwise, the representational is equal to 0.
Further, the specific process of step 5 is as follows:
step 5-1: select GFP' A certain node GFPiThe rest nodes GFPxAnd node GFPiThe connecting line of the two points is used for collision detection on the map, if the collision detection is passed, the Euclidean distance between the two points is stored into a characteristic matrix PF(i, x), otherwise store 0 in PF(i,x);
Step 5-2: in the feature map MFFinding the feature node number corresponding to the starting point position in the multi-mode information feature tree T, adding the feature node number into the multi-mode information feature tree T to complete the initialization of the multi-mode information feature tree T, and initializing the open node set to be Topen=T;
Step 5-3: judging whether the number of the open nodes is larger than zero, if so, performing the step 5-4; if not, performing the step 5-10;
step 5-4: extracting the first open node topenWill topenFrom the open node set TopenRemoving;
step 5-5: from the feature map MFMiddle extraction of topenThe region of interest AOI;
and 5-6: extracting t in region of interest AOIopenAll adjacent feature nodes X ofnew;
And 5-7: judging adjacent characteristic node set XnewWhether the current time is empty or not, if so, executing a step 5-3; if not, executing step 5-8;
and 5-8: taking XnewFirst node x ofnewFinding x on a multimodal information feature tree TnewNearest neighbor x ofparentJudgment of xparentIf it is not present, x isnewFrom XnewRemoving and returning to execute the step 5-7; if so, performing step 5-9;
and 5-9: x is to benewJoining to a multimodal information feature Tree T and an open node set TopenIn (1), xnewFrom XnewIn removing, returning toCarrying out step 5-7;
step 5-10: in the feature map MFFinding a characteristic node corresponding to the position of the end point, and finding a path from the starting point to the end point by reversely searching the multi-mode information characteristic tree T;
step 5-11: and returning to execute the step 2-1.
Further, the step 5-5 is from the feature map MFMiddle extraction of topenThe specific method for the region of interest AOI of (2),
in the feature map MFIn (1), all values are searched for as topenPoints of (2), recording limit values x of coordinates in all pointsmin、xmax、ymin、ymaxAnd selecting a square area corresponding to the coordinate limit value as an interest area AOI.
The method quickly finds out the candidate nodes of the feasible region by constructing the multi-mode information feature tree and extracting the feature points in real time to optimize the path planning based on random sampling, solves the trap space, improves the intellectualization of the mobile robot, and can efficiently and quickly find out the optimal motion trail of the mobile robot in the complex dynamic man-machine co-fusion environment.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a schematic diagram of feature point extraction through angular velocity channel and distance fusion in the embodiment of the present invention.
FIG. 3 is a schematic diagram of a second distance fusion in an embodiment of the present invention.
Fig. 4 is a schematic diagram of feature points finally extracted in the embodiment of the present invention.
FIG. 5 is a diagram of a multi-modal information feature tree and heuristic paths in an embodiment of the invention.
Detailed Description
The present invention is further described below in conjunction with the following figures and specific examples so that those skilled in the art may better understand the present invention and practice it, but the examples are not intended to limit the present invention.
In the description of the present invention, it is to be understood that the terms "first", "second" and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "second" or "first" may explicitly or implicitly include one or more of that feature. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements does not include a limitation to the listed steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1 to 5, an embodiment of a robot real-time motion planning method based on a multi-modal information feature tree according to the present invention includes the following steps:
step 1: acquiring an environment map, extracting node pose information of the robot in real time, and judging whether the representation degree of the feature point to the environment map is complete (the premise of complete representation degree is that the feature map is calculated every time a feature point meeting conditions is obtained, the feature point corresponding to the feature point is not found in all points on the feature map, and the representation degree of the feature point corresponding to the feature point is complete only when the point on each map is found), if not, executing the step 2, and if complete, executing the step 4. The method for judging whether the representation degree of the environment map is complete by the feature points comprises the steps of checking whether the environment map has a corresponding feature file before opening the terminal for path planning each time, judging whether the representation degree is complete through the feature file if the corresponding feature file exists, and judging that the representation degree is incomplete if the corresponding feature file does not exist; and when the robot finishes the path planning on the environment map and closes the terminal, generating a corresponding feature file of the environment map.
This implementationIn the example, three feature files of fp.txt, feature _ map.txt and feature _ matrix.txt are set, wherein the fp.txt is a feature point file and stores x and y coordinate information of feature points, two columns of the fp.txt file are n rows, n is the number of the feature points, the first column is the x coordinate of the feature points, and the second column is the y coordinate of the feature points; txt is a feature map file, and stores the number of the nearest feature point to which each grid point on the environment map can be connected; txt is a feature matrix file which stores the connection condition of n feature points and the distance between every two adjacent nodes. And judging whether three files, namely fp. If no corresponding fp.txt characteristic point file exists, the robot performs motion planning on the environment map for the first time, and the representation degree is incomplete; if the corresponding feature point file fp.txt exists but the corresponding feature _ map.txt file and the feature _ matrix.txt file do not exist, the robot does not perform motion planning on the environment map for the first time and the representation degree is incomplete; if the corresponding fp.txt, feature _ map.txt and feature _ matrix.txt exist, the degree of characterization is complete. The feature node set G at the moment is only collected when the terminal is closed each timeFPTxt file is written, the calculation amount in the operation process can be reduced, and the operation speed is improved. And only when the degree of representation is complete, the terminal is closed to generate the feature map file and the feature matrix file corresponding to the environment map, namely, the feature map feature _ map.txt file and the feature matrix feature _ matrix.txt file are updated each time the terminal is closed and written into the corresponding txt file.
Step 1-1: given the navigation objective, the robot starts path planning based on the Risk-RRT method (see the documents "C.Fulgezi, A.Spalanzani, C.Laugier, and C.Tay," Risk based movement planning and navigation planning in the environmental dynamics environment, "Res.Rep., pp.1-14,2010."), and the method for extracting the node pose information of the robot in real time is to acquire the node pose information of the robot in the path planning process in real time through subscription/analog _ position and/robot _ 0/object topic. The/amcl _ position topic records real-time x and y coordinate information of the robot; the/robot _0/odom topic records the corresponding topic at the positionAngular velocity information of the robot, denoted Gm=(xm,ym,ωm) Where x and y represent the positions of the nodes, and ω represents the angular velocities of the nodes.
Step 1-2: under an ROS (Robot Operating System, ROS) which is a software architecture with high flexibility for compiling a Robot software program, judging whether a feature file exists or not when a terminal is opened every time, and calculating points of an unobstructed area on a map through the feature file to find the ratio representation of the number of the feature point numbers corresponding to the points to all the points of the unobstructed area for representing the representation degree; for simplicity in calculating the representation, all cases where the ratio is less than 1 are set to represent 0; when the ratio is equal to 1, the representation degree is complete, at this time, the representation activity is set to 1, and only when the representation degree is complete, the feature map and the feature matrix are generated and written into the feature _ map.
If the three files fp.txt, feature _ map.txt and feature _ matrix.txt do not exist, which means that motion planning is performed on the map for the first time, the representation is made to be 0, and a feature map M is initializedFInitializing a feature point set G for a two-dimensional zero matrix (pixels of the environment map are 504 × 434, and pixels of the initialized feature map are two-dimensional matrices of 504 × 434) with the same size as the environment mapFPStep 2 is executed for the empty set, and step 2-1 is executed in this embodiment;
if the feature file exists but the representation degree is not complete, namely the fpnew=(xn,yn) Storing feature point set GFPIn the method, characteristic points are visualized in an Rviz (the Rviz is a graphical tool carried by the ROS, and graphical operation can be conveniently carried out on a program of the ROS, the method is mainly used for carrying out simulation experiments on an Rviz platform), and a step 2 is executed, wherein the step 2-2 is executed in the embodiment;
if there is anyAnd the characteristic files are complete in degree of representation, namely fpnew=(xn,yn) Storing feature point set GFPAnd step 4 is executed.
Step 2: performing primary feature extraction on the node pose of the robot by using angular velocity channel and distance fusion to obtain an initial feature point set GFPAnd simultaneously updating the feature map.
Step 2-1: using the angular velocity channel to extract nodes with angular velocities greater than a preset threshold, denoted as Gnew=(xn,yn) (ii) a And when the nodes with the angular speed larger than the preset threshold are extracted, extracting the nodes at intervals of a certain number for reducing the redundancy of the nodes. In this embodiment, the number interval is set to 15, the message frequency of the/amc _ position and/robot _ 0/from topics is high, the difference between points is small, node data is extracted every 15 number intervals, nodes with effective differences can be extracted, and the redundancy of the nodes is reduced.
Step 2-2: if the representation degree is not complete, executing the step 2-3; and if the representation degree is complete, executing the step 2-5, namely replacing the original feature points according to the optimal distance.
Step 2-3: judging feature point set GFPIf the node G is empty, directly connecting the node G if the node G is emptynewStoring feature point set GFPAnd returning to execute the step 2-1; if not, step 2-4 is performed.
Step 2-4: node G is connected by Euclidean distance channelnewAnd feature point set GFPAll nodes G ini=(xi,yi) (i ═ 1, 2, 3, … … n, where n is the feature point set GFPThe number of the distance channels in step 2) are sequentially calculated (i.e., the distance fusion operation in step 2 is distinguished from the distance fusion operation performed again in step 3, and is referred to as first distance fusion, the distance fusion operation in step 3 is performed for the second time, and both the distance fusion operations are Euclidean distance channels), and the Euclidean distance is recorded as d _ once, and whether the Euclidean distance d _ once is less than or equal to a preset second distance is judgedA primary distance fusion threshold (in this embodiment, the threshold is 50.50 is actually 2.7m divided by the resolution of 0.054), and if yes, the step 2-1 is executed again; if node GnewAnd feature point set GFPIf the distances of all the nodes in the node G are greater than the preset first distance fusion threshold value, the node G is connected with the node GnewAdding into feature point set GFPG is to beFPCarry over to step 3 and execute.
Step 2-5: the method comprises the steps of adjusting the positions of the feature points according to the optimal distance and only arranging the feature point set GFPThe representation degree of the environment map is complete, namely, the representation degree is 1, the environment map is executed, and the node G is connectednewAnd feature point set GFPAll nodes G ini=(xi,yi) And (i is 1, 2, 3, … … n), sequentially calculating Euclidean distances, recording the Euclidean distances as d _ once, judging whether the Euclidean distances d _ once are less than or equal to a preset first distance fusion threshold value, if so, executing the step 2-6, and if not, executing the step 4.
And adjusting the position of the characteristic point according to the optimal distance, wherein the optimal distance is the sum of the calculated distances, and the optimal characteristic point is selected, so that the new node obtained in real time is not added at the moment, and the position of the original characteristic point is adjusted. Namely, the sum of the distances between the original feature point and all the points on the map corresponding to the original feature point is calculated, the sum of the distances between the new node and all the points on the map corresponding to the original feature point is calculated, and the distance with the smaller sum of the two distances is compared to replace the original node as the optimal feature point. Meanwhile, the premise of calculating the sum of the distances is that the new node can be connected with all points on the map corresponding to the original feature points, so that once the representation degree is complete, the position of the feature points is adjusted, the representation degree cannot be reduced to be incomplete, and instead, the optimal feature points represented by a certain area can be found in an iteration mode all the time.
Step 2-6: map of features MFGrid point with middle grid value i and node GnewSequentially carrying out collision detection (if an obstacle exists in a connecting line between two points, the collision detection is not passed, and if the obstacle exists, the collision detection is passed), and the formula is as follows:if all collision detections pass and the feature map MFGrid point with middle grid value i and node GnewIs less than the feature map MFIf the middle grid value is the sum of the distances between the grid point of i and the original characteristic node, the node GnewNode G replacing original characteristicsi=(xi,yi) Update the feature map, and compare GFPAnd carrying out the step 4, otherwise, carrying out the step 5. If the better feature points can be replaced, the feature points are replaced, and the feature map is updated; if no better feature points are found, a heuristic path is generated according to the original feature points, and step 5 is executed.
As shown in fig. 2, the trajectory of the robot is indicated by a dotted line, the flags indicate feature points extracted by the angular velocity path and the first distance fusion, and the black rectangles indicate obstacle regions.
And step 3: reuse of distance fusion for the initial feature point set GFPFurther extracting until the degree of characterization of the feature points is complete to obtain the optimal feature points, and obtaining a final feature point set GFP'. The first distance fusion is to add new feature points, and the second distance fusion is to delete redundant feature points from the newly added feature points.
Step 3-1: initializing the feature point set G to be saved when the initialization count c (the number of the neighbor feature points used for calculating the feature points and c is less than or equal to 1) is 0RIs an empty set (G)RInformation for storing feature points that must be saved); extracting feature point set GFPEach node G ini=(xi,yi) (i ═ 1, 2, 3, … … n, where n is the feature point set GFPNumber of) of neighboring feature nodes.
Step 3-2: initialization i (number of cycles) is 1.
Step 3-3: judging whether i is less than or equal to the characteristic point set GFPIf yes, executing step 3-4, otherwise executing step 3-6.
Step 3-4: judging whether the number of the feature node sets adjacent to the ith node is less than or equal to 3 (if the number of the neighbor feature points of one feature point is less than 1, the feature point is an isolated point, and the feature point cannot participate in the generation of a feature tree, in order to ensure the connectivity between the points and the points, the feature points with the number of the neighbor feature points less than or equal to 3 and the neighbors thereof are stored according to experience, the feature points do not participate in the second distance fusion, and the rest points participate in the second distance fusion), if so, performing the step 3-5, and if not, making i equal to i +1 and returning to perform the step 3-3.
Step 3-5: node Gi=(xi,yi) Adding nodes in the feature node set itself and its adjacent feature node set into a point set GRThe preparation method comprises the following steps of (1) performing; judging whether the number of the adjacent feature node sets is less than or equal to 1, and counting c +1 if the number of the adjacent feature node sets is less than or equal to 1; if i is not equal to the feature point set GFPAnd (3) enabling i to be i +1 and returning to execute the step 3-3, otherwise, executing the step 3-6.
Step 3-6: point set GRPerforming deduplication processing (some feature points are repeatedly added to G in steps 3-5)RIn, so to delete duplicate, the method of deleting duplicate is judged by iterator in c + +), if the feature point set GFPThe number n of (A) is not less than (G of point set)RNumber-1) (it is stated that only one feature point can participate in the second distance fusion, which is equivalent to not participating in), the feature map is updated, and the feature point set G is updatedFPAnd carrying out the step 4, otherwise, carrying out the steps 3-7.
Step 3-7: feature point set GFPRemoving point set GRAll other nodes of (2) are added to the point set GD(i.e., the feature point set participating in the second distance fusion), the second distance fusion is performed to delete the feature points. The second distance fusion is carried out to delete the characteristic points by taking GDThe x-th node G ofDx(x=1,2,……n1-1, wherein n1Is a set of points GDNumber of (G) in sequence with GDNode G inDx1(x1=x+1,x+2,……n1) Performing collision detection and Euclidean distance calculation on the connection line on the map, and if the collision detection is passed and the distance is less than the artificially set secondary fusionIs 80, which is a little larger than 50 of the first distance fusion, and 80 is also the actual number of meters divided by 0.054 in this embodiment), the feature point set G is obtainedFPMiddle deletion node GDx1Up to point set GDAll points in (a) have been detected. Updating the feature map after deleting the feature points by the second distance fusion, wherein the feature point set G at the momentFPNamely the final feature point set GFP', will GFP' bring step 4 and execute.
As shown in FIG. 3, r1Threshold (50), r, representing a first distance fusion2A threshold (80) representing a second distance fusion, crosses represent redundant feature points deleted by the second distance fusion, solid circles represent feature points finally extracted, and rectangles represent obstacle regions.
And 4, step 4: and updating the environment map by using the Euclidean distance to obtain a feature map, and finishing the real-time feature extraction. And when the terminal is opened, if the token degree is complete, using the feature points in fpFP' updating the feature point set.
Step 4-1: initializing a flag bit grid _ not _ find being 0, wherein the flag bit is used for indicating whether all points of the non-obstacle area on the map find the corresponding feature point number, and G is used for judging whether the points of the non-obstacle area on the map find the corresponding feature point numberFP' all node coordinates in this document are converted to grid coordinates, denoted Gi1=(xi1,yi1) (i1 ═ 1, 2, … … n, where n is the feature point set GFP' number), the conversion formula is:
grid coordinate xi1(x-coordinate-origin x-coordinate of the map)/map resolution,
grid coordinate yi1(y-coordinate-origin y-coordinate of the map)/map resolution.
Step 4-2: traversing two-dimensional grid map MP(p=0,1,……p1Wherein p is1Size of two-dimensional grid map) of the obstacle-free areak,jk) Initializing the count (count is used to count if (i)k,jk) If the feature point number corresponding to the feature point number cannot be found, making count +1) be 0, and initializing min _ dist to be the map length multiplied by the map width; (ii) grid point (i)k,jk) In turn with GFP' feature node G ini1The connecting line is used for collision detection on the map, and if the collision detection is passed, the Euclidean distance dist between the two points is calculated1If dist1Less than min _ dist, then in the feature map MFIn updating grid point (i)k,jk) Is the number i of the feature node and min _ dist is updated to dist1A value of (d); if the collision detection fails, count +1 is counted; up to GFP' all nodes in the tree go through end, if count is GFP' n, make grid _ not _ find equal to 1, and exit the loop, return to step 2 (this process is as long as a point on the map is found to have GFPIf all the points in the step (2) fail to pass the collision detection, the loop is exited, the calculation amount is saved, the representation degree is not complete enough, and a new feature point needs to be added in step (2); if count ≠ GFP' n, repeating the step 4-2 until the two-dimensional grid map MpAll grid points of the middle obstacle-free area are traversed to the end.
Step 4-3: if grid _ not _ find is equal to 0, the degree of representation is complete (it means that all grid points on the map can find the numbers of the feature points corresponding to the grid points), and the representational is equal to 1, otherwise, the representational is equal to 0.
As shown in fig. 4, the dotted lines represent nodes extracted in the robot path planning, and the solid dots and crosses represent feature points extracted by the first distance fusion, where: the cross mark represents the redundant feature points deleted through the secondary distance fusion, and the solid dots represent feature nodes finally extracted through the secondary distance fusion; through the angular velocity channel and the distance fusion of twice, the number of the characteristic nodes is obviously reduced.
And 5: according to the feature point set GFP' obtaining a feature matrix PFFrom the starting point, the end point and the feature matrix PFAnd generating a multi-mode information feature tree of the feature points, and obtaining a heuristic path on the feature map.
Step 5-1: select GFP' A certain node GFPi(i ═ 1, 2, … … n, where n is the feature point set GFPNumber of nodes G) of the node groupFPx(x ≠ i) of 1, 2, … … n and x ≠ i) and node GFPiThe connection line of the two points is used for collision detection on the map, and if the collision detection is passed, the Euclidean distance between the two points is stored into a characteristic matrix PF(i, x), otherwise store 0 in PF(i,x)。
Step 5-2: in the feature map MFFinding the feature node number corresponding to the starting point position, adding the feature node number into the multi-mode information feature tree T to complete initialization of the multi-mode information feature tree T, and initializing an open node set to be Topen=T。
Step 5-3: judging whether the number of the open nodes is larger than zero, if so, performing the step 5-4; if not, go to step 5-10.
Step 5-4: extracting the first open node topenWill topenFrom the open node set TopenIs removed.
And 5-5: from the feature map MFMiddle extraction of topenIn the feature map MFIn (1), all values are searched for as topenPoints of (2), recording limit values x of coordinates in all pointsmin、xmax、ymin、ymaxAnd selecting a square area corresponding to the coordinate limit value as an interest area AOI.
And 5-6: extracting t in region of interest AOIopenAll adjacent feature nodes X ofnew。
And 5-7: judging adjacent characteristic node set XnewWhether the current time is empty or not, if so, executing a step 5-3; if not, step 5-8 is performed.
And 5-8: taking XnewFirst node x ofnewFinding x on a multimodal information feature tree TnewNearest neighbor x ofparentJudgment of xparentIf it is not present, x isnewFrom XnewRemoving and returning to execute the step 5-7; if so, executeAnd 5-9.
And 5-9: x is to benewJoining to a multimodal information feature tree T and an open node set TopenIn (1), xnewFrom XnewAnd (5) removing, and returning to execute the step 5-7.
Step 5-10: in the feature map MFFinding out the characteristic node corresponding to the end point position, and finding out the path from the starting point to the end point by reversely searching the multi-mode information characteristic tree T. The original environment map has only two values of 0 and 255, 0 represents an obstacle-free area, 255 represents an obstacle-containing area, and the feature map fills the grid value with the environment map value of 0 with the number of the corresponding feature point, and 255 remains unchanged. In this way, when an end point is given, the feature point number filled in the feature map by the end point can be quickly found, and the start point is also the same. The method has the advantages that only feature points with the order of ten need to be traversed instead of traversing all grid values with the order of hundreds of thousands of the traditional method, so that the traversal complexity is greatly reduced, and the real-time performance of path planning is ensured.
Step 5-11: and returning to execute the step 2-1 for updating the feature point set, updating the stored feature file into the adjusted feature point when the terminal is closed, and directly performing the step 5 to generate a heuristic path according to the updated feature file when the map is opened next time.
As shown in fig. 5, given an end point, using the real-time position of the robot as a starting point, and continuously adding candidate nodes to the multi-modal information feature tree through the collision detection module, completing creation of the multi-modal information feature tree, obtaining a search tree shown by a dotted line, and further obtaining a heuristic path shown by a shadow.
The invention has the beneficial effects that:
(1) the method for extracting the feature points in real time realizes the representation of the environment structure, not only solves the defect that the feature points need to be extracted in advance in an offline manner in the traditional GVD, but also greatly saves the time cost. The characteristic point information is stored through files, and the characteristic point information is checked and updated before and after the path planning, so that the occupied memory is small, the problems of node reuse and calculation resource saving are solved, and after extraction and creation are completed, the characteristic point information can be repeatedly used in the path planning of different starting points and end points, which is equivalent to the function of endowing the robot with brain memory.
(2) The map is more simply expressed by extracting the feature points, the expressed feature map retains the structural information of the environment, redundant information is removed, and in the search process of path planning, the feature points are only traversed instead of all map grid points, so that the traversal complexity is greatly reduced, and the real-time performance of the path planning is ensured.
(3) On the basis of a Risk-RRT motion planning method, a multi-modal information feature tree is generated, the multi-modal information feature tree is composed of feature points extracted from nodes at all moments based on Risk-RRT, and the feature points do not contain timestamp information. The method is not only suitable for dynamic complex environments, but also simple in nodes and strong in representativeness, and can represent structural information of the environments. A heuristic path can be rapidly generated by giving a starting point and an end point, the robot can sequentially carry out motion planning according to sub-targets on the heuristic path to finally reach a target point, and the problem of trap space in the motion planning process based on random sampling is solved.
The above-mentioned embodiments are merely preferred embodiments for fully illustrating the present invention, and the scope of the present invention is not limited thereto. The equivalent substitution or change made by the technical personnel in the technical field on the basis of the invention is all within the protection scope of the invention. The protection scope of the invention is subject to the claims.
Claims (7)
1. A robot real-time motion planning method based on a multi-mode information feature tree is characterized by comprising the following steps:
step 1: acquiring an environment map, extracting node pose information of the robot in real time, judging whether the representation degree of the feature point to the environment map is complete, if not, executing the step 2, and if so, executing the step 4;
step 2: performing primary feature extraction on the node pose of the robot by using an angular velocity channel and distance fusion to obtain an initial feature point set, and updating a feature map;
step 2-1: using the angular velocity channel to extract nodes with angular velocities greater than a preset threshold, denoted as Gnew=(xn,yn);
Step 2-2: if the representation degree is not complete, executing the step 2-3; if the representation degree is complete, executing the step 2-5;
step 2-3: judging feature point set GFPIf the node G is empty, directly connecting the node G if the node G is emptynewStoring feature point set GFPAnd returning to execute the step 2-1; if not, executing step 2-4;
step 2-4: node G is connected by Euclidean distance channelnewAnd feature point set GFPAll nodes G ini=(xi,yi) Sequentially calculating Euclidean distances, judging whether the Euclidean distances are smaller than or equal to a preset first distance fusion threshold value, and if yes, returning to execute the step 2-1;
if node GnewAnd feature point set GFPIf the distances of all the nodes in the node G are greater than the preset first distance fusion threshold value, the node G is connected with the node GnewAdding into feature point set GFPG isFPCarrying out the step 3;
step 2-5: node GnewAnd feature point set GFPAll nodes G ini=(xi,yi) Sequentially calculating Euclidean distances, judging whether the Euclidean distances are smaller than or equal to a preset first-time distance fusion threshold value, if so, executing the steps 2-6, and if not, executing the step 4;
step 2-6: map of features MFMiddle grid point and node GnewSequentially carrying out collision detection, if all collision detections pass and the characteristic map MFGrid point and node G innewIs less than the feature map MFThe sum of the distances between the grid point in (1) and the original characteristic node is determined as the node GnewReplace the original feature node Gi=(xi,yi) Updating the feature map, will GFPCarrying out the step 4, otherwise, executing the step 5;
and 3, step 3: further extracting the initial feature point set by using distance fusion again until the degree of characterization of the feature points is complete to obtain the optimal feature points, and obtaining a final feature point set, wherein the method specifically comprises the following steps:
step 3-1: the initialization count c is 0, and the feature point set G to be saved is initializedRIs an empty set; extracting feature point set GFPEach node G ini=(xi,yi) The neighboring feature node sets of (2);
step 3-2: initializing i to be 1;
step 3-3: judging whether i is less than or equal to the characteristic point set GFPIf yes, executing step 3-4, if not, executing step 3-6;
step 3-4: judging whether the number of the adjacent feature node sets of the ith node is less than or equal to 3, if so, performing the step 3-5, otherwise, enabling i to be i +1 and returning to execute the step 3-3;
step 3-5: node Gi=(xi,yi) Adding nodes in the feature node set itself and its adjacent feature node set into a point set GRPerforming the following steps; judging whether the number of the adjacent feature node sets is less than or equal to 1, and counting c +1 if the number of the adjacent feature node sets is less than or equal to 1; if i is not equal to the feature point set GFPIf the number n is greater than the preset value, i is equal to i +1, and the step 3-3 is executed, otherwise, the step 3-6 is executed;
step 3-6: point set GRPerforming deduplication processing if the feature point set GFPThe number n of the point sets is not less than the point set GRIf the number of the feature points is-1, updating the feature map and collecting the feature points GFPCarrying out the step 4, otherwise, executing the step 3-7;
step 3-7: feature point set GFPRemoving point set GRAll other nodes of (2) are added to the point set GDIn the method, the second distance fusion is carried out to delete the feature pointsUpdating the feature map, the feature point set G at this timeFPNamely the final feature point set GFP', will GFP' carry over to step 4 and execute;
and 4, step 4: updating the environment map by using the Euclidean distance to obtain a feature map, and finishing the real-time feature extraction;
and 5: obtaining a feature matrix according to the feature point set, generating a multi-mode information feature tree of the feature points according to the starting point, the end point and the feature matrix, and obtaining a heuristic path on the feature map, wherein the heuristic path comprises the following steps:
step 5-1: select GFP' A certain node GFPiThe rest nodes GFPxAnd node GFPiThe connection line of the two points is used for collision detection on the map, and if the collision detection is passed, the Euclidean distance between the two points is stored into a characteristic matrix PF(i, x), otherwise store 0 in PF(i,x);
Step 5-2: in the feature map MFFinding the feature node number corresponding to the starting point position, adding the feature node number into the multi-mode information feature tree T to complete initialization of the multi-mode information feature tree T, and initializing an open node set to be Topen=T;
Step 5-3: judging whether the number of the open nodes is larger than zero, if so, performing the step 5-4; if not, performing the step 5-10;
step 5-4: extracting the first open node topenWill topenFrom the open node set TopenRemoving;
and 5-5: from the feature map MFMiddle extraction of topenThe region of interest AOI;
and 5-6: extracting t in AOI of interest areaopenAll adjacent feature nodes X ofnew;
And 5-7: judging adjacent characteristic node set XnewWhether the current time is null or not, if so, executing the step 5-3; if not, executing step 5-8;
and 5-8: taking XnewFirst node x ofnewFinding x on a multimodal information feature tree TnewNearest neighbor x ofparentJudgment ofBroken xparentIf it is present, if it is not present, xnewFrom XnewRemoving, and returning to execute the step 5-7; if so, performing step 5-9;
and 5-9: x is to benewJoining to a multimodal information feature Tree T and an open node set TopenIn (1), xnewFrom XnewRemoving, and returning to execute the step 5-7;
step 5-10: in the feature map MFFinding a characteristic node corresponding to the position of the end point, and finding a path from the starting point to the end point by reversely searching the multi-mode information characteristic tree T;
step 5-11: and returning to execute the step 2-1.
2. The real-time robot motion planning method based on the multi-modal information feature tree of claim 1, wherein: the method for judging whether the degree of characterization of the environment map by the feature points is complete in the step 1 is to check whether the environment map has a corresponding feature file before opening the terminal for path planning each time, judge whether the degree of characterization is complete through the feature points in the feature files if the environment map has the corresponding feature files, and judge that the degree of characterization is incomplete if the environment map has no corresponding feature files; and when the robot finishes the path planning on the environment map and closes the terminal, generating a corresponding feature file of the environment map.
3. The real-time robot motion planning method based on the multi-modal information feature tree according to claim 2, wherein the specific process of the step 1 is as follows:
step 1-1: giving a navigation target, starting path planning by the robot based on a Risk-RRT algorithm, and acquiring node pose information G of the robot in the path planning process in real timem=(xm,ym,ωm) Wherein x and y represent the positions of the nodes, and omega represents the angular velocity of the nodes;
step 1-2: judging whether the feature file exists or not when the terminal is opened every time, and calculating the point of an unobstructed area on the map through the feature file to find the ratio representation of the number of the feature point numbers corresponding to the point of the unobstructed area to all the points of the unobstructed area, wherein the ratio representation is used for representing the representation degree;
if no feature file exists, the representation is made 0, and the feature map M is initializedFInitializing a feature point set G for a two-dimensional zero matrix with the same size as the environment mapFPIf the set is empty, executing the step 2;
if the feature file exists but the representation degree is not complete, the node G stored in the feature file is usednew=(xn,yn) Storing feature point set GFPStep 2 is executed;
if the characteristic file exists and the degree of characterization is complete, the node G stored in the characteristic filenew=(xn,yn) Storing feature point set GFPAnd step 4 is executed.
4. The real-time robot motion planning method based on the multi-modal information feature tree of claim 1, wherein: when the angular velocity channel is used to extract the nodes with the angular velocity greater than the preset threshold in the step 2-1, the nodes are extracted at intervals of a certain number for reducing the redundancy of the nodes.
5. The real-time robot motion planning method based on the multi-modal information feature tree of claim 1, wherein: the specific method for distance fusion comprises the following steps:
node G in point setxAnd the rest of nodes G in the point setx1Connecting lines, performing collision detection and Euclidean distance calculation on the map, and deleting the node G from the point set if the collision detection is passed and the distance is less than the distance threshold value of distance fusion set by peoplex1Until all points in the set have been detected.
6. The real-time robot motion planning method based on the multi-modal information feature tree of claim 1, wherein: the specific process of the step 4 is as follows:
step 4-1: initialization targetThe flag bit is 0 and is used for indicating whether all points of the area without obstacles on the map find the corresponding characteristic point number, and G is setFP' all node coordinates in this document are converted to grid coordinates, denoted Gi1=(xi1,yi1);
Step 4-2: traversing two-dimensional grid map MPAll grid points (i) of the middle obstacle-free areak,jk) Initializing count to be 0, and initializing min _ dist to be the map length multiplied by the map width; grid point (i)k,jk) In turn with GFP' feature node G ini1The connecting line is used for collision detection on the map, and if the collision detection is passed, the Euclidean distance dist between the two points is calculated1If dist1Less than min _ dist, then in the feature map MFIn updating grid point (i)k,jk) Is the number i of the feature node and min _ dist is updated to dist1A value of (d); if the collision detection fails, count +1 is counted; up to GFP' all nodes in the set go through end if count is GFPIf the number n of the' is equal to 1, the loop is exited, and the step 2 is returned; if count ≠ GFP' n, repeating the step 4-2 until the two-dimensional grid map MpAll grid points of the middle barrier-free area are traversed and ended to obtain a feature map MF;
Step 4-3: if the flag bit is equal to 0, the representation degree is complete, and the representational is equal to 1, otherwise, the representational is equal to 0.
7. The real-time robot motion planning method based on the multi-modal information feature tree of claim 1, wherein: said step 5-5 is from the feature map MFMiddle extraction of topenThe specific method for the area of interest AOI of (1),
in the feature map MFIn (1), all values are searched for as topenPoints of (2), recording limit values x of coordinates in all pointsmin、xmax、ymin、ymaxAnd selecting a square area corresponding to the coordinate limit value as an interest area AOI.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011262159.2A CN112428271B (en) | 2020-11-12 | 2020-11-12 | Robot real-time motion planning method based on multi-mode information feature tree |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011262159.2A CN112428271B (en) | 2020-11-12 | 2020-11-12 | Robot real-time motion planning method based on multi-mode information feature tree |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112428271A CN112428271A (en) | 2021-03-02 |
CN112428271B true CN112428271B (en) | 2022-07-12 |
Family
ID=74700520
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011262159.2A Active CN112428271B (en) | 2020-11-12 | 2020-11-12 | Robot real-time motion planning method based on multi-mode information feature tree |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112428271B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113485373B (en) * | 2021-08-12 | 2022-12-06 | 苏州大学 | Robot real-time motion planning method based on Gaussian mixture model |
CN117870653B (en) * | 2024-03-13 | 2024-05-14 | 中国科学技术大学 | Method for establishing and updating two-dimensional differential Euclidean symbol distance field map |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108582073A (en) * | 2018-05-02 | 2018-09-28 | 北京邮电大学 | A kind of quick barrier-avoiding method of mechanical arm based on improved random road sign Map Method |
CN108983780A (en) * | 2018-07-24 | 2018-12-11 | 武汉理工大学 | One kind is based on improvement RRT*The method for planning path for mobile robot of algorithm |
CN110119144A (en) * | 2019-04-19 | 2019-08-13 | 苏州大学 | Based on the matched multirobot SLAM algorithm of sub- map feature |
CN110962130A (en) * | 2019-12-24 | 2020-04-07 | 中国人民解放军海军工程大学 | Heuristic RRT mechanical arm motion planning method based on target deviation optimization |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102165437B1 (en) * | 2014-05-02 | 2020-10-14 | 한화디펜스 주식회사 | Path planning apparatus of mobile robot |
-
2020
- 2020-11-12 CN CN202011262159.2A patent/CN112428271B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108582073A (en) * | 2018-05-02 | 2018-09-28 | 北京邮电大学 | A kind of quick barrier-avoiding method of mechanical arm based on improved random road sign Map Method |
CN108983780A (en) * | 2018-07-24 | 2018-12-11 | 武汉理工大学 | One kind is based on improvement RRT*The method for planning path for mobile robot of algorithm |
CN110119144A (en) * | 2019-04-19 | 2019-08-13 | 苏州大学 | Based on the matched multirobot SLAM algorithm of sub- map feature |
CN110962130A (en) * | 2019-12-24 | 2020-04-07 | 中国人民解放军海军工程大学 | Heuristic RRT mechanical arm motion planning method based on target deviation optimization |
Also Published As
Publication number | Publication date |
---|---|
CN112428271A (en) | 2021-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Francis et al. | Long-range indoor navigation with PRM-RL | |
Jeong et al. | Image preprocessing for efficient training of YOLO deep learning networks | |
CN109034077B (en) | Three-dimensional point cloud marking method and device based on multi-scale feature learning | |
Huang et al. | Deep learning driven visual path prediction from a single image | |
CN112428271B (en) | Robot real-time motion planning method based on multi-mode information feature tree | |
Brahmbhatt et al. | Deepnav: Learning to navigate large cities | |
CN113485373B (en) | Robot real-time motion planning method based on Gaussian mixture model | |
CN111397598A (en) | Mobile robot path planning and sampling method and system in man-machine co-fusion environment | |
CN111916144B (en) | Protein classification method based on self-attention neural network and coarsening algorithm | |
CN109163722A (en) | A kind of anthropomorphic robot paths planning method and device | |
KR20210108044A (en) | Video analysis system for digital twin technology | |
Bastani et al. | Machine-assisted map editing | |
US20120076426A1 (en) | Method and system for matching panoramic images using a graph structure, and computer-readable recording medium | |
CN114859375A (en) | Autonomous exploration mapping system and method based on multi-robot cooperation | |
Zuo et al. | Graph-based visual manipulation relationship reasoning network for robotic grasping | |
CN114442629A (en) | Mobile robot path planning method based on image processing | |
Xu et al. | An efficient algorithm for environmental coverage with multiple robots | |
Soni et al. | Multi-robot unknown area exploration using frontier trees | |
Wang et al. | DelvMap: Completing Residential Roads in Maps Based on Couriers’ Trajectories and Satellite Imagery | |
CN117115911A (en) | Hypergraph learning action recognition system based on attention mechanism | |
CN113723180B (en) | Method and system for constructing service robot active object detection model data set | |
CN109977455A (en) | It is a kind of suitable for the ant group optimization path construction method with terrain obstruction three-dimensional space | |
CN107301659B (en) | A kind of high resolution image and GIS data method for registering and device | |
Landi et al. | Spot the difference: A novel task for embodied agents in changing environments | |
CN113837049B (en) | Intelligent pavement cleaning method based on convolutional neural network and genetic algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |