CN114545923A - Robot mapping method, electronic device and computer storage medium - Google Patents
Robot mapping method, electronic device and computer storage medium Download PDFInfo
- Publication number
- CN114545923A CN114545923A CN202111630176.1A CN202111630176A CN114545923A CN 114545923 A CN114545923 A CN 114545923A CN 202111630176 A CN202111630176 A CN 202111630176A CN 114545923 A CN114545923 A CN 114545923A
- Authority
- CN
- China
- Prior art keywords
- node
- robot
- explored
- grid
- skeleton
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000013507 mapping Methods 0.000 title claims abstract description 40
- 238000010586 diagram Methods 0.000 description 11
- 238000010276 construction Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000000605 extraction Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000004888 barrier function Effects 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The application discloses a robot mapping method, electronic equipment and a computer storage medium. The graph establishing method comprises the following steps: determining nodes in the skeleton based on the skeleton of a grid map, wherein the grid map is constructed based on an environment area where the robot is located, and the grid map comprises a boundary to be explored; determining a root node based on the position of the robot, and determining a node tree according to the root node and nodes in the skeleton; and controlling the robot to explore and build a graph according to the position relation between the leaf nodes in the node tree and the boundary to be explored. Through the mode, the robot can automatically build the image, the intelligent level is improved, and the safety is improved.
Description
Technical Field
The present application relates to the field of intelligent mapping technologies, and in particular, to a robot mapping method, an electronic device, and a computer storage medium.
Background
The robot carries out operation under different unknown environments, relates to a robot positioning, map building and planning technology, and firstly requires the robot to complete the construction of an environment map so as to carry out subsequent operations such as navigation and the like.
The robot mapping technology is that the robot senses the external environment through a sensor and completes self-positioning. The traditional drawing construction mode adopts a manual control mode or a following mode to control the robot to move so as to complete drawing construction, the mode needs manual control, and the intelligent level needs to be improved.
Disclosure of Invention
The application provides a robot mapping method, electronic equipment and a computer storage medium, so that automatic robot mapping is realized, the intelligent level is improved, and the safety is improved.
In order to solve the technical problem, the application provides a robot map building method. The robot mapping method comprises the following steps: determining nodes in the skeleton based on the skeleton of a grid map, wherein the grid map is constructed based on an environment area where the robot is located, and the grid map comprises a boundary to be explored; determining a root node based on the position of the robot, and determining a node tree according to the root node and nodes in the skeleton; and controlling the robot to explore and build a graph according to the position relation between the leaf nodes in the node tree and the boundary to be explored.
Wherein, the above-mentioned node in confirming the skeleton includes: and determining grids corresponding to corners, branches and/or grids corresponding to tail ends in the skeleton as nodes.
Wherein, the above-mentioned node of confirming in the skeleton includes: and traversing the grids on the framework, and judging whether the grids are nodes or not according to the position relation between the neighborhood grids of the grids and the framework.
Wherein, the above-mentioned relation according to neighborhood grid and skeleton of grid, judge whether the grid is the node, include: judging that the grid is a leaf node based on that only one of the neighborhood grids of the grid is on the framework; determining that the grids are turning nodes based on that only two of the neighborhood grids of the grids are on the framework and the two neighborhood grids and the grids are not on the same straight line; and judging the grid as a branch node based on the fact that at least three of the neighborhood grids of the grid are on the framework.
Wherein, the grid map comprises grids to be searched; the determining the position relationship between the leaf node and the boundary to be explored in the node tree includes: and judging the position relation between the leaf nodes and the boundary to be explored as the leaf nodes are positioned on the boundary to be explored based on at least three of the adjacent grids of the grid where the leaf nodes are positioned as the grid to be explored.
Wherein, the above-mentioned position relation according to leaf node and the border of waiting to explore in the node tree, control mobile robot and explore and build the picture, include: taking a father node of a leaf node positioned on a boundary to be explored as a node to be explored; and controlling the robot to move to the node to be explored to explore and build a graph.
Wherein, the method for establishing the graph further comprises the following steps: taking the position of the robot as a current node, and judging whether a leaf node of the current node has a node to be explored or not; if the node exists, the robot is controlled to move to the node to be explored; and if the current node does not exist, controlling the robot to move to the father node of the current node.
Wherein, the mapping method further comprises: calculating the length of the boundary to be explored, comprising: starting from a leaf node positioned on the boundary to be explored, moving along the skeleton to determine the length of the boundary to be explored; and if the length is larger than the threshold value, moving the robot based on the leaf nodes at the boundary to be explored to perform the steps of exploring and building the graph.
In order to solve the technical problem, the application provides an electronic device. The electronic device includes: comprises a memory and a processor which are coupled with each other, wherein the processor is used for executing the program data stored in the memory so as to realize the mapping method of the robot.
To solve the above technical problem, the present application provides a computer storage medium. The computer storage medium has stored thereon program data executable to implement the robot mapping method described above.
The robot image building method comprises the steps that firstly, nodes in a skeleton are determined based on the skeleton of a grid map, the grid map comprises a boundary to be explored, and therefore the skeleton extends to the boundary to be explored, namely the skeleton comprises the skeleton of the boundary to be explored; further, a node tree of nodes in the framework is established, and the mobile robot is controlled to explore and establish a graph based on the position relation between leaf nodes in the node tree and a boundary to be searched; by the method, the robot can automatically construct the image, and the intelligent level of image construction is improved; in addition, the exploration points are calculated based on the skeleton of the grid map, so that the exploration points can be far away from the barrier, and the safety can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a schematic flow chart diagram of an embodiment of a robot mapping method of the present application;
FIG. 2 is a schematic diagram of the structure of a grid map constructed by the embodiment of FIG. 1;
FIG. 3 is a schematic diagram of the grid map and its skeleton according to the embodiment of FIG. 2;
FIG. 4 is a schematic structural diagram of a portion of the structure of the embodiment of FIG. 3;
FIG. 5 is a schematic structural diagram of a portion of the structure of the embodiment of FIG. 3;
FIG. 6 is a schematic structural diagram of a portion of the structure of the embodiment of FIG. 3;
FIG. 7 is a flowchart illustrating a specific process of step S15 in the embodiment of FIG. 1;
FIG. 8 is a flowchart illustrating a specific step S12 in the embodiment of FIG. 1;
FIG. 9 is a schematic flow chart diagram illustrating an embodiment of a robot mapping method of the present application;
FIG. 10 is a schematic structural diagram of an embodiment of the apparatus for drawing according to the present application;
FIG. 11 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 12 is a schematic structural diagram of an embodiment of a computer storage medium according to the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be noted that the following examples are only illustrative of the present application, and do not limit the scope of the present application. Likewise, the following examples are only some examples and not all examples of the present application, and all other examples obtained by a person of ordinary skill in the art without any inventive step are within the scope of the present application.
In the description of the embodiments of the present application, it should be noted that the terms "connected" and "connected" are to be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected, unless explicitly stated or limited otherwise; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. Specific meanings of the above terms in the embodiments of the present application can be understood as specific cases by those of ordinary skill in the art.
In the embodiments of the present application, unless otherwise explicitly specified or limited, a first feature "on" or "under" a second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
The robot and the patterning method thereof, and the computer storage medium provided by the present application are described in detail below with reference to embodiments.
The present application first proposes a robot mapping method, as shown in fig. 1, fig. 1 is a schematic flowchart of an embodiment of the robot mapping method of the present application. The method for creating the image in the embodiment specifically comprises the following steps:
step S11: and determining nodes in the skeleton based on the skeleton of the grid map, wherein the grid map is constructed based on the environment area where the robot is located, and the grid map comprises a boundary to be explored.
The robot is provided with a detection device capable of detecting a space obstacle and a moving device for movement. The robot is also internally provided with a processor which can receive and transmit instructions and process data information.
In this embodiment, the robot may construct a grid map of the environment area through a Simultaneous Localization And Mapping (SLAM), where the grid map includes a boundary to be explored. And the robot moves to the boundary to be explored based on the information of the grid map so as to explore and construct a map of the boundary to be explored, and the constructed grid map is expanded.
The SLAM map construction means that the robot creates a map in a completely unknown environment under the condition that the position of the robot is uncertain, and meanwhile, the map is used for autonomous positioning and navigation. The SLAM problem can be described as: the robot moves from an unknown position in an unknown environment, performs self-positioning according to position estimation and sensor data during the moving process, and simultaneously builds an incremental map.
When the robot constructs a map, the robot needs to know the position of the robot in the environment and record the positions of features in the environment, and the robot establishes the environment map while positioning.
The grid map is composed of grids, each grid can represent the state of the surrounding environment of the position of the grid, and the state comprises three states: an occupancy state that characterizes the presence of an obstacle, an idle state that characterizes the absence of an obstacle and the free passage of the robot, and an unknown state that the robot has not detected (whether the grid is or is not known to be present with an obstacle). The robot can plan an advancing route according to the grid map, and the grid map is continuously detected and updated in the advancing process of the robot.
The robot can determine the size of the grid according to parameters such as environment resolution, environment information storage capacity, decision speed and the like. If the environment resolution is large, the environment information storage capacity is large, the decision speed is slow, and a small grid can be selected; and if the environment resolution is small, the storage amount of the environment information is small, the decision speed is high, a large grid can be selected, but the capability of finding a path in a dense obstacle environment is weak.
Of course, in other embodiments, the robot may also employ other techniques to construct a grid map of the environmental area.
In an application scenario, the grid map constructed by the present embodiment is shown in fig. 2. Wherein, the area in the dotted line rectangle frame is the environment area of the robot; the solid line indicates an obstacle region (occupied in a state corresponding to the grid); the area surrounded by the solid lines is an idle area (the state corresponding to the grid is idle); areas except the obstacle area and the free area in the dotted line rectangular frame are unknown areas (the state of the corresponding grid is unknown); the boundary to be explored is the boundary between the free area and the unknown area (part of which is marked by the dotted line inside the dotted rectangle).
The robot carries out skeleton extraction on the grid map by using a skeleton extraction method, so that the central position of a free area in the grid map can be obtained, and the skeleton can be ensured to be positioned in the middle of obstacles on two sides, namely the grid on the skeleton is positioned at the optimal position far away from the obstacles.
Of course, in other embodiments, in order to shorten the length of the skeleton of the grid map, the skeleton in the idle area may not be located in the central area thereof, as long as it is ensured that the robot does not touch two obstacles when moving along the skeleton.
Specifically, the robot may first project an initial position, an obstacle area, and a boundary to be searched of the robot into a grid map, determine a passable area of the robot in the grid map, and then perform skeleton extraction on the passable area by using a Zhang-Suen algorithm and the like.
The skeleton of the extracted grid map shown in fig. 2 is shown by the bold black line in fig. 3.
Further, the embodiment may determine a grid corresponding to a corner in the skeleton, a grid corresponding to a branch in the skeleton, and/or a grid corresponding to a terminal as a node of the skeleton, because the corner, the branch, and the terminal in the skeleton can reflect the change of the motion direction of the robot.
Optionally, the robot may traverse the grid on the skeleton, and determine whether the grid is a node according to a positional relationship between a neighborhood grid of the grid and the skeleton, for example, whether the neighborhood grid is on the skeleton.
The robot traverses each grid on the skeleton by taking the initial position (starting position) of the robot as a root node, and judges whether each grid point is a node (including a leaf node, a turning node and a branch node).
In some embodiments, the neighborhood grid comprises an eight neighborhood grid.
Specifically, the present embodiment may implement step S11 by the following method:
(1) and judging the grid as a leaf node based on that only one of the neighborhood grids of the grid is on the framework.
If only one of the grid neighborhood is on the skeleton, the grid is considered as a node at the tail end of the skeleton, and the grid is determined as a leaf node of the skeleton.
For example, as shown in FIG. 4, if one and only one of the eight neighborhood grids of grid A1, C1, is on skeleton B, while the other neighborhood grids are not on skeleton B, then grid A1 is determined to be a leaf node of the skeleton.
(2) And only two of the grid-based neighborhood grids are on the framework, and the two neighborhood grids and the grid are not on the same straight line, so that the grid is judged to be a turning node.
If two of the neighborhood grids of the grid are on the framework, the grid is considered to be in the middle position of the framework, and whether the two neighborhood grids and the grid are on the same straight line is further judged; if yes, determining that the grid is not a node of the skeleton; if not, determining that the grid is a turning node of the framework, and the turning node is located at the turning position of the framework.
For example, as shown in FIG. 5, if there are and only two neighborhood grids C2, C3 in the eight neighborhood grid of grid A2 on skeleton B, and neighborhood grids C2, C3 and grid A2 are not located on a straight line, then grid A2 is determined to be a corner node of skeleton B.
(3) And judging the grid as a branch node based on the skeleton of at least three of the grid neighborhood grids.
If at least three of the grid neighborhood grids are on the framework, the grid is considered to be in the middle of the framework, and the grid is determined to be a branch node at the branch of the framework.
For example, as shown in FIG. 6, if three neighborhood grids C4, C5, C6 in the eight neighborhood grid of grid A3 are on skeleton B, grid A3 is determined to be a branch node at the branch of skeleton B.
The nodes in the skeleton of the grid map can be determined by the method.
Step S12: and determining a root node based on the position of the robot, and determining a node tree according to the root node and the nodes in the skeleton.
The position of the robot may be an initial position of the robot or a current position of the robot (the robot is moving all the time).
The nodes of the skeleton obtained in step S12 are stored in a tree structure with the position of the robot as a root node, thereby forming a node tree.
Specifically, starting from the root node, moving to a next-level node of the root node along a grid positioned on the skeleton, and then moving to a next-level node of the next-level node until a last-level node, namely a leaf node; the next-level node and the previous-level node are in a parent-child relationship, that is, the next-level node is a child node of the previous-level node, and the previous-level node is a parent node of the next-level node.
The distance between the child node and the parent node can be obtained in the moving process. And establishing a node tree based on the hierarchical relation and the distance between the nodes.
Marking the leaf nodes based on the node tree to obtain a series of leaf nodes of the node tree; the node tree comprises a root node and nodes at all levels between the root node and leaf nodes, and the node tree can embody the parent-child relationship between the nodes.
Step S13: and controlling the robot to explore and build a graph according to the position relation between the leaf nodes in the node tree and the boundary to be explored.
The grid map further comprises grids to be explored, namely grids with unknown grid states, including grids corresponding to boundaries to be explored and grids corresponding to unknown areas.
Optionally, in this embodiment, if at least three of the neighbor grids of the grid where the leaf node is located are to-be-explored grids, it is determined that the position relationship between the leaf node and the to-be-explored boundary is that the leaf node is located at the to-be-explored boundary.
The robot traverses each leaf node, and aiming at the grid where each leaf node is located, the state of the grid in the eight neighborhoods of the grid is obtained and counted; and if the states of at least three grids in the eight neighborhood grids of the grid where the leaf node is located are unknown, namely at least three grids in the neighborhood grids of the grid where the leaf node is located are grids to be explored, determining that the leaf node is located at the boundary to be explored.
And the robot traverses each leaf node in the node tree according to the method to obtain a series of leaf nodes at the boundary to be explored.
The robot image building method of the embodiment is characterized in that firstly, nodes in a skeleton are determined based on the skeleton of a grid map, and the grid map comprises a boundary to be explored, so that the skeleton extends to the boundary to be explored, namely the skeleton comprises the skeleton of the boundary to be explored; further, the embodiment establishes a node tree of nodes in the framework, and controls the mobile robot to explore and establish a graph based on the position relationship between leaf nodes in the node tree and a boundary to be searched; by the method, the robot can automatically construct the image, and the intelligent level of image construction is improved; in addition, the exploration points are calculated based on the skeleton of the grid map, so that the exploration points can be far away from the barrier, and the safety can be improved; meanwhile, the points to be explored are arranged according to a certain sequence, so that the exploration sequence of the points to be explored is ensured.
Alternatively, the present embodiment may implement step S13 by the method shown in fig. 7. The method of the present embodiment includes steps S71 and S72.
Step S71: and taking the parent node of the leaf node at the boundary to be explored as the node to be explored.
Step S72: and controlling the robot to move to the node to be explored to explore and build a graph.
Because the whole boundary to be explored may be a long straight line, leaf nodes appear at two ends of the boundary, and therefore the robot acquires the father node of the leaf node of the boundary to be explored as the node to be explored, the whole boundary to be explored can be explored, and the drawing construction precision and efficiency are improved.
In another embodiment, step S13 may be implemented by a method as shown in fig. 8. The method of the present embodiment includes steps S81 through S85.
Step S81: and taking the parent node of the leaf node at the boundary to be explored as the node to be explored.
Step S82: and judging whether a leaf node of the current node has a node to be explored or not by taking the position of the robot as the current node.
In some application scenarios, the position of the robot may not be located on a node of the skeleton, and a node closest to the position of the robot may be selected as a current node of the robot.
Step S83: and if so, controlling the robot to move to the node to be explored.
If the leaf node of the current node of the robot has the node to be explored, the robot is controlled to preferentially explore the node to be explored of the current node and build a graph, the motion path of the robot can be shortened, and the graph building efficiency is improved.
Step S84: and if the current node does not exist, controlling the robot to move to the father node of the current node.
If the leaf node of the current node of the robot does not have the leaf node at the boundary to be explored, the robot is considered not to move to the leaf node along the current node in order to construct a graph of the unknown area; at this time, the searching priorities of the nodes to be searched can be sorted after tracing back to the parent node of the current node.
And finally, obtaining a node to be explored with the highest priority as the next exploration point position, and controlling the robot to reach the designated position. Namely, the robot can be ensured to search towards the current direction preferentially.
Step S85: and controlling the robot to explore and build a map at the designated position.
The robot is controlled to preferentially explore the nodes to be explored of the current nodes and build the graph, the motion path of the robot can be shortened, and the graph building efficiency is improved.
The present application further provides a mapping method of a robot according to another embodiment, as shown in fig. 9, fig. 9 is a flowchart illustrating an embodiment of the mapping method of the robot according to the present application. The method for creating the image in the embodiment specifically comprises the following steps:
step S91: and determining nodes in the skeleton based on the skeleton of the grid map, wherein the grid map is constructed based on the environment area where the robot is located, and the grid map comprises a boundary to be explored.
Step 91 is similar to step S11 described above and will not be described in detail here.
Step S92: and determining a root node based on the position of the robot, and determining a node tree according to the root node and the nodes in the skeleton.
Step S92 is similar to step S12 described above and is not repeated here.
Step S93: and according to the position relation between the leaf nodes in the node tree and the boundary to be explored.
Step S93 is similar to step S13 described above and is not repeated here.
Step S94: the length of the boundary to be explored is calculated.
Optionally, the present embodiment starts with a leaf node located at the boundary to be explored and moves along the skeleton to determine the length of the boundary to be explored.
Specifically, nodes on the boundary to be explored are searched for along the nodes on the skeleton from leaf nodes on the boundary to be explored to form the boundary to be explored; when the length of the boundary to be explored (the distance between adjacent nodes of the skeleton can be calculated when the node tree is constructed) is larger than a certain threshold value, it indicates that the unexplored area is large and needs to be explored, and at this time, the parent node of the leaf node is added into the set of the nodes to be explored.
Step S95: and if the length is larger than the threshold value, controlling the robot to explore and build a map.
Step S95 is similar to step S13 described above and is not repeated here.
And if the length is smaller than or equal to the threshold value, the unexplored area is small, the exploration is not needed, and the leaf node is discarded.
In one embodiment, as shown in fig. 10, there is provided a mapping apparatus, which may be a part of a computer device using a software module or a hardware module, or a combination of the two, and specifically includes: the system comprises a skeleton node extraction module 111, a node tree determination module 112 and an exploration map building module 113, wherein the node tree determination module 112 is respectively connected with the skeleton node extraction module 111 and the exploration map building module 113; wherein: the skeleton node extraction module 111 is configured to determine nodes in a skeleton based on the skeleton of a grid map, where the grid map is constructed based on an environment area where the robot is located, and the grid map includes a boundary to be explored; the node tree determining module 112 is configured to determine a root node based on the position of the robot, and determine a node tree according to the root node and a node in the skeleton; the exploration mapping module 113 is configured to control the robot to explore and map according to a position relationship between a leaf node in the node tree and a boundary to be explored.
The mapping device of the embodiment is also used for realizing the mapping method of the robot.
For the specific definition of the mapping device, reference may be made to the above definition of the mapping method for the robot, and details are not described here. The modules in the mapping device can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The present application further provides an electronic device, as shown in fig. 11, fig. 11 is a schematic structural diagram of an embodiment of the electronic device of the present application. The electronic device 100 of the present embodiment includes a processor 101, a memory 102 coupled to the processor 101, an input/output device 103, and a bus 104.
The processor 101, the memory 102, and the input/output device 103 are respectively connected to the bus 104, the memory 102 stores program data, and the processor 101 is configured to execute the program data to implement the robot mapping method.
The controller in the above embodiments may be integrated within the processor 101.
In the present embodiment, the processor 101 may also be referred to as a CPU (Central Processing Unit). The processor 101 may be an integrated circuit chip having signal processing capabilities. The processor 101 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 101 may be any conventional processor or the like.
The electronic apparatus 100 of the present embodiment may be a robot, and further includes a detection device capable of detecting a space obstacle and a moving device.
According to the robot mapping method and device, the skeleton of the grid map of the current environment area is extracted, and the skeleton is extended to the map boundary; meanwhile, the points to be explored are calculated based on the framework, and the obtained points to be explored are far away from the barrier, so that the safety is improved to the maximum extent; meanwhile, the points to be explored are arranged according to a certain sequence, so that the exploration sequence of the points to be explored is ensured.
The present application further proposes a computer-readable storage medium, as shown in fig. 12, the computer-readable storage medium 160 of the present embodiment is used for storing the program data 161 of the above-mentioned embodiment, and the program data 161 can be executed to implement the mapping method of the robot. The program data 161 has been described in detail in the above method embodiments, and is not described herein.
The computer readable storage medium 160 of this embodiment may be, but is not limited to, a usb disk, an SD card, a PD optical drive, a removable hard disk, a high-capacity floppy drive, a flash memory, a multimedia memory card, a server, etc.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
Different from the prior art, the method and the device are characterized in that nodes in the skeleton are determined based on the skeleton of the grid map, and the grid map comprises the boundary to be explored, so that the skeleton extends to the boundary to be explored, namely the skeleton comprises the skeleton of the boundary to be explored; further, a node tree of nodes in the framework is established, and the mobile robot is controlled to explore and establish a graph based on the position relation between leaf nodes in the node tree and a boundary to be searched; by the method, the robot can automatically construct the image, and the intelligent level of image construction is improved; in addition, the exploration points are calculated based on the skeleton of the grid map, so that the exploration points can be far away from the barrier, and the safety can be improved; meanwhile, the points to be explored are arranged according to a certain sequence, so that the exploration sequence of the points to be explored is ensured.
In addition, if the above functions are implemented in the form of software functions and sold or used as a standalone product, the functions may be stored in a storage medium readable by a mobile terminal, that is, the present application also provides a storage device storing program data, which can be executed to implement the method of the above embodiments, the storage device may be, for example, a usb disk, an optical disk, a server, etc. That is, the present application may be embodied as a software product, which includes several instructions for causing an intelligent terminal to perform all or part of the steps of the methods described in the embodiments.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing mechanisms, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be viewed as implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device (e.g., a personal computer, server, network device, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions). For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.
Claims (10)
1. A robot mapping method is characterized by comprising the following steps:
determining nodes in a skeleton based on the skeleton of a grid map, wherein the grid map is constructed based on an environment area where a robot is located, and the grid map comprises a boundary to be explored;
determining a root node based on the position of the robot, and determining a node tree according to the root node and nodes in the skeleton;
and controlling the robot to explore and build a graph according to the position relation between the leaf nodes in the node tree and the boundary to be explored.
2. The method of claim 1, wherein the determining the nodes in the skeleton comprises:
and determining grids corresponding to corners, branches and/or grids corresponding to tail ends in the skeleton as nodes.
3. The method of claim 1, wherein the determining the nodes in the skeleton comprises:
and traversing the grids on the framework, and judging whether the grids are nodes or not according to the position relation between the neighborhood grids of the grids and the framework.
4. The mapping method according to claim 3, wherein the determining whether the grid is a node according to the position relationship between the grid in the neighborhood of the grid and the skeleton includes:
determining that the grid is a leaf node based on only one of the neighborhood grids of the grid being on the skeleton;
determining that the grid is a turning node based on that only two of the neighborhood grids of the grid are on the framework and the two neighborhood grids and the grid are not on the same straight line;
and judging and determining the grid as a branch node based on that at least three of the neighborhood grids of the grid are on the framework.
5. The mapping method of claim 1, wherein the grid map comprises a grid to be explored; determining the position relation between the leaf nodes in the node tree and the boundary to be explored, wherein the step of determining the position relation comprises the following steps:
and based on that at least three grids in the neighborhood of the grid where the leaf node is located are grids to be explored, judging the position relation between the leaf node and the boundary to be explored to be that the leaf node is located at the boundary to be explored.
6. The mapping method according to claim 1, wherein the controlling the robot exploration mapping according to the position relationship between the leaf nodes in the node tree and the boundary to be explored comprises:
taking a father node of the leaf node positioned on the boundary to be explored as a node to be explored;
and controlling the robot to move to the node to be explored for exploring and building a graph.
7. The mapping method according to claim 6, wherein the mapping method further comprises:
taking the position of the robot as a current node, and judging whether a leaf node of the current node has a node to be explored or not;
if the node exists, controlling the robot to move to the node to be explored;
and if the current node does not exist, controlling the robot to move to the father node of the current node.
8. The mapping method according to claim 1, wherein the mapping method further comprises:
calculating the length of the boundary to be explored, comprising: starting from a leaf node located on the boundary to be explored, moving along the skeleton to determine the length of the boundary to be explored;
and if the length is larger than the threshold value, executing a step of controlling the robot to explore and establish a map.
9. An electronic device, comprising: comprises a memory and a processor which are coupled with each other, wherein the processor is used for executing the program data stored in the memory so as to realize the mapping method of the robot in any one of the claims 1 to 8.
10. A computer storage medium having stored thereon program data executable to implement the method of mapping a robot of any of claims 1 to 8.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111630176.1A CN114545923A (en) | 2021-12-28 | 2021-12-28 | Robot mapping method, electronic device and computer storage medium |
PCT/CN2022/107503 WO2023124035A1 (en) | 2021-12-28 | 2022-07-22 | Mapping method for robot, electronic device and nonvolatile readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111630176.1A CN114545923A (en) | 2021-12-28 | 2021-12-28 | Robot mapping method, electronic device and computer storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114545923A true CN114545923A (en) | 2022-05-27 |
Family
ID=81668955
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111630176.1A Pending CN114545923A (en) | 2021-12-28 | 2021-12-28 | Robot mapping method, electronic device and computer storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114545923A (en) |
WO (1) | WO2023124035A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023124035A1 (en) * | 2021-12-28 | 2023-07-06 | 美的集团(上海)有限公司 | Mapping method for robot, electronic device and nonvolatile readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110956161A (en) * | 2019-12-17 | 2020-04-03 | 中新智擎科技有限公司 | Autonomous map building method and device and intelligent robot |
US20210089040A1 (en) * | 2016-02-29 | 2021-03-25 | AI Incorporated | Obstacle recognition method for autonomous robots |
CN113009916A (en) * | 2021-03-08 | 2021-06-22 | 珠海市一微半导体有限公司 | Path planning method, chip and robot based on global map exploration |
CN113050632A (en) * | 2021-03-11 | 2021-06-29 | 珠海市一微半导体有限公司 | Map exploration method and chip for robot to explore unknown area and robot |
CN113465612A (en) * | 2021-07-02 | 2021-10-01 | 南京航空航天大学 | Parallel path planning method and system based on double-layer index |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11314262B2 (en) * | 2016-08-29 | 2022-04-26 | Trifo, Inc. | Autonomous platform guidance systems with task planning and obstacle avoidance |
CN109993064B (en) * | 2019-03-06 | 2022-04-29 | 东南大学 | Method for extracting connection path between road network nodes in picture |
CN110850871B (en) * | 2019-10-21 | 2020-07-17 | 深圳市银星智能科技股份有限公司 | Machine path planning method and mobile robot |
CN111721280B (en) * | 2020-05-25 | 2022-05-31 | 科沃斯机器人股份有限公司 | Area identification method, self-moving equipment and storage medium |
CN112000754B (en) * | 2020-08-11 | 2024-06-07 | 珠海格力电器股份有限公司 | Map construction method, device, storage medium and computer equipment |
CN111966097B (en) * | 2020-08-12 | 2024-06-14 | 深圳华芯信息技术股份有限公司 | Map building method, system and terminal based on grid map regional exploration |
CN113189988B (en) * | 2021-04-21 | 2022-04-15 | 合肥工业大学 | Autonomous path planning method based on Harris algorithm and RRT algorithm composition |
CN113310491A (en) * | 2021-05-17 | 2021-08-27 | 北京航空航天大学 | Unmanned aerial vehicle navigation network automatic generation method considering specific structure |
CN114545923A (en) * | 2021-12-28 | 2022-05-27 | 美的集团(上海)有限公司 | Robot mapping method, electronic device and computer storage medium |
-
2021
- 2021-12-28 CN CN202111630176.1A patent/CN114545923A/en active Pending
-
2022
- 2022-07-22 WO PCT/CN2022/107503 patent/WO2023124035A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210089040A1 (en) * | 2016-02-29 | 2021-03-25 | AI Incorporated | Obstacle recognition method for autonomous robots |
CN110956161A (en) * | 2019-12-17 | 2020-04-03 | 中新智擎科技有限公司 | Autonomous map building method and device and intelligent robot |
CN113009916A (en) * | 2021-03-08 | 2021-06-22 | 珠海市一微半导体有限公司 | Path planning method, chip and robot based on global map exploration |
CN113050632A (en) * | 2021-03-11 | 2021-06-29 | 珠海市一微半导体有限公司 | Map exploration method and chip for robot to explore unknown area and robot |
CN113465612A (en) * | 2021-07-02 | 2021-10-01 | 南京航空航天大学 | Parallel path planning method and system based on double-layer index |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023124035A1 (en) * | 2021-12-28 | 2023-07-06 | 美的集团(上海)有限公司 | Mapping method for robot, electronic device and nonvolatile readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2023124035A1 (en) | 2023-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9207678B2 (en) | Method and apparatus for constructing map for mobile robot | |
CN112000754B (en) | Map construction method, device, storage medium and computer equipment | |
JP6802331B2 (en) | Lane processing method and equipment | |
CN110936383B (en) | Obstacle avoiding method, medium, terminal and device for robot | |
CN111680747B (en) | Method and apparatus for closed loop detection of occupancy grid subgraphs | |
EP3987995B1 (en) | Method for expanding working area based on laser map, chip and robot | |
CN110146098A (en) | A kind of robot map enlargement method, device, control equipment and storage medium | |
CN109163722A (en) | A kind of anthropomorphic robot paths planning method and device | |
US5978520A (en) | Method of recognizing image data and apparatus therefor | |
CN112561859B (en) | Monocular vision-based steel belt drilling and anchor net identification method and device for anchoring and protecting | |
CN111295666A (en) | Lane line detection method, device, control equipment and storage medium | |
CN111329398A (en) | Robot control method, robot, electronic device, and readable storage medium | |
CN114545923A (en) | Robot mapping method, electronic device and computer storage medium | |
CN114545922A (en) | Path planning method, electronic device and computer storage medium | |
CN112146668A (en) | Unmanned vehicle autonomous navigation method based on ROS | |
KR102230362B1 (en) | Cleaning Robot Apparatus Using Rectangular Map Decomposition and Method for Planning Coverage Path Using the Same | |
CN114690769A (en) | Path planning method, electronic device, storage medium and computer program product | |
JP2022513781A (en) | Target attribute detection, neural network training and intelligent driving methods, equipment | |
CN117008589A (en) | Mobile robot autonomous exploration mapping method based on improved RRT algorithm | |
CN114343507A (en) | Map data generation method and device and sweeping robot | |
CN115519586A (en) | Cliff detection method for robot, and storage medium | |
CN113791610A (en) | Global path planning method for mobile robot | |
Jebari et al. | Combined vision and frontier-based exploration strategies for semantic mapping | |
Loo et al. | Scene Action Maps: Behavioural Maps for Navigation without Metric Information | |
CN110705519B (en) | Autonomous mobile robot, map splicing method and device thereof, and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220706 Address after: Room 1152, area B, 11 / F, building 1, 158 Shuanglian Road, Huangpu District, Shanghai, 201702 Applicant after: Midea Group (Shanghai) Co.,Ltd. Applicant after: MIDEA GROUP Co.,Ltd. Address before: 201799 room 1152, block B, floor 11, building 1, No. 158 Shuanglian Road, Qingpu District, Shanghai Applicant before: Midea Group (Shanghai) Co.,Ltd. |
|
TA01 | Transfer of patent application right |