CN110968083B - Method for constructing grid map, method, device and medium for avoiding obstacles - Google Patents
Method for constructing grid map, method, device and medium for avoiding obstacles Download PDFInfo
- Publication number
- CN110968083B CN110968083B CN201811158962.4A CN201811158962A CN110968083B CN 110968083 B CN110968083 B CN 110968083B CN 201811158962 A CN201811158962 A CN 201811158962A CN 110968083 B CN110968083 B CN 110968083B
- Authority
- CN
- China
- Prior art keywords
- obstacle
- grid
- information
- map
- grid map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 93
- 238000010276 construction Methods 0.000 claims abstract description 19
- 230000007613 environmental effect Effects 0.000 claims abstract description 15
- 238000004590 computer program Methods 0.000 claims description 45
- 230000015654 memory Effects 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 29
- 238000010408 sweeping Methods 0.000 claims description 28
- 238000012545 processing Methods 0.000 claims description 25
- 238000003860 storage Methods 0.000 claims description 20
- 238000013507 mapping Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 abstract description 17
- 230000004888 barrier function Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 19
- 239000010813 municipal solid waste Substances 0.000 description 13
- 230000005236 sound signal Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 6
- 230000004044 response Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 238000010410 dusting Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
- G01C21/32—Structuring or formatting of map data
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The embodiment of the application provides a construction method of a grid map, an obstacle avoidance method, equipment and a medium. In the embodiment of the application, an environment image of the surrounding environment is acquired; information identifying at least one obstacle contained in the environmental image; determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one barrier; and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map. According to the grid map construction method and the grid map construction device, the influence of device missing detection and false detection on the grid map construction is reduced, the accuracy of the obstacles output by the grid map is improved, and the obstacle avoidance performance of the device for avoiding the obstacles on the map constructed by the grid map construction method is improved.
Description
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method for constructing a grid map, a method for avoiding obstacles, a device, and a medium.
Background
When the floor sweeping robot is used for sweeping the ground, obstacles need to be avoided so as to better perform cleaning work.
The obstacle avoidance function of the sweeping robot is generally realized by matching infrared, laser and ultrasonic equidistant sensors and a spring baffle, and after the distance sensors detect that an obstacle exists in front or the spring baffle touches the obstacle, the robot returns or bypasses according to the control instruction of obstacle avoidance.
Disclosure of Invention
Aspects of the present application provide a method for constructing a grid map, a method for avoiding obstacles, a device and a medium.
The embodiment of the application provides a method for constructing a grid map, which is suitable for self-moving equipment and comprises the following steps:
acquiring an environment image of a surrounding environment;
information identifying at least one obstacle contained in the environmental image; determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one barrier;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
The embodiment of the application also provides an obstacle avoidance method, which is suitable for self-moving equipment, and the method comprises the following steps: determining a travel path in the moving process of the self-moving equipment;
determining an obstacle avoidance area with obstacles on the travelling path according to the obstacle information recorded in a plurality of grid layers of each grid in the grid map;
and carrying out obstacle avoidance processing on the obstacle avoidance area.
An embodiment of the present application further provides a self-moving device, including: the machine body is provided with a sensor, one or more processors and one or more memories for storing computer programs;
the sensor is used for acquiring an environment image of the surrounding environment;
the one or more processors to execute the computer program to:
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one obstacle;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program, which, when executed by one or more processors, causes the one or more processors to perform actions comprising:
acquiring an environment image of a surrounding environment;
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one obstacle;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
An embodiment of the present application further provides a self-moving device, including: the machine body is provided with one or more processors and one or more memories for storing computer programs;
the one or more processors to execute the computer program to:
determining a travel path in the moving process of the self-moving equipment;
determining an obstacle avoidance area with obstacles on the travelling path according to the obstacle information recorded in a plurality of grid layers of each grid in the grid map;
and carrying out obstacle avoidance processing on the obstacle avoidance area.
Embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed by one or more processors, causes the one or more processors to perform actions comprising:
determining a travel path in the moving process of the self-moving equipment;
determining an obstacle avoidance area with obstacles on the travelling path according to the obstacle information recorded in a plurality of grid layers of each grid in the grid map;
and carrying out obstacle avoidance processing on the obstacle avoidance area.
According to the method and the device for avoiding the obstacles in the grid map, the grid layer of the grid matched with the obstacles in the existing map is updated according to the information of the obstacles, the influence of device missing detection and false detection on the grid map construction is reduced, the accuracy of the obstacles output by the grid map is improved, and the obstacle avoiding performance of the device for avoiding the obstacles on the map constructed by the grid map construction method is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart of a method for constructing a grid map according to an exemplary embodiment of the present application;
FIG. 2 is a flowchart of a method for calculating position information for each obstacle according to an exemplary embodiment of the present application;
FIG. 3 is a probability grid map and a partially enlarged illustration thereof provided by exemplary embodiments of the present application;
fig. 4 is a flowchart of a method for updating probability values of a certain grid layer for each grid in a grid map provided by an exemplary embodiment of the present application;
FIG. 5 is a detailed method flowchart of a method for constructing a grid map according to an exemplary embodiment of the present disclosure;
fig. 6 is a schematic flowchart of an obstacle avoidance method according to an exemplary embodiment of the present application;
fig. 7 is a block diagram of a self-moving device according to an exemplary embodiment of the present application;
FIG. 8 is a block diagram of a robot according to an exemplary embodiment of the present disclosure;
FIG. 9 is a block diagram of a self-propelled device according to an exemplary embodiment of the present application;
fig. 10 is a block diagram of another robot according to an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only a few embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
At present, the obstacle avoidance function of the sweeping robot is generally realized by matching an infrared sensor, a laser sensor, an ultrasonic sensor and a spring baffle, and after the distance sensor detects that an obstacle exists in the front or the spring baffle touches the obstacle, the robot returns or bypasses according to a control instruction for avoiding the obstacle. The robot that sweeps floor can't detect low object and narrow and small region, probably causes the robot that sweeps floor to brush the trouble such as winding the barrier, block in narrow and small region simultaneously, influences going on of follow-up cleaning.
To solve the above technical problems in the prior art, an embodiment of the present application provides a solution, and the basic idea is: the method comprises the steps of acquiring an environment image from the surrounding environment, identifying at least one piece of obstacle information in the environment image, updating grid layers of grids matched with an obstacle in a grid map according to the information of the obstacle to obtain an updated grid map, and avoiding the obstacle from a mobile device according to the updated grid map.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
The words "if", as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030, when" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrases "comprising one of \8230;" does not exclude the presence of additional like elements in an article or system comprising the element.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method for constructing a grid map according to an exemplary embodiment of the present application, where as shown in fig. 1, the method includes:
s101: acquiring an environment image of the surrounding environment;
s102: information identifying at least one obstacle contained in the environmental image;
s103: determining grid layers in grids respectively matched with at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores the information of one barrier;
s104: and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
The execution main body of the method in the embodiment of the application can be self-moving equipment, such as an unmanned vehicle, a robot and the like, the types of the robot and the unmanned vehicle are not limited, and the robot can be a sweeping robot, a following robot, a welcoming robot and the like. Different devices acquire environment images in corresponding working environments aiming at different working environments, for example, a sweeping robot can acquire environment images of areas such as a living room, a kitchen, a toilet, a horizontal type and the like in the traveling process in the process of sweeping a household of a resident; the shopping guide robot in the market can acquire environment images of various areas such as pedestrian passages, shops and the like in the process of shopping guide for the customer; the following robot can acquire the following target and the surrounding environment image in the process of advancing in the process of following the target.
In this embodiment, a visual sensor is arranged on a self-moving device, so as to collect a surrounding environment where the self-moving device is located in real time, and the visual sensor has an object recognition function, and a monocular camera is preferred, but the embodiment of the present application is not particularly limited to the type of the visual sensor.
The grid map in the embodiment of the application is a probability grid map, the probability value of an object is stored in each grid in the probability grid map, the situation that the grid can pass is reflected, and the probability value is 0-1. The probability tends to 0 (blank) to indicate that the coordinate location corresponding to the grid is passable, and the probability tends to 1 (occupied) to indicate that the coordinate location corresponding to the grid is impassable.
In the above embodiment, information of at least one obstacle included in the current environment image is identified, where the information of the obstacle includes information such as category information, position information, classification confidence, and obstacle labeling box of the obstacle.
In the above embodiment, the type information for identifying the obstacle may be that the current environment image is subjected to image recognition, the type information of each obstacle included in the current environment image is identified, and the type information of each obstacle included in the current environment image may be identified by using a target detection algorithm.
In the above-described embodiment, the position information identifying the obstacle may calculate the position information of each obstacle from the pixel position of each obstacle in the current environment image. Fig. 2 is a flowchart of a method for calculating position information of each obstacle according to an exemplary embodiment of the present application. As shown in fig. 2, the method includes:
s201: determining the pixel position of each obstacle in the environment image;
s202: inquiring a mapping relation between the pixel position of each obstacle in the image and the distance from the obstacle to the self-moving equipment according to the pixel position of each obstacle in the environment image so as to obtain the distance from each obstacle to the self-moving equipment;
s203: and determining the position information of each obstacle according to the distance from each obstacle to the self-moving equipment and the position information of the self-moving equipment.
In this embodiment, it is necessary to first determine the current position information of the mobile device for positioning, start moving from an unknown position of the mobile device, perform self-positioning according to the position estimation and the sensor data during moving, and construct the grid map.
In this embodiment, the pixel position of each obstacle in the current environment image is determined, and optionally, the boundary line of each obstacle intersecting the ground is identified from the current environment image as the pixel position of each obstacle in the current environment image. Actually, the grid map is a two-dimensional plane map, and the two end points of the boundary line where each obstacle intersects with the ground are connected into a straight line at the pixel position of a certain obstacle acquired at the same time, and are finally reflected in the grid map as only one straight line with a length. The mapping relation between the pixel position of the obstacle in the image and the distance from the obstacle to the mobile device is preset in the vision sensor, and the distance from each obstacle to the mobile device can be determined according to the mapping relation aiming at the pixel position of each obstacle in the current environment image. After the distance from each obstacle to the mobile equipment is determined, the position information of each obstacle can be calculated by combining the position information of the mobile equipment.
In the process of identifying an object, false detection and missed detection may occur, and in the process of identifying the same obstacle for multiple times, false identification may occur. The embodiment of the application provides a solution, an existing grid map is improved, each grid in the grid map is multi-layer, each layer of grid in each grid stores information of one obstacle, each grid can store information of a plurality of obstacles simultaneously, and only grid layers in the grids matched with the obstacles are updated in the updating process of the grid map. Finally, each grid of the grid map presents probability values for a plurality of objects.
For example, if the object a is recognized several times and the object a is recognized as the object B several times in the middle, due to the improvement of the grid map in the present application, the grid layer matching the object a in the grid map is updated successively for the number of times the object a is recognized correctly, and the grid layer matching the object B in the grid map is updated successively for the number of times the object a is recognized as the object B. In the situation that the object A is mistakenly identified as the object B in the situation of updating the grid map after the object A is identified for multiple times, the grid layer corresponding to the object B of the grid matched with the object B in the grid map is updated, and the influence of false detection and missed detection on the construction of the final grid map in the process of identifying the object is reduced.
Fig. 3 is a probability grid map and a partially enlarged display diagram thereof according to an embodiment of the present disclosure. Fig. 3 is a schematic diagram of a probability grid map at the top and an enlarged schematic diagram of a selected portion C of the dotted line at the bottom. The grid with higher probability values in the graph is marked with darker colors.
After determining the grid layers in the grids respectively matched with each obstacle in the grid map, updating the corresponding grid layers in the grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle. Optionally, for each obstacle in the at least one obstacle, generating local grid information of the obstacle according to the category information and the position information of the obstacle; and updating the corresponding grid layer in the grid matched with the obstacle in the grid map by using the local grid information of the obstacle.
In the above embodiment, the original probability values of the corresponding grid layers in the grids matched with the obstacle in the grid map may be updated by using the observation probability values in each grid in the local grid information of the obstacle, so as to obtain the updated probability values of the corresponding grid layers in the grids matched with the obstacle.
Fig. 4 is a flowchart of a method for updating probability values of a certain grid layer for each grid in a grid map according to an exemplary embodiment of the present application, where the method includes the following steps:
s401: carrying out logarithm operation on the observation probability value in each grid in the local grid information of the obstacle to obtain an observation probability logarithm value;
s402: carrying out logarithm operation on the original probability values of the corresponding grid layers in the grids matched with the obstacles in the grid map to obtain original probability logarithm values;
s403: obtaining an update probability value of a corresponding grid layer of a grid matched with the obstacle according to an observation probability logarithm value corresponding to local grid information of the obstacle and an original probability logarithm value corresponding to the corresponding grid layer in the grid matched with the obstacle in the grid map;
s404: and performing antilog operation on the update probability logarithm value of the corresponding grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the corresponding grid layer of the grid matched with the obstacle.
Wherein, the updated update probability value Pn of the corresponding grid layer of each grid matched with the obstacle is calculated by the formula:
P 1 is the original probability value, P 2 To observe the probability values, pn is the update summaryThe value is obtained.
According to the method and the device for constructing the grid map, the problem of updating the corresponding grid layer in the grid matched with each obstacle in the grid map is converted into the problem of mathematical calculation, and the construction of the constructed grid map is converted into the mathematical problem.
In the construction process of the grid map, the robot can acquire the information of the same obstacle from multiple angles, and then can present the 3D information of the same obstacle in the grid map.
Fig. 5 is a more detailed method flowchart of a method for constructing a grid map according to an exemplary embodiment of the present application, and as shown in fig. 5, the method includes:
s501: acquiring images of the surrounding environment from a sensor of the mobile equipment;
s502: acquiring a frame of environment image;
s503: carrying out image recognition on the environment image, and judging whether the environment image contains information of obstacles;
s504: when the environment image is identified to have no obstacle, continuing to advance according to a pre-advancing path; when the environment image is identified to have at least one obstacle, determining grid layers in grids respectively matched with the at least one obstacle in the grid map;
s505: and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
S506: and avoiding the obstacle according to the updated grid map.
In the foregoing embodiment, each step of the method for constructing a grid map of the present application has been described in detail, and therefore, for each step of this embodiment, reference may be made to the description of the corresponding part of the foregoing embodiment, and beneficial effects of the foregoing embodiments may also be produced in this embodiment of the present application, which are not repeated herein.
According to the method and the device for avoiding the obstacles in the grid map, the grid layer of the grid matched with the obstacles in the existing map is updated according to the information of the obstacles, the influence of device missing detection and false detection on the grid map construction is reduced, the accuracy of the obstacles output by the grid map is improved, and the device for avoiding the obstacles by the map constructed by the grid map construction method in the application improves the obstacle avoiding performance.
The following describes a method for constructing a grid map according to the present application with reference to embodiments of different scenarios.
The method comprises the steps that 1, in a ground sweeping scene of a sweeping robot, in the process of sweeping the ground, the robot can acquire environment images of objects on the indoor ground, the robot walks to a certain position to acquire a current environment image, image recognition is carried out on the current environment image, the environment image of the frame is recognized to comprise a garbage can and a shoe, and the position information of the garbage can and the shoe in a grid map is determined according to the pixel positions of the garbage can and the shoe in the image; respectively generating local grid information of the trash can and the shoes aiming at the trash can and the shoes, and updating the original probability values of the corresponding grid layers in the grids matched with the trash can in the grid map by utilizing the generated local grid information of the trash can to obtain the updated probability values of the corresponding grid layers of the grids matched with the trash can; updating the original probability value of the corresponding grid layer in the grid matched with the shoe in the grid map by using the generated local grid information of the shoe to obtain the updated probability value of the corresponding grid layer of the grid matched with the shoe; when the robot travels to other different angles, the garbage can is recognized for multiple times, wherein the garbage can is recognized as a chair once by mistake, the grid layer corresponding to the chair of the grid matched with the chair in the existing grid is updated, the influence of the garbage can recognized as the chair by mistake on the construction of the final grid map in the object recognition process is reduced, the probability value of outputting the garbage can in the grid map is more reasonable, and the obstacle avoidance performance of the corresponding sweeping robot is also improved.
The application scene 2 is that in a shopping mall shopping guide scene, in the process of carrying out shopping guide on a user by a shopping guide robot, an environment image in the shopping mall can be collected, the shopping guide robot walks to a certain position to obtain a frame of current environment image, the frame of current environment image is subjected to image identification, the frame of environment image is identified to comprise a step, and the position information of the step in a grid map is determined according to the pixel position of the step in the image; generating local grid information of a step, and updating the original probability value of the corresponding grid layer in the grid matched with the step in the grid map by using the generated local grid information of the step to obtain the update probability value of the corresponding grid layer of the grid matched with the step; updating the original probability value of the corresponding grid layer in the grid matched with the step in the grid map by using the generated local grid information of the step to obtain the updated probability value of the corresponding grid layer of the grid matched with the step; when the robot travels to other different angles, the steps are recognized for multiple times, wherein the steps are recognized as electric wires by mistake for one time, the grid layers corresponding to the electric wires of the grids matched with the chairs are updated, the influence of identifying the steps as the electric wires on the final grid map construction in the object recognition process is reduced, the probability value of outputting a garbage can in the grid map is more reasonable, and the obstacle avoidance performance of the corresponding shopping guide robot is also improved.
Fig. 6 is a schematic flowchart of an obstacle avoidance method according to an exemplary embodiment of the present application. As shown in fig. 6, the method includes:
s601: determining a traveling path in the moving process of the self-moving equipment;
s602: determining an obstacle avoidance area with obstacles on a traveling path according to the obstacle information recorded in a plurality of grid layers of each grid in the grid map;
s603: and carrying out obstacle avoidance processing aiming at the obstacle avoidance area.
In this embodiment, the grid map may be a global grid map that has already been constructed, or may be a grid map that has not yet been constructed and is being constructed. Before the automatic mobile equipment travels, a travel path is planned according to a task created by the automatic mobile equipment. And then, determining whether an obstacle exists on the travelling path of the automatic mobile equipment according to the obstacle information recorded in the plurality of grid layers of each grid in the grid map, if so, determining an obstacle avoidance area of the obstacle, and performing obstacle avoidance processing according to the obstacle avoidance area.
The obstacle avoidance method comprises the steps of determining an obstacle avoidance area with obstacles on a travelling path according to obstacle information recorded in a plurality of grid layers of each grid in a grid map. In an alternative embodiment, the obstacle information recorded in each grid layer includes category information and probability information of the obstacle; determining a target grid on a travel path in the grid map; determining whether obstacle avoidance processing is required according to the category information of the obstacles recorded in each grid layer in the target grid, for example, when the sweeping robot recognizes that the obstacle in the target grid is a charger, the sweeping robot does not need to avoid the charger but should approach the charger for charging; when obstacle avoidance processing is required, acquiring obstacle avoidance thresholds corresponding to the category information of obstacles, wherein the obstacle avoidance thresholds of different obstacles may be different, and comparing probability information of the obstacles recorded by each grid layer in the target grid with the obstacle avoidance thresholds corresponding to the obstacles; and if the grid layer larger than the obstacle avoidance threshold exists in the target grid, determining that the area corresponding to the target grid is the obstacle avoidance area.
After the obstacle avoidance area is determined, obstacle avoidance processing needs to be performed on the obstacle avoidance area. An optional embodiment is that an obstacle avoidance path is determined according to an obstacle avoidance area; when the mobile equipment travels to the obstacle avoidance area, the current travel path is switched to the obstacle avoidance path to continue traveling; and after the self-moving equipment passes through the obstacle avoidance area, switching from the current obstacle avoidance path to the traveling path to continue traveling. It is obvious that the automatic moving device can also plan a travel path for the obstacle avoidance area that does not coincide at all with the travel path.
The following describes a method for avoiding obstacles according to an embodiment of the present application with reference to embodiments of different scenarios.
The method comprises the following steps that 1, in a scene that a sweeper cleans the ground, the sweeping robot needs to plan a travelling path for completing sweeping of the ground in advance, a target grid of an obstacle on the travelling path is determined in a grid map, when the sweeping robot recognizes that the obstacle in the target grid is a charger, the sweeping robot does not need to avoid the charger at the moment, but needs to approach the charger for charging, when the sweeping robot recognizes that the obstacle in the target grid is a chair, the sweeping robot needs to avoid the chair to walk at the moment, and probability information of the chair in the target grid corresponding to a grid layer is compared with an obstacle avoidance threshold corresponding to the chair; and if the probability value of the grid layer corresponding to the chair in the target grid is greater than the obstacle avoidance threshold value, determining that the grid corresponding to the chair is an obstacle avoidance area, and replanning the traveling path of the sweeping robot.
The method comprises the following steps that (1) in an application scene 2, in the market shopping guide scene, after a market shopping guide robot acquires the requirement of a user for reaching a destination, a traveling path for guiding the user to the destination is planned in advance, a target grid of obstacles on the traveling path is determined in a grid map, the market shopping guide robot identifies that the obstacles in the target grid are people and need to avoid the obstacles to the people, and probability information of a grid layer corresponding to the people in the target grid is compared with an obstacle avoidance threshold corresponding to the people; and if the probability value of the grid layer corresponding to the person in the target grid is greater than the obstacle avoidance threshold value, determining that the grid corresponding to the person is an obstacle avoidance area, and re-planning the traveling path of the sweeping robot.
Fig. 7 is a block diagram of a self-moving device according to an exemplary embodiment of the present application. The self-moving device includes one or more processors 702 and one or more memories 703 and sensors 705 that store computer programs. The sensor 705 is a visual sensor, and is used for acquiring an environment image of the acquired surrounding environment; necessary components such as an audio component 701, a power supply component 704, and the like may also be included.
One or more processors 702 to execute computer programs to:
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores the information of one obstacle;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
Optionally, the one or more processors 702, information identifying at least one obstacle contained in the environmental image, may be configured to: performing image recognition on the environment image, and recognizing the category information of each obstacle contained in the environment image; and calculating the position information of each obstacle according to the pixel position of each obstacle in the environment image.
Optionally, the one or more processors 702, calculating the position information of each obstacle according to the pixel position of each obstacle in the environment image, may be configured to: determining the pixel position of each obstacle in the environment image; inquiring a mapping relation between the pixel position of each obstacle in the image and the distance from the obstacle to the self-moving equipment according to the pixel position of each obstacle in the environment image so as to obtain the distance from each obstacle to the self-moving equipment; and determining the position information of each obstacle according to the distance from each obstacle to the self-moving equipment and the position information of the self-moving equipment.
Optionally, the one or more processors 702, determining the pixel location of each obstacle in the ambient image, may be configured to: and identifying a boundary line of each obstacle, which intersects with the ground, from the environment image as the pixel position of each obstacle in the environment image.
Optionally, the one or more processors 702, according to the information of the at least one obstacle, update the corresponding grid layer in the grid respectively matched with the at least one obstacle in the grid map, and may be configured to: generating local grid information of the obstacles according to the type information and the position information of the obstacles aiming at each obstacle in at least one obstacle; and updating the corresponding grid layer in the grid matched with the obstacle in the grid map by using the local grid information of the obstacle.
Optionally, the one or more processors 702, using the local grid information of the obstacle, to update a corresponding grid layer in a grid matching the obstacle in the grid map, may be configured to: and updating the original probability value of the corresponding grid layer in the grid matched with the obstacle in the grid map by using the observation probability value in each grid in the local grid information of the obstacle to obtain the updated probability value of the corresponding grid layer of the grid matched with the obstacle.
Optionally, the one or more processors 702 update the original probability values of the corresponding grid layers in the grids matched with the obstacle in the grid map by using the observed probability value in each grid in the local grid information of the obstacle, to obtain updated probability values of the corresponding grid layers of the grids matched with the obstacle, and may be configured to: carrying out logarithm operation on the observation probability value in each grid in the local grid information of the obstacle to obtain an observation probability logarithm value; carrying out logarithm operation on the original probability values of the corresponding grid layers in the grids matched with the obstacles in the grid map to obtain original probability logarithm values; obtaining an update probability value of a corresponding grid layer of a grid matched with the obstacle according to an observation probability logarithm value corresponding to local grid information of the obstacle and an original probability logarithm value corresponding to the corresponding grid layer in the grid matched with the obstacle in the grid map; and performing inverse logarithm operation on the update probability logarithm value of the corresponding grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the corresponding grid layer of the grid matched with the obstacle.
Optionally, the one or more processors 702, performing an antilog operation on the update probability logarithm values of the corresponding grid layers of the corresponding grids in the updated grid map, to obtain update probability values of the corresponding grid layers of the grids matched with the obstacle, may be configured to: the updated update probability value Pn of the corresponding grid layer of each grid matching the obstacle is calculated by the formula:
P 1 is the original probability value, P 2 To observe the probability value, pn is the update probability value.
Optionally, the one or more processors 702, after obtaining the updated grid map, may be further configured to: and in the moving process, obstacle avoidance is carried out according to the updated grid map.
According to the method and the device for avoiding the obstacles in the grid map, the grid layer of the grid matched with the obstacles in the existing map is updated according to the information of the obstacles, the influence of device missing detection and false detection on the grid map construction is reduced, the accuracy of the obstacles output by the grid map is improved, and the obstacle avoiding performance of the device for avoiding the obstacles on the map constructed by the grid map construction method is improved.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by the one or more processors 702, causes the one or more processors 702 to perform the steps in the respective method embodiments illustrated in fig. 1.
The self-moving equipment can be a robot, an unmanned vehicle and the like. Fig. 8 is a block diagram of a robot according to an exemplary embodiment of the present disclosure. As shown in fig. 8, the robot includes: a machine body 801; the machine body 801 is provided with one or more processors 803 and one or more memories 804 storing computer instructions. In addition, a sensor 802 is provided on the machine body 801. The sensor 802 is a vision sensor 802, such as a camera, etc., for acquiring an environmental image of the surrounding environment during the operation of the robot.
In addition to one or more processors 803 and one or more memories 804, the machine body 801 is provided with some basic components of the robot, such as audio components, power supply components, odometers, drive components, and the like. An audio component, which may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. The sensors 802 may also include a lidar sensor 802, a humidity sensor 802, and the like. Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. Alternatively, the sweeping assembly may include a sweeping motor, a sweeping brush, a dusting brush, a dust suction fan, and the like. The basic components and the structures of the basic components included in different robots are different, and the embodiments of the present application are only some examples.
It is noted that the audio components, the sensors 802, the one or more processors 803, and the one or more memories 804 may be disposed inside the machine body 801 or on the surface of the machine body 801.
The machine body 801 is an execution mechanism by which the robot performs a task of a job, and can execute an operation designated by the processor 803 in a certain environment. Wherein, the appearance form of robot has been embodied to a certain extent to the mechanical body. In the present embodiment, the external appearance of the robot is not limited, and may be, for example, a circle, an ellipse, a triangle, a convex polygon, or the like.
The one or more memories 804 are primarily for storing computer programs that are executable by the one or more processors 803 to cause the one or more processors 804 to perform grid mapping operations. In addition to storing computer programs, the one or more memories 804 may also be configured to store other various data to support operations on the robot.
One or more processors 803, which may be considered control systems for the robot, may be used to execute computer programs stored in one or more memories 804 to perform grid mapping operations on the robot.
The processor 803, e.g., the one or more memories 804, stores computer programs, which the one or more processors 803 may execute, and which may be used to:
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores the information of one obstacle;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
Optionally, the one or more processors 803, identifying information of at least one obstacle contained in the environment image, may be configured to: performing image recognition on the environment image, and recognizing the category information of each obstacle contained in the environment image; and calculating the position information of each obstacle according to the pixel position of each obstacle in the environment image.
Optionally, the one or more processors 803, which calculate the position information of each obstacle according to the pixel position of each obstacle in the environment image, may be configured to: determining the pixel position of each obstacle in the environment image; inquiring a mapping relation between the pixel position of each obstacle in the image and the distance from the obstacle to the self-moving equipment according to the pixel position of each obstacle in the environment image so as to obtain the distance from each obstacle to the self-moving equipment; and determining the position information of each obstacle according to the distance from each obstacle to the self-moving equipment and the position information of the self-moving equipment.
Optionally, the one or more processors 803, which determine the pixel position of each obstacle in the environment image, may be configured to: and identifying boundary lines of each obstacle, which intersect with the ground, from the environment image as pixel positions of each obstacle in the environment image.
Optionally, the one or more processors 803, according to the information of the at least one obstacle, update the corresponding grid layers in the grids respectively matched with the at least one obstacle in the grid map, and may be configured to: generating local grid information of the obstacles according to the type information and the position information of the obstacles aiming at each obstacle in at least one obstacle; and updating the corresponding grid layer in the grid matched with the obstacle in the grid map by using the local grid information of the obstacle.
Optionally, the one or more processors 803, using the local grid information of the obstacle, update the corresponding grid layer in the grid matched with the obstacle in the grid map, and may be configured to: and updating the original probability value of the corresponding grid layer in the grid matched with the obstacle in the grid map by using the observation probability value in each grid in the local grid information of the obstacle to obtain the updated probability value of the corresponding grid layer of the grid matched with the obstacle.
Optionally, the one or more processors 803 update the original probability values of the corresponding grid layers in the grids matched with the obstacle in the grid map by using the observed probability value in each grid in the local grid information of the obstacle, to obtain updated probability values of the corresponding grid layers of the grids matched with the obstacle, which may be used to: carrying out logarithm operation on the observation probability value in each grid in the local grid information of the obstacle to obtain an observation probability logarithm value; carrying out logarithmic operation on the original probability values of the corresponding grid layers in the grids matched with the obstacles in the grid map to obtain original probability logarithmic values; obtaining an update probability value of a corresponding grid layer of the grid matched with the obstacle according to an observation probability logarithm value corresponding to the local grid information of the obstacle and an original probability logarithm value corresponding to the corresponding grid layer in the grid matched with the obstacle in the grid map; and performing inverse logarithm operation on the update probability logarithm value of the corresponding grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the corresponding grid layer of the grid matched with the obstacle.
Optionally, the one or more processors 803 perform an inverse logarithm operation on the update probability logarithm of the corresponding grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the corresponding grid layer of the grid matched with the obstacle, which can be used to: the updated update probability value Pn of the corresponding grid layer of each grid matching the obstacle is calculated by the formula:
P 1 is the original probability value, P 2 To observe the probability value, pn is the update probability value.
Optionally, the one or more processors 803, after obtaining the updated grid map, may further be configured to: and in the moving process, obstacle avoidance is carried out according to the updated grid map.
According to the method and the device for avoiding the obstacles in the grid map, the grid layer of the grid matched with the obstacles in the existing map is updated according to the information of the obstacles, the influence of device missing detection and false detection on the grid map construction is reduced, the accuracy of the obstacles output by the grid map is improved, and the obstacle avoiding performance of the device for avoiding the obstacles in the map constructed by the grid map construction method is improved.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by the one or more processors 803, causes the one or more processors 803 to perform the steps of the respective method embodiment shown in fig. 1.
Fig. 9 is a block diagram of a self-moving device according to an exemplary embodiment of the present application. The self-moving device includes one or more processors 902 and one or more memories 903 storing computer programs and sensors 905. The sensor 905 is a visual sensor and is used for acquiring an environment image of the acquired surrounding environment; necessary components such as audio components 901, power components 904, and the like may also be included.
One or more processors 902 for executing a computer program for:
determining a travel path in the moving process of the self-moving equipment;
determining an obstacle avoidance area with obstacles on a travelling path according to obstacle information recorded in a plurality of grid layers of each grid in a grid map;
and carrying out obstacle avoidance processing aiming at the obstacle avoidance area.
Optionally, the one or more processors 902, where the obstacle information recorded in each grid layer includes category information and probability information of an obstacle, and determine, according to the obstacle information recorded in the multiple grid layers of each grid in the grid map, an obstacle avoidance area where an obstacle exists on a travel path, and may be configured to:
determining a target grid on a travel path in the grid map;
determining whether obstacle avoidance processing is needed or not according to the type information of the obstacles recorded by each grid layer in the target grid;
when obstacle avoidance processing is required, acquiring an obstacle avoidance threshold corresponding to the type information of the obstacle;
comparing probability information of the obstacles recorded by each grid layer in the target grid with an obstacle avoidance threshold;
and if the grid layer larger than the obstacle avoidance threshold exists in the target grid, determining that the area corresponding to the target grid is the obstacle avoidance area.
Optionally, the one or more processors 902, performing obstacle avoidance processing on the obstacle avoidance area, may be configured to:
determining an obstacle avoidance path according to the obstacle avoidance area;
when the mobile equipment travels to the obstacle avoidance area, the current travel path is switched to the obstacle avoidance path to continue traveling;
and after the self-moving equipment passes through the obstacle avoidance area, switching from the current obstacle avoidance path to the traveling path to continue traveling.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program, and the computer program, when executed by the one or more processors 902, causes the one or more processors 902 to perform the steps in the respective method embodiment of fig. 6.
The self-moving equipment can be a robot, an unmanned vehicle and the like. Fig. 10 is a block diagram of another robot according to an exemplary embodiment of the present disclosure. As shown in fig. 10, the robot includes: a machine body 1001; the machine body 1001 is provided with one or more processors 1003 and one or more memories 1004 which store computer instructions. In addition, a sensor 1002 is provided on the machine body 1001. The sensor 1002 is a visual sensor 1002, such as a camera, or the like, and is used for acquiring a plurality of environment images during the current motion of the robot during the operation of the robot.
In addition to one or more processors 1003 and one or more memories 1004, the machine body 1001 is provided with some basic components of the robot, such as an audio component, a power supply component, an odometer, a drive component, and the like. An audio component, which may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. The sensors 1002 may also include a lidar sensor 1002, a humidity sensor 1002, and the like, and the vision sensor 1002 may be a camera, a video camera, and the like. Alternatively, the drive assembly may include drive wheels, drive motors, universal wheels, and the like. Alternatively, the sweeping assembly may include a sweeping motor, a sweeping brush, a dusting brush, a dust suction fan, and the like. The basic components and the structures of the basic components included in different robots are different, and the embodiments of the present application are only some examples.
It should be noted that the audio component, the sensor 1002, the one or more processors 1003, and the one or more memories 1004 may be disposed inside the machine body 1001, or may be disposed on the surface of the machine body 1001.
The machine main body 1001 is an actuator by which the robot performs a task, and can execute an operation designated by the processor 1003 in a certain environment. Wherein, the appearance form of robot has been reflected to a certain extent to the mechanical body. In the present embodiment, the external form of the robot is not limited, and may be, for example, a circle, an ellipse, a triangle, a convex polygon, or the like.
The one or more memories 1004 are primarily for storing computer programs that are executable by the one or more processors 1003 to cause the one or more processors 1004 to perform obstacle avoidance operations. In addition to storing computer programs, the one or more memories 1004 may also be configured to store other various data to support operations on the robot.
The one or more processors 1003, which may be considered a control system for the robot, may be configured to execute one or more computer programs stored in the memory 1004 to perform obstacle avoidance operations on the robot.
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with at least one obstacle in a grid map; the grid in the grid map has multiple layers, and each layer stores the information of one obstacle;
and updating corresponding grid layers in grids respectively matched with the at least one obstacle in the grid map according to the information of the at least one obstacle so as to obtain an updated grid map.
Optionally, the one or more processors 1003 may determine, according to the obstacle information recorded in the multiple grid layers of each grid in the grid map, an obstacle avoidance area where an obstacle exists on the travel path, where the obstacle avoidance area includes the type information and probability information of the obstacle, and may be configured to:
determining a target grid on the travel path in the grid map;
determining whether obstacle avoidance processing is needed or not according to the type information of the obstacles recorded by each grid layer in the target grid;
when obstacle avoidance processing is required, acquiring an obstacle avoidance threshold corresponding to the type information of the obstacle;
comparing probability information of the obstacles recorded by each grid layer in the target grid with an obstacle avoidance threshold;
and if the grid layer larger than the obstacle avoidance threshold exists in the target grid, determining that the area corresponding to the target grid is the obstacle avoidance area.
Optionally, the one or more processors 1003, performing obstacle avoidance processing on the obstacle avoidance area, may be configured to:
determining an obstacle avoidance path according to the obstacle avoidance area;
when the mobile equipment travels to an obstacle avoidance area, switching the traveling path to the obstacle avoidance path to continue traveling;
and after the self-moving equipment passes through the obstacle avoidance area, switching from the obstacle avoidance path to the traveling path to continue traveling.
Correspondingly, the embodiment of the application also provides a computer readable storage medium storing the computer program. The computer-readable storage medium stores a computer program that, when executed by the one or more processors 1003, causes the one or more processors 1003 to perform the steps in the corresponding method embodiment of fig. 6.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus comprising the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
Claims (17)
1. A construction method of a grid map is suitable for a self-moving device, and is characterized by comprising the following steps:
acquiring an environment image of a surrounding environment;
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one obstacle;
for each obstacle in the at least one obstacle, generating local grid information of the obstacle according to the information of the obstacle, wherein each grid in the local grid information of the obstacle comprises an observation probability value;
and updating the original probability value of the corresponding grid layer in the grid matched with the obstacle in the grid map by using the observation probability value to obtain the updated probability value of the corresponding grid layer of the grid matched with the obstacle so as to obtain the updated grid map.
2. The method of claim 1, wherein identifying information of at least one obstacle contained in the environmental image comprises:
carrying out image recognition on an environment image, and recognizing the category information of each obstacle contained in the environment image;
and calculating the position information of each obstacle according to the pixel position of each obstacle in the environment image.
3. The method of claim 2, wherein calculating the position information of each obstacle according to the pixel position of each obstacle in the environment image comprises:
determining the pixel position of each obstacle in the environment image;
inquiring a mapping relation between the pixel position of each obstacle in the image and the distance from the obstacle to the self-moving equipment according to the pixel position of each obstacle in the environment image so as to obtain the distance from each obstacle to the self-moving equipment;
and determining the position information of each obstacle according to the distance from each obstacle to the self-moving equipment and the position information of the self-moving equipment.
4. The method of claim 3, wherein determining the pixel location of each obstacle in the environmental image comprises:
and identifying boundary lines of each obstacle, which intersect with the ground, from the environment image as pixel positions of each obstacle in the environment image.
5. The method of claim 2, wherein generating, for each of the at least one obstacle, local grid information for the obstacle from the information for the obstacle comprises:
and generating local grid information of the obstacles according to the type information and the position information of the obstacles aiming at each obstacle in the at least one obstacle.
6. The method of claim 5, wherein updating the original probability values of the respective grid layers in the grid map that matches the obstacle with the observation probability values to obtain updated probability values of the respective grid layers of the grid that matches the obstacle comprises:
carrying out logarithm operation on the observation probability value in each grid in the local grid information of the obstacle to obtain an observation probability logarithm value;
carrying out logarithmic operation on the original probability values of the corresponding grid layers in the grids matched with the obstacles in the grid map to obtain original probability logarithm values;
obtaining an update probability value of a corresponding grid layer of the grid matched with the obstacle according to an observation probability logarithm value corresponding to the local grid information of the obstacle and an original probability logarithm value corresponding to a corresponding grid layer in the grid matched with the obstacle in the grid map;
and performing antilog operation on the update probability logarithm value of the corresponding grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the corresponding grid layer of the grid matched with the obstacle.
7. The method of claim 6, wherein performing an anti-log operation on the update probability logarithm of the respective grid layer of the corresponding grid in the updated grid map to obtain the update probability value of the respective grid layer of the grid matching the obstacle comprises:
updated update probability value P for the respective grid layer of each grid matched to the obstacle n The calculation formula is as follows:
P 1 is the original probability value, P 2 To observe the probability value, P n To update the probability values.
8. The method of any of claims 1-7, further comprising, after obtaining the updated grid map:
and in the moving process, obstacle avoidance is carried out according to the updated grid map.
9. An obstacle avoidance method is suitable for self-moving equipment, and is characterized by comprising the following steps:
determining a target grid corresponding to a traveling path in the moving process of the self-moving equipment in a grid map;
acquiring obstacle avoidance threshold values corresponding to the obstacle category information according to the obstacle category information recorded by each grid layer in the target grid;
determining a region corresponding to the target grid as an obstacle avoidance region according to the probability information of the obstacle recorded by each grid layer in the target grid and the obstacle avoidance threshold;
and carrying out obstacle avoidance processing on the obstacle avoidance area.
10. The method of claim 9, wherein obtaining an obstacle avoidance threshold corresponding to the category information of the obstacle according to the category information of the obstacle recorded in each grid layer of the target grid comprises:
determining whether obstacle avoidance processing is needed or not according to the type information of the obstacles recorded by each grid layer in the target grid;
and when obstacle avoidance processing is required, acquiring an obstacle avoidance threshold corresponding to the type information of the obstacle.
11. The method of claim 9, wherein determining, according to the probability information of the obstacle recorded in each grid layer of the target grid and the obstacle avoidance threshold, that the area corresponding to the target grid is an obstacle avoidance area includes:
comparing the probability information of the obstacles recorded by each grid layer in the target grid with the obstacle avoidance threshold;
and if the grid layer larger than the obstacle avoidance threshold exists in the target grid, determining that the area corresponding to the target grid is the obstacle avoidance area.
12. The method according to claim 9, wherein performing obstacle avoidance processing on the obstacle avoidance area includes:
determining an obstacle avoidance path according to the obstacle avoidance area;
when the mobile equipment travels to the obstacle avoidance area, the current travel path is switched to the obstacle avoidance path to continue traveling;
and after the self-moving equipment passes through the obstacle avoidance area, switching from the current obstacle avoidance path to the traveling path to continue traveling.
13. An autonomous mobile device, comprising: the machine body is provided with a sensor, one or more processors and one or more memories for storing computer programs;
the sensor is used for acquiring an environment image of the surrounding environment;
the one or more processors to execute the computer program to:
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one obstacle;
for each obstacle in the at least one obstacle, generating local grid information of the obstacle according to the information of the obstacle, wherein each grid in the local grid information of the obstacle comprises an observation probability value;
and updating the original probability value of the corresponding grid layer in the grid matched with the obstacle in the grid map by using the observation probability value to obtain the updated probability value of the corresponding grid layer of the grid matched with the obstacle so as to obtain the updated grid map.
14. The self-moving equipment according to claim 13, wherein the self-moving equipment is a sweeping robot, the sensor is a monocular camera, a striking plate is mounted on the front side wall of the machine body, and the monocular camera is arranged in the striking plate.
15. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to perform acts comprising:
acquiring an environment image of the surrounding environment;
information identifying at least one obstacle contained in the environmental image;
determining grid layers in grids respectively matched with the at least one obstacle in the grid map; the grid in the grid map has multiple layers, and each layer stores information of one obstacle;
for each obstacle in the at least one obstacle, generating local grid information of the obstacle according to the information of the obstacle, wherein each grid in the local grid information of the obstacle comprises an observation probability value;
and updating the original probability value of the corresponding grid layer in the grid matched with the obstacle in the grid map by using the observation probability value to obtain the updated probability value of the corresponding grid layer of the grid matched with the obstacle so as to obtain the updated grid map.
16. An autonomous mobile device, comprising: the machine body is provided with one or more processors and one or more memories for storing computer programs;
the one or more processors to execute the computer program to:
determining a target grid corresponding to a traveling path in the moving process of the self-moving equipment in a grid map;
acquiring obstacle avoidance threshold values corresponding to the obstacle category information according to the obstacle category information recorded by each grid layer in the target grid;
determining a region corresponding to the target grid as an obstacle avoidance region according to the probability information of the obstacle recorded by each grid layer in the target grid and the obstacle avoidance threshold;
and carrying out obstacle avoidance processing aiming at the obstacle avoidance area.
17. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by one or more processors, causes the one or more processors to perform acts comprising:
determining a target grid corresponding to a traveling path in the moving process of the mobile equipment in a grid map;
acquiring obstacle avoidance threshold values corresponding to the obstacle category information according to the obstacle category information recorded by each grid layer in the target grid;
determining a region corresponding to the target grid as an obstacle avoidance region according to the probability information of the obstacle recorded by each grid layer in the target grid and the obstacle avoidance threshold;
and carrying out obstacle avoidance processing aiming at the obstacle avoidance area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811158962.4A CN110968083B (en) | 2018-09-30 | 2018-09-30 | Method for constructing grid map, method, device and medium for avoiding obstacles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811158962.4A CN110968083B (en) | 2018-09-30 | 2018-09-30 | Method for constructing grid map, method, device and medium for avoiding obstacles |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110968083A CN110968083A (en) | 2020-04-07 |
CN110968083B true CN110968083B (en) | 2023-02-10 |
Family
ID=70029057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811158962.4A Active CN110968083B (en) | 2018-09-30 | 2018-09-30 | Method for constructing grid map, method, device and medium for avoiding obstacles |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110968083B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113573236B (en) * | 2020-04-29 | 2024-04-05 | 亚信科技(中国)有限公司 | Method and device for evaluating confidence of positioning result |
CN113865598A (en) * | 2020-06-30 | 2021-12-31 | 华为技术有限公司 | Positioning map generation method, positioning method and positioning device |
CN111781936B (en) * | 2020-08-07 | 2024-06-28 | 深圳中智永浩机器人有限公司 | Robot path planning method, robot path planning device, robot and computer readable storage medium |
CN112107257B (en) * | 2020-09-30 | 2022-09-20 | 北京小狗吸尘器集团股份有限公司 | Intelligent cleaning equipment and obstacle avoidance path planning method and device thereof |
CN112380942B (en) * | 2020-11-06 | 2024-08-06 | 北京石头创新科技有限公司 | Method, device, medium and electronic equipment for identifying obstacle |
CN112506196B (en) * | 2020-12-07 | 2022-09-20 | 合肥工业大学 | Robot obstacle avoidance method and system based on priori knowledge |
CN113520246B (en) * | 2021-07-30 | 2023-04-04 | 珠海一微半导体股份有限公司 | Mobile robot compensation cleaning method and system |
CN113670292B (en) * | 2021-08-10 | 2023-10-20 | 追觅创新科技(苏州)有限公司 | Map drawing method and device, sweeper, storage medium and electronic device |
CN113848943B (en) * | 2021-10-18 | 2023-08-08 | 追觅创新科技(苏州)有限公司 | Grid map correction method and device, storage medium and electronic device |
CN116088489B (en) * | 2021-11-05 | 2024-02-27 | 北京三快在线科技有限公司 | Grid map updating method and device |
CN116148879B (en) * | 2021-11-22 | 2024-05-03 | 珠海一微半导体股份有限公司 | Method for improving obstacle marking precision by robot |
CN114779787A (en) * | 2022-05-23 | 2022-07-22 | 杭州萤石软件有限公司 | Grid map construction method, robot and machine-readable storage medium |
CN115469312A (en) * | 2022-09-15 | 2022-12-13 | 重庆长安汽车股份有限公司 | Method and device for detecting passable area of vehicle, electronic device and storage medium |
CN115797817B (en) * | 2023-02-07 | 2023-05-30 | 科大讯飞股份有限公司 | Obstacle recognition method, obstacle display method, related equipment and system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006205348A (en) * | 2005-01-31 | 2006-08-10 | Sony Corp | Obstacle avoiding device, obstacle avoiding method, obstacle avoiding program, and movable robot device |
JP5560794B2 (en) * | 2010-03-16 | 2014-07-30 | ソニー株式会社 | Control device, control method and program |
JP5962137B2 (en) * | 2012-03-29 | 2016-08-03 | 富士通株式会社 | Guide route search method, guide route search device, and guide route search program |
CN103544496B (en) * | 2012-07-12 | 2016-12-21 | 同济大学 | The robot scene recognition methods merged with temporal information based on space |
CN108344414A (en) * | 2017-12-29 | 2018-07-31 | 中兴通讯股份有限公司 | A kind of map structuring, air navigation aid and device, system |
-
2018
- 2018-09-30 CN CN201811158962.4A patent/CN110968083B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110968083A (en) | 2020-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110968083B (en) | Method for constructing grid map, method, device and medium for avoiding obstacles | |
CN111197985B (en) | Area identification method, path planning method, device and storage medium | |
US20220095872A1 (en) | System for spot cleaning by a mobile robot | |
CN109682368B (en) | Robot, map construction method, positioning method, electronic device and storage medium | |
EP3872528B1 (en) | Travel control method, device, and storage medium | |
CN113284240B (en) | Map construction method and device, electronic equipment and storage medium | |
US10102429B2 (en) | Systems and methods for capturing images and annotating the captured images with information | |
KR102275300B1 (en) | Moving robot and control method thereof | |
KR102298582B1 (en) | Artificial intelligence robot for determining cleaning route using sensor data and method for the same | |
US20200019156A1 (en) | Mobile Robot Cleaning System | |
US11960297B2 (en) | Robot generating map based on multi sensors and artificial intelligence and moving based on map | |
CN111103875B (en) | Method, apparatus and storage medium for avoiding | |
US20190184569A1 (en) | Robot based on artificial intelligence, and control method thereof | |
KR20210029586A (en) | Method of slam based on salient object in image and robot and cloud server implementing thereof | |
US20220257074A1 (en) | Mobile robot using artificial intelligence and controlling method thereof | |
JP7539949B2 (en) | Autonomous Mobile Robot for Coverage Path Planning | |
US20210405650A1 (en) | Robot generating map and configuring correlation of nodes based on multi sensors and artificial intelligence, and moving based on map, and method of generating map | |
CN110946511A (en) | Method, apparatus and storage medium for determining slippage | |
JP2020502675A (en) | Navigation and self-locating method for autonomously moving processing device | |
WO2021246170A1 (en) | Information processing device, information processing system and method, and program | |
CN112987716A (en) | Operation control method, device and system and robot | |
CN116661458A (en) | Robot travel control method, robot, and storage medium | |
JP2024529082A (en) | NAVIGATION METHOD AND SELF-MOBILITATING DEVICE - Patent application | |
KR20200054694A (en) | Cleaning apparatus and controlling method thereof | |
CN114942644A (en) | Method for controlling robot to clean and robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |