CN116539022A - Automatic robot mapping method and automatic robot mapping device - Google Patents

Automatic robot mapping method and automatic robot mapping device Download PDF

Info

Publication number
CN116539022A
CN116539022A CN202310387292.8A CN202310387292A CN116539022A CN 116539022 A CN116539022 A CN 116539022A CN 202310387292 A CN202310387292 A CN 202310387292A CN 116539022 A CN116539022 A CN 116539022A
Authority
CN
China
Prior art keywords
robot
boundary point
navigation
candidate boundary
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310387292.8A
Other languages
Chinese (zh)
Inventor
杜昌源
陈明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou On Bright Electronics Co Ltd
Original Assignee
Guangzhou On Bright Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou On Bright Electronics Co Ltd filed Critical Guangzhou On Bright Electronics Co Ltd
Priority to CN202310387292.8A priority Critical patent/CN116539022A/en
Publication of CN116539022A publication Critical patent/CN116539022A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a robot automatic drawing method and a robot automatic drawing device. The automatic map building method of the robot comprises the following steps: acquiring data of a robot map and pose information of a robot; determining at least one candidate boundary point based on the data of the robot map, wherein the candidate boundary point is a point representing a junction between the unknown zone and the passable zone; determining a candidate boundary point which can be reached by the robot in at least one candidate boundary point based on pose information of the robot; determining a candidate boundary point with the optimal navigation path from the robot from the candidate boundary points which can be moved and reached by the robot as an optimal boundary point; and constructing a navigation map in the navigation process by taking the optimal boundary point as a navigation target point.

Description

Automatic robot mapping method and automatic robot mapping device
Technical Field
The present application relates to the field of robotics, and more particularly, to a method for automatic mapping of a robot and an apparatus for automatic mapping of a robot.
Background
Most of traditional autonomous mobile robots only have the capability of navigating and avoiding obstacles on the existing map, so that the work of building the map also needs human participation to control the movement of the robot. The current robot automatic construction scheme adopts a potential field method and a wall-surrounding method, and the methods need preset conditions to process various working scenes, are complex, but have poor actual effect, reliability and universality. There are also simple rough methods such as "cow farming", "collision" which employ relatively fixed path mobile robots with a low degree of intelligentization. Therefore, the existing automatic robot map building solution cannot meet the actual application requirements, and most of the automatic robot map building solutions do not have navigation and obstacle avoidance capabilities in the automatic robot map building process.
Disclosure of Invention
One aspect of the present application provides a method for automatically mapping a robot, including: acquiring data of a robot map and pose information of a robot; determining at least one candidate boundary point based on the data of the robot map, wherein the candidate boundary point is a point representing a junction between the unknown zone and the passable zone; determining a candidate boundary point which can be reached by the robot in at least one candidate boundary point based on pose information of the robot; determining a candidate boundary point with the optimal navigation path from the robot from the candidate boundary points which can be moved and reached by the robot as an optimal boundary point; and constructing a navigation map in the navigation process by taking the optimal boundary point as a navigation target point.
An aspect of the present application provides a robot automatic mapping device, including: the data acquisition module is configured to acquire data of a robot map and pose information of the robot; a select boundary point determination module that determines at least one candidate boundary point based on data of the robot map, wherein the candidate boundary point is a point representing a boundary between the unknown zone and the passable zone; the accessibility determination module is configured to determine candidate boundary points which can be reached by the robot in a moving way from at least one candidate boundary point based on pose information of the robot; the optimal boundary point determining module is configured to determine a candidate boundary point with the optimal navigation path from the robot from candidate boundary points reached by the robot in a movable way as an optimal boundary point; and a map construction module configured to construct a navigation map during navigation with the optimal boundary point as a navigation target point.
An aspect of the present application provides a robot automatic mapping device, including: a processor; and a memory storing computer readable instructions, wherein the processor is configured to execute the computer readable instructions to perform a method of robotic auto-mapping according to an embodiment of the present application.
Yet another aspect of the present application provides a computer-readable storage medium comprising: computer readable instructions, which when executed by a processor, cause the processor to perform a method of robot auto-mapping according to an embodiment of the present application.
According to the robot automatic drawing construction method and the robot automatic drawing construction device, based on the boundary point thought, the robot automatic drawing construction can be achieved without human participation, and the robot automatic drawing construction method and device have the advantages of being low in calculation force requirement, high in drawing construction efficiency, good in drawing construction integrity and low in cost.
Drawings
The aspects of the present application are best understood from the following detailed description when read with the accompanying drawing figures. Note that the various features are not necessarily drawn to scale according to industry standard practices. Like reference numerals describe like components throughout the several views. Like numerals having different letter suffixes may represent different instances of similar components. In the drawings:
FIG. 1 shows a flow chart of a method of robotic auto-mapping according to an embodiment of the present application;
FIG. 2 illustrates a schematic diagram of a relationship of a picture pixel coordinate system to a navigational world coordinate system according to an embodiment of the present application;
FIG. 3 shows a more specific schematic flow chart of a method of robotic auto-mapping according to an embodiment of the present application;
fig. 4 shows an application example of a method for robot auto-mapping according to an embodiment of the present application;
FIG. 5 shows an example block diagram of a robotic self-mapping apparatus according to an embodiment of the present application; and
fig. 6 shows a schematic block diagram of a computer system that may implement a method of robotic auto-mapping according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application are described in detail below. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by showing an example of the present application. The present application is in no way limited to any particular configuration and algorithm set forth below, but rather covers any modification, substitution, or improvement of elements, components, and algorithms without departing from the spirit of the present application. In the drawings and following description, well-known structures and techniques are not shown in order to avoid unnecessarily obscuring the present application.
The process of robot mapping is actually a process of continuously moving the robot to an unknown area and updating the map through the scanning of the laser radar. According to the robot automatic map construction method and the robot automatic map construction device, under the map construction working mode, boundary points representing points between the unknown area and the passable area on the robot map are used as target points and are sent to the original navigation function program package of the robot, so that the robot can move and navigate while constructing the map, and manual participation is not needed.
The robot automatic mapping method and the robot automatic mapping apparatus in the present application will be described in detail according to embodiments of the present application with reference to the accompanying drawings.
Fig. 1 shows a flow chart of a method for robot auto-mapping according to an embodiment of the present application. As shown in fig. 1, a method 100 for robot auto-mapping according to an embodiment of the present application includes S102-S110.
At S102, data of a robot map and pose information of a robot are acquired.
The robot map is an environment map captured by a robot by a scanning device (for example, a laser radar), is a static two-dimensional grid map including an occupancy state, and data thereof includes pixel values. In general, the values of the pixels of the unknown region may be represented by negative values, the values of the pixels of the passable region may be represented by zero, and the values of the pixels of the obstacle occupying region may be represented by positive values, wherein the absolute values of the negative values are smaller than the positive values. As examples, there may be three pixel values: -1, 0, 100, respectively representing the spatial state of the grid: an unknown zone, a passable zone, and an obstacle occupying zone. It should be appreciated that for other maps of different assignment formats, the map data may be initialized and reassigned by the method described above. Pose information of the robot includes, but is not limited to, position and heading angle information of the robot.
At S104, at least one candidate boundary point is determined based on the data of the robot map, wherein the candidate boundary point is a point representing a boundary between the unknown zone and the passable zone.
The boundary points are points between the boundary of the unknown area and the passable area, and as long as the robot moves to these points, the unknown area is searched by the sensor of the robot and updated into the map. When the map does not have the boundary points, the robot has no unknown area, namely, the robot has completed the complete exploration of the unknown environment, and complete map information is constructed.
In some embodiments, determining at least one candidate boundary point based on the data of the robot map comprises: for a pixel point corresponding to a zero value of a pixel representing a passable area in the data of the robot map, summing up pixel values of two pixel points adjacent to the pixel point in the X direction and the Y direction in a picture pixel coordinate system respectively; and determining the pixel point as a candidate boundary point in the case where the sum obtained in the X direction or the Y direction is a negative value.
Typically, there will be many such candidate boundary points. The method for automatically constructing the map of the robot also needs to optimize the candidate boundary points, and calculates the optimal boundary point with highest efficiency for completing the map constructing task to be used for the navigation of the robot.
At S106, a candidate boundary point to which the robot can move is determined from at least one candidate boundary point based on pose information of the robot.
Reachability, i.e. the robot entity is to be able to move to the candidate boundary point. In some embodiments, a "merge pixel" approach is employed to determine candidate boundary points that the robot can move to. Specifically, determining, from at least one candidate boundary point, a candidate boundary point that the robot can move to based on pose information of the robot includes: acquiring the pixel resolution of a robot map and the physical size of the robot; for a candidate boundary point of the at least one candidate boundary point, determining an area including the candidate boundary point based on the acquired pixel resolution of the robot map and the physical size of the robot; determining whether an obstacle occupation area exists in the area; and determining that the robot can reach the candidate boundary point in the case where there is no obstacle occupying area in the region.
In some embodiments, determining the region including the candidate boundary point includes: determining the number of pixels covered by the physical size of the robot on the robot map by dividing the physical size of the robot by the pixel resolution of the robot map; determining the total number of pixels of the candidate boundary points added to 2 times of the number of pixels covered by the physical size of the robot; and determining a square area which takes the candidate boundary point as a center and takes the length occupied by the total number of pixels as a side length on the robot map as an area comprising the candidate boundary point.
As an example, the pixel resolution of the robot map is 0.05m/pix, and the physical size of the robot is about 0.4m in diameter. In this case, the number of pixels covered by the physical size of the robot on the robot map is 8, so it is considered to merge 17×17 pixels including the candidate boundary point (i.e., 16 pixels around the candidate boundary point in the X direction (or Y direction) plus the candidate boundary point), i.e., 0.85m×0.85m pixel grid space. If the merged pixel grid space does not have an obstacle occupation area, the robot is considered to be able to smoothly reach the candidate boundary point (or after pose adjustment) and keep a certain distance from the obstacle.
At S108, a candidate boundary point having an optimal navigation path from the robot is determined as an optimal boundary point from among the candidate boundary points that the robot can move to.
In some embodiments, determining an optimal boundary point from candidate boundary points to which the robot can move may include: planning a navigation path for a candidate boundary point reached by the robot in a movable way; discarding candidate boundary points which can not be subjected to navigation planning from the candidate boundary points which can be reached by the robot in a moving way; and taking the candidate boundary point with the shortest navigation path out of the candidate boundary points which can be reached by the robot in a moving way and can carry out navigation path planning as the optimal boundary point.
If a boundary point is selected that is too close to the robot, time will be wasted on repeated construction of a region of known map information around the robot. If a boundary point is selected that is too far from the robot, the robot will be caused to move back and forth between the two distant boundary points, which also wastes time in the repeated construction of the known map information area.
In some embodiments, determining the optimal boundary point from among the candidate boundary points to which the robot can move may further include: and eliminating candidate boundary points which are the same as the optimal boundary point determined previously and the candidate boundary points which fail in a plurality of navigation attempts from among the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning.
At S110, a navigation map is constructed during navigation with the optimal boundary point as a navigation target point.
The coordinates used by the boundary points in the robot map are pixel coordinate points in a picture pixel coordinate system, and the robot can navigate by using the coordinate points in a map world coordinate system. Therefore, it is necessary to convert the pixel coordinate points in the picture pixel coordinate system into coordinate points in the map world coordinate system.
In some embodiments, constructing the navigation map during navigation with the optimal boundary point as the navigation target point may include: multiplying the image width and the image height of the data of the robot map in the image pixel coordinate system by a preset image resolution respectively to determine a corresponding image width and a corresponding image height in the navigation world coordinate system; determining the sum of the X coordinate of the navigation target point in the picture pixel coordinate system and the product of the X coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution plus the X coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the X coordinate of the navigation target point in the navigation world coordinate system; and determining the sum of the product of the Y coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution plus the Y coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the Y coordinate of the navigation target point in the navigation world coordinate system.
As an example, in the picture pixel coordinate system, the point of origin is set to be located at the lower left corner point of the picture, the X-axis forward direction is directed from right to left, the Y-axis forward direction is directed from top to bottom, and the whole picture is located in the third quadrant of the coordinate system. The data of the read map are ordered according to the rules from left to right and from bottom to top. In the navigation world coordinate system, the origin default adopts the coordinate center of the robot at the moment of starting the image construction, and the coordinate of the point relative to the origin in the picture pixel coordinate system is origin [ X ] origin ,Y origin ,Z origin ]The X-axis forward direction is from left to right, and the Y-axis forward direction is from bottom to top, see the schematic diagram of the relationship between the picture pixel coordinate system and the navigation world coordinate system according to the embodiment of the present application shown in fig. 2.
In one example, the Image Width image_width of the data of the robot map in the Image pixel coordinate system is, for example, 512pix, the Image height of the data of the robot map in the Image pixel coordinate system is, for example, 480pix, the preset Image Resolution image_resolution is a Resolution set by a mapping algorithm, for example, 0.05m/pix, and the coordinate difference between the origin in the navigation world coordinate system and the original point in the Image pixel coordinate system is origin [ X origin ,Y origin ,Z origin ]For example, [ -12.2m,0]。
In this example, the corresponding picture Width in the navigation world coordinate system is denoted as map_width, and the corresponding picture Height in the navigation world coordinate system is denoted as map_height. Map_width and map_height are derived by:
Map_Width=Image_Width*Image_Resolution=512pix*0.05m/pix=25.6m,
Map_Height=Image_Height*Image_Resolution=480pix*0.05m/pix=24.0m。
that is, the width of the corresponding picture in the navigation world coordinate system is 25.6m, and the height of the corresponding picture in the navigation world coordinate system is 24.0m.
Any pixel point (X) in the picture pixel coordinate system Image ,Y Image ) Conversion to coordinate points (X) in a navigation world coordinate system map ,Y map ) Wherein:
X map =X origin +(X Image *Map_Width/Image_Width);
Y map =Y origin +(Y Image *Map_Height/Image_Height)。
Namely:
X map =X origin +(X Image *Image_Resolution);
Y map =Y origin +(Y Image *Image_Resolution)。
as an example, for pixel points (152 pix, 11 pix) in the picture pixel coordinate system, the conversion into coordinate points in the navigation world coordinate system is:
X map =-12.2m+(152pix*25.6m/512pix)=-12.2m+7.6m=-4.60m;
Y map -12.2m+ (11pix x 24.0m/480 pix) = -12.2m+0.55m= -11.65m, or
X map =-12.2m+(152pix*0.05m/pix)=-12.2m+7.6m=-4.60m;
Y map =-12.2m+(11pix*0.05m/pix)=-12.2m+0.55m=-11.65m。
After the pixel point coordinates in the picture pixel coordinate system are converted into the coordinates in the navigation world coordinate system, the coordinates in the navigation world coordinate system are sent to a navigation function program package of the robot, so that the robot can realize navigation to the coordinate point, move navigation and construct a map without manual participation.
After reaching the navigation target point, the method for automatically mapping the robot according to the embodiment of the application can be repeated again, and the method is repeated until no new navigation target point exists relative to the previous process, the mapping is completed, the created navigation map is saved, and the robot returns to the original point.
It should be understood that the map data is continuously updated during the movement of the robot, so as to reduce the occupation of the algorithm and improve the efficiency, only the data is read and the map is updated, the optimal boundary point is not determined, and the optimal boundary point is determined when the robot moves to the navigation target point. In addition, in order to obtain better laser radar scanning coverage rate, the robot rotates 2 circles in situ after reaching the navigation target point, and the laser radar detection distance as far as possible is set under the condition that the accuracy is not affected, so that the map construction efficiency is improved.
According to the robot automatic map building method provided by the embodiment of the application, the robot automatic map building can be realized based on the boundary point thought without human participation, and the robot automatic map building method has the advantages of low calculation force requirement, high map building efficiency, good map building integrity and low cost.
In some embodiments, the pose information of the robot is obtained by a combination of an odometer sensor and an inertial sensor (IMU sensor) of the robot.
In the moving process of the autonomous map building of the robot, no autonomous positioning algorithm is used for positioning correction, and meanwhile, the map building precision of a large-area complex environment is affected due to the fact that the ground is uneven, accumulated errors exist in an odometer and the like. At this time, the IMU sensor and the odometer sensor of the robot can be fused to improve the accuracy. Specifically, the course angle information of the robot acquired by the IMU sensor may be fused with the course angle information of the robot acquired by the odometer sensor.
In some embodiments, the pose information of the robot includes heading angle information of the robot, and the method of robot auto-mapping may further include: and filtering data of large jump of course angle information obtained by the odometer sensor in a short time as noise.
In some embodiments, the method for automatically mapping a robot according to the embodiments of the present application may further include: the pose of the robot is adjusted based on data obtained by the laser radar sensor of the robot to avoid jamming in obstacles.
In the navigation map construction process, a robot is constructed while navigating. In the case where there is an obstacle in the navigation path, there is a possibility that navigation is failed many times or even a situation where the robot is stuck in the obstacle occurs. By evaluating the blocking condition (for example, blocking azimuth) of the obstacle through the data obtained by the laser radar sensor, the robot can be controlled to escape after the pose is adjusted, and the robot is prevented from being blocked due to the obstacle.
According to the robot automatic map building method provided by the embodiment of the application, based on the boundary point thought, the robot automatic map building can be realized without human participation, the automatic map building is combined into the original navigation function of the robot, and the escape and exit mechanism is designed by combining with the existing laser radar sensor of the robot, so that the robot has the functions of automatic map building, navigation obstacle avoidance and escape and exit. In addition, the automatic map building method for the robot provided by the embodiment of the application has the advantages of low calculation force requirement, high map building efficiency, good map building integrity and low cost.
Fig. 3 shows a more specific schematic flow chart of a method of robotic auto-mapping according to an embodiment of the present application. The robot auto-mapping method 300 according to an embodiment of the present application includes S202-S218.
At S202, a navigation function package and an automatic mapping package are started.
At S204, data of a robot map and pose information of the robot are acquired.
At S206, the map raster data is traversed to determine candidate boundary points.
In S208, the reachability of the candidate boundary point is evaluated by the method of "pixel merging". If there are no candidate boundary points that the robot can move to, then return to S206 to further traverse the map raster data to determine candidate boundary points. If there is a candidate boundary point that the robot can move to, the method proceeds to S210.
At S210, the coordinates of the pixel points of the candidate boundary points in the picture pixel coordinate system are converted into coordinates in the navigation world coordinate system.
At S212, it is determined whether there are other candidate boundary points. In case there are duplicate points, this case is also considered as no other candidate boundary points. At this time, the method proceeds to S216. In the case where there are other candidate boundary points, the method proceeds to S214, at which time the path plan of the candidate boundary points is evaluated, and the candidate boundary point having the shortest navigation path outside the predetermined distance from the robot among the candidate boundary points capable of navigation path planning is taken as the optimal boundary point.
In S216, the navigation program package issues the navigation world coordinates of the optimal boundary point (in the case where there is only one candidate boundary point, that is, the candidate boundary point), and the robot builds a map while navigating.
At S218, after the robot reaches the navigation target point (i.e., the determined optimal boundary point), spin is performed to improve the mapping efficiency.
At S220, the robot returns to the origin after the autonomous mapping is completed and the map is saved.
It should be appreciated that various data acquired during the method (such as planned path information, map data) and various boundary point information (such as candidate boundary points and optimal boundary points) as well as constructed maps may be stored in memory.
Fig. 4 shows an application example of the robot automatic mapping method according to the embodiment of the present application. Fig. 4 shows fig. 4 (a) -4 (d). In fig. 4 (a), the robot starts to perform automatic mapping from the position of the coordinate 1, the front end of the robot has an unknown area as shown in the figure due to the limitation of the detection distance of the laser radar, and the optimal boundary point 1 is calculated, so that the robot automatically navigates to the position of the optimal boundary point 1, in fig. 4 (b), namely, the position of the robot coordinate 2, the map is updated and perfected continuously in the moving process of the robot, and the robot spins after reaching the position of the coordinate 2 so as to further improve the laser scanning mapping effect. Then, the optimal boundary point 3 is obtained, and then the robot autonomously navigates to the position of the coordinate 3 in fig. 4 (c), namely, the position of the robot coordinate 4 in fig. 4 (d). And continuing until no boundary point meeting the condition exists, namely, the robot completes the drawing task in the step (d) of fig. 4, and the robot is ready to return to the original point.
Fig. 5 shows an example block diagram of a robotic auto mapping apparatus according to an embodiment of the application. As shown in fig. 5, the robot automatic mapping apparatus 500 according to the embodiment of the present application includes a data acquisition module 501, a candidate boundary point determination module 502, a reachability determination module 503, an optimal boundary point determination module 504, and a map construction module 505.
The data acquisition device 501 is configured to acquire data of a robot map and pose information of the robot.
The boundary point determining means 502 is configured to determine at least one candidate boundary point based on the data of the robot map, wherein the candidate boundary point is a point representing a boundary between the unknown zone and the passable zone.
The reachability determination module 503 is configured to determine a candidate boundary point to which the robot can move from among at least one candidate boundary point based on pose information of the robot.
The optimal boundary point determination module 504 is configured to determine, as an optimal boundary point, a candidate boundary point, from among candidate boundary points reached by which the robot can move, for which the navigation path is optimal from the robot.
The map construction module 505 is configured to construct a navigation map during navigation with the optimal boundary point as a navigation target point.
In some embodiments, the data of the robot map includes a negative value representing pixels of the unknown region, a zero value representing pixels of the passable region, and a positive value representing pixels of the obstacle occupancy region, wherein the negative value is less than the positive value in absolute value, and the candidate boundary point determination module 502 is configured to: for a pixel point corresponding to a zero value of a pixel representing a passable area in the data of the robot map, summing up pixel values of two pixel points adjacent to the pixel point in the X direction and the Y direction in a picture pixel coordinate system respectively; and determining the pixel point as a candidate boundary point in the case where the sum obtained in the X direction or the Y direction is a negative value.
In some embodiments, the data acquisition module 501 is further configured to: acquiring the pixel resolution of a robot map and the physical size of the robot; and reachability determination module 503 is configured to: for a candidate boundary point of the at least one candidate boundary point, determining an area including the candidate boundary point based on the acquired pixel resolution of the robot map and the physical size of the robot; determining whether an obstacle occupation area exists in the area; and determining that the robot is able to reach the candidate boundary point in the case where there is no obstacle occupying area in the area.
In some embodiments, reachability determination module 503 is configured to: determining the number of pixels covered by the physical size of the robot on the robot map by dividing the physical size of the robot by the pixel resolution of the robot map; determining the total number of pixels of the candidate boundary points added to 2 times of the number of pixels covered by the physical size of the robot; and determining a square area which takes the candidate boundary point as a center and takes the length occupied by the total number of pixels as a side length on the robot map as an area comprising the candidate boundary point.
In some embodiments, the optimal boundary point determination module 504 is configured to: planning a navigation path for a candidate boundary point reached by the robot in a movable way; discarding candidate boundary points which can not be subjected to navigation planning from the candidate boundary points which can be reached by the robot in a moving way; and taking the candidate boundary point with the shortest navigation path out of the candidate boundary points which can be reached by the robot in a moving way and can carry out navigation path planning as the optimal boundary point.
In some embodiments, the optimal boundary point determination module 504 is configured to: and eliminating candidate boundary points which are the same as the optimal boundary point determined previously and the candidate boundary points which fail in a plurality of navigation attempts from among the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning.
In some embodiments, the map building module 505 is configured to: multiplying the image width and the image height of the data of the robot map in the image pixel coordinate system by a preset image resolution respectively to determine a corresponding image width and a corresponding image height in the navigation world coordinate system; determining the sum of the X coordinate of the navigation target point in the picture pixel coordinate system and the product of the X coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution plus the X coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the X coordinate of the navigation target point in the navigation world coordinate system; and determining the sum of the product of the Y coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution plus the Y coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the Y coordinate of the navigation target point in the navigation world coordinate system.
In some embodiments, the data acquisition module 501 acquires pose information of the robot by combining an odometer sensor and an inertial sensor of the robot. In some embodiments, the pose information of the robot includes heading angle information of the robot, and the automatic mapping apparatus 500 of the robot according to the present application further includes a filtering module configured to filter data of a large jump of the heading angle information obtained by the odometer sensor in a short time as noise. In some embodiments, the robotic automatic mapping device 500 according to the present application further includes an obstacle avoidance module configured to adjust the pose of the robot based on data obtained by the lidar sensor of the robot to avoid stuck in an obstacle.
According to the robot automatic map building device provided by the embodiment of the application, based on the boundary point thought, the robot automatic map building can be realized without human participation, the automatic map building is combined into the original navigation function of the robot, and the escape and exit mechanism is designed by combining with the existing laser radar sensor of the robot, so that the robot has the functions of automatic map building, navigation obstacle avoidance and escape and exit. In addition, the automatic map building method for the robot provided by the embodiment of the application has the advantages of low calculation force requirement, high map building efficiency, good map building integrity and low cost.
Fig. 6 illustrates an example block diagram of a computer system that may implement the robotic auto-mapping method shown in fig. 1, in accordance with an embodiment of this application. It should be appreciated that the computer system 600 illustrated in fig. 6 is only one example and should not be taken as limiting the functionality and scope of use of the embodiments of the present disclosure. The computer system may be implemented in a control device in communication with the multi-pod dispatch robot or in the multi-pod dispatch robot itself.
As shown in fig. 6, a computer system 600 may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various suitable actions and processes according to programs stored in a Read Only Memory (ROM) 602 or loaded from a storage device 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the computer system 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, touch screens, touch pads, cameras, accelerometers, gyroscopes, sensors, etc.; output devices 607 including, for example, liquid crystal displays (LCDs, liquid Crystal Display), speakers, vibrators, motors, electronic governors, and the like; storage 608 including, for example, flash memory (Flash Card), etc.; and a communication device 609. The communication means 609 may allow the computer system 600 to communicate with other devices wirelessly or by wire to exchange data. While FIG. 6 illustrates a computer system 600 having various devices, it should be understood that not all illustrated devices are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 6 may represent one device or a plurality of devices as needed.
In particular, according to embodiments of the present invention, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure provide a computer-readable storage medium storing a computer program containing program code for performing the process S106 shown in fig. 1. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. When the computer program is executed by the processing device 601, the process S106 shown in fig. 1 is implemented.
It should be noted that, the computer readable medium according to the embodiments of the present invention may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. A computer readable storage medium according to an embodiment of the present invention may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Additionally, a computer-readable signal medium according to an embodiment of the present invention may include a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (Radio Frequency), and the like, or any suitable combination thereof.
Computer program code for carrying out operations in accordance with embodiments of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. For example, the algorithms described in particular embodiments may be modified without departing from the basic spirit of the invention. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (22)

1. A method for automatically mapping a robot, comprising:
acquiring data of a robot map and pose information of a robot;
determining at least one candidate boundary point based on data of the robot map, wherein the candidate boundary point is a point representing a junction between an unknown zone and a passable zone;
determining a candidate boundary point which can be reached by the robot in the at least one candidate boundary point based on the pose information of the robot;
determining a candidate boundary point with the optimal navigation path from the robot from the candidate boundary points which can be reached by the robot in a moving way as an optimal boundary point; a kind of electronic device with a high-performance liquid crystal display
And constructing a navigation map in the navigation process by taking the optimal boundary point as a navigation target point.
2. The method of robotic auto mapping of claim 1, wherein the data of the robotic map includes a negative value representing pixels of an unknown zone, a zero value representing pixels of a passable zone, and a positive value representing pixels of an obstacle-occupying zone, wherein an absolute value of the negative value is less than the positive value, and determining at least one candidate boundary point based on the data of the robotic map includes:
for the pixel points corresponding to the zero values of the pixels representing the passable area in the data of the robot map, summing the pixel values of two pixel points adjacent to the pixel points in the X direction and the Y direction in the pixel coordinate system of the picture; and
and determining the pixel point as the candidate boundary point when the sum obtained in the X direction or the Y direction is a negative value.
3. The method for automatically mapping a robot according to claim 1, wherein determining a boundary point to which the robot can move from the at least one boundary point based on pose information of the robot comprises:
acquiring the pixel resolution of the robot map and the physical size of the robot;
For a candidate boundary point of the at least one candidate boundary point, determining an area including the candidate boundary point based on the acquired pixel resolution of the robot map and the physical size of the robot;
determining whether an obstacle occupation area exists in the area; and
in the case where there is no obstacle occupation area in the area, it is determined that the robot can reach the candidate boundary point.
4. A method of robotic auto mapping as claimed in claim 3, wherein determining the region including the candidate boundary point comprises:
determining the number of pixels covered by the physical size of the robot on the robot map by dividing the physical size of the robot by the pixel resolution of the robot map;
determining the total number of pixels of the candidate boundary points plus 2 times the number of pixels covered by the physical size of the robot; and
and determining a square area which takes the candidate boundary point as a center and takes the length occupied by the pixels of the total pixel number as a side length on the robot map as an area comprising the candidate boundary point.
5. The method of robotic auto mapping of claim 1, wherein determining the optimal boundary point from among candidate boundary points that the robot is able to move to comprises:
Planning a navigation path of a candidate boundary point reached by the robot in a movable way;
discarding candidate boundary points which cannot be subjected to navigation planning from the candidate boundary points which can be reached by the robot in a moving way; and
and taking the candidate boundary point with the shortest navigation path out of the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning as the optimal boundary point.
6. The method of robotic auto mapping of claim 5, wherein determining the optimal boundary point from among candidate boundary points reached by the robot is further comprising:
and eliminating candidate boundary points which are the same as the optimal boundary point determined previously and the candidate boundary points with failed multiple navigation attempts from the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning.
7. The method of robotic auto mapping of claim 1, wherein constructing a navigation map during navigation with the optimal boundary point as a navigation target point comprises:
multiplying the picture width and the picture height of the data of the robot map in a picture pixel coordinate system by a preset image resolution respectively to determine a corresponding picture width and a corresponding picture height in a navigation world coordinate system;
Determining the sum of the X coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the X coordinate of the navigation target point in the navigation world coordinate system, wherein the sum is obtained by adding the product of the X coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution; and
and determining the sum of the product of the Y coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution and the Y coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the Y coordinate of the navigation target point in the navigation world coordinate system.
8. The method for automatically mapping a robot according to claim 1, wherein the pose information of the robot is obtained by combining an odometer sensor and an inertial sensor of the robot.
9. The method of robotic auto-mapping of claim 7, wherein the pose information of the robot comprises heading angle information of the robot, and the method further comprises: and filtering data of large jump of course angle information obtained by the odometer sensor in a short time as noise.
10. The robotic auto mapping method of claim 1, further comprising:
and adjusting the pose of the robot based on the data obtained by the laser radar sensor of the robot so as to avoid blocking in an obstacle.
11. An apparatus for automatically mapping a robot, comprising:
the data acquisition module is configured to acquire data of a robot map and pose information of the robot;
an optimal boundary point determination module that determines at least one candidate boundary point based on data of the robot map, wherein the candidate boundary point is a point representing a junction between an unknown zone and a passable zone;
a reachability determination module configured to determine a candidate boundary point to which the robot can move from among the at least one candidate boundary point based on pose information of the robot;
an optimal boundary point determining module configured to determine, from candidate boundary points reached by the robot, a candidate boundary point having an optimal navigation path from the robot as an optimal boundary point; and
and the map construction module is configured to construct a navigation map in the navigation process by taking the optimal boundary point as a navigation target point.
12. The robotic automatic mapping device of claim 11, wherein the data of the robotic map comprises a negative value representing pixels of an unknown zone, a zero value representing pixels of a passable zone, and a positive value representing pixels of an obstacle-occupying zone, wherein the negative value is less than the positive value in absolute value, and the candidate boundary point determination module is configured to:
for the pixel points corresponding to the zero values of the pixels representing the passable area in the data of the robot map, summing the pixel values of two pixel points adjacent to the pixel points in the X direction and the Y direction in the picture pixel coordinate system; and
and determining the pixel point as the candidate boundary point when the sum obtained in the X direction or the Y direction is a negative value.
13. The robot mapping apparatus of claim 11, wherein,
the data acquisition module is further configured to acquire a pixel resolution of the robot map and a physical size of the robot; and is also provided with
The reachability determination module is configured to:
for a candidate boundary point of the at least one candidate boundary point, determining an area including the candidate boundary point based on the acquired pixel resolution of the robot map and the physical size of the robot;
Determining whether an obstacle occupation area exists in the area; and
in the case where there is no obstacle occupation area in the area, it is determined that the robot can reach the candidate boundary point.
14. The robotic auto mapping method of claim 13, wherein the reachability determination module is configured to:
determining the number of pixels covered by the physical size of the robot on the robot map by dividing the physical size of the robot by the pixel resolution of the robot map;
determining the total number of pixels of the candidate boundary points plus 2 times the number of pixels covered by the physical size of the robot; and
and determining a square area which takes the candidate boundary point as a center and takes the length occupied by the pixels of the total pixel number as a side length on the robot map as an area comprising the candidate boundary point.
15. The robotic automatic mapping device of claim 11, wherein the optimal boundary point determination module is configured to:
planning a navigation path of a candidate boundary point reached by the robot in a movable way;
discarding candidate boundary points which cannot be subjected to navigation planning from the candidate boundary points which can be reached by the robot in a moving way; a kind of electronic device with a high-performance liquid crystal display
And taking the candidate boundary point with the shortest navigation path out of the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning as the optimal boundary point.
16. The robotic automatic mapping device of claim 15, wherein the optimal boundary point determination module is further configured to: and eliminating candidate boundary points which are the same as the optimal boundary point determined previously and the candidate boundary points with failed multiple navigation attempts from the candidate boundary points which can be reached by the robot in a moving way and can be subjected to navigation path planning.
17. The robotic auto mapping device of claim 11, wherein the map building module is configured to:
multiplying the picture width and the picture height of the data of the robot map in a picture pixel coordinate system by a preset image resolution respectively to determine a corresponding picture width and a corresponding picture height in a navigation world coordinate system;
determining the sum of the X coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the X coordinate of the navigation target point in the navigation world coordinate system, wherein the sum is obtained by adding the product of the X coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution; a kind of electronic device with a high-performance liquid crystal display
And determining the sum of the product of the Y coordinate of the navigation target point in the picture pixel coordinate system and the preset image resolution and the Y coordinate difference between the original point in the navigation world coordinate system and the original point in the picture pixel coordinate system as the Y coordinate of the navigation target point in the navigation world coordinate system.
18. The robot mapping apparatus of claim 11, wherein the pose information of the robot is obtained by combining an odometer sensor and an inertial sensor of the robot.
19. The robot auto-mapping apparatus of claim 18, wherein the pose information of the robot includes heading angle information of the robot, and the apparatus further comprises:
and the filtering module is configured to filter data of large jump of course angle information obtained by the odometer sensor in a short time as noise.
20. The robotic auto mapping device of claim 11, further comprising:
and the obstacle avoidance module is configured to adjust the pose of the robot based on data obtained by a laser radar sensor of the robot so as to avoid being blocked in an obstacle.
21. An apparatus for automatically mapping a robot, comprising:
a processor; and
a memory storing computer readable instructions that,
wherein the processor is configured to execute the computer readable instructions to perform the method of robotic auto-mapping according to any one of claims 1 to 10.
22. A computer readable storage medium comprising computer readable instructions which, when executed by a processor, cause the processor to perform the method of robotic auto mapping according to any one of claims 1 to 10.
CN202310387292.8A 2023-04-11 2023-04-11 Automatic robot mapping method and automatic robot mapping device Pending CN116539022A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310387292.8A CN116539022A (en) 2023-04-11 2023-04-11 Automatic robot mapping method and automatic robot mapping device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310387292.8A CN116539022A (en) 2023-04-11 2023-04-11 Automatic robot mapping method and automatic robot mapping device

Publications (1)

Publication Number Publication Date
CN116539022A true CN116539022A (en) 2023-08-04

Family

ID=87446029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310387292.8A Pending CN116539022A (en) 2023-04-11 2023-04-11 Automatic robot mapping method and automatic robot mapping device

Country Status (1)

Country Link
CN (1) CN116539022A (en)

Similar Documents

Publication Publication Date Title
CN108319655B (en) Method and device for generating grid map
CN109405836B (en) Method and system for determining drivable navigation paths of unmanned vehicles
AU2019321145B2 (en) Method, device, and equipment for obstacle or ground recognition and flight control, and storage medium
CN110260867B (en) Method, equipment and device for determining and correcting neutral position in robot navigation
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
CN113284240A (en) Map construction method and device, electronic equipment and storage medium
US20220012509A1 (en) Overhead-view image generation device, overhead-view image generation system, and automatic parking device
CN111220148A (en) Mobile robot positioning method, system and device and mobile robot
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
CN111198378B (en) Boundary-based autonomous exploration method and device
CN109631921B (en) Method and apparatus for identifying center of navigation path of unmanned vehicle
EP3535096B1 (en) Robotic sensing apparatus and methods of sensor planning
US11112780B2 (en) Collaborative determination of a load footprint of a robotic vehicle
CN110597265A (en) Recharging method and device for sweeping robot
CN113475976B (en) Method and device for determining passable area of robot, storage medium and robot
WO2021249469A1 (en) Grid map generation method, device, and computer readable storage medium
CN113252023A (en) Positioning method, device and equipment based on odometer
CN116539022A (en) Automatic robot mapping method and automatic robot mapping device
CN113405557B (en) Path planning method and related device, electronic equipment and storage medium
CN113776520B (en) Map construction, using method, device, robot and medium
CN114812539A (en) Map search method, map using method, map searching device, map using device, robot and storage medium
WO2021212297A1 (en) Systems and methods for distance measurement
US11662740B2 (en) Position estimating apparatus, method for determining position of movable apparatus, and non-transitory computer readable medium
CN114353780B (en) Gesture optimization method and device
JP7447060B2 (en) Information processing device, information processing method, autonomous robot device, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination