CN111121753A - Robot joint graph building method and device and computer readable storage medium - Google Patents
Robot joint graph building method and device and computer readable storage medium Download PDFInfo
- Publication number
- CN111121753A CN111121753A CN201911397654.1A CN201911397654A CN111121753A CN 111121753 A CN111121753 A CN 111121753A CN 201911397654 A CN201911397654 A CN 201911397654A CN 111121753 A CN111121753 A CN 111121753A
- Authority
- CN
- China
- Prior art keywords
- mapping
- robot
- sub
- joint
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
Abstract
The invention discloses a robot joint graph building method, equipment and a computer readable storage medium, wherein the method comprises the following steps: obtaining a mapping dividing line according to the logic area characteristics and/or the functional characteristics of the mapping area, and dividing the mapping area into a plurality of subgraphs through the mapping dividing line; expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map; and identifying the positioning identifier of the joint area, merging the sub-graph map according to the positioning identifier, and generating the full-width map of the map building area. The scheme for establishing the drawing by combining multiple robots is more efficient and accurate, the drawing establishing process is optimized, and the cooperativity of each robot in the drawing establishing process is improved.
Description
Technical Field
The invention relates to the technical field of robots, in particular to a robot joint graph building method, equipment and a computer readable storage medium.
Background
In the field of existing robot control technology, slam (simultaneous localization and Mapping) is commonly used, and the technology is also called cml (coordinated localization and localization), which may also be called instant localization and Mapping, or concurrent Mapping and localization. The technology enables the mobile robot to collect information of the operation environment through various sensors, so that a map of a site is built autonomously, further, self pose information is calculated according to the map, and a required safety movement function is completed.
In the above-described basic technique, when the robot enters a certain environment for the first time, there is no surrounding information at all, and at this time, the environment is regarded as an unknown environment, and when the robot has walked once in the unknown environment, the robot can save the environment information recorded at each point by itself, and it considers the environment as a known environment when it operates in the same environment (when the environment has no large change).
It can be seen that, in the above technology, the SLAM technology can be adopted to support the robot to work in an unknown environment and a known environment at the same time, wherein the known environment can reduce uncertainty of the robot on self pose estimation, and the motion stability of the robot is relatively higher.
Therefore, in consideration of the practical application scenario, a low-cost sensor is generally selected as a solution adopted in indoor navigation, the sensing range and accuracy to the environment are relatively low, and a global positioning device such as a GPS cannot be used. Meanwhile, considering that the application scene of indoor navigation is generally places such as warehouses and production lines, the robot needs to be frequently parked at a fixed position, so that in order to ensure stable operation of a plurality of machines and reduce the workload of environment configuration, all the machines need to have a uniform understanding of the environment, that is, each machine uses the same known environment information.
It can be seen that in an actual application scenario, a user of a robot wants to change a location environment into a known environment as soon as possible, and when the robot is used in a large-scale application scenario, a robot needs to be used for joint drawing construction at the moment, and at present, an efficient and accurate scheme for joint drawing construction by multiple robots does not exist.
Disclosure of Invention
In order to solve the technical defects in the prior art, the invention provides a robot joint mapping method, which comprises the following steps:
obtaining a mapping dividing line according to the logic area characteristics and/or the functional characteristics of the mapping area, and dividing the mapping area into a plurality of subgraphs through the mapping dividing line;
expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map;
and identifying the positioning identifier of the joint area, merging the sub-graph map according to the positioning identifier, and generating the full-width map of the map building area.
Optionally, before obtaining a mapping dividing line according to the logical region characteristic and/or the functional characteristic of the mapping region and dividing the mapping region into a plurality of subgraphs through the mapping dividing line, the method further includes:
and calibrating sensor information of the mapping robot according to a preset mapping measurement standard.
Optionally, the obtaining a mapping dividing line according to the logical region characteristics and/or the functional characteristics of the mapping region, and dividing the mapping region into a plurality of subgraphs through the mapping dividing line specifically includes:
dividing the mapping region into a plurality of subgraphs through the mapping dividing line to obtain an initial contour of the subgraph;
and performing expansion processing on the contour lines of the initial contour until every two contour lines are closed, and obtaining an updated construction drawing dividing line from the closed contour lines.
Optionally, the expanding the map building dividing line to form a seam region between the subgraphs specifically includes:
presetting a joint distance related to the running speed of the running robot and the measuring range of the sensor;
and obtaining the area range of the joint area according to the joint distance.
Optionally, after identifying the positioning identifier of the seam area, merging the sub-map according to the positioning identifier, and generating a full-width map of the mapping region, the method further includes:
determining a first sub-graph and a second sub-graph in an adjacent relationship among the plurality of sub-graphs;
determining the running state of the running robot, wherein if the running robot enters a joint region adjacent to the second sub-graph from the first sub-graph, the current pose and the historical pose of the running robot are obtained;
switching the coordinate of the running robot to the second sub-graph through a preset coordinate conversion matrix and by combining the current pose and the historical pose, and meanwhile obtaining an initial pose of the running robot in the second sub-graph;
and updating the initial pose according to the repositioning information of the running robot in the second subgraph to obtain the accurate pose of the running robot in the second subgraph.
Optionally, the switching the coordinate of the running robot to the second sub-graph by using the coordinate transformation matrix and combining the current pose and the historical pose includes:
if the initial pose of the running robot in the second subgraph is in the illegal area of the second subgraph, slowing down the running process of the running robot and waiting for the repositioning information of the running robot in the second subgraph;
and if the running robot does not receive the relocation information within the preset time, suspending the current running process and reporting an assistance request.
Optionally, after determining the first sub-graph and the second sub-graph in the adjacent relationship in the plurality of sub-graphs, the method further includes:
selecting three positioning points which meet preset conditions in the seam area;
and obtaining the coordinate transformation matrix related to translation and rotation according to the relative coordinates of the three positioning points on the first sub-graph and the second sub-graph respectively.
Optionally, the identifying the positioning identifier of the seam area, and merging the sub-map according to the positioning identifier to generate a full-width map of the mapping region specifically includes:
identifying whether a positioning identifier exists in the seam area;
if the positioning identification exists in the joint area, merging the sub-graph map according to the positioning identification to generate a full-width map of the map building area;
if the positioning identification does not exist in the joint area, generating an auxiliary mark of the joint area according to the corresponding relation of the joint area, and combining the sub-map according to the auxiliary mark to generate the full-width map.
The invention also provides a robot joint mapping device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the robot joint mapping method according to any one of the above items when being executed by the processor.
The invention also provides a computer-readable storage medium, on which a robot joint mapping program is stored, which, when executed by a processor, implements the steps of the robot joint mapping method according to any one of the above items.
The method has the advantages that the mapping dividing line is obtained through the logic area characteristics and/or the functional characteristics of the mapping area, and the mapping area is divided into a plurality of subgraphs through the mapping dividing line; expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map; and identifying the positioning identifier of the joint area, merging the sub-graph map according to the positioning identifier, and generating the full-width map of the map building area. The scheme for establishing the drawing by combining multiple robots is more efficient and accurate, the drawing establishing process is optimized, and the cooperativity of each robot in the drawing establishing process is improved.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a first flowchart of a joint mapping method for a robot according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a joint mapping method for robots according to an embodiment of the present invention;
FIG. 3 is a third flowchart of a joint mapping method for robots according to an embodiment of the present invention;
FIG. 4 is a fourth flowchart of a robot joint mapping method according to an embodiment of the present invention;
FIG. 5 is a fifth flowchart of a robot joint mapping method according to an embodiment of the present invention;
FIG. 6 is a sixth flowchart of a combined robot mapping method according to an embodiment of the present invention;
fig. 7 is a seventh flowchart of a robot joint mapping method according to an embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
Example one
Fig. 1 is a first flowchart of a joint mapping method for a robot according to an embodiment of the present invention. The embodiment provides a robot joint mapping method, which comprises the following steps:
s1, obtaining a mapping dividing line according to the logic area characteristics and/or the functional characteristics of the mapping area, and dividing the mapping area into a plurality of subgraphs through the mapping dividing line;
s2, expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map;
and S3, identifying the positioning identification of the joint area, merging the sub-graph map according to the positioning identification, and generating the full-width map of the mapping area.
In this embodiment, first, a mapping dividing line is obtained according to the logical region characteristics and/or the functional characteristics of the mapping region, and the mapping dividing line divides the mapping region into a plurality of subgraphs. It should be understood that the map division that is not reasonable for the mapping region may cause the corresponding positions of the divided regions on different sub-maps to be non-uniform, and the non-uniformity may cause information differences among the mapping robots to cause other relevance problems. Specifically, in this embodiment, on one hand, it is necessary to ensure stable characteristics of the divided regions, and on the other hand, the area of the divided regions is reduced as much as possible. In this embodiment, considering that a general warehouse is divided into logical areas, in an alternative of this embodiment, a logical area is taken as a minimum partition block, that is, one or more logical areas are divided into the same mapping area, and similarly, in another alternative of this embodiment, a position with an obvious dividing function in the mapping area is determined, for example, a gate of a main channel in the mapping area, a middle position of a long channel, a position where a small path is cut off by the main channel, and the like, and then, each specific divided area is subjected to a thinning process, thereby obtaining each sub-mapping area and a mapping dividing line between each sub-mapping area.
In this embodiment, after obtaining the mapping dividing line, the mapping dividing line is expanded to form a seam region between the subgraphs, and meanwhile, the subgraph is mapped to generate a subgraph map. In the above implementation step, a partition region having an obvious feature in the mapping region is determined, and corresponding subgraphs are obtained through one or more obvious features (e.g., logical or functional partitions), so as to implement partitioning of the mapping region, whereas if corresponding subgraphs cannot be completely partitioned through the above features (e.g., corresponding features are not obvious enough, and are not accurate enough for each subgraph of the region), a partition line between each subgraph is found in a dilation manner. And simultaneously, expanding the mapping dividing lines to form joint areas between the subgraphs, and mapping the subgraphs by one or more mapping robots to generate a subgraph map. Optionally, in this embodiment, different robots are used, or the same robot is used to perform mapping in each sub-image region simultaneously, or not simultaneously. The drawing establishing area takes the dividing lines of all the sub-drawings as basic boundaries, and corresponding joint area distances are expanded outwards from the dividing lines to serve as joint areas, so that smooth transition of the follow-up running robot in the joint areas is guaranteed.
In this embodiment, the subgraph is mapped to generate a subgraph map, then the positioning identifier of the joint area is identified, and the subgraph map is merged according to the positioning identifier to generate a full-width map of the mapping area. Similarly, as described in the above example, after the graphing robot performs the graphing task on one subgraph, the subgraph map of the subgraph is generated. After each sub-graph in the graph building region completes the graph building task, data of each sub-graph map is stored, the stored map data are combined into a full-width large graph, or one or more sub-graph maps are selected according to actual use requirements and are used after being combined.
The method has the advantages that the mapping dividing line is obtained through the logic area characteristics and/or the functional characteristics of the mapping area, and the mapping area is divided into a plurality of sub-graphs through the mapping dividing line; expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map; and identifying the positioning identifier of the joint area, merging the sub-graph map according to the positioning identifier, and generating the full-width map of the map building area. The scheme for establishing the drawing by combining multiple robots is more efficient and accurate, the drawing establishing process is optimized, and the cooperativity of each robot in the drawing establishing process is improved.
Example two
Based on the above embodiments, in order to ensure that the perception error of each mapping robot for the environment of the mapping region is controlled within an acceptable range, in this embodiment, the sensor information of the robot is calibrated according to a preset mapping measurement standard.
Specifically, firstly, a calibration scheme for each mapping robot in the embodiment needs to be determined, that is, a measurement standard is established by using the same scene, and on the basis, each mapping robot individual corrects the data obtained by the individual mapping robot according to the measurement standard, so that the perception error of each mapping robot to the environment of the mapping area is controlled within an acceptable range.
Then, the sensor scheme adopted by each mapping robot in the embodiment is determined. Among them, the following are common indoor solution sensors: an Odom mechanical odometer, an imu (inertial measurement unit) inertial sensor, a Lidar, a Camera (including monocular, binocular, etc. cameras of various forms).
And then, according to different sensor schemes, corresponding sensors are calibrated uniformly. Specifically, the problems existing in the calibration process are solved from the following two aspects, namely, the error of describing the same physical world information by the sensors of different mapping robots is small enough, and the information expressed among the sensors of the same mapping robot can be converted into the pose expression of each mapping robot under a unified world coordinate system, so that the calibration of the sensors of each mapping robot is in a unified state.
Optionally, in order to further enable the calibration of each mapping robot sensor to reach a uniform state, in this embodiment, the sensor installation manner is described by a CAD diagram, so as to obtain an initial coordinate system transformation matrix. Then, optimization is carried out on the basis, and a coordinate transformation matrix with higher precision is obtained. Specifically, the optimization operation may be to place the mapping robot in a rectangular room with a square of about 20 for a period of time, and collect data of each sensor of the mapping robot during the period of time. And then, performing coordinate transformation on the data of each sensor to obtain the motion track of the mapping robot in a world coordinate system, for example, odom is accumulated through a rotating wire harness of a wheel, imu is twice integrated through an accelerometer, and camera calculates relative displacement through the difference between the front frame image and the rear frame image. It will be appreciated that all of the motion trajectories described above describe the same motion of the mapping robot and therefore the data should be consistent. Therefore, on this basis, the present embodiment establishes an optimization function according to the constraint of the data, so as to improve the optimization degree of the raw data collected by each sensor, and at the same time, the trajectory residual obtained through the optimized matrix transformation is also the minimum. And finally, solving by a mathematical method to obtain a required accurate coordinate conversion matrix, thereby realizing the uniform calibration of the sensors of the mapping robot, and simultaneously, converting corresponding data by using the calibrated matrix in the process of executing the mapping task by the mapping robot.
Optionally, for the pure vision sensor scheme of the camera, if the camera includes a depth camera, a specially-shaped fixture may be used to calibrate parameters of the depth camera, and then calibration references are provided for imu and camera;
optionally, for the sensor scheme of the lidar, a calibration space with a known size may be used, the measurement accuracy of the lidar is calibrated first, and the Odom is calibrated by the motion of the mapping robot based on the calibration accuracy;
optionally, for a sensor scheme including lidar and camera, on the basis of the sensor scheme including lidar, laser may be further used to perform reference calibration on the camera, so as to further improve the calibration accuracy.
The method has the advantages that sensor information of the mapping robots is subjected to unified calibration through the preset mapping measurement standard, so that the perception error of each mapping robot to the mapping area environment is controlled within an acceptable range, and the success rate and accuracy of the combined mapping of the mapping robots are improved.
EXAMPLE III
Fig. 2 is a second flowchart of a joint mapping method for robots according to an embodiment of the present invention. Based on the foregoing embodiment, in this embodiment, the obtaining a mapping dividing line according to a logical region feature and/or a functional feature of a mapping region, and dividing the mapping region into a plurality of subgraphs through the mapping dividing line specifically includes:
s11, dividing the mapping region into a plurality of subgraphs through the mapping dividing line to obtain the initial contour of the subgraph;
and S12, performing expansion processing on the contour lines of the initial contour until the contour lines are closed pairwise, and obtaining an updated construction drawing dividing line from the closed contour lines.
Optionally, if it is identified that the positioning condition of the map building dividing line is not good, the information at the intersection of each sub-map is ensured to be uniform by adding an auxiliary positioning scheme such as a two-dimensional code.
In this embodiment, the principle of dividing the mapping region is to reduce the length of the cut as much as possible. For example, in two large areas connected by a long walkway, cutting is selected to be performed in the walkway, and if the japanese character type or the field character type is required to be performed, the cutting area is firstly constructed, and then the drawing is further constructed for each divided sub-graph based on the construction (namely, as a known environment).
In this embodiment, in the divided sub-images, there needs to be at least one area that can facilitate positioning of the mapping robot, and the area is used as the starting area of this embodiment, and when the mapping robot starts to execute a mapping task in one sub-image, the mapping robot needs to first stop in the starting area, so as to ensure that the mapping robot executes subsequent mapping tasks after successful positioning. Optionally, in this embodiment, according to different mapping requirements, the boot area may be a two-dimensional code, or may be a corner with a specific identification condition, or may be a corner with an identification pattern that is easy to relocate.
In this embodiment, in the process of executing the Mapping task, the SLAM (SLAM, localized Mapping and Mapping, also referred to as CML, current Mapping and Mapping, immediate positioning and Mapping) is started to push the Mapping machine or let the Mapping machine automatically operate in the area with the Mapping requirement in the Mapping area, and in the process of walking the Mapping robot, the feature information of the area passing through is automatically recorded, and when the Mapping robot passes through the history area next time, the pose of the Mapping robot can be corrected through the recorded feature information, so that the stability and accuracy in the process of executing the Mapping task are improved.
Example four
Fig. 3 is a third flowchart of a joint robot mapping method according to an embodiment of the present invention. Based on the foregoing embodiment, in this embodiment, the determining a first sub-graph and a second sub-graph adjacent to the mapping dividing line, where the first sub-graph and the second sub-graph precede a coordinate transformation matrix of a seam region, and the method further includes:
s21, presetting a joint distance related to the running speed of the running robot and the measuring range of the sensor;
and S22, obtaining the area range of the joint area according to the joint distance.
Firstly, in an application scenario of multi-robot joint map building, each independent sub-map is generated according to the map building requirements. Map switching is usually encountered by each mapping robot in the mapping process, and one obvious advantage of map switching is that when one long walkway fails to optimize, maps built by different mapping robots may coincide. For example, when a long channel connects two large areas, but the actual patterned shape of the walkway is bent due to some interference, the original i-shape may become scissor-like, resulting in overlapping of partial areas. In this case, the same problem may occur in the application scenario of single map building, and therefore, in the application scenario of multi-robot joint map building, it is necessary to perform map switching between sub-maps in time during normal operation of each operating robot after the map building.
Considering that during the map switching process, the description of the same coordinate point at the joint may be different for different maps, and if one sub-area connects two joints, it is difficult to normalize the coordinates. Therefore, in the present embodiment, a sufficient distance is left for the seam between the subgraphs, and only the coordinates of the seam between the two connected subgraphs are matrix-transformed.
In this embodiment, the area range of the seam region is determined by the seam distance. The joint distance of the joint is determined according to specific operation requirements and operation scenes, and is related to the designed maximum running speed of the operation robot and the maximum measuring range of the sensor. Specifically, the joint distance is proportional to the highest speed per hour, because the repositioning frequency in the new subgraph is fixed, therefore, the higher the design speed per hour of the running robot is, the more joint distances need to be reserved so as to facilitate the repositioning of the running robot after crossing the subgraph. Similarly, the seam distance is inversely proportional to the maximum sensor range, and the shorter the sensor range is, the more difficult the repositioning is, so that more positioning information and more positionable areas need to be provided in order to successfully switch the operation robot to a new sub-graph.
EXAMPLE five
Fig. 4 is a fourth flowchart of a robot joint mapping method according to an embodiment of the present invention. Based on the foregoing embodiment, in this embodiment, after the identifying the positioning identifier of the seam area, merging the sub-map according to the positioning identifier, and generating the full-width map of the mapping region, the method further includes:
s41, determining a first sub-graph and a second sub-graph which are in adjacent relation in the plurality of sub-graphs;
s42, determining the running state of the running robot, wherein if the running robot enters a joint region adjacent to the second sub-graph from the first sub-graph, the current pose and the historical pose of the running robot are obtained;
s43, switching the coordinates of the running robot to the second sub-graph through a preset coordinate transformation matrix and by combining the current pose and the historical pose, and meanwhile obtaining the initial pose of the running robot in the second sub-graph;
and S44, updating the initial pose according to the repositioning information of the running robot in the second sub-graph to obtain the accurate pose of the running robot in the second sub-graph.
For example, when the running robot runs to a sub-graph a close to the edge, the current pose and possible historical pose records of the running robot are converted into the coordinates of a new sub-graph b through coordinate conversion, and therefore an approximate initial pose on the sub-graph b is obtained; and then, waiting for the running robot to trigger one repositioning on the sub-graph b, and obtaining the accurate pose of the running robot on the sub-graph b, so that the map switching success of the running robot in the current state is determined.
EXAMPLE six
Fig. 5 is a fifth flowchart of a robot joint mapping method according to an embodiment of the present invention. Based on the foregoing embodiment, in this embodiment, the switching the coordinate of the running robot to the second sub-graph through the coordinate transformation matrix in combination with the current pose and the historical pose includes:
s431, if the initial pose of the running robot in the second subgraph is in an illegal area of the second subgraph, slowing down the running process of the running robot, and waiting for relocation information of the running robot in the second subgraph;
and S432, if the running robot does not receive the relocation information within the preset time, suspending the current running process and reporting an assistance request.
Similarly, as described in the above example, if the running robot recognizes that the transformed coordinates fall in the illegal region of the sub-graph b during the coordinate transformation, the running speed of the running robot is correspondingly reduced, and meanwhile, information on whether the relocation of the running robot is successful is waited. And if the pose of the running robot completely leaves the sub-graph a and the successful relocation information is not received yet, the running robot suspends the current mapping task and reports a request for manual cooperation.
EXAMPLE seven
Fig. 6 is a sixth flowchart of a robot joint mapping method according to an embodiment of the present invention. Based on the above embodiments, in the present embodiment:
s01, selecting three positioning points which meet preset conditions in the seam area;
and S02, obtaining the coordinate transformation matrix related to translation and rotation according to the relative coordinates of the three positioning points on the first sub-graph and the second sub-graph respectively.
For example, the coordinate transformation matrix between two subgraphs a and b is obtained by the relative position relationship of the two subgraphs. Specifically, three positions with certain positioning conditions are arbitrarily selected at joints of the subgraphs a and b, relative coordinates (Pa1, Pb1), (Pa2, Pb2), (Pa3, Pb3) of the three positions in the subgraphs a and b are determined, an approximate coordinate transformation matrix T is obtained through the three equations, wherein the coordinate transformation matrix T contains the relation between translation and rotation of the positions, and therefore 3 unknown quantities [ x, y, theta ] exist, and on the basis, the temporary positions of the subgraphs of the running robot after switching are obtained through the obtained coordinate transformation formula in a way that T × Pai is Pbi, wherein i is 1, 2, and 3. However, considering that the error generated by the above conversion process is large, in this embodiment, the relocation of the running robot in the switched sub-graph is further triggered, and the coordinate information in the new sub-graph after conversion is further corrected by the success information of the relocation.
Example eight
Fig. 7 is a seventh flowchart of a robot joint mapping method according to an embodiment of the present invention. Based on the foregoing embodiment, in this embodiment, the identifying the positioning identifier of the seam area, and merging the sub-map according to the positioning identifier to generate the full-width map of the mapping region specifically includes:
s31, identifying whether the seam area has a positioning mark;
s32, if the positioning identification exists in the joint area, merging the sub-graph map according to the positioning identification to generate a full-width map of the mapping area;
and S33, if the positioning identification does not exist in the joint area, generating an auxiliary mark of the joint area according to the corresponding relation of the joint area, and combining the sub-map according to the auxiliary mark to generate the full-width map.
Optionally, in this embodiment, according to different mapping requirements, a corresponding two-dimensional code is set in the boot area, or an identification pattern easy to reposition is attached to the boot area;
optionally, in the process of planning the navigation path according to the task requirement in the normal operation process, the operation robot may obtain a position area through which the operation robot passes in advance, and when the planned path appears at the intersection of two subgraphs and finally leaves from the range of one subgraph and enters another subgraph, in order to improve the success rate of switching, in this embodiment, a corresponding mark is made on the intersection-and-convergence path, and the mark may be identification information such as a two-dimensional code. Therefore, when the running robot runs to the periphery of the recorded path section, the coordinate conversion is continuously triggered until the converted coordinate is verified on the new subgraph, so that the running robot is smoothly switched to the new subgraph.
Example nine
Based on the above embodiment, the present invention further provides a robot joint mapping apparatus, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the robot joint mapping method according to any one of the above embodiments.
It should be noted that the device embodiment and the method embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment, and technical features in the method embodiment are correspondingly applicable in the device embodiment, which is not described herein again.
Example ten
Based on the above embodiment, the present invention further provides a computer-readable storage medium, where a robot joint mapping program is stored, and when executed by a processor, the robot joint mapping program implements the steps of the robot joint mapping method according to any one of the above embodiments.
It should be noted that the media embodiment and the method embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment, and technical features in the method embodiment are correspondingly applicable in the media embodiment, which is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (10)
1. A robot joint mapping method is characterized by comprising the following steps:
obtaining a mapping dividing line according to the logic area characteristics and/or the functional characteristics of the mapping area, and dividing the mapping area into a plurality of subgraphs through the mapping dividing line;
expanding the graph construction dividing line to form a joint area between the subgraphs, and simultaneously constructing the subgraph to generate a subgraph map;
and identifying the positioning identifier of the joint area, merging the sub-graph map according to the positioning identifier, and generating the full-width map of the map building area.
2. A robot joint mapping method according to claim 1, wherein before obtaining mapping dividing lines according to logical region features and/or functional features of a mapping region and dividing the mapping region into a plurality of sub-graphs by the mapping dividing lines, the method further comprises:
and calibrating sensor information of the mapping robot according to a preset mapping measurement standard.
3. The robot joint mapping method according to claim 1, wherein the obtaining of mapping dividing lines according to logical region features and/or functional features of the mapping region, and the dividing of the mapping region into a plurality of sub-graphs by the mapping dividing lines specifically includes:
dividing the mapping region into a plurality of subgraphs through the mapping dividing line to obtain an initial contour of the subgraph;
and performing expansion processing on the contour lines of the initial contour until every two contour lines are closed, and obtaining an updated construction drawing dividing line from the closed contour lines.
4. The robot joint mapping method according to claim 1, wherein the expanding the mapping dividing line to form a seam region between the subgraphs specifically comprises:
presetting a joint distance related to the running speed of the running robot and the measuring range of the sensor;
and obtaining the area range of the joint area according to the joint distance.
5. The method of claim 1, wherein after identifying the positioning identifiers of the joint areas and merging the sub-map according to the positioning identifiers to generate a full map of the mapping area, the method further comprises:
determining a first sub-graph and a second sub-graph in an adjacent relationship among the plurality of sub-graphs;
determining the running state of the running robot, wherein if the running robot enters a joint region adjacent to the second sub-graph from the first sub-graph, the current pose and the historical pose of the running robot are obtained;
switching the coordinate of the running robot to the second sub-graph through a preset coordinate conversion matrix and by combining the current pose and the historical pose, and meanwhile obtaining an initial pose of the running robot in the second sub-graph;
and updating the initial pose according to the repositioning information of the running robot in the second subgraph to obtain the accurate pose of the running robot in the second subgraph.
6. The joint robot mapping method according to claim 5, wherein the switching the coordinates of the running robot to the second sub-graph through the coordinate transformation matrix in combination with the current pose and the historical pose comprises:
if the initial pose of the running robot in the second subgraph is in the illegal area of the second subgraph, slowing down the running process of the running robot and waiting for the repositioning information of the running robot in the second subgraph;
and if the running robot does not receive the relocation information within the preset time, suspending the current running process and reporting an assistance request.
7. A robot joint mapping method according to any of claims 5-6, wherein after determining a first sub-graph and a second sub-graph in an adjacent relationship among the plurality of sub-graphs, the method further comprises:
selecting three positioning points which meet preset conditions in the seam area;
and obtaining the coordinate transformation matrix related to translation and rotation according to the relative coordinates of the three positioning points on the first sub-graph and the second sub-graph respectively.
8. The robot joint mapping method according to claim 1, wherein the identifying the positioning identifier of the joint area, and merging the sub-map according to the positioning identifier to generate a full map of the mapping area, specifically includes:
identifying whether a positioning identifier exists in the seam area;
if the positioning identification exists in the joint area, merging the sub-graph map according to the positioning identification to generate a full-width map of the map building area;
if the positioning identification does not exist in the joint area, generating an auxiliary mark of the joint area according to the corresponding relation of the joint area, and combining the sub-map according to the auxiliary mark to generate the full-width map.
9. A joint robot mapping apparatus, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the joint robot mapping method according to any one of claims 1 to 8.
10. A computer-readable storage medium, having stored thereon a joint robot mapping program, which when executed by a processor, performs the steps of the joint robot mapping method according to any one of claims 1 to 8.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911397654.1A CN111121753A (en) | 2019-12-30 | 2019-12-30 | Robot joint graph building method and device and computer readable storage medium |
PCT/CN2020/133753 WO2021135813A1 (en) | 2019-12-30 | 2020-12-04 | Robot joint mapping method and device, and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911397654.1A CN111121753A (en) | 2019-12-30 | 2019-12-30 | Robot joint graph building method and device and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111121753A true CN111121753A (en) | 2020-05-08 |
Family
ID=70505470
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911397654.1A Pending CN111121753A (en) | 2019-12-30 | 2019-12-30 | Robot joint graph building method and device and computer readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111121753A (en) |
WO (1) | WO2021135813A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111815718A (en) * | 2020-07-20 | 2020-10-23 | 四川长虹电器股份有限公司 | Method for quickly switching stations of industrial screw robot based on vision |
CN112000103A (en) * | 2020-08-27 | 2020-11-27 | 西安达升科技股份有限公司 | AGV robot positioning, mapping and navigation method and system |
CN112146662A (en) * | 2020-09-29 | 2020-12-29 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN112484729A (en) * | 2020-11-11 | 2021-03-12 | 深圳市优必选科技股份有限公司 | Navigation map switching method and device, terminal equipment and storage medium |
CN112562402A (en) * | 2020-11-12 | 2021-03-26 | 深圳优地科技有限公司 | Position determination method, device, terminal and storage medium |
WO2021135813A1 (en) * | 2019-12-30 | 2021-07-08 | 炬星科技(深圳)有限公司 | Robot joint mapping method and device, and computer-readable storage medium |
CN113387099A (en) * | 2021-06-30 | 2021-09-14 | 深圳市海柔创新科技有限公司 | Map construction method, map construction device, map construction equipment, warehousing system and storage medium |
CN113822995A (en) * | 2021-09-26 | 2021-12-21 | 上海擎朗智能科技有限公司 | Method and device for creating navigation map of mobile equipment and storage medium |
CN116408807A (en) * | 2023-06-06 | 2023-07-11 | 广州东焊智能装备有限公司 | Robot control system based on machine vision and track planning |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103926930A (en) * | 2014-05-07 | 2014-07-16 | 重庆邮电大学 | Multi-robot cooperation map building method based on Hilbert curve detection |
CN105225604A (en) * | 2015-10-30 | 2016-01-06 | 汕头大学 | A kind of construction method of mixing map of Mobile Robotics Navigation |
CN107436148A (en) * | 2016-05-25 | 2017-12-05 | 深圳市朗驰欣创科技股份有限公司 | A kind of robot navigation method and device based on more maps |
CN109725327A (en) * | 2019-03-07 | 2019-05-07 | 山东大学 | A kind of method and system of multimachine building map |
CN110561423A (en) * | 2019-08-16 | 2019-12-13 | 深圳优地科技有限公司 | pose transformation method, robot and storage medium |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8798840B2 (en) * | 2011-09-30 | 2014-08-05 | Irobot Corporation | Adaptive mapping with spatial summaries of sensor data |
CN108335302B (en) * | 2018-01-26 | 2021-10-08 | 上海思岚科技有限公司 | Region segmentation method and device |
CN108827249B (en) * | 2018-06-06 | 2020-10-27 | 歌尔股份有限公司 | Map construction method and device |
CN108981701B (en) * | 2018-06-14 | 2022-05-10 | 广东易凌科技股份有限公司 | Indoor positioning and navigation method based on laser SLAM |
CN109118940A (en) * | 2018-09-14 | 2019-01-01 | 杭州国辰机器人科技有限公司 | A kind of mobile robot composition based on map splicing |
CN111121753A (en) * | 2019-12-30 | 2020-05-08 | 炬星科技(深圳)有限公司 | Robot joint graph building method and device and computer readable storage medium |
-
2019
- 2019-12-30 CN CN201911397654.1A patent/CN111121753A/en active Pending
-
2020
- 2020-12-04 WO PCT/CN2020/133753 patent/WO2021135813A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103926930A (en) * | 2014-05-07 | 2014-07-16 | 重庆邮电大学 | Multi-robot cooperation map building method based on Hilbert curve detection |
CN105225604A (en) * | 2015-10-30 | 2016-01-06 | 汕头大学 | A kind of construction method of mixing map of Mobile Robotics Navigation |
CN107436148A (en) * | 2016-05-25 | 2017-12-05 | 深圳市朗驰欣创科技股份有限公司 | A kind of robot navigation method and device based on more maps |
CN109725327A (en) * | 2019-03-07 | 2019-05-07 | 山东大学 | A kind of method and system of multimachine building map |
CN110561423A (en) * | 2019-08-16 | 2019-12-13 | 深圳优地科技有限公司 | pose transformation method, robot and storage medium |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021135813A1 (en) * | 2019-12-30 | 2021-07-08 | 炬星科技(深圳)有限公司 | Robot joint mapping method and device, and computer-readable storage medium |
CN111815718A (en) * | 2020-07-20 | 2020-10-23 | 四川长虹电器股份有限公司 | Method for quickly switching stations of industrial screw robot based on vision |
CN112000103A (en) * | 2020-08-27 | 2020-11-27 | 西安达升科技股份有限公司 | AGV robot positioning, mapping and navigation method and system |
CN112146662A (en) * | 2020-09-29 | 2020-12-29 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN112146662B (en) * | 2020-09-29 | 2022-06-10 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN112484729A (en) * | 2020-11-11 | 2021-03-12 | 深圳市优必选科技股份有限公司 | Navigation map switching method and device, terminal equipment and storage medium |
CN112562402A (en) * | 2020-11-12 | 2021-03-26 | 深圳优地科技有限公司 | Position determination method, device, terminal and storage medium |
CN113387099A (en) * | 2021-06-30 | 2021-09-14 | 深圳市海柔创新科技有限公司 | Map construction method, map construction device, map construction equipment, warehousing system and storage medium |
CN113822995A (en) * | 2021-09-26 | 2021-12-21 | 上海擎朗智能科技有限公司 | Method and device for creating navigation map of mobile equipment and storage medium |
CN116408807A (en) * | 2023-06-06 | 2023-07-11 | 广州东焊智能装备有限公司 | Robot control system based on machine vision and track planning |
CN116408807B (en) * | 2023-06-06 | 2023-08-15 | 广州东焊智能装备有限公司 | Robot control system based on machine vision and track planning |
Also Published As
Publication number | Publication date |
---|---|
WO2021135813A1 (en) | 2021-07-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111121753A (en) | Robot joint graph building method and device and computer readable storage medium | |
JP6842519B2 (en) | Data collection method and its system | |
CN109916393B (en) | Multi-grid-value navigation method based on robot pose and application thereof | |
US20220057212A1 (en) | Method for updating a map and mobile robot | |
EP3672762B1 (en) | Self-propelled robot path planning method, self-propelled robot and storage medium | |
CN111123949B (en) | Obstacle avoidance method and device for robot, robot and storage medium | |
WO2020259274A1 (en) | Area identification method, robot, and storage medium | |
CN107837044B (en) | Partitioned cleaning method and device of cleaning robot and robot | |
KR20170088228A (en) | Map building system and its method based on multi-robot localization | |
EP4332714A1 (en) | Robot navigation method, chip and robot | |
CN110986920B (en) | Positioning navigation method, device, equipment and storage medium | |
WO2018076777A1 (en) | Robot positioning method and device, and robot | |
KR101333496B1 (en) | Apparatus and Method for controlling a mobile robot on the basis of past map data | |
CN111240322B (en) | Method for determining working starting point of robot movement limiting frame and motion control method | |
CN113375657A (en) | Electronic map updating method and device and electronic equipment | |
CN113607161B (en) | Robot navigation path width acquisition system, method, robot and storage medium | |
CN113932825B (en) | Robot navigation path width acquisition system, method, robot and storage medium | |
CN114995459A (en) | Robot control method, device, equipment and storage medium | |
CN115981305A (en) | Robot path planning and control method and device and robot | |
KR102481615B1 (en) | Method and system for collecting data | |
KR20200043329A (en) | Method and system for collecting data | |
KR101297608B1 (en) | Method and system for robot coverage of unknown environment | |
CN117589153B (en) | Map updating method and robot | |
CN113031006B (en) | Method, device and equipment for determining positioning information | |
JP2018120482A (en) | Robot and method of controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 518000 Room 401, block D, building 7, Shenzhen International Innovation Valley, Dashi 1st Road, Xili community, Xili street, Nanshan District, Shenzhen, Guangdong Applicant after: JUXING TECHNOLOGY (SHENZHEN) Co.,Ltd. Address before: 518000 building 101, building R3b, Gaoxin industrial village, No.018, Gaoxin South 7th Road, community, high tech Zone, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province Applicant before: JUXING TECHNOLOGY (SHENZHEN) Co.,Ltd. |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200508 |