WO2020182146A1 - Système robotique, système de cartographie et procédé pour carte de navigation robotique - Google Patents

Système robotique, système de cartographie et procédé pour carte de navigation robotique Download PDF

Info

Publication number
WO2020182146A1
WO2020182146A1 PCT/CN2020/078789 CN2020078789W WO2020182146A1 WO 2020182146 A1 WO2020182146 A1 WO 2020182146A1 CN 2020078789 W CN2020078789 W CN 2020078789W WO 2020182146 A1 WO2020182146 A1 WO 2020182146A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
mapping
navigation map
working
features
Prior art date
Application number
PCT/CN2020/078789
Other languages
English (en)
Chinese (zh)
Inventor
刘哲
王悦翔
尹慧昕
曹抒阳
Original Assignee
锥能机器人(上海)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 锥能机器人(上海)有限公司 filed Critical 锥能机器人(上海)有限公司
Publication of WO2020182146A1 publication Critical patent/WO2020182146A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the invention relates to a robot, in particular to a navigation system and method of the robot.
  • Electromagnetic navigation embeds metal wires on the driving path of the AGV, and loads the guiding frequency on the metal wire, and realizes the navigation of the AGV by identifying the guiding frequency.
  • Magnetic stripe navigation uses magnetic stripe on the ground instead of burying wires underground, and uses magnetic tape induction signals to achieve navigation.
  • Two-dimensional code navigation by laying a two-dimensional code at a certain distance on the path, calculates and corrects the pose of the AGV by comparing the position of the two-dimensional code under the camera.
  • Laser navigation is based on lidar, which scans the surrounding environment and collects reflected light information to determine your own position in the scene.
  • the inventor of the present invention found that in the prior art, electromagnetic navigation technology and magnetic stripe navigation technology have greatly modified the ground.
  • the electromagnetic navigation technology even requires pre-embedded magnetic nails, and the industrial application scenarios are narrow.
  • there is almost no human-computer interaction between these two technologies and the cost of avoiding obstacles and changing the preset path is extremely high.
  • these two technologies cannot achieve intensive operation and multi-machine parallel in a single scene.
  • the two-dimensional code navigation technology solves the problem of high ground laying costs, in many scenarios, ground markers (such as medical scenarios) are not allowed.
  • the QR code is prone to damage and dirt, causing the QR code to be unrecognized or incorrectly recognized, requiring high labor and maintenance costs.
  • Laser navigation currently relies on reflectors on a large scale, has certain requirements on the surrounding environment and lighting conditions, and has very poor adaptability in dynamic environments. It can only be used in simple indoor scenes and cannot adapt to complex environments with multiple goods and multiple machines. In addition, the cost of laser navigation is extremely high, and there is no possibility of cost reduction in the short term. Traditional visual navigation has the characteristics of low recognition accuracy, strong environmental characteristics, and slow operation speed.
  • the purpose of the present invention is to provide a robot system, a robot navigation map mapping system and a mapping method, which can be used in large-area scenes without laying permanent positioning markers.
  • the present invention provides a robot navigation map mapping method.
  • a motion path is preset, a plurality of removable markers are set on the motion path, and the mapping robot is located on the motion path.
  • the mapping method includes the steps:
  • the feature acquisition module of the mapping robot records the features along the way, and obtains information about the mapping robot when it moves to the removable marker. Information about the pose calibration for calibration;
  • the features and corresponding pose information recorded by the feature collection module are processed or the features and corresponding pose information recorded by the feature collection module are sent to a server for processing to obtain a navigation map.
  • the mapping method further includes the step of calibrating the coordinate origin of the feature acquisition module of the mapping robot and the coordinate origin of the motion path.
  • the feature collection module establishes the features collected by the sensor into its own coordinate system.
  • the removable marker is removed.
  • the mapping method further includes, after obtaining the navigation map, causing the mapping robot to continue to record features along the way through the feature acquisition module, and to record the newly recorded features and corresponding positions.
  • the posture information is updated to the navigation map or the newly recorded feature and corresponding posture information are sent to the server to update the navigation map.
  • the removable marker includes identifiable reference pose information.
  • the removable marker is an artificially identifiable marker, and the artificially identifiable marker corresponds to the reference pose information.
  • the feature acquisition module is a camera, and the ground pattern features along the way are captured by the camera of the mapping robot.
  • the feature collection module includes multiple cameras and/or laser sensors, and features along the way are recorded by the multiple cameras and/or laser sensors.
  • the present application further provides a machine-readable medium having instructions stored on the machine-readable medium, and when the instructions are executed on a machine, the machine executes the above-mentioned robot navigation map mapping method.
  • the application further provides a system, which includes a memory for storing instructions executed by one or more processors of the system; and a processor, which is one of the processors of the system, for executing the above-mentioned robot navigation map mapping method.
  • This application further provides a robot navigation map mapping system, the mapping system includes:
  • Removable markers the removable markers being arranged in a movement path
  • a feature collection module which is configured to record features along the path when the mapping robot travels along the motion path, and to obtain a review of the mapping when the mapping robot reaches the position of the removable marker Map the robot's pose calibration information for calibration;
  • a feature processing module configured to process the features and corresponding pose information recorded by the feature collection module or send the features and corresponding pose information recorded by the feature collection module to a server for processing, To get a navigation map.
  • the removable marker includes identifiable reference pose information.
  • the removable marker is a removable marker that is manually identifiable, and the manually identifiable marker corresponds to the reference pose information.
  • the motion path is composed of multiple straight paths.
  • the feature collection module is a plurality of cameras and/or laser sensors arranged on the mapping robot.
  • the feature collection module is a camera provided on the mapping robot, and the camera is configured to record features of ground patterns along the way.
  • the application further provides a robot system, which includes a mapping robot, a working robot, and a robot management system;
  • the mapping robot includes:
  • a feature collection module configured to record features along a path when the mapping robot travels along a movement path
  • a feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing
  • the robot management system is configured to receive and process features and corresponding pose information recorded by the mapping robot to obtain or update a navigation map;
  • the working robot is configured to obtain the navigation map from the robot management system for positioning.
  • a removable marker is arranged on the movement path, and the feature acquisition module is further configured to obtain a response to the mapping robot when the mapping robot reaches the position of the removable marker. Calibration information to perform calibration.
  • the removable marker includes identifiable reference pose information.
  • the removable marker is an artificially identifiable marker, and the artificially identifiable marker corresponds to the reference pose information.
  • the working robot is configured to compare the recorded features with the features in the navigation map during operation to obtain the current pose information of the working robot.
  • the working robot is configured to issue an instruction to create a map to the robot management system when it is confirmed that the recorded feature cannot match the feature in the navigation map.
  • the robot management system is configured to instruct the mapping robot to record features along the local movement path near the working robot when receiving an instruction that requires mapping from the working robot Update the navigation map.
  • the feature collection module is configured to record the features of ground patterns along the movement path.
  • the feature collection module includes multiple cameras and/or laser sensors to record features along the way.
  • the robot system includes multiple mapping robots and/or multiple working robots that are cooperatively controlled by the robot management system.
  • the working robot is a handling robot.
  • the present invention also provides a robot system, which includes a working robot and a robot management system;
  • the working robot includes:
  • a conversion module configured to switch the working robot from a working mode to a mapping mode under a first predetermined condition
  • An information collection module configured to record features along a movement path when the working robot travels along a movement path in the mapping mode
  • An information processing module configured to send the features and corresponding pose information recorded by the information collection module to the robot management system for processing in the mapping mode, and in the working mode Obtain a navigation map from the robot management system for positioning;
  • the robot management system is configured to receive and process features and corresponding pose information recorded from the working robot to obtain or update the navigation map.
  • the information collection module is configured to record features along the way in the working mode
  • the information processing module is configured to combine the features recorded by the information collection module in the working mode with the navigation The features in the map are compared to obtain the current pose information of the working robot
  • the first predetermined condition includes that the information processing module confirms that the feature recorded by the information collection module cannot match the feature in the navigation map in the working mode.
  • the working robot records features along the local movement path near the working robot in the mapping mode to update the local navigation map.
  • the conversion module is configured to switch the working robot to the working mode after completing the update of the navigation map.
  • the robot system further includes a mapping robot
  • the mapping robot includes:
  • a feature collection module configured to record features along the path when the mapping robot travels along the movement path
  • a feature processing module configured to send the features and corresponding pose information recorded by the feature acquisition module to the robot management system for processing
  • the robot management system is further configured to also receive and process features and corresponding pose information recorded by the mapping robot to obtain or update the navigation map.
  • the robot management system instructs the mapping robot to replace part or all of the working robots to perform the tasks of the mapping mode under a second predetermined condition.
  • the robot system of the present application is a camera-based cluster robot system, which relies on visual features to realize positioning. Compared with the existing cluster robot positioning system, it has the advantages of no need to lay positioning markers, can be used in large-area scenes, high robot storage density, high robot running speed, high positioning accuracy, and less manual intervention in the map update process.
  • FIG. 1 is a schematic diagram of the system composition of the robot navigation map mapping system during the first mapping of an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the system composition of the robot navigation map mapping system when updating the map according to an embodiment of the present application.
  • Fig. 3 is a flowchart of a method for building a robot navigation map according to an embodiment of the present application.
  • Fig. 4 is a schematic diagram of the system composition of a robot system according to an embodiment of the present application.
  • the purpose of the navigation map mapping system of the present application is to form a navigation map that enables the robot to navigate in an area.
  • This area can be an outdoor area, or an indoor area where positioning signals such as GPS cannot be received.
  • This system uses the ground pattern feature recognition method to illuminate the ground captured by the camera, and process the image captured by the camera to recognize the current position and posture.
  • the ground texture feature refers to any feature on the ground, such as cracks, lines, protrusions, recesses, and possible objects on the ground.
  • the image can be a pair of photos or frames in a video.
  • the mapping is done using a separate mapping robot.
  • the removable marker includes identifiable reference pose information or corresponds to reference pose information.
  • the reference pose information refers to relative or absolute position information and pose information that can be referred to when the robot performs pose calibration.
  • the mapping robot can proceed along the navigation path on the basis of the navigation map established for the first time, and further capture images of ground texture features, and then process the images to update the navigation map.
  • the navigation of this application uses ground texture recognition technology, feature point matching through ground texture recognition, multi-robot map data cloud sharing technology, and robot scheduling system, which can dispatch thousands of robots in the same system according to the position information returned by each robot , Intensive collaborative work in the same scene.
  • the navigation map mapping system includes a mapping robot 1.
  • the mapping robot 1 may be one or more.
  • the mapping robot 1 can travel along the set movement path 2, which can be performed semi-automatically (for example, by remote control) or manually. In the case that a rough preliminary navigation map has been obtained, the mapping robot 1 can also automatically travel along the set movement path 2 to obtain a more refined navigation map.
  • the movement path 2 refers to the area where the robot can travel.
  • Figures 1 to 4 exemplarily show a route of the motion path 2, that is, the way the mapping robot travels along the motion path 2.
  • those skilled in the art should understand that other forms of routes can be set according to actual needs. As long as it can build a navigation map.
  • the mapping robot 1 has its own IMU (Inertial Measurement Unit; inertial detection module), which can perform high-precision linear walking in a small local area.
  • the mapping robot 1 is provided with a camera, which is used to photograph the ground pattern features in the motion path.
  • the camera is preferably a high-speed camera, which can provide high frame rate and high-resolution images when the robot is running at high speed, and realize the approximate real-time processing and feedback mechanism of the image, so as to ensure that the mapping can be performed on the premise of the higher running speed of the mapping robot Next implementation.
  • the ground pattern is more resistant to abrasion, which ensures a longer time of operation.
  • the camera of the mapping robot 1 can capture scene images other than the ground and record scene features, or the mapping robot 1 can record other features along the path of movement through other feature collection modules, such as laser sensors. Obtain scene characteristics, etc.
  • the mapping robot 1 may include multiple cameras and/or laser sensors, so as to be able to acquire multiple types of feature information to cooperate with mapping.
  • the mapping robot 1 may be provided with a light supplement device, which can be used to supplement light in a poor light environment, thereby improving the imaging quality.
  • the mapping robot 1 may be provided with a communication module to transmit the captured images and/or recorded features to an external processing device, such as a robot that controls one or more mapping robots and working robots.
  • Management system 4. The robot management system 4 can be used as a part of a navigation map mapping system.
  • the robot management system 4 is configured to receive and process the image taken by the camera of the mapping robot or the feature information extracted from the captured image, and/or the feature information and corresponding pose information recorded by other feature collection modules to obtain navigation map.
  • pose information refers to position information and posture information. It can be understood that the corresponding pose information can be obtained by the measuring device equipped with the mapping robot itself, and since it is well known to those skilled in the art, it will not be repeated here. It should be understood that, if only the mapping is realized, the image and/or feature processing can also be performed by the feature processing module carried by the mapping robot itself, so that the navigation map can also be established only by the mapping robot itself.
  • the navigation map mapping system may include removable markers 3 distributed on the movement path 2. It can be seen from FIG. 1 that the removable markers 3 can be evenly arranged on a path section or non-uniformly arranged on a path section, and can be set according to site conditions or needs. For example, at locations requiring high accuracy (a place where precise turns are required), relatively more removable markers are arranged to improve accuracy.
  • the removable marker 3 includes readable coordinate position information and posture information. Thus, when the mapping robot 1 recognizes the removable marker, it can read its information as the reference pose information to correct the pose of the mapping robot.
  • the removable marker may be a QR code or other customized graphic codes.
  • the removable marker is an artificially identifiable removable marker, and the artificially identifiable marker corresponds to the reference pose information. Therefore, after the mapping robot 1 obtains the manually identifiable marker, it can be manually operated in the background, for example, through the robot management system 4 or directly through the display screen on the mapping robot 1 to perform mapping The posture of the robot 1 is corrected to facilitate manual review of the mapping effect and adjustment of the mapping parameters.
  • the manually identifiable marker may be any form that can be preset to indicate the ground coordinate direction or the scene orientation and correspond to relative position information or absolute position information.
  • multiple types of removable markers can be arranged on the motion path to be used to correct the pose of the mapping robot. Then, the mapping robot 2 transmits the corrected pose information to the server for processing, such as the aforementioned robot management system 4, or the feature processing module carried by itself.
  • the removable marker can be removed after the first mapping is completed. After the first mapping, the mapping robot continues to travel along the path of the removed markers to complete or update the navigation map, as shown in Figure 2.
  • the setting of removable markers can temporarily enhance the environmental characteristics, improve the accuracy of the first mapping, and remove it after the first mapping is completed, so as to adapt to various scenarios where markers are not allowed. In some embodiments, if markers can be retained in the scene, the removable markers may not be removed after the first mapping is completed.
  • drawing is a continuous work, which can be continued as needed.
  • the ground texture is not completely unchanged in an industrial environment, and the ground texture will change over time, or due to the coordinated work of heavy machinery and rolling, the built-up ground texture pattern will also change significantly. Similar to the ground texture, other features will also change over time due to scene adjustments.
  • the mapping robot needs to repeat the mapping and upload the new terrain features and/or other features to the system server. Therefore, the mapping robot needs to patrol the working path regularly to detect whether the ground texture and/or other features in the path have changed or changed completely due to time or external forces.
  • the working robot When the working robot finds a feature pattern in the navigation map that cannot be matched, it will also send a request to the system to call the mapping robot to re-judge, store and update ground features and/or other features. Therefore, the stability and accuracy of the navigation map in a complex environment are guaranteed. It should be understood that when updating the navigation map, the mapping robot may not follow the original motion path. The motion path of the mapping robot can be reset according to the working conditions of the working robot, so as to ensure the stability of the navigation map. And accuracy does not affect the normal work of the working robot.
  • the above-mentioned navigation map update uses SLAM technology (Simultaneous Localization And Mapping; real-time positioning and map reconstruction technology).
  • SLAM technology Simultaneous Localization And Mapping; real-time positioning and map reconstruction technology.
  • This technology realizes the determination of the matching relationship between the image taken at the current location and/or the recorded features and the feature points of the data in the pre-established map library, so as to determine the specific and accurate coordinate position of the current location in the calibration map.
  • the acquired new data can be continuously updated to the original map library to achieve dynamic optimization of the map library data.
  • the above-mentioned navigation map update also uses map data cloud sharing technology. After the robot obtains and updates the map data through the above-mentioned SLAM technology, it is uploaded to the map data management center through its own communication device. The management center optimizes the map data and then shares it with all devices in the current system to ensure the real-time update of the map data of all devices in the system and improve the stability and effectiveness of the overall map
  • a motion path is set in the area of the navigation map to be created, and the motion path may be a straight line or a curve.
  • the marker can be a QR code or a manually identifiable marker.
  • place the mapping robot on the motion path. Determine the direction of the X and Y coordinates of the area and the origin of the robot running map.
  • the center of the calibration robot is at the origin of the coordinates, and the origin is in the field of view of the main camera. Turn on the camera of the mapping robot to make the mapping robot move along the path of motion.
  • the camera records the ground pattern features along the way and/or records other features through other feature acquisition modules, and moves to each of the Calibrate the position of the mapping robot when the marker can be removed.
  • the mapping robot continues to move along the movement path, and uses the camera to record the ground pattern features along the way, and/or record other features through other feature acquisition modules, and capture the newly captured
  • the ground texture features and/or newly recorded features are updated to the navigation map.
  • the update of the navigation map is completed.
  • the image captured by the camera and/or the features recorded by other feature acquisition modules can be uploaded to the remote robot management system and processed in the robot management system to obtain a navigation map.
  • the instruction code can be stored in any type of computer accessible memory (for example, permanent or modifiable, volatile or nonvolatile, solid state Or non-solid, fixed or replaceable media, etc.).
  • the memory may be, for example, programmable array logic (Programmable Array Logic, "PAL"), random access memory (Random Access Memory, "RAM”), and programmable read-only memory (Programmable Read Only Memory, "PROM” for short).
  • Read-Only Memory Read-Only Memory
  • EEPROM Electrically Removable Programmable ROM
  • magnetic disks optical discs
  • digital versatile discs Digital Versatile Disc , Referred to as “DVD" and so on.
  • Fig. 4 shows a schematic diagram of the system composition of the robot system composed of the above-mentioned mapping robot and working robot.
  • the robot system includes a mapping robot 1, a working robot 5 and a robot management system 4.
  • the robot management system 4 cooperatively controls multiple mapping robots 1 and/or multiple working robots 5.
  • the mapping robot 1 and the working robot 5 can work simultaneously in the same working area.
  • the mapping robot will move to the position where the mapping is needed after receiving the mapping instruction.
  • the mapping robot moves along a motion path, and the camera on it can capture the ground texture features in the motion path and/or record the features through other feature acquisition modules.
  • the feature processing module of the mapping robot transmits the captured images and/or recorded features to the robot management system in real time.
  • the robot management system After the robot management system processes the image and/or features, it updates the original navigation map if necessary.
  • the robot management system 4 communicates with the working robot 5, and transmits the updated navigation map to the working robot in real time, so that the working robot can locate according to the updated navigation map.
  • Working robots are generally used to carry goods in warehouses and other occasions.
  • the working robot 5 stores a navigation map and is provided with a camera or other information collection module.
  • the working robot can compare the image captured by the camera and/or the features recorded by other information collection modules with the images and/or features stored in the navigation map to obtain coordinate position information of the working robot for navigation.
  • the working robot is arranged to compare the image captured by its camera and/or the features recorded by other information acquisition modules with the nearby location images and/or features stored in the navigation map to obtain the relative position of the working robot. Based on the displacement and rotation angle of the characteristic position with coordinate position information, and then locate the coordinate position information of the working robot in the navigation map to realize navigation.
  • the basic workflow of the working robot includes:
  • the working robot receives tasks such as cargo handling and moves to the starting origin or a specific coordinate point in any path;
  • the working robot if it recognizes that the ground texture and/or other features do not match the stored map, it will issue a map-building instruction to the robot management system, detour to avoid the mismatched area if necessary, and then wait Transmit the new mapping information, and continue working after receiving the new mapping information.
  • the feature mismatch here refers to a certain proportion of mismatch, and the proportion can be set as required.
  • the working robot may first process the work in other areas, and wait for the map of the local area to update before performing the work in the local area.
  • the robot management system dispatches related mapping robots to move to the area according to the power of the mapping robot, the distance from the area, etc., and updates the local navigation map of the area in time. Thereby ensuring stability and accuracy.
  • the mismatched area can be determined based on the previous pose information of the working robot, for example, it extends to the surrounding based on the previous pose information of the working robot.
  • the warehouse may be divided into blocks in advance, and the block in which the working robot is located is confirmed according to the previous pose information of the working robot, and the block and/or adjacent blocks are regarded as non-matching areas.
  • other methods can also be used to determine the unmatched area, as long as the navigation map can be updated at the mismatch identified by the working robot.
  • when more than a predetermined number of working robots issue a mapping instruction to the robot management system in a short period of time it means that the environment has changed significantly, and the robot management system instructs the mapping robot to perform the mapping of the entire area.
  • the navigation map is updated to ensure the effective operation of the entire system.
  • the robot management system may also instruct the mapping robot to perform other operations according to actual conditions to ensure the effectiveness of the navigation map.
  • the working robot itself has a mapping function.
  • the first mapping can be performed by the mapping robot, the working robot, or both the mapping robot and the working robot.
  • the working robot has a working mode and a mapping mode, and the two modes can be switched mutually.
  • the working robot has a conversion module configured to control the working robot to switch between a working mode and a mapping mode under a first predetermined condition.
  • the working robot can record the characteristics of the local motion path through the information acquisition module, and transmit the recorded characteristics to the robot management system through the information processing module, so that the recorded information can be recognized in the working mode
  • the mapping mode is switched to re-map the local path, and the navigation map is updated in time.
  • the working robot switches to the working mode.
  • the robot management system instructs the mapping robot to perform part or all of the tasks in the mapping mode of the working robot under the second predetermined condition.
  • the robot management system instructs the mapping robot to replace the working robot to update the navigation map, and the working robot maintains the working mode for a predetermined period of time , Does not switch to the mapping mode.
  • the robot management system instructs the mapping robot to replace some or all of the working robots to update the navigation map to ensure that a certain number of working robots are in the working mode.
  • the robot management system instructs the mapping robot to replace the working robot to update the navigation map.
  • the mapping robot can also replace the working robot to perform tasks in the mapping mode in other situations.
  • the first predetermined condition and the second predetermined condition can be set reasonably in the entire robot system to dynamically balance the effectiveness of work and map data.
  • RMS Robot Management System
  • This application uses multi-robot map data cloud sharing, which can be used not only for ground pattern recognition, but also for other navigation methods such as laser navigation.
  • the task of the mapping robot and the working robot of the robot system of the present application are separated, and the map can be updated at a shorter time interval or a smaller map environment change standard without affecting the work, thereby reducing manual work (For example, reduce the frequency of manual recalibration of the map).
  • a logical unit and/or module may be a physical unit and/or module , It can also be a part of a physical unit and/or module, or it can be implemented as a combination of multiple physical units and/or modules.
  • the physical implementation of these logical units and/or modules is not the most important.
  • These logical units And/or the combination of the functions realized by the module is the key to solving the technical problem proposed by the present invention.
  • the foregoing system embodiments of the present invention do not introduce units and/or modules that are not closely related to solving the technical problems proposed by the present invention. This does not indicate that the foregoing system embodiments are not relevant. There are no other units and/or modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un système robotique, un système de cartographie et un procédé pour une carte de navigation robotique. Le système robotique comprend un robot de cartographie (1) et un robot de navigation (5). Le robot de cartographie (1) et le robot de navigation (5) peuvent fonctionner simultanément dans une même zone de travail et être gérés de manière coordonnée par un système de gestion de robot (4). Lors de la cartographie initiale de la carte de navigation, des marqueurs amovibles (3) sont disposés sur le sol. Une fois la cartographie initiale terminée, les marqueurs amovibles (3) sont retirés. Dans le système de cartographie, des caractéristiques de texture de sol de la zone de travail peuvent être acquises à tout moment, et le besoin de mise à jour de la carte de navigation est déterminé sur la base des caractéristiques de texture de sol acquises.
PCT/CN2020/078789 2019-03-13 2020-03-11 Système robotique, système de cartographie et procédé pour carte de navigation robotique WO2020182146A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910196698.1 2019-03-13
CN201910196698.1A CN111693046A (zh) 2019-03-13 2019-03-13 机器人系统和机器人导航地图建图系统及方法

Publications (1)

Publication Number Publication Date
WO2020182146A1 true WO2020182146A1 (fr) 2020-09-17

Family

ID=72426123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/078789 WO2020182146A1 (fr) 2019-03-13 2020-03-11 Système robotique, système de cartographie et procédé pour carte de navigation robotique

Country Status (2)

Country Link
CN (1) CN111693046A (fr)
WO (1) WO2020182146A1 (fr)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113701767B (zh) * 2020-05-22 2023-11-17 杭州海康机器人股份有限公司 一种地图更新的触发方法和系统
CN112146662B (zh) * 2020-09-29 2022-06-10 炬星科技(深圳)有限公司 一种引导建图方法、设备及计算机可读存储介质
CN112731923B (zh) * 2020-12-17 2023-10-03 武汉万集光电技术有限公司 一种集群机器人协同定位系统及方法
CN113246136B (zh) * 2021-06-07 2021-11-16 深圳市普渡科技有限公司 机器人、地图构建方法、装置和存储介质
CN113532421B (zh) * 2021-06-30 2024-04-26 同济人工智能研究院(苏州)有限公司 一种基于子图更新和反光板优化的动态激光slam方法
CN114355877B (zh) * 2021-11-25 2023-11-03 烟台杰瑞石油服务集团股份有限公司 一种多机器人作业区域的分配方法和装置
CN114873178A (zh) * 2022-05-18 2022-08-09 上海飒智智能科技有限公司 一种生产车间免部署amr系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012136555A1 (fr) * 2011-04-08 2012-10-11 Siemens Aktiengesellschaft Dispositif pour la localisation et la navigation de véhicules autonomes et procédé pour l'exploitation desdits véhicules
CN103777637A (zh) * 2014-02-13 2014-05-07 苏州工业园区艾吉威自动化设备有限公司 无反射板激光自主导航agv小车及其导航方法
CN104679004A (zh) * 2015-02-09 2015-06-03 上海交通大学 柔性路径与固定路径相结合的自动导引车及其导引方法
CN107703940A (zh) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 一种基于天花板二维码的导航方法
CN107702722A (zh) * 2017-11-07 2018-02-16 云南昆船智能装备有限公司 一种激光导引agv自然导航定位方法
CN108225303A (zh) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 二维码定位标签、基于二维码的定位导航系统和方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103884330B (zh) * 2012-12-21 2016-08-10 联想(北京)有限公司 信息处理方法、可移动电子设备、引导设备和服务器
CN104067145B (zh) * 2014-05-26 2016-10-05 中国科学院自动化研究所 剪枝机器人系统
CN105203094B (zh) * 2015-09-10 2019-03-08 联想(北京)有限公司 构建地图的方法和设备
CN105425807B (zh) * 2016-01-07 2018-07-03 朱明� 一种基于人工路标的室内机器人导航方法及装置
US9864377B2 (en) * 2016-04-01 2018-01-09 Locus Robotics Corporation Navigation using planned robot travel paths
CN108919811A (zh) * 2018-07-27 2018-11-30 东北大学 一种基于tag标签的室内移动机器人SLAM方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012136555A1 (fr) * 2011-04-08 2012-10-11 Siemens Aktiengesellschaft Dispositif pour la localisation et la navigation de véhicules autonomes et procédé pour l'exploitation desdits véhicules
CN103777637A (zh) * 2014-02-13 2014-05-07 苏州工业园区艾吉威自动化设备有限公司 无反射板激光自主导航agv小车及其导航方法
CN104679004A (zh) * 2015-02-09 2015-06-03 上海交通大学 柔性路径与固定路径相结合的自动导引车及其导引方法
CN107703940A (zh) * 2017-09-25 2018-02-16 芜湖智久机器人有限公司 一种基于天花板二维码的导航方法
CN107702722A (zh) * 2017-11-07 2018-02-16 云南昆船智能装备有限公司 一种激光导引agv自然导航定位方法
CN108225303A (zh) * 2018-01-18 2018-06-29 水岩智能科技(宁波)有限公司 二维码定位标签、基于二维码的定位导航系统和方法

Also Published As

Publication number Publication date
CN111693046A (zh) 2020-09-22

Similar Documents

Publication Publication Date Title
WO2020182146A1 (fr) Système robotique, système de cartographie et procédé pour carte de navigation robotique
CN108287544B (zh) 一种智能机器人路线规划及沿原路径返回的方法及系统
CN111958591B (zh) 一种语义智能变电站巡检机器人自主巡检方法及系统
CN109720340B (zh) 一种基于视觉识别的自动泊车系统及方法
CN107907131B (zh) 定位系统、方法及所适用的机器人
US10278333B2 (en) Pruning robot system
CN106813672B (zh) 移动机器人的导航方法及移动机器人
CN106774310A (zh) 一种机器人导航方法
CN107179082B (zh) 基于拓扑地图和度量地图融合的自主探索方法和导航方法
US11846949B2 (en) Systems and methods for calibration of a pose of a sensor relative to a materials handling vehicle
CN104635735A (zh) 一种新型agv视觉导航控制方法
CN103105858A (zh) 在固定相机和云台相机间进行目标放大、主从跟踪的方法
EP3745085A1 (fr) Procédé et système de navigation visuelle multi-dispositif dans une scène variable
CN111037552A (zh) 一种配电房轮式巡检机器人的巡检配置及实施方法
CN109144068A (zh) 三向前移式导航切换agv叉车的电控方式及控制装置
CN109459032B (zh) 移动机器人定位方法、导航方法和网格地图建立方法
CN106647738A (zh) 一种无人搬运车的对接路径确定方法及系统及无人搬运车
WO2022027611A1 (fr) Procédé de positionnement et procédé de construction de carte pour robot mobile, et robot mobile
JP2011039968A (ja) 車両可動領域検出装置
CN112204345A (zh) 移动设备的室内定位方法、移动设备及控制系统
CN115014338A (zh) 一种基于二维码视觉和激光slam的移动机器人定位系统及方法
WO2023274177A1 (fr) Procédé et appareil de construction de carte, dispositif, système d'entreposage et support de stockage
KR101319525B1 (ko) 이동 로봇을 이용하여 목표물의 위치 정보를 제공하기 위한 시스템
CN112833890A (zh) 地图构建方法、装置、设备、机器人及存储介质
CN112388626B (zh) 机器人辅助导航方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20769663

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20769663

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 25.01.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20769663

Country of ref document: EP

Kind code of ref document: A1