WO2017028653A1 - 一种移动机器人室内自建地图的方法和系统 - Google Patents
一种移动机器人室内自建地图的方法和系统 Download PDFInfo
- Publication number
- WO2017028653A1 WO2017028653A1 PCT/CN2016/091033 CN2016091033W WO2017028653A1 WO 2017028653 A1 WO2017028653 A1 WO 2017028653A1 CN 2016091033 W CN2016091033 W CN 2016091033W WO 2017028653 A1 WO2017028653 A1 WO 2017028653A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- robot
- grid
- value
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 6
- 230000007613 environmental effect Effects 0.000 claims abstract description 5
- 238000004364 calculation method Methods 0.000 claims abstract description 3
- 230000004927 fusion Effects 0.000 claims description 17
- 238000001514 detection method Methods 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 10
- 230000033001 locomotion Effects 0.000 claims description 7
- 239000003086 colorant Substances 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 6
- 238000010276 construction Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 2
- 235000008694 Humulus lupulus Nutrition 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000002425 crystallisation Methods 0.000 description 1
- 230000008025 crystallization Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/383—Indoor data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3837—Data obtained from a single source
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3863—Structures of map data
- G01C21/387—Organisation of map data, e.g. version management or database structures
- G01C21/3881—Tile-based structures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C22/00—Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/08—Systems for measuring distance only
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
Definitions
- the present invention relates to the field of automation technologies, and in particular, to a method and system for self-built a map of a mobile robot.
- the development of robotics is the common crystallization of the comprehensive development of science and technology. According to the purpose, the robot can be divided into military robots, industrial robots, service robots, etc. Among them, there are huge demands for mobile robots among these robot types.
- the research scope of mobile robots covers architecture, control mechanism, information system, sensing technology, planning strategy, and drive system, including mechanical kinematics, artificial intelligence, intelligent control, pattern recognition, image processing, and visual technology. , a variety of subject areas such as sensor technology, computer networks and communications, and even bioinformatics.
- Mobile robots are widely used not only in industries such as industry, agriculture, medical care, and services, but also in hazardous and dangerous environments such as urban security, defense, and space exploration.
- the research level of mobile robots is an important indicator to measure a country's level of scientific and technological development and comprehensive national strength.
- the "robot revolution” is expected to be an entry point and an important growth point for the "third industrial revolution", which will affect the global manufacturing landscape.
- the International Federation of Robotics (IFR) predicts that the "robot revolution” will create a trillion-dollar market, leading to the rapid development of key technologies and markets such as new material functional modules related to robots, sensory acquisition and recognition, intelligent control and navigation.
- map construction is the primary problem, and it is the core technology to realize robot navigation and even higher intelligence. Building a map is the premise of positioning. Map construction involves the following sub-problems: map representation methods, sensors, description and processing of uncertain information, simultaneous robot positioning, and mapping.
- the map construction of indoor robots is mainly a flat map, and common representation methods include grid maps, geometric feature maps, and topological maps. This technical solution uses a grid map.
- the grid map divides the whole working environment into several grids of the same size. Radar, sonar, ultrasonic and other detectors are used to obtain the information of obstacles in each grid, indicating the possibility of obstacles in each grid.
- the information in each grid corresponds directly to an area in the real environment, making it easy to create, maintain, and understand.
- the raster map uses the probability value to represent the uncertainty of the raster model and can provide more accurate metric information. Due to these advantages, raster maps have gained a great deal of use in mobile robots. However, when the amount of data on the map is large, the storage amount of the raster map and the data maintenance workload are increased, which imposes a large burden on the real-time processing of the computer.
- the geometric feature map refers to abstract geometric features extracted from the environment-aware data collected by the robot, and the environment is represented by related geometric features (such as points, lines, and faces).
- the method is more compact and convenient for position estimation and target recognition, but the extraction of geometric information requires additional processing of the detected information, does not apply to unstructured map structures, and requires a large amount of accurate measurement data. Get accurate modeling.
- the topological map represents the indoor environment as a topology map with nodes and related connecting lines, where the nodes represent important locations in the environment (corners, doors, elevators, stairs, etc.), and the edges represent the connection between nodes, such as Corridors, etc.
- the method occupies a small storage space, can quickly implement path planning, does not require precise location information of the robot, and provides a more natural interface for human-computer interaction instructions. However, this method ignores the details of the environment and makes it difficult to fine-tune the path of the robot. In addition, when the sensor detection error is large, it is difficult to create and maintain. Moreover, if there are two very similar places in the environment, it is difficult to determine whether it is the same point on the map. This brings the difficulty of correctly identifying in a wide range of environments.
- the three map representation methods have their own advantages and disadvantages. It is necessary to combine the application scenarios of the robots and assist other technologies to realize a high-precision and easy-to-maintain map model.
- the invention provides a method and a system for self-built maps of mobile robots, and combines the characteristics of indoor movements to obtain path information and obstacle information from detectors, real-time adjustment and correspondence of indoor environment and grid information, and guarantee map model. Effectiveness.
- the grid map is easy to maintain, making it easy to access map data quickly.
- the program has wide technical adaptability, can quickly respond to diverse environments, and quickly create dynamic map models that reflect real-world information, which can effectively realize robot positioning and navigation.
- the technical solution of the present invention provides a method for self-built a map of a mobile robot indoor, comprising the following steps:
- the robot traverses the indoor feasible area and records path information and environmental information
- the robot calculates and marks the CV value of each grid in the map
- a map is created based on the path information and the CV value.
- the forming the initialized map further includes:
- the map is a mesh pattern composed of a series of square grids
- the map marks the location of the indoor environment in a grid form, each grid representing an indoor area of an actual size of 200 mm x 200 mm;
- the user sets the initial map size according to the range of motion of the robot indoors.
- the map stores the information of the grid in a two-dimensional array
- the data type of the two-dimensional array is a custom data structure AStarNode, and the data structure is defined as follows:
- map_maze is a two-dimensional array of AstarNode data structures
- S_x is position information of the abscissa (X coordinate);
- S_y is position information of the ordinate (Y coordinate);
- S_cv represents the grid CV value, the probability that each grid is occupied by obstacles, the range is 0 ⁇ 1, 0 identifies obstacle-free occupation, 1 means occupied by obstacles, and the larger value indicates that the grid is occupied by obstacles. The greater the probability;
- each grid has a CV value of 1, indicating that it is all an obstacle.
- marking the coordinate origin in the initialization map further includes:
- the east-west direction is the X-axis and the north-south direction is the Y-axis.
- the robot traverses the feasible area of the room, records path information and environment information, and further includes:
- the robot records data path information and CV values during indoor movement
- the robot marks the location information of the feasible area in the map according to the measured value of the odometer
- the robot marks the CV value in the map based on the value measured by the ultrasonic sensor fusion.
- the robot calculates the X and Y coordinate values of the position according to the odometer, and the calculation method is:
- X(0), Y(0) is the position of the robot at the initial moment, D(i) and The distance and azimuth of the autonomous vehicle are from i-1 to i.
- the robot performs fusion measurement through the ultrasonic sensor including two methods:
- Method 1 Data fusion of different sensors at the same time.
- This method is to estimate the CV value of each grid according to the Bayesian probability algorithm.
- the Bayesian probability algorithm is:
- the detection data of two ultrasonic sensors are used for fusion to obtain a CV value
- the detection range of each ultrasonic sensor is divided into three categories, Class I is unoccupied, Class II is likely to occupy, and Class III is uncertain;
- R is the ultrasonic detection range, and r is the actually detected distance
- Method 2 Fusion of measured values of ultrasonic sensors of the same grid at different times.
- CV value of each grid is distinguished by color on the map
- a grid with a CV value of 0 to 0.2 is a fully walkable area, indicated by white;
- a grid having a CV value of 0.2 to 0.8 is an area where obstacles may exist, indicated by gray;
- a grid with a CV value of 0.8 to 1 is a completely non-walkable area, indicated in black.
- the self-built map is formed, and the position of the door is manually revised in the map, and further includes:
- the robot uses the side ultrasonic sensor to measure the distance of the robot from the wall.
- the two jumps of the ultrasonic sensor reading (the first time from small to large, the second from big to large) Small) to determine that there is a door at the location;
- the robot is marked as the identifier of the door according to the intermediate position of the two jumps;
- the manual operation will not clear the door's position by the location of the door, and the location of the door will be identified by other colors on the software interface.
- the technical solution of the present invention further provides a system for self-built a map of a mobile robot, comprising: an odometer, an ultrasonic sensor, and a processing unit, wherein
- the odometer is used to calculate the distance and angle at which the robot walks indoors;
- the ultrasonic sensor is used for detecting obstacle information at different distances around the robot, wherein there is one ultrasonic sensor on each side of the robot and no less than one ultrasonic sensor on the front side;
- the processing unit is configured to calculate path information of the robot walking and raster information of the map, and store data of the map grid.
- the technical scheme of the invention adopts an odometer to measure the moving distance in real time, and then infers the position of the robot at any time and any place, and uses an ultrasonic sensor to detect indoor environment information within a certain distance around the robot, and the robot combines the speculative position and detection.
- the surrounding indoor environment information is located, and the information of the surrounding obstacles is located in the map.
- the scheme has strong correspondence between the grid map and the actual indoor environment, the map is easy to maintain, convenient and quick access to the map data, and has wide technical adaptability, and can quickly create a dynamic map model reflecting the real environment information, and effectively realize the positioning of the robot. And navigation.
- FIG. 1 is a flow chart of a method for self-built a map of a mobile robot in a first embodiment of the present invention
- FIG. 2 is a flowchart of a method for constructing a self-built raster map according to Embodiment 2 of the present invention
- Embodiment 3 is a schematic diagram of initializing a grid map in Embodiment 2 of the present invention.
- FIG. 5 is a schematic diagram showing the principle of an indoor recording path of a mobile robot according to Embodiment 3 of the present invention.
- FIG. 6 is a flowchart of a method for estimating a grid CV value by a robot according to a fourth embodiment of the present invention.
- Embodiment 7 is a schematic diagram of the principle of ultrasonic detection in Embodiment 4 of the present invention.
- FIG. 8 is a flowchart of a method for manually modifying position information of a door according to Embodiment 5 of the present invention.
- FIG. 9 is a structural diagram of a system for constructing a self-built map of a mobile robot in the first to fourth embodiments of the present invention.
- Embodiment 1 A method for self-built map in a mobile robot indoor.
- FIG. 1 is a flow chart of a method for constructing a self-built map of a mobile robot in the first embodiment of the present invention. As shown in Figure 1, the process includes the following steps:
- Step 101 Form an initialized map.
- the map is a mesh pattern composed of a series of square grids
- the map marks the location of the indoor environment in a grid
- the initial map size is 400 ⁇ 400 grids, and each grid represents an indoor area of actual size 200mm ⁇ 200mm;
- the map stores the information of the grid in a two-dimensional array.
- the data type of the two-dimensional array is a custom data structure AStarNode, and the data structure is defined as follows:
- map_maze is a two-dimensional array of AstarNode data structures
- S_x is position information of the abscissa (X coordinate);
- S_y is position information of the ordinate (Y coordinate);
- S_cv represents the grid CV value, the probability that each grid is occupied by obstacles, the range is 0 ⁇ 1, 0 identifies obstacle-free occupation, 1 means occupied by obstacles, and the larger value indicates that the grid is occupied by obstacles. The greater the probability;
- each grid has a CV value of 1, indicating that it is all an obstacle.
- Step 102 Mark a coordinate origin in the initialization map.
- the origin is at the edge of the map and is the center of the edge of the map
- the origin is at the edge of the map
- the direction parallel to the edge of the map is the X axis, and the direction perpendicular to the edge of the map is the Y axis.
- Step 103 The robot traverses the feasible area of the room, and records path information and environment information.
- the data is recorded when the robot starts
- the robot marks the location information of the feasible area in the map according to the measured value of the odometer.
- Step 104 The robot calculates and marks the CV value of each grid in the map.
- the robot marks the CV value in the map based on the measured value of the ultrasonic sensor.
- Step 105 Establish a map according to the path information and the CV value.
- the robot constructs a map according to the path information and CV value
- the robot uses the side ultrasonic sensor to measure the distance from the wall. When the door is open, it is judged that the position is changed by two jumps of the ultrasonic sensor reading (the first time from small to large, the second time from big to small). door;
- the robot is marked as the identifier of the door according to the intermediate position of the two jumps;
- the error flag of the door is manually cleared by the software, and the location of the door is identified by other colors on the software interface.
- Embodiment 2 A method of self-built initialized raster map.
- FIG. 2 is a flowchart of a method for self-built initialized raster map according to Embodiment 2 of the present invention. As shown in Figure 2, the method process includes the following steps:
- Step 201 Form an initialized mesh grid map.
- the initialized raster map is a mesh pattern composed of a series of square grids
- the initialized raster map marks the location of the indoor environment in a grid
- the initial map size is 400 ⁇ 400 grids, and each grid represents an indoor area of actual size 200mm ⁇ 200mm;
- the initial map stores information for each raster in a two-dimensional array.
- the data type of the two-dimensional array is a custom data structure, AStarNode, which is defined as follows:
- map_maze is a two-dimensional array of AstarNode data structures
- S_x is position information of the abscissa (X coordinate);
- S_y is position information of the ordinate (Y coordinate);
- S_cv represents the grid CV value, the probability that each grid is occupied by obstacles, the range is 0 ⁇ 1, 0 identifies obstacle-free occupation, 1 means occupied by obstacles, and the larger value indicates that the grid is occupied by obstacles. The greater the probability.
- Step 202 Assign a CV value to the grid map.
- each grid s_cv value (grid CV value) is 1, indicating that all are obstacles.
- Step 203 Mark a coordinate origin in the initialization map.
- the east-west direction is the X-axis and the north-south direction is the Y-axis.
- Embodiment 3 A method of moving a robot indoor recording path.
- FIG. 4 is a flow chart of a method for recording a path of a mobile robot indoor in a third embodiment of the present invention. As shown in Figure 4, the process includes the following steps:
- Step 301 The odometer records the moving distance of the robot.
- Step 302 Calculate an angle change of the robot movement.
- Step 303 Calculate the moving distance of the robot in the X and Y axis directions.
- the X axis movement distance is
- X(0), Y(0) is the position of the robot at the initial moment, D(i) and The distance and azimuth of the autonomous vehicle are from i-1 to i.
- Step 304 Calculate coordinate values of the position of the robot.
- the X coordinate value is X(k)/200
- the Y coordinate value is Y(k)/200.
- Embodiment 4 A method for a robot to estimate a grid CV value.
- FIG. 6 is a flowchart of a method for estimating a grid CV value by a robot according to a fourth embodiment of the present invention. As shown in FIG. 6, the process includes the following steps:
- Step 401 Using two ultrasonic sensors to detect data.
- Two ultrasonic sensors are located directly in front of the robot in a side-by-side position.
- Step 402 Calculate a CV value.
- Robotic fusion measurements by ultrasonic sensors include two methods:
- Method 1 Data fusion of different sensors at the same time.
- the CV value of each grid is obtained according to the Bayesian probability estimation.
- the Bayesian probability algorithm is:
- the detection data of two ultrasonic sensors are used for fusion to obtain a CV value
- the detection range of a single ultrasonic sensor is divided into three categories, Class I is unoccupied, Class II is likely to occupy, and Class III is uncertain;
- R is the ultrasonic detection range, and r is the actually detected distance
- Method 2 Fusion of measured values of ultrasonic sensors of the same grid at different times.
- Step 403 Label the CV value in the map.
- a grid with a CV value of 0 to 0.2 is a fully walkable area, indicated by white;
- a grid having a CV value of 0.2 to 0.8 is an area where obstacles may exist, indicated by gray;
- a grid with a CV value of 0.8 to 1 is a completely non-walkable area, indicated in black.
- Embodiment 5 A method of manually modifying the position information of a door.
- FIG. 8 is a flowchart of a method for manually marking position information of a door according to Embodiment 5 of the present invention. As shown in FIG. 8, the process includes the following steps:
- Step 501 The robot records two jumps of the ultrasonic sensor reading.
- the robot uses the side ultrasonic sensor to detect the distance of the robot from the wall.
- the two jumps of the ultrasonic sensor reading (the first time from small to large, the second time from the second Big to small) It is judged that there is a door at this position.
- Step 502 Calculate an intermediate position of two hops.
- the middle position of the two jumps is the position of the door.
- Step 503 Manually clear the identifier that is not the door.
- the error flag of the door is manually cleared by the software, and the location of the door is identified by other colors on the software interface.
- FIG. 9 is a structural diagram of a system for constructing a self-built map of a mobile robot in the first to fourth embodiments of the present invention.
- the system includes: an odometer 601, an ultrasonic sensor 602, and a processing unit 603, wherein
- the odometer is used to calculate the distance and angle at which the robot walks indoors;
- the ultrasonic sensor is used for detecting obstacle information at different distances around the robot, wherein there is one ultrasonic sensor on each side of the robot and no less than one ultrasonic sensor on the front side;
- the processing unit is configured to calculate path information of the robot walking and raster information of the map, and store data of the map grid.
- the technical scheme of the invention uses the grid map to record the information of the indoor environment, is suitable for the limited range of indoor activities of the mobile robot, obtains the path information and the obstacle information from the detector, and realizes the real-time adjustment and correspondence of the indoor environment and the grid information.
- the validity of the map model is guaranteed.
- the grid map is easy to maintain, making it easy to access map data quickly.
- the program has wide technical adaptability, can quickly respond to diverse environments, and quickly create dynamic map models that reflect real-world information, which can effectively realize robot positioning and navigation.
- embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
- the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
- the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
- These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
- the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Databases & Information Systems (AREA)
- Acoustics & Sound (AREA)
- Computer Networks & Wireless Communication (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
一种移动机器人室内自建地图的方法和系统,该方法包括:形成初始化的地图(101);在初始化地图中标记坐标原点(102);机器人遍历室内可行区域,记录路径信息和环境信息(103);机器人计算和标记地图中每个栅格的CV值(104);根据路径信息和CV值建立地图(105);路径信息和CV值采用数学算法计算获得。采用栅格式地图进行建模,实现室内环境和栅格信息的实时调整和对应,保证了地图模型的有效性;而且,栅格地图易于维护,方便快速存取地图数据,并且适应性广,能够对多元化的环境实现快速反应,快速创建动态的、反映真实环境信息的地图模型。
Description
本发明涉及自动化技术领域,特别涉及一种移动机器人室内自建地图的方法和系统。
机器人技术的发展是科学技术综合发展的共同结晶。机器人按照用途可以分为军用机器人、工业机器人、服务机器人等,其中,这些机器人类型中都对移动机器人有巨大的需求。
移动机器人的研究范围涵盖:体系结构、控制机构、信息系统、传感技术、规划策略、以及驱动系统等几个方面,涉及机械运动学、人工智能、智能控制、模式识别、图像处理、视觉技术、传感技术、计算机网络与通讯、以至生物信息技术等在内的多个学科领域。移动机器人不仅在工业、农业、医疗、服务等行业中得到广泛的应用,而且在城市安全、国防和空间探测领域等有害与危险场合得到很好的应用。移动机器人的研究水平,是衡量一个国家科技发展水平和综合国力的重要标志。“机器人革命”有望成为“第三次工业革命”的一个切入点和重要增长点,将影响全球制造业格局。国际机器人联合会(IFR)预测:“机器人革命”将创造数万亿美元的市场,从而带动与机器人相关的新材料功能模块、感知获取与识别、智能控制与导航等关键技术与市场快速发展。
在自主式移动机器人的研究中,机器人导航是实现机器人智能的前提。导航中的两个关键问题是:1、对环境的描述地图构建,2、机器人在地图中的位置定位。其中,地图构建是首要的问题,是实现机器人导航乃至更高智能的核心技术,构建地图是实现定位的前提。地图构建涉及下列的子问题:地图表示方法、传感器、不确定信息的描述和处理、机器人同时定位及建图。
室内机器人的地图构建主要是平面地图,常用的表示方法包括栅格地图、几何特征地图和拓扑地图。本技术方案采用的是栅格地图。
栅格地图是将整个工作环境分为若干大小相同的栅格,采用雷达、声纳、超声波等探测器来获得每个栅格存在障碍物的信息,指出每个栅格存在障碍物的可能性,每个栅格的信息直接与真实环境中的某个区域相对应,因而易于创建、维护和理解。栅格地图采用概率值表示栅格模型的不确定性,能够提供较准确的度量信息。由于这些优点,栅格地图在移动机器人中获得大量的采用。但是,在地图的数据量较大时,也会带来栅格地图的存储量和数据维护工作量的增大,对计算机的实时处理造成较大的负担。
几何特征地图是指从机器人收集的环境感知数据中提取抽象的几何特征,用有关的几何特征(如点、直线、面)表示环境。该方法更为紧凑,且便于位置估计和目标识别,但是几何信息的提取需要对探测的信息进行额外的处理,不适用于非结构化的地图结构,且需要大量的、测量精确的数据才可以得到精确的建模。
拓扑地图把室内环境表示为带结点和相关连接线的拓扑结构图,其中结点表示环境中的重要位置点(拐角、门、电梯、楼梯等),边表示结点间的连接关系,如走廊等。该方法占据的存储空间小,可以快速实现路径规划,不需要机器人的精确位置信息,也为人机交互下达指令提供了一种更为自然的接口。但是,该方法忽略了环境的细节信息,难以对机器人的路径进行精细的规划。此外,传感器探测误差较大时,难以创建和维护。而且,如果环境中存在两个很相似的地方,则在地图上很难确定是否为同一个点。这就带来了大范围环境中正确识别的难度。
三种地图表示方法各有优缺点,需要结合机器人的应用场景,同时辅助其他的技术来实现高精度、易维护的地图模型。
发明内容
本发明提供一种移动机器人室内自建地图的方法和系统,结合室内移动的特点,从探测器获得路径信息和障碍物信息,实现室内环境和栅格信息的实时调整和对应,保证了地图模型的有效性。同时,栅格地图易于维护,方便快速存取地图数据。同时,该方案的技术适应性广,能够对多元化的环境实现快速反应,快速创建动态的、反映真实环境信息的地图模型,能够有效实现机器人的定位和导航。
本发明的技术方案提供了一种移动机器人室内自建地图的方法,包括以下步骤:
形成初始化的地图;
在所述初始化地图中标记坐标原点;
机器人遍历室内可行区域,记录路径信息和环境信息;
机器人计算和标记地图中每个栅格的CV值;
根据路径信息和CV值建立地图。
进一步的,所述形成初始化的地图,进一步包括:
所述地图为一系列正方形栅格组成的网状图形;
所述地图以栅格形式标记室内环境的位置,每个栅格代表实际大小200mm×200mm的室内面积;
用户根据机器人室内活动范围设定初始地图大小。
进一步的,所述地图以一个二维数组存储栅格的信息,二维数组的数据类型是一个自定义的数据结构AStarNode,该数据结构的定义如下:
typedef struct AStarNode
{
int s_x;
int s_y;
int s_g;
int s_h;
int s_cv;
struct AStarNode*s_parent;
int s_is_in_closetable;
int s_is_in_opentable;
}AStarNode;
AStarNode map_maze[400][400];
其中,map_maze为AstarNode数据结构的二维数组;
s_x为横坐标(X坐标)的位置信息;
s_y为纵坐标(Y坐标)的位置信息;
s_cv表示栅格CV值,为每个栅格被障碍物占据的概率,范围为0~1,0标识无障碍占据,1表示被障碍物占据,值越大表示该栅格被障碍物占据的概率越大;
初始化地图中,每个栅格CV值都为1,表示全部为障碍物。
进一步的,所述在所述初始化地图中标记坐标原点,进一步包括:
以机器人充电桩为地图的坐标原点;
东西方向为X轴,南北方向为Y轴。
进一步的,所述机器人遍历室内可行区域,记录路径信息和环境信息,进一步包括:
机器人在室内移动过程中记录数据路径信息和CV值;
机器人根据里程计的测量值在地图中标记可行区域的位置信息;
机器人根据超声波传感器融合测量的值在地图中标记CV值。
进一步的,机器人根据里程计计算所处位置的X、Y坐标值,计算方法为:
进一步的,机器人通过超声传感器进行融合测量包括两种方法:
方法一、同一时刻不同传感器的数据融合。
此方法为根据贝叶斯概率算法估算求得每个栅格的CV值,贝叶斯概率算法为:
采用两台超声波传感器的探测数据进行融合获得CV值;
每一个超声波传感器的探测范围划分为3类,I类为无占据,II类为可能占据,III类为不确定;
定义每个栅格被占据的概率为P(O),没有被占据的概率为P(E)=1-P(O),则
其中,
R为超声波探测量程,r为实际探测到的距离;
同一时刻两台超声波传感器的探测数据融合后获得的每个栅格的CV值为
方法二、不同时刻对同一栅格的超声波传感器测量值的融合。
不同时刻同一栅格的CV值为
进一步的,在地图上用颜色区分每个栅格的CV值;
CV值为0~0.2的栅格为完全可行走区域,白色表示;
CV值为0.2~0.8的栅格为可能存在障碍物的区域,用灰色表示;
CV值为0.8~1的栅格为完全不可行走的区域,用黑色表示。
进一步的,机器人在室内遍历行走后,自建形成地图,并在地图中人工修订门的位置,进一步包括:
机器人在室内行走过程中,使用侧面的超声波传感器测量机器人距离墙的距离,在门开着的状态下,通过超声波传感器读数的两次跳变(第一次从小变大,第二次从大到小)判断出该位置有门;
机器人根据所述两次跳变的中间位置,标记为门的标识;
创建地图完成后,人工通过软件操作将不是门的位置清除门的标识,将门所处位置在软件界面上用其他颜色标识。
本发明的技术方案还提供了一种移动机器人室内自建地图的系统,包括:里程计、超声波传感器、处理单元,其中,
里程计用于计算机器人在室内行走的距离和角度;
超声波传感器用于探测机器人周围不同距离的障碍物信息,其中,机器人两侧各有一个超声波传感器,前侧有不少于一个的超声波传感器;
处理单元用于计算机器人行走的路径信息和地图的栅格信息,并存储地图栅格的数据。
本发明技术方案设计采用里程计实时测量移动距离,进而推测出机器人在任意时间、任意地点的位置,并采用超声波传感器探测机器人周围一定距离内的室内环境信息,机器人结合所述推测的位置和探测到的周围的室内环境信息,在地图中定位出周围障碍物的信息。该方案具有栅格地图和实际的室内环境对应性强,地图易于维护,方便快速存取地图数据,技术适应性广,能够快速创建动态的、反映真实环境信息的地图模型,有效实现机器人的定位和导航。
本发明的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明
书中变得显而易见,或者通过实施本发明而了解。本发明的目的和其他优点可通过在所写的说明书、权利要求书、以及附图中所特别指出的结构来实现和获得。
下面通过附图和实施例,对本发明的技术方案做进一步的详细描述。
附图用来提供对本发明的进一步理解,并且构成说明书的一部分,与本发明的实施例一起用于解释本发明,并不构成对本发明的限制。在附图中:
图1为本发明实施例一中移动机器人室内自建地图的方法流程图;
图2为本发明实施例二中自建初始化的栅格地图的方法流程图;
图3为本发明实施例二中初始化栅格地图的示意图;
图4为本发明实施例三中移动机器人室内记录路径的方法流程图;
图5为本发明实施例三中移动机器人室内记录路径的原理示意图;
图6为本发明实施例四中机器人估算栅格CV值的方法流程图;
图7为本发明实施例四中超声波探测的原理示意图;
图8为本发明实施例五中人工修订门的位置信息的方法流程图;
图9为本发明实施例一至四中移动机器人室内自建地图的系统结构图。
以下结合附图对本发明的优选实施例进行说明,应当理解,此处所描述的优选实施例仅用于说明和解释本发明,并不用于限定本发明。
实施例一:移动机器人室内自建地图的方法。
图1为本发明实施例一中移动机器人室内自建地图的方法流程图。如图1所示,该流程包括以下步骤:
步骤101、形成初始化的地图。
所述地图为一系列正方形栅格组成的网状图形;
所述地图以栅格形式标记室内环境的位置;
初始地图大小为400×400栅格,每个栅格代表实际大小200mm×200mm的室内面积;
所述地图以一个二维数组存储栅格的信息,二维数组的数据类型是一个自定义的数据结构AStarNode,该数据结构的定义如下:
typedef struct AStarNode
{
int s_x;
int s_y;
int s_g;
int s_h;
int s_cv;
struct AStarNode*s_parent;
int s_is_in_closetable;
int s_is_in_opentable;
}AStarNode;
AStarNode map_maze[400][400];
其中,map_maze为AstarNode数据结构的二维数组;
s_x为横坐标(X坐标)的位置信息;
s_y为纵坐标(Y坐标)的位置信息;
s_cv表示栅格CV值,为每个栅格被障碍物占据的概率,范围为0~1,0标识无障碍占据,1表示被障碍物占据,值越大表示该栅格被障碍物占据的概率越大;
初始化地图中,每个栅格CV值都为1,表示全部为障碍物。
步骤102、在所述初始化地图中标记坐标原点。
以机器人充电桩为地图的坐标原点;
原点位于地图的边缘,为地图边缘的中央位置;
原点位于地图的边缘;
与地图边缘平行的方向为X轴,与地图边缘垂直的方向为Y轴。
步骤103、机器人遍历室内可行区域,记录路径信息和环境信息。
机器人出发时开始记录数据;
机器人根据里程计的测量值在地图中标记可行区域的位置信息。
步骤104、机器人计算和标记地图中每个栅格的CV值。
机器人根据超声波传感器的测量值在地图中标记CV值。
步骤105、根据路径信息和CV值建立地图。
机器人在室内行走过程中,根据路径信息和CV值自建形成地图;
地图中标注门的位置,再进行人工的修订,进行进一步包括:
机器人使用侧面的超声波传感器测量距离墙的距离,在门开着的状态下,通过超声波传感器读数的两次跳变(第一次从小变大,第二次从大到小)判断出该位置有门;
机器人根据所述两次跳变的中间位置,标记为门的标识;
创建地图完成后,人工通过软件清除不是门的错误标识,将门所处位置在软件界面上用其他颜色标识。
实施例二:自建初始化的栅格地图的方法。
图2为本发明实施例二中自建初始化的栅格地图的方法流程图。如图二所示,该方法流程包括如下步骤:
步骤201、形成初始化的网状栅格地图。
初始化的栅格地图为一系列正方形栅格组成的网状图形;
初始化的栅格地图以栅格形式标记室内环境的位置;
初始地图大小为400×400栅格,每个栅格代表实际大小200mm×200mm的室内面积;
初始地图以一个二维数组存储每个栅格的信息,二维数组的数据类型是一个自定义的数据结构AStarNode,该数据结构的定义如下:
typedef struct AStarNode
{
int s_x;
int s_y;
int s_g;
int s_h;
int s_cv;
struct AStarNode*s_parent;
int s_is_in_closetable;
int s_is_in_opentable;
}AStarNode;
AStarNode map_maze[400][400];
其中,map_maze为AstarNode数据结构的二维数组;
s_x为横坐标(X坐标)的位置信息;
s_y为纵坐标(Y坐标)的位置信息;
s_cv表示栅格CV值,为每个栅格被障碍物占据的概率,范围为0~1,0标识无障碍占据,1表示被障碍物占据,值越大表示该栅格被障碍物占据的概率越大。
步骤202、为栅格地图赋CV值。
初始化地图中,每个栅格s_cv值(栅格CV值)都为1,表示全部为障碍物。
步骤203、在所述初始化地图中标记坐标原点。
以机器人充电桩为地图的坐标原点;
东西方向为X轴,南北方向为Y轴。
实施例三:移动机器人室内记录路径的方法。
图4为本发明实施例三中移动机器人室内记录路径的方法流程图。如图4所示,该流程包括以下步骤:
步骤301、里程计记录机器人的移动距离。
步骤302、计算机器人移动的角度变化。
步骤303、计算机器人的在X、Y轴方向的移动距离。
步骤304、计算机器人的位置的坐标值。
X坐标值为X(k)/200,Y坐标值为Y(k)/200。
实施例四:机器人估算栅格CV值的方法。
图6为本发明实施例四中机器人估算栅格CV值的方法流程图。如图6所示,该流程包括如下步骤:
步骤401、采用两台超声波传感器探测数据。
两台超声波传感器位于机器人的正前方,处在并列的位置。
步骤402、计算CV值。
机器人通过超声传感器进行融合测量包括两种方法:
方法一、同一时刻不同传感器的数据融合。
根据贝叶斯概率估算求得每个栅格的CV值,贝叶斯概率算法为:
采用两台超声波传感器的探测数据进行融合获得CV值;
单个超声波传感器的探测范围划分为3类,I类为无占据,II类为可能占据,III类为不确定;
定义每个栅格被占据的概率为P(O),没有被占据的概率为P(E)=1-P(O),
则
其中,
R为超声波探测量程,r为实际探测到的距离;
同一时刻两台超声波传感器的探测数据融合后获得的每个栅格的CV值为
方法二、不同时刻对同一栅格的超声波传感器测量值的融合。
不同时刻同一栅格的CV值为
步骤403、在地图中标注CV值。
CV值为0~0.2的栅格为完全可行走区域,白色表示;
CV值为0.2~0.8的栅格为可能存在障碍物的区域,用灰色表示;
CV值为0.8~1的栅格为完全不可行走的区域,用黑色表示。
实施例五:人工修订门的位置信息的方法。
图8为本发明实施例五中人工标记门的位置信息的方法流程图。如图8所示,该流程包括如下步骤:
步骤501、机器人记录超声波传感器读数的两次跳变。
机器人在室内行走过程中,使用侧面的超声波传感器探测机器人距离墙的距离的远近,在门开着的状态下,通过超声波传感器读数的两次跳变(第一次从小变大,第二次从大到小)判断出该位置有门。
步骤502、计算两次跳变的中间位置。
两次跳变的中间位置为门的位置。
步骤503、人工清除不是门的标识。
创建地图完成后,人工通过软件清除不是门的错误标识,将门所处位置在软件界面上用其他颜色标识。
图9为本发明实施例一至四中移动机器人室内自建地图的系统结构图。该系统包括:里程计601、超声波传感器602、处理单元603,其中,
里程计用于计算机器人在室内行走的距离和角度;
超声波传感器用于探测机器人周围不同距离的障碍物信息,其中,机器人两侧各有一个超声波传感器,前侧有不少于一个的超声波传感器;
处理单元用于计算机器人行走的路径信息和地图的栅格信息,并存储地图栅格的数据。
本发明技术方案采用栅格式地图来记录室内环境的信息,适合移动机器人室内活动范围有限的特点,从探测器获得路径信息和障碍物信息,实现室内环境和栅格信息的实时调整和对应,保证了地图模型的有效性。同时,栅格地图易于维护,方便快速存取地图数据。同时,该方案的技术适应性广,能够对多元化的环境实现快速反应,快速创建动态的、反映真实环境信息的地图模型,能够有效实现机器人的定位和导航。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和
/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。
Claims (10)
- 一种移动机器人室内自建地图的方法,其特征在于,包括以下步骤:形成初始化的地图;在所述初始化地图中标记坐标原点;机器人遍历室内可行区域,记录路径信息和环境信息;机器人计算和标记地图中每个栅格的CV值;根据路径信息和CV值建立地图。
- 根据权利要求1所述的方法,其特征在于,所述形成初始化的地图,进一步包括:所述地图为一系列正方形栅格组成的网状图形;所述地图以栅格形式标记室内环境的位置,每个栅格代表实际大小200mm×200mm的室内面积;用户根据机器人室内活动范围设定初始地图大小。
- 根据权利要求1或2所述的方法,其特征在于,所述地图以一个二维数组存储栅格的信息,二维数组的数据类型是一个自定义的数据结构AStarNode,该数据结构的定义如下:typedef structAStarNode{int s_x;int s_y;int s_g;int s_h;int s_cv;structAStarNode*s_parent;int s_is_in_closetable;int s_is_in_opentable;}AStarNode;AStarNode map_maze[400][400];其中,map_maze为AstarNode数据结构的二维数组;s_x为横坐标(X坐标)的位置信息;s_y为纵坐标(Y坐标)的位置信息;s_cv表示栅格CV值,为每个栅格被障碍物占据的概率,范围为0~1,0标识无障碍占据,1表示被障碍物占据,值越大表示该栅格被障碍物占据的概率越大;初始化地图中,每个栅格CV值都为1,表示全部为障碍物。
- 根据权利要求1所述的方法,其特征在于,所述在所述初始化地图中标记坐标原点,进一步包括:以机器人充电桩为地图的坐标原点;东西方向为X轴,南北方向为Y轴。
- 根据权利要求1所述的方法,其特征在于,所述机器人遍历室内可行区域,记录路径信息和环境信息,进一步包括:机器人在室内移动过程中记录数据路径信息和CV值;机器人根据里程计的测量值在地图中标记可行区域的位置信息;机器人根据超声波传感器融合测量的值在地图中标记CV值。
- 根据权利要求1或5所述的方法,其特征在于,机器人通过超声传感器进行融合测量包括两种方法:方法一、同一时刻不同传感器的数据融合。此方法为根据贝叶斯概率算法估算求得每个栅格的CV值,贝叶斯概率算法为:采用两台超声波传感器的探测数据进行融合获得CV值;每一个超声波传感器的探测范围划分为3类,I类为无占据,II类为可能占据,III类为不确定;定义每个栅格被占据的概率为P(O),没有被占据的概率为P(E)=1-P(O),则其中,R为超声波探测量程,r为实际探测到的距离;同一时刻两台超声波传感器的探测数据融合后获得的每个栅格的CV值为方法二、不同时刻对同一栅格的超声波传感器测量值的融合。不同时刻同一栅格的CV值为
- 根据权利要求1或7所述的方法,其特征在于,进一步包括:在地图上用颜色区分每个栅格的CV值;CV值为0~0.2的栅格为完全可行走区域,白色表示;CV值为0.2~0.8的栅格为可能存在障碍物的区域,用灰色表示;CV值为0.8~1的栅格为完全不可行走的区域,用黑色表示。
- 根据权利要求1或7所述的方法,其特征在于,机器人在室内遍历行走后,自建形成地图,并在地图中人工修订门的位置,进一步包括:机器人在室内行走过程中,使用侧面的超声波传感器测量机器人距离墙的距离,在门开着的状态下,通过超声波传感器读数的两次跳变(第一次从小变大,第二次从大到小)判断出该位置有门;机器人根据所述两次跳变的中间位置,标记为门的标识;创建地图完成后,人工通过软件操作将不是门的位置清除门的标识,将门所处位置在软件界面上用其他颜色标识。
- 一种移动机器人室内自建地图的系统,其特征在于,包括里程计、超声波传感器、处理单元,其中,里程计用于计算机器人在室内行走的距离和角度;超声波传感器用于探测机器人周围不同距离的障碍物信息,其中,机器人两侧各有一个超声波传感器,前侧有不少于一个的超声波传感器;处理单元用于计算机器人行走的路径信息和地图的栅格信息,并存储地图栅格的数据。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/572,312 US20180172451A1 (en) | 2015-08-14 | 2016-07-22 | Method and system for mobile robot to self-establish map indoors |
EP16836517.9A EP3336489A4 (en) | 2015-08-14 | 2016-07-22 | METHOD AND SYSTEM FOR AUTOMATICALLY ESTABLISHING INTERNAL MAPS OF CARDS BY MOBILE ROBOT |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510502547.6 | 2015-08-14 | ||
CN201510502547.6A CN105043396B (zh) | 2015-08-14 | 2015-08-14 | 一种移动机器人室内自建地图的方法和系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017028653A1 true WO2017028653A1 (zh) | 2017-02-23 |
Family
ID=54450130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/091033 WO2017028653A1 (zh) | 2015-08-14 | 2016-07-22 | 一种移动机器人室内自建地图的方法和系统 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20180172451A1 (zh) |
EP (1) | EP3336489A4 (zh) |
CN (1) | CN105043396B (zh) |
WO (1) | WO2017028653A1 (zh) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107024934A (zh) * | 2017-04-21 | 2017-08-08 | 山东大学 | 一种基于云平台的医院服务机器人及方法 |
CN109916408A (zh) * | 2019-02-28 | 2019-06-21 | 深圳市鑫益嘉科技股份有限公司 | 机器人室内定位和导航方法、装置、设备及存储介质 |
CN110836668A (zh) * | 2018-08-16 | 2020-02-25 | 科沃斯商用机器人有限公司 | 定位导航方法、装置、机器人及存储介质 |
CN111307168A (zh) * | 2020-03-19 | 2020-06-19 | 苏州艾吉威机器人有限公司 | Agv建图方法和定位方法及系统 |
CN112927322A (zh) * | 2021-01-20 | 2021-06-08 | 上海高仙自动化科技发展有限公司 | 一种定位初始化方法、装置和机器人 |
CN113203419A (zh) * | 2021-04-25 | 2021-08-03 | 重庆大学 | 基于神经网络的室内巡检机器人校正定位方法 |
CN113434788A (zh) * | 2021-07-07 | 2021-09-24 | 北京经纬恒润科技股份有限公司 | 建图方法、装置、电子设备及车辆 |
CN117260744A (zh) * | 2023-11-21 | 2023-12-22 | 张家港保税区长江国际港务有限公司 | 一种基于人工智能的机械手路线规划方法 |
Families Citing this family (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105043396B (zh) * | 2015-08-14 | 2018-02-02 | 北京进化者机器人科技有限公司 | 一种移动机器人室内自建地图的方法和系统 |
CN105425803B (zh) * | 2015-12-16 | 2020-05-19 | 纳恩博(北京)科技有限公司 | 自主避障方法、装置和系统 |
CN105652874B (zh) * | 2016-03-21 | 2019-04-12 | 北京联合大学 | 一种基于广义波前算法的移动机器人实时避障方法 |
CN107305376A (zh) * | 2016-04-19 | 2017-10-31 | 上海慧流云计算科技有限公司 | 一种室内地图自动绘制机器人及绘制方法 |
CN107305377A (zh) * | 2016-04-19 | 2017-10-31 | 上海慧流云计算科技有限公司 | 一种室内地图自动绘制机器人及绘制方法 |
CN107401803A (zh) * | 2016-05-19 | 2017-11-28 | 科沃斯机器人股份有限公司 | 一种组合机器人的控制方法 |
CN105974928B (zh) * | 2016-07-29 | 2018-12-07 | 哈尔滨工大服务机器人有限公司 | 一种机器人导航路径规划方法 |
CN106484959B (zh) * | 2016-09-19 | 2019-12-31 | 上海斐讯数据通信技术有限公司 | 一种户型图绘制方法及绘制设备 |
CN108225343A (zh) * | 2016-12-22 | 2018-06-29 | 沈阳美行科技有限公司 | 一种地图信息系统、生成和使用方法及其应用 |
CN108225344A (zh) * | 2016-12-22 | 2018-06-29 | 沈阳美行科技有限公司 | 一种地图系统、生成和使用方法及其应用 |
CN108225342A (zh) * | 2016-12-22 | 2018-06-29 | 沈阳美行科技有限公司 | 一种地图数据系统、生成和使用方法及其应用 |
CN106919174A (zh) * | 2017-04-10 | 2017-07-04 | 江苏东方金钰智能机器人有限公司 | 一种智能引导机器人的引导方法 |
JP6828579B2 (ja) * | 2017-04-27 | 2021-02-10 | トヨタ自動車株式会社 | 環境整備ロボットおよびその制御プログラム |
CN108957463B (zh) * | 2017-06-30 | 2021-01-22 | 北京猎户星空科技有限公司 | 超声波的测量方法和装置 |
WO2019019147A1 (en) * | 2017-07-28 | 2019-01-31 | Qualcomm Incorporated | SELF-EXPLORATION CONTROL OF A ROBOTIC VEHICLE |
CN107702715B (zh) * | 2017-08-23 | 2019-09-20 | 昆山联骥机器人有限公司 | 一种室内服务机器人自主导航用数字地图建立方法 |
CN107928565A (zh) * | 2017-11-17 | 2018-04-20 | 北京奇虎科技有限公司 | 清洁机器人的清洁方法、装置及机器人 |
CN107966702B (zh) * | 2017-11-21 | 2019-12-13 | 北京进化者机器人科技有限公司 | 环境地图的构建方法及装置 |
CN109959935B (zh) * | 2017-12-14 | 2020-10-23 | 北京欣奕华科技有限公司 | 一种地图建立方法、地图建立装置及机器人 |
CN109974719A (zh) * | 2017-12-28 | 2019-07-05 | 周秦娜 | 一种基于云计算的移动机器人环境感知的控制方法及装置 |
CN110069058A (zh) * | 2018-01-24 | 2019-07-30 | 南京机器人研究院有限公司 | 一种机器人室内导航控制方法 |
CN108663041B (zh) * | 2018-02-09 | 2020-04-24 | 意诺科技有限公司 | 一种绘制导航地图的方法及装置 |
CN108803602B (zh) * | 2018-06-01 | 2021-07-13 | 浙江亚特电器有限公司 | 障碍物自学习方法及新障碍物自学习方法 |
CN109002043B (zh) * | 2018-08-24 | 2021-06-15 | 湖南超能机器人技术有限公司 | 应用于机器人的红外对准信号数据处理方法 |
CN109298386B (zh) * | 2018-10-17 | 2020-10-23 | 中国航天系统科学与工程研究院 | 一种基于多智能体协同的三维未知区域快速探测方法 |
CN109376212B (zh) * | 2018-11-22 | 2020-12-15 | 上海木木聚枞机器人科技有限公司 | 一种计算行人出现概率的地图的生成方法及系统 |
CN109556598B (zh) * | 2018-11-23 | 2021-01-19 | 西安交通大学 | 一种基于超声波传感器阵列的自主建图与导航定位方法 |
CN111380532B (zh) * | 2018-12-29 | 2022-06-28 | 深圳市优必选科技有限公司 | 路径规划方法、装置、终端及计算机存储介质 |
US10694053B1 (en) | 2019-01-22 | 2020-06-23 | Xerox Corporation | Wireless location tracking tag for monitoring real time location-tracking apparatus for an electronic device |
CN111578938B (zh) * | 2019-02-19 | 2022-08-02 | 珠海格力电器股份有限公司 | 目标物的定位方法及装置 |
CN109934918B (zh) * | 2019-03-08 | 2023-03-28 | 北京精密机电控制设备研究所 | 一种基于视触觉融合机制的多机器人协同地图重建方法 |
CN111656137A (zh) * | 2019-04-18 | 2020-09-11 | 深圳市大疆创新科技有限公司 | 可移动平台的导航方法、设备、计算机可读存储介质 |
CN112180910B (zh) * | 2019-06-18 | 2024-07-19 | 北京京东乾石科技有限公司 | 一种移动机器人障碍物感知方法和装置 |
CN112212863B (zh) * | 2019-07-09 | 2024-08-09 | 苏州科瓴精密机械科技有限公司 | 栅格地图的创建方法及创建系统 |
WO2021003958A1 (zh) * | 2019-07-09 | 2021-01-14 | 苏州科瓴精密机械科技有限公司 | 栅格地图的创建方法及创建系统 |
CN110487279B (zh) * | 2019-08-27 | 2022-12-13 | 东南大学 | 一种基于改进a*算法的路径规划方法 |
CN110824489A (zh) * | 2019-11-06 | 2020-02-21 | 博信矿山科技(徐州)股份有限公司 | 一种提高室内机器人位置精度的定位方法 |
CN110928972A (zh) * | 2019-11-22 | 2020-03-27 | 珠海格力电器股份有限公司 | 一种语义地图构建方法、系统、装置、存储介质及机器人 |
CN111080786B (zh) * | 2019-12-19 | 2024-05-28 | 盈嘉互联(北京)科技有限公司 | 基于bim的室内地图模型构建方法及装置 |
CN111061273B (zh) * | 2019-12-26 | 2023-06-06 | 航天时代(青岛)海洋装备科技发展有限公司 | 一种无人艇用自主避障融合方法和系统 |
US11244470B2 (en) | 2020-03-05 | 2022-02-08 | Xerox Corporation | Methods and systems for sensing obstacles in an indoor environment |
US11026048B1 (en) * | 2020-03-05 | 2021-06-01 | Xerox Corporation | Indoor positioning system for a mobile electronic device |
CN111272183A (zh) * | 2020-03-16 | 2020-06-12 | 达闼科技成都有限公司 | 一种地图创建方法、装置、电子设备及存储介质 |
CN113449054B (zh) * | 2020-03-27 | 2023-08-04 | 杭州海康机器人股份有限公司 | 一种地图切换的方法和移动机器人 |
CN113465614B (zh) * | 2020-03-31 | 2023-04-18 | 北京三快在线科技有限公司 | 无人机及其导航地图的生成方法和装置 |
CN111486847B (zh) * | 2020-04-29 | 2021-10-08 | 华中科技大学 | 一种无人机导航方法及系统 |
CN111753649B (zh) * | 2020-05-13 | 2024-05-14 | 上海欧菲智能车联科技有限公司 | 车位检测方法、装置、计算机设备和存储介质 |
CN111631639B (zh) * | 2020-05-26 | 2021-07-06 | 珠海市一微半导体有限公司 | 全局栅格地图的地图遍历块建立方法、芯片及移动机器人 |
CN112053415B (zh) * | 2020-07-17 | 2023-08-01 | 科沃斯机器人股份有限公司 | 一种地图构建方法和自行走设备 |
CN111881245B (zh) * | 2020-08-04 | 2023-08-08 | 深圳安途智行科技有限公司 | 能见度动态地图的产生方法、装置、设备及存储介质 |
US11356800B2 (en) | 2020-08-27 | 2022-06-07 | Xerox Corporation | Method of estimating indoor location of a device |
CN114527736B (zh) * | 2020-10-30 | 2023-10-13 | 速感科技(北京)有限公司 | 困境规避方法、自主移动设备和存储介质 |
CN112731321B (zh) * | 2020-11-27 | 2024-06-04 | 北京理工大学 | 基于mimo认知雷达的移动机器人避障及地图绘制方法 |
CN114720978A (zh) * | 2021-01-06 | 2022-07-08 | 扬智科技股份有限公司 | 用于同时定位和地图构建的方法和移动平台 |
CN112782706B (zh) * | 2021-01-11 | 2022-05-10 | 山东新一代信息产业技术研究院有限公司 | 机器人超声波传感器障碍物检测方法及系统 |
CN113311827B (zh) * | 2021-05-08 | 2022-07-12 | 东南大学 | 一种提高存储效率的机器人室内地图及其生成方法 |
CN113532418A (zh) * | 2021-06-11 | 2021-10-22 | 上海追势科技有限公司 | 一种停车场地图单车采集方法 |
CN113607154B (zh) * | 2021-06-29 | 2024-05-24 | 广州大学 | 一种室内机器人二维自主定位方法、系统、设备及介质 |
CN113670296B (zh) * | 2021-08-18 | 2023-11-24 | 北京经纬恒润科技股份有限公司 | 基于超声波的环境地图生成方法及装置 |
CN115731360B (zh) * | 2021-08-31 | 2024-10-15 | 中科南京软件技术研究院 | 面向人机交互的栅格地图后处理表示方法 |
CN113848961B (zh) * | 2021-10-13 | 2023-10-24 | 中国人民解放军国防科技大学 | 基于声纳探测概率的水下航行器安全隐蔽路径规划方法及系统 |
CN113984057A (zh) * | 2021-10-19 | 2022-01-28 | 山东中瑞电气有限公司 | 基于多数据分析的移动机器人定位方法 |
CN114166227B (zh) * | 2021-12-06 | 2024-07-02 | 神思电子技术股份有限公司 | 一种室内导航地图的绘制方法及设备 |
CN114442627B (zh) * | 2022-01-24 | 2023-10-13 | 电子科技大学 | 一种面向智能家居移动设备的动态桌面寻路系统及方法 |
CN114859891B (zh) * | 2022-04-02 | 2024-06-14 | 中国人民解放军国防科技大学 | 多机器人持续监控方法和非临时性计算机可读存储介质 |
CN115049688B (zh) * | 2022-08-16 | 2022-11-18 | 之江实验室 | 基于强化学习思想的栅格地图区域划分方法及装置 |
CN115096293B (zh) * | 2022-08-24 | 2022-11-04 | 北京极光愿景科技有限公司 | 多机器人协作的探测地图构建方法、装置及扫雷机器人 |
CN115268470B (zh) * | 2022-09-27 | 2023-08-18 | 深圳市云鼠科技开发有限公司 | 清洁机器人的障碍物位置标记方法、装置以及介质 |
CN115407344B (zh) * | 2022-11-01 | 2023-01-17 | 小米汽车科技有限公司 | 栅格地图创建方法、装置、车辆及可读存储介质 |
CN118533182B (zh) * | 2024-07-25 | 2024-09-17 | 山东鸿泽自动化技术有限公司 | 一种搬运机器人视觉智能导航方法及系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033971A (zh) * | 2007-02-09 | 2007-09-12 | 中国科学院合肥物质科学研究院 | 一种移动机器人地图创建系统及地图创建方法 |
CN101619985A (zh) * | 2009-08-06 | 2010-01-06 | 上海交通大学 | 基于可变形拓扑地图的服务机器人自主导航方法 |
US20140005933A1 (en) * | 2011-09-30 | 2014-01-02 | Evolution Robotics, Inc. | Adaptive Mapping with Spatial Summaries of Sensor Data |
CN204374771U (zh) * | 2015-01-14 | 2015-06-03 | 上海物景智能科技有限公司 | 实现清扫机器人地图边界建模的装置以及清扫机器人 |
CN104731101A (zh) * | 2015-04-10 | 2015-06-24 | 河海大学常州校区 | 清洁机器人室内场景地图建模方法及机器人 |
CN104808671A (zh) * | 2015-05-19 | 2015-07-29 | 东南大学 | 一种家居环境下的机器人路径规划方法 |
CN105043396A (zh) * | 2015-08-14 | 2015-11-11 | 北京进化者机器人科技有限公司 | 一种移动机器人室内自建地图的方法和系统 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5150452A (en) * | 1989-07-28 | 1992-09-22 | Megamation Incorporated | Method and apparatus for anti-collision and collision protection for multiple robot system |
ES2377638T3 (es) * | 2001-01-24 | 2012-03-29 | Telenav, Inc. | Sistema de navegación en tiempo real para entorno móvil |
US7489812B2 (en) * | 2002-06-07 | 2009-02-10 | Dynamic Digital Depth Research Pty Ltd. | Conversion and encoding techniques |
US7584020B2 (en) * | 2006-07-05 | 2009-09-01 | Battelle Energy Alliance, Llc | Occupancy change detection system and method |
KR100883520B1 (ko) * | 2007-07-23 | 2009-02-13 | 한국전자통신연구원 | 실내 환경지도 작성 시스템 및 방법 |
KR101409987B1 (ko) * | 2007-12-11 | 2014-06-23 | 삼성전자주식회사 | 이동 로봇의 자세 보정 방법 및 장치 |
CN102138769B (zh) * | 2010-01-28 | 2014-12-24 | 深圳先进技术研究院 | 清洁机器人及其清扫方法 |
-
2015
- 2015-08-14 CN CN201510502547.6A patent/CN105043396B/zh active Active
-
2016
- 2016-07-22 WO PCT/CN2016/091033 patent/WO2017028653A1/zh active Application Filing
- 2016-07-22 US US15/572,312 patent/US20180172451A1/en not_active Abandoned
- 2016-07-22 EP EP16836517.9A patent/EP3336489A4/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033971A (zh) * | 2007-02-09 | 2007-09-12 | 中国科学院合肥物质科学研究院 | 一种移动机器人地图创建系统及地图创建方法 |
CN101619985A (zh) * | 2009-08-06 | 2010-01-06 | 上海交通大学 | 基于可变形拓扑地图的服务机器人自主导航方法 |
US20140005933A1 (en) * | 2011-09-30 | 2014-01-02 | Evolution Robotics, Inc. | Adaptive Mapping with Spatial Summaries of Sensor Data |
CN204374771U (zh) * | 2015-01-14 | 2015-06-03 | 上海物景智能科技有限公司 | 实现清扫机器人地图边界建模的装置以及清扫机器人 |
CN104731101A (zh) * | 2015-04-10 | 2015-06-24 | 河海大学常州校区 | 清洁机器人室内场景地图建模方法及机器人 |
CN104808671A (zh) * | 2015-05-19 | 2015-07-29 | 东南大学 | 一种家居环境下的机器人路径规划方法 |
CN105043396A (zh) * | 2015-08-14 | 2015-11-11 | 北京进化者机器人科技有限公司 | 一种移动机器人室内自建地图的方法和系统 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3336489A4 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107024934A (zh) * | 2017-04-21 | 2017-08-08 | 山东大学 | 一种基于云平台的医院服务机器人及方法 |
CN107024934B (zh) * | 2017-04-21 | 2023-06-02 | 山东大学 | 一种基于云平台的医院服务机器人及方法 |
CN110836668A (zh) * | 2018-08-16 | 2020-02-25 | 科沃斯商用机器人有限公司 | 定位导航方法、装置、机器人及存储介质 |
CN109916408A (zh) * | 2019-02-28 | 2019-06-21 | 深圳市鑫益嘉科技股份有限公司 | 机器人室内定位和导航方法、装置、设备及存储介质 |
CN111307168A (zh) * | 2020-03-19 | 2020-06-19 | 苏州艾吉威机器人有限公司 | Agv建图方法和定位方法及系统 |
CN112927322B (zh) * | 2021-01-20 | 2024-01-23 | 上海高仙自动化科技发展有限公司 | 一种定位初始化方法、装置和机器人 |
CN112927322A (zh) * | 2021-01-20 | 2021-06-08 | 上海高仙自动化科技发展有限公司 | 一种定位初始化方法、装置和机器人 |
CN113203419A (zh) * | 2021-04-25 | 2021-08-03 | 重庆大学 | 基于神经网络的室内巡检机器人校正定位方法 |
CN113203419B (zh) * | 2021-04-25 | 2023-11-10 | 重庆大学 | 基于神经网络的室内巡检机器人校正定位方法 |
CN113434788A (zh) * | 2021-07-07 | 2021-09-24 | 北京经纬恒润科技股份有限公司 | 建图方法、装置、电子设备及车辆 |
CN113434788B (zh) * | 2021-07-07 | 2024-05-07 | 北京经纬恒润科技股份有限公司 | 建图方法、装置、电子设备及车辆 |
CN117260744A (zh) * | 2023-11-21 | 2023-12-22 | 张家港保税区长江国际港务有限公司 | 一种基于人工智能的机械手路线规划方法 |
CN117260744B (zh) * | 2023-11-21 | 2024-02-02 | 张家港保税区长江国际港务有限公司 | 一种基于人工智能的机械手路线规划方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3336489A4 (en) | 2019-04-10 |
CN105043396B (zh) | 2018-02-02 |
EP3336489A1 (en) | 2018-06-20 |
US20180172451A1 (en) | 2018-06-21 |
CN105043396A (zh) | 2015-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017028653A1 (zh) | 一种移动机器人室内自建地图的方法和系统 | |
Nieto et al. | Recursive scan-matching SLAM | |
Triebel et al. | Multi-level surface maps for outdoor terrain mapping and loop closing | |
WO2017041730A1 (zh) | 一种移动机器人避障导航的方法和系统 | |
Kümmerle et al. | Large scale graph-based SLAM using aerial images as prior information | |
Hähnel et al. | Mobile robot mapping in populated environments | |
CN107544501A (zh) | 一种智能机器人智慧行走控制系统及其方法 | |
CN111459166A (zh) | 一种灾后救援环境下含受困人员位置信息的情景地图构建方法 | |
Xiao et al. | 3D point cloud registration based on planar surfaces | |
CN112904358B (zh) | 基于几何信息的激光定位方法 | |
CN114442621A (zh) | 一种基于四足机器人的自主探索和建图系统 | |
Wulf et al. | Ground truth evaluation of large urban 6D SLAM | |
Skrzypczyński | Mobile robot localization: Where we are and what are the challenges? | |
CN112652001A (zh) | 基于扩展卡尔曼滤波的水下机器人多传感器融合定位系统 | |
TW202238449A (zh) | 室內定位系統及室內定位方法 | |
EP3088983B1 (en) | Moving object controller and program | |
Nandkumar et al. | Simulation of Indoor Localization and Navigation of Turtlebot 3 using Real Time Object Detection | |
Gartshore et al. | Incremental map building using an occupancy grid for an autonomous monocular robot | |
Peng et al. | Autonomous UAV-Based Structural Damage Exploration Platform for Post-Disaster Reconnaissance | |
Nuchter et al. | Extracting drivable surfaces in outdoor 6d slam | |
Jiménez Serrata et al. | An intelligible implementation of FastSLAM2. 0 on a low-power embedded architecture | |
Wang et al. | Towards an obstacle detection system for robot obstacle negotiation | |
Baligh Jahromi et al. | Layout slam with model based loop closure for 3d indoor corridor reconstruction | |
Wang et al. | Agv navigation based on apriltags2 auxiliary positioning | |
Uno et al. | Deep Inertial Underwater Odometry System. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16836517 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15572312 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016836517 Country of ref document: EP |