CN106681330A - Robot navigation method and device based on multi-sensor data fusion - Google Patents

Robot navigation method and device based on multi-sensor data fusion Download PDF

Info

Publication number
CN106681330A
CN106681330A CN201710061225.1A CN201710061225A CN106681330A CN 106681330 A CN106681330 A CN 106681330A CN 201710061225 A CN201710061225 A CN 201710061225A CN 106681330 A CN106681330 A CN 106681330A
Authority
CN
China
Prior art keywords
data
robot
sensor
map
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710061225.1A
Other languages
Chinese (zh)
Inventor
李建欣
王皓悦
张扬扬
张日崇
怀进鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201710061225.1A priority Critical patent/CN106681330A/en
Publication of CN106681330A publication Critical patent/CN106681330A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Abstract

The invention provides a robot navigation method and device based on multi-sensor data fusion. The method comprises the following steps: establishing a map of a total environment according to data acquired by a laser radar sensor and data of an encoder; acquiring the current position of a robot in the map of the total environment in real time according to data acquired by a laser radar sensor, an accelerometer sensor, a gyroscope sensor and a magnetometer sensor, the map of the total environment and the data of the encoder; acquiring a planned route of the robot from the current position to a targeted position in real time according to the map of the total environment and the current position of the robot; and controlling the robot to keep away from barriers during movement by the data which are acquired by the laser radar sensor, a deep camera, an ultrasonic sensor and an infrared sensor and the data of the encoder. On the basis of reasonable utilization of the sensors to realize navigation of the robot, the method is flexibly applied to various scenes, costs are considered, and the autonomous navigation effect can be achieved.

Description

Robot navigation method and device based on Fusion
Technical field
The present embodiments relate to field of artificial intelligence, more particularly to a kind of machine based on Fusion Device people air navigation aid and device.
Background technology
Robot autonomous airmanship is the hot technology of field in intelligent robotics, by autonomous navigation technology, robot Can intelligence move in the environment, so as to the task such as completing to guide, carrying, interact.So autonomous navigation technology is robot Move towards intelligentized basis, it is impossible to which the robot of autonomous can not be referred to as intelligent robot.
Existing robot autonomous airmanship can be roughly divided into two big class:One class is active autonomous navigation technology, one Class is passive autonomous navigation technology.Active autonomous navigation technology is that robot needs to rely on the certainly leading of external equipment realization Boat, for example, need to dispose in the environment base station, using GPS device etc., therefore very flexible.Additionally due to the restriction of GPS itself, Which cannot be applied to the positioning under indoor environment, and the precision of GPS often cannot also meet the requirement of robot autonomous walking. Passive autonomous navigation technology refers to that robot need not rely on external equipment, only realizes independent navigation using self-sensor device.This The motility of class technology is good, can be applicable under indoor or outdoors environment, without the need for professional, professional equipment deployment.
But, existing passive autonomous navigation technology is usually used single-sensor, it is impossible to provide more comprehensive data, Even with multiple sensors, also typically the data of each sensor acquisition are unreasonably merged, causes independent navigation effect It is poor.
The content of the invention
The embodiment of the present invention provides a kind of robot navigation method and device based on Fusion, for solving The poor technical problem of certainly existing robot autonomous navigation effect.
The embodiment of the present invention provides a kind of robot navigation method based on Fusion, including:According to sharp Data and encoder data that optical radar sensor acquisition is arrived, using positioning immediately and map structuring technology, build total environment Map;Gathered according to the laser radar sensor, acceierometer sensor, gyro sensor and magnetometer sensor in real time Data, the total environment map and encoder data, the robot is obtained described total by vision localization algorithm Current location in body environmental map;In real time according to the total environment map and the current location of the robot, by road Footpath planning algorithm obtains programme path of the robot from the current location to target location;According to the current location and The programme path, is gathered using the laser radar sensor, depth camera, ultrasonic sensor and infrared sensor Data and encoder data, are moved by robot avoiding obstacles described in local paths planning algorithm controls.
The embodiment of the present invention provides a kind of robot navigation device based on Fusion, including:Map structure Modeling block, for the data that collected according to laser radar sensor and encoder data, using positioning and map structuring immediately Technology, builds total environment map;Real-time positioning module, for being passed according to the laser radar sensor, accelerometer in real time Data, the total environment map and encoder data that sensor, gyro sensor and magnetometer sensor are gathered, lead to Cross vision localization algorithm and obtain current location of the robot in the total environment map;Route planning module, is used for In real time according to the total environment map and the current location of the robot, the robot is obtained by path planning algorithm From the current location to the programme path of target location;Control module, for according to the current location and the planning road Line, using the laser radar sensor, depth camera, ultrasonic sensor and infrared sensor gather data and Encoder data, is moved by robot avoiding obstacles described in local paths planning algorithm controls.
The robot navigation method based on Fusion and device that the present invention is provided, are passed using laser radar Data and encoder data that sensor is collected, build total environment map, and are sensed according to encoder data, laser radar The data of device, acceierometer sensor, gyro sensor and magnetometer sensor collection carry out real-time positioning, root to robot According to the current location programme path of robot, and combine laser radar sensor, depth camera, ultrasonic sensor and infrared The data of sensor acquisition, the avoiding obstacles movement of control robot, so as to realize robot autonomous navigation, this programme is not based on The characteristics of with sensor, utilize different sensing datas to be merged and be used for corresponding process, sensor is utilized rationally On the basis of realizing robot navigation, several scenes are flexibly applied to, and take into account cost, preferable independent navigation can be realized Effect.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing Accompanying drawing to be used needed for having technology description is briefly described.
Fig. 1 is that the flow process of the robot navigation method based on Fusion that the embodiment of the present invention one is provided is shown It is intended to;
Fig. 2 is that the flow process of the robot navigation method based on Fusion that the embodiment of the present invention two is provided is shown It is intended to;
Fig. 3 A are a kind of robot navigation method based on Fusion that the embodiment of the present invention three is provided Schematic flow sheet;
Fig. 3 B are robot navigation method of the another kind of the offer of the embodiment of the present invention three based on Fusion Schematic flow sheet;
Fig. 4 is that the flow process of the robot navigation method based on Fusion that the embodiment of the present invention four is provided is shown It is intended to;
Fig. 5 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention five is provided is shown It is intended to;
Fig. 6 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention six is provided is shown It is intended to;
Fig. 7 A are a kind of robot navigation device based on Fusion that the embodiment of the present invention seven is provided Structural representation;
Fig. 7 B are robot navigation device of the another kind of the offer of the embodiment of the present invention seven based on Fusion Structural representation;
Fig. 8 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention eight is provided is shown It is intended to.
Specific embodiment
To make purpose, technical scheme and the advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described.Based on the embodiment in the present invention, The every other embodiment obtained under the premise of creative work is not made by those of ordinary skill in the art, belongs to this The scope of bright protection.
For the sake of clarity, the definition of the specific word for using of the invention or phrase is illustrated first.
Laser radar:Range information of the robot apart from surrounding barrier is provided, usually the one of surrounding Two dimensional slice.Can be used for map structuring, positioning, Real Time Obstacle Avoiding etc..Precision is higher, and stability is high, and noise is low, high cost.Visually Distance is remote, closely locates blind area little.
Depth camera:Range information of the robot apart from surrounding barrier is provided, usually the three of surrounding Dimension point cloud.Can be used for map structuring, positioning, Real Time Obstacle Avoiding etc..Precision is relatively low, and stability is low, and noise is high, advantage of lower cost. Visual range is near, and short-distance blind section is big.
Ultrasonic sensor:Range information of the robot apart from surrounding barrier is provided, usually one-dimensional single-point.Can For Real Time Obstacle Avoiding.Data precision is low, and stability is low, and noise is high, and cost is very low.It is big that visual range locates blind area farther out, closely.
Infrared sensor:Range information of the robot apart from surrounding barrier is provided, usually one-dimensional single-point.It is available In Real Time Obstacle Avoiding.Data precision is relatively high, and stability is relatively high, and noise is relatively high, and cost is very low.Visual range is near, closely Place non-blind area.
Accelerometer:The instantaneous linear acceleration value of robot can be provided.Can be used to positioning, Real Time Obstacle Avoiding.
Gyroscope:The instantaneous angular velocity of robot can be provided.Can be used to positioning, Real Time Obstacle Avoiding.
Gaussmeter:The absolute towards estimation of robot can be provided.Can be used to positioning, Real Time Obstacle Avoiding.
Encoder:The estimation such as mileage, speed of robot ambulation can be provided.For positioning, Real Time Obstacle Avoiding.
Fig. 1 is that the flow process of the robot navigation method based on Fusion that the embodiment of the present invention one is provided is shown It is intended to, as shown in figure 1, the present embodiment is applied in robot navigation device illustrate in this way, the robot navigation Device can be integrated in robot autonomous navigation system, and the method includes:
101st, the data for being collected according to laser radar sensor and encoder data, using positioning and map structuring immediately Technology, builds total environment map.
Specifically, immediately positioning and map constructing method (Simultaneous Localization And Mapping, Abbreviation SLAM) including but not limited to:The methods such as scan matching, figure optimization.
With actual scene for example:When robot is placed in a new environment, need using positioning and map immediately Construction method, draws the map of current environment.Specifically, control robot to move in this context, laser radar sensor is not It is disconnected to collect data, and SLAM algorithms are utilized, calculating and draw out corresponding total environment map in real time, the total environment map is Two-dimensional grid map, the two-dimensional grid map are subsequently mainly used in the path planning and robot autonomous localization of the overall situation.Wherein, swash The data of optical radar sensor acquisition are used for the Data Matching of SLAM algorithms, and encoder data is used for changing every time for SLAM algorithms In generation, provides initial estimation, so as to accelerate map building process.
The total environment map of above-mentioned foundation can be reused, therefore map structuring process often only needs to carry out one It is secondary.Only significantly change when environment has, or robot is when being placed in new environment, just need to rebuild total environment ground Figure.
As global path planning is intended only as the general orientation reference of the actual walking process of robot, therefore only need to determine Path substantially.Accordingly, carry out global path planning and only require that total environment map includes the main information of environment i.e. Can, by taking indoor environment as an example, can include:Wall, indoor furniture, outdoor electric pole etc., and the ring that laser radar sensor is provided Border two dimensional slice data provide above main information enough.In addition, when robot carries out autonomous positioning, needing real-time sensor Data are matched with map, and the matching process is based primarily upon big line feature, for example:Wall, large obstacle etc., and laser thunder The two dimensional slice information for reaching also be enough to provide these information.Although autonomous positioning is higher for the requirement of precision, laser radar is passed The data often high precision of sensor collection, noise are little, thus the map built based on laser radar data can optimize it is autonomous fixed The effect of position.It can be seen that, total environment map is built using the data that laser radar sensor is gathered, can be in constructing environment map On the basis of autonomous positioning is realized, the time saved needed for map structuring, reduce data processing amount, improve navigation efficiency.It is real In the application of border, laser radar sensor can be arranged on the chassis of robot.
102nd, passed according to the laser radar sensor, acceierometer sensor, gyro sensor and gaussmeter in real time Data, the total environment map and encoder data that sensor is gathered, obtain the robot by vision localization algorithm Current location in the total environment map.
With actual scene for example:Based on abovementioned steps build total environment map, using laser radar sensor, The data of acceierometer sensor, gyro sensor and magnetometer sensor collection carry out real-time positioning, obtain robot Current location, the current location are robot positional information currently under total environment map.
103rd, in real time according to the total environment map and the current location of the robot, obtained by path planning algorithm Obtain programme path of the robot from the current location to target location.
Wherein, path planning algorithm is included but is not limited to:A star algorithms, Dijkstra's algorithm etc..
With actual scene for example:Robot is positioned, that is, after obtaining the current location of robot, you can according to The total environment map that the current location of robot and abovementioned steps build, carries out from the route of current location to target location advising Draw, obtain programme path.
104th, according to the current location and the programme path, using the laser radar sensor, depth camera, Ultrasonic sensor and the data and encoder data of infrared sensor collection, by local paths planning algorithm controls institute State robot avoiding obstacles movement.
Wherein, sector planning algorithm is included but is not limited to:Dynamic window simulation, D Star algorithms etc..
With actual scene for example:Based on the current location of the programme path and robot for obtaining in real time, by laser The data of radar sensor, depth camera, ultrasonic sensor and infrared sensor collection, by local paths planning algorithm Decision-making is carried out, controllable robot is movably walking, and avoid the either statically or dynamically obstacle of surrounding during being movably walking in real time Thing.
Specifically, as local paths planning to be ensured the safety and smoothness of robot ambulation, it is therefore desirable to use laser Radar, depth camera, ultrasonic sensor, the data of infrared sensor collection and encoder data are merged.
In practical application, as laser radar sensor cannot detect the barrier higher or lower than itself laser radar tangent plane Hinder thing, therefore infrared sensor and ultrasonic sensor can be installed in laser radar sensor lower part, it is near the ground to detect Barrier.As ultrasound wave has short-distance blind section but visual range remote, and infrared visible distance is near but without short-distance blind section, therefore Both are used cooperatively, the effect of the remote non-blind area of visual range is reached.Additionally, the three of nearby environment are obtained using depth camera Dimension cloud data, can directly obtain all objects information blocked in front of robot.
The robot navigation method based on Fusion that the present embodiment is provided, using laser radar sensor The data for collecting and encoder data, build total environment map, and according to encoder data, laser radar sensor, plus The data of speedometer transducer, gyro sensor and magnetometer sensor collection carry out real-time positioning to robot, according to machine The current location programme path of device people, and combine laser radar sensor, depth camera, ultrasonic sensor and infrared sensing The data of device collection, the avoiding obstacles movement of control robot, so as to realize robot autonomous navigation, this programme is based on not simultaneous interpretation The characteristics of sensor, utilize different sensing datas to be merged and be used for corresponding process, realized using sensor rationally On the basis of robot navigation, several scenes are flexibly applied to, and take into account cost, preferable independent navigation effect can be realized Really.
Fig. 2 is that the flow process of the robot navigation method based on Fusion that the embodiment of the present invention two is provided is shown It is intended to, as shown in Fig. 2 the present embodiment is still applied in robot navigation device illustrate in this way, in embodiment one On the basis of, 102 include:
201st, the data according to magnetometer sensor collection, initialize to the attitude of the robot;
202nd, in real time according to current time with a upper moment described in acceierometer sensor and the gyro sensor adopt The data of collection and the encoder data, calculated current time relative to the position of robot and attitude described in a upper moment Change is estimated;
203rd, determine in the total environment map with the position and the attitudes vibration corresponding point map of estimation, obtain institute State the map datum of point map;
204th, by the laser radar sensor current data of collection and the map datum of the point map are distinguished Matched, obtained the current location of the robot, the current location of the robot is matching degree highest point map Position.
With actual scene for example:In initialization, the data of magnetometer sensor collection are read first, obtain machine The initial estimation of the absolute direction of people, is initialized using this initial estimation travel direction.In robot moving process, constantly read The data of encoder data and the acceierometer sensor and gyro sensor collection are taken, current time is calculated Position and attitudes vibration relative to a upper moment is estimated.Wherein, position is obtained by process is integrated to encoder data Estimate with attitudes vibration, the data that acceierometer sensor is gathered are integrated and can obtain change in location estimation, to gyro The data of instrument sensor acquisition are integrated and can obtain attitudes vibration estimation, afterwards by above-mentioned three's weighted average, i.e., Position and attitudes vibration estimation of the current time relative to a upper moment was obtained.Afterwards, estimated according to position and attitudes vibration, Matched with the data of the map datum and laser radar sensor currently collection of corresponding map point, matching degree highest point map The as current location of robot.
Specifically, robot autonomous localization can be carried out using Monte-Carlo particle filtering method.Accordingly, according to magnetic strength The data of flowmeter sensor collection, initialize to the attitude of particle, in robot moving process, in real time according to current time The data gathered with acceierometer sensor described in a upper moment and the gyro sensor and the encoder data, The particle position and attitudes vibration that current time was calculated relative to a upper moment is estimated, estimates to determine according to position and attitudes vibration Some point maps, the map datum of these point maps are matched with the data of laser radar sensor currently collection, matching degree Highest point map is the current location of robot.
In above-mentioned matching process, the high precision of laser radar sensor, noise are low, and observed range is remote, can obtain The line feature of distant place, and the Main Basiss for matching are line features.Therefore, the data for being based only on laser radar sensor collection are entered Row matching, the time required to can effectively saving matching, improves the efficiency of robot localization, so as to faster realize navigation.
Also, it is used in combination the position that encoder data, acceierometer sensor, gyro sensor calculate current particle Estimate with attitudes vibration, realize multi-source fusion, the accuracy of robot localization can be improved, so as to more accurately and reliably realize machine Device people navigates.
The present embodiment provide the robot navigation method based on Fusion, realize it is robot autonomous fixed During position, it is used in combination encoder data, acceierometer sensor, gyro sensor and calculates current position and attitude Change estimates that the data for being based only upon laser radar sensor collection are matched, the time required to not only can effectively saving matching, The accuracy of robot localization can also be improved, so as to more rapidly accurately and reliably realize robot navigation.
Fig. 3 A are a kind of robot navigation method based on Fusion that the embodiment of the present invention three is provided Schematic flow sheet, as shown in Figure 3A, the present embodiment is still applied in robot navigation device illustrate in this way, in reality On the basis of applying example one or embodiment two, 104 include:
301st, the data for being gathered according to the Ultrasonic Sensor Data or the infrared sensor in real time, judge the machine Whether device people is in mobile state of being obstructed;
If the 302, the robot is not in mobile state of being obstructed, according to the depth camera and the laser radar The data of sensor currently collection, build local environment map, and according to the local environment map, the robot it is current Position and programme path, are moved by robot avoiding obstacles described in local paths planning algorithm controls.
With actual scene for example:Sector planning strategy is totally divided into two situations, and a situation is under normal condition Planning strategy, another situation be special state process.Specifically, for the first situation, in real time according to supersonic sensing Whether device data or the data of infrared sensor collection, judge the robot in mobile state of being obstructed, if being not in movement Be obstructed in the normal walking process of state, i.e. robot, planned using conventional planning strategy, i.e., according to depth camera and The local environment map that the data of institute's laser radar sensor currently collection build, with reference to the present bit of the robot for above obtaining Put and programme path, by the avoiding obstacles movement of local paths planning algorithm controls robot.Specifically, can be using dynamic Window simulation algorithm carries out robot movement control.
In the present embodiment, dynamic window simulation algorithm can be built around robot in real time according to each sensor acquisition data The local environment map of a small range, then calculated based on the local environment map simulation, carry out decision-making.Specifically, local environment The structure of map adopts the number of two-dimensional surface data and three dimensional point cloud, i.e. laser radar sensor and depth camera collection According to.The reason for using both sensors is that local environment map needs real-time update, if by similar ultrasonic sensor and The one-dimensional point data of infrared sensor collection is building up to local environment map, may result in and cannot clean out asking for barrier Topic, i.e., inscribe when a certain, and ultrasonic sensor or infrared sensor have observed a barrier, and subsequent time observation It is not the position at this obstacle object point place, cannot just determines whether the barrier at a upper moment is also still present, gradually yet Ground local environment map can build up barrier, cause the robot under decision-making at the end of one's rope, ultimately result in navigation failure.And Just there is no this in two-dimensional surface data and three dimensional point cloud, because what laser radar sensor and depth camera were observed It is not a point, but a scope, therefore the reliability of navigation can be improved in the real-time of guarantee map in most of scope Property.
For another kind of situation, i.e., the data for being gathered according to Ultrasonic Sensor Data or infrared sensor in real time, judge The robot in movement be obstructed state when, i.e., robot runs into special circumstances, for example:Apart from barrier it is excessively near, due to fixed Situations such as overlapping with barrier, surrounded by barrier caused by the error of position, then into special state process, specifically, can be with root According to Ultrasonic Sensor Data or the data and default set direction strategy of infrared sensor currently collection, the machine is controlled People's avoiding obstacles are moved.Accordingly, as shown in Figure 3 B, Fig. 3 B are that the another kind that the embodiment of the present invention three is provided is sensed based on more The schematic flow sheet of the robot navigation method of device data fusion, as shown in Figure 3 B, the present embodiment is still applied to base in this way Illustrate in the robot navigation device of Fusion, on the basis of Fig. 3 A illustrated embodiments, After 301, can also include:
If the 303, the robot is in mobile state of being obstructed, according to the Ultrasonic Sensor Data or described infrared The data and default set direction strategy of sensor currently collection, control the robot avoiding obstacles movement.
Wherein, ultrasonic sensor and infrared sensor are used for treatment on special problems, for example, when ultrasonic sensor or red Outer sensor is observed apart from barrier is excessively near, robot is besieged or as position error robot thinks itself and barrier When coincidence, then special state process is carried out.Now no longer decision-making is carried out according to local environment map, but directly select Fled to the direction away from barrier, specific set direction strategy can using including but not limited to closest approach opposite direction, The modes such as Artificial Potential Field synthesis are selected.
The robot navigation method based on Fusion that the present embodiment is provided, in the mobile control to robot In system strategy, based on the data judging current state that Ultrasonic Sensor Data or infrared sensor are gathered, if normally, basis The data of depth camera and laser radar sensor currently collection build local environment map, and are based on the local ring condition Figure, with reference to robot current location and programme path, control robot avoiding obstacles movement, otherwise, using default direction Selection strategy is fled from, so that it is guaranteed that the reliability of robot navigation.
Further, Fig. 4 is the robot navigation side based on Fusion that the embodiment of the present invention four is provided The schematic flow sheet of method, as shown in figure 4, the present embodiment is still applied in robot navigation device illustrate in this way, On the basis of embodiment three, currently gathered according to the depth camera and the laser radar sensor described in 302 Data, build local environment map, including:
401st, denoising is carried out to the data of the depth camera currently collection, and by the data after denoising to two dimensional surface Projected, obtained two-dimensional projection data;
If the 402, there are no the data of the laser radar sensor collection in two-dimensional projection data relevant position, will be corresponding The local environment map datum of position is filled with corresponding two-dimensional projection data;
If the 403, there are the data of the laser radar sensor collection in the two-dimensional projection data relevant position, by institute The data for stating two-dimensional projection data and laser radar sensor collection are weighted averagely, to obtain the local of relevant position Environmental map data.
With actual scene for example:Sector planning strategy is totally divided into two situations, and a situation is under normal condition Planning strategy, another situation be special state process.Specifically, for the first situation, in real time according to supersonic sensing Whether device data or the data of infrared sensor collection, judge the robot in mobile state of being obstructed, if being not in movement Be obstructed in the normal walking process of state, i.e. robot, planned using conventional planning strategy, i.e., according to depth camera and The local environment map that the data of institute's laser radar sensor currently collection build, with reference to the present bit of the robot for above obtaining Put and programme path, by the avoiding obstacles movement of local paths planning algorithm controls robot.
In the present embodiment, as a example by using the movement of dynamic window simulation algorithm control robot, dynamic window simulation algorithm It is based on two-dimensional grid map, it is therefore desirable to denoising is carried out to three dimensional point cloud first, then by three dimensional point cloud to two dimension Plane carries out projection and obtains two-dimensional projection data, now by three dimensional point cloud point row projection, if a certain show obstacle object point, Then the projection result of this row is barrier, is otherwise projected as the free time.By two-dimensional projection data and laser radar sensor collection Two dimensional surface data are merged, and specific strategy is, the two-dimemsional number gathered to laser radar sensor using two-dimensional projection data According to being updated, i.e., for the position that laser radar sensor is not observed directly is filled out using the two-dimensional projection data for obtaining Fill, for the position that laser radar sensor is observed, the number collected using two-dimensional projection data and laser radar sensor It is average according to being weighted.The method is applied to multiple depth cameras and multiple laser radar sensors.
The robot navigation method based on Fusion that the present embodiment is provided, depth camera is gathered Three dimensional point cloud carries out denoising projection, and the two dimension collected based on the two-dimensional projection data and laser radar sensor that obtain is put down Face data are merged, and build local environment map, realize mobile control to robot, and the present embodiment is by by two dimensional surface Data, three dimensional point cloud carry out fusion treatment, can reach preferable sector planning effect, improve navigation accuracy.
One of ordinary skill in the art will appreciate that:Realize that all or part of step of above-mentioned each method embodiment can be led to Cross the related hardware of programmed instruction to complete.Aforesaid program can be stored in a read/write memory medium.The program is being held During row, the step of including above-mentioned each method embodiment is performed;And aforesaid storage medium includes:ROM, RAM, magnetic disc or CD Etc. it is various can be with the medium of store program codes.
Fig. 5 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention five is provided is shown It is intended to, as shown in figure 5, the robot navigation device can be integrated in robot autonomous navigation system, the device includes:
Map structuring module 51, for the data that collected according to laser radar sensor and encoder data, using being Shi Dingwei and map structuring technology, build total environment map;
Real-time positioning module 52, for being passed according to the laser radar sensor, acceierometer sensor, gyroscope in real time Data, the total environment map and encoder data that sensor and magnetometer sensor are gathered, by vision localization algorithm Obtain current location of the robot in the total environment map;
Route planning module 53, in real time according to the total environment map and the current location of the robot, leads to Cross path planning algorithm and obtain programme path of the robot from the current location to target location;
Control module 54, for according to the current location and the programme path, using the laser radar sensor, The data and encoder data of depth camera, ultrasonic sensor and infrared sensor collection, by local paths planning Robot avoiding obstacles movement described in algorithm controls.
Specifically, the instant positioning that map structuring module 51 is adopted is included but is not limited to map constructing method:Scanning With, figure optimization etc. method.With actual scene for example:When robot is placed in a new environment, need using immediately fixed Position and map constructing method, draw the map of current environment.Specifically, control robot to move in this context, laser radar Sensor constantly collects data, and map structuring module 51 utilizes SLAM algorithms, calculates and draw out corresponding total environment in real time Map.Based on the total environment map that map structuring module 51 builds, real-time positioning module 52 utilizes laser radar sensor, adds The data of speedometer transducer, gyro sensor and magnetometer sensor collection carry out real-time positioning, obtain working as robot Front position.After real-time positioning module 52 obtains the current location of robot, route planning module 53 is according to real-time positioning module 52 The total environment map that the current location of the robot of acquisition and map structuring module 51 build, is carried out from current location to target The route planning of position, obtains programme path.Based on the current location of the programme path and robot for obtaining in real time, control module 54 by the data of laser radar sensor, depth camera, ultrasonic sensor and infrared sensor collection, by local road Footpath planning algorithm carries out decision-making, and controllable robot is movably walking, and avoids the static state of surrounding during being movably walking in real time Or dynamic barrier.
Wherein, the total environment map is two-dimensional grid map.The data of laser radar sensor collection are used for SLAM The Data Matching of algorithm, encoder data are used to provide initial estimation for each iteration of SLAM algorithms, so as to accelerate map structure Build process.And the data often high precision of laser radar sensor collection, noise are little, therefore based on laser radar data structure Map can optimize the effect of autonomous positioning.Total environment map, energy are built using the data that laser radar sensor is gathered It is enough the time saved needed for map structuring, to reduce data processing amount in constructing environment map and on the basis of realizing autonomous positioning, Improve navigation efficiency.In practical application, laser radar sensor can be arranged on the chassis of robot.
The total environment map of above-mentioned foundation can be reused, therefore map structuring module 51 often only needs to carry out Map structuring process.Only significantly change when environment has, or robot is when being placed in new environment, map structuring mould Block 51 just needs to rebuild total environment map.
Wherein, path planning algorithm is included but is not limited to:A star algorithms, Dijkstra's algorithm etc., sector planning are calculated Method is included but is not limited to:Dynamic window simulation, D Star algorithms etc..
Specifically, as local paths planning to be ensured the safety and smoothness of robot ambulation, it is therefore desirable to use laser Radar, depth camera, ultrasonic sensor, the data of infrared sensor collection and encoder data are merged.
Wherein, the robot navigation device that the present embodiment is provided can perform the technical side of the embodiment of the method for embodiment one Case, which realizes that principle is similar with technique effect, and here is omitted.
The robot navigation device based on Fusion that the present embodiment is provided, using laser radar sensor The data for collecting and encoder data, build total environment map, and according to encoder data, laser radar sensor, plus The data of speedometer transducer, gyro sensor and magnetometer sensor collection carry out real-time positioning to robot, according to machine The current location programme path of device people, and combine laser radar sensor, depth camera, ultrasonic sensor and infrared sensing The data of device collection, the avoiding obstacles movement of control robot, so as to realize robot autonomous navigation, this programme is based on not simultaneous interpretation The characteristics of sensor, utilize different sensing datas to be merged and be used for corresponding process, realized using sensor rationally On the basis of robot navigation, several scenes are flexibly applied to, and take into account cost, preferable independent navigation effect can be realized Really.
Fig. 6 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention six is provided is shown It is intended to, as shown in fig. 6, on the basis of embodiment five, real-time positioning module 52 includes:
Initialization unit 521, for the data gathered according to the magnetometer sensor, enters to the attitude of the robot Row initialization;
Estimation unit 522, in real time according to current time with a upper moment described in acceierometer sensor and the top The data and the encoder data of spiral shell instrument sensor acquisition, calculated current time relative to robot described in a upper moment Position and attitudes vibration estimate;
Acquiring unit 523, it is corresponding with the position and attitudes vibration estimation in the total environment map for determining Point map, obtains the map datum of the point map;
Matching unit 524, for data and the point map by the laser radar sensor is currently gathered Map datum is matched respectively, obtains the current location of the robot, the current location of the robot be matching degree most The position of high point map.
With actual scene for example:In initialization, initialization unit 521 reads the number of magnetometer sensor collection According to the initial estimation of the absolute direction of acquisition robot is initialized using this initial estimation travel direction.It is moved through in robot Cheng Zhong, estimation unit 522 constantly reads encoder data and the acceierometer sensor and the gyro sensor is adopted The data of collection, calculate position and attitudes vibration estimation of the current time relative to upper moment robot.Wherein, to encoder number Position is obtained by according to process is integrated and attitudes vibration is estimated, the data that acceierometer sensor is gathered are integrated can Estimated with obtaining change in location, the data that gyro sensor is gathered are integrated and can obtain attitudes vibration estimation, afterwards By to above-mentioned three's weighted average, you can obtained position and attitudes vibration estimation of the current time relative to a upper moment.It Afterwards, acquiring unit 523 is estimated to determine corresponding point map, the corresponding map point of matching unit 524 according to position and attitudes vibration Map datum and the data of laser radar sensor currently collection match, matching degree highest point map is robot Current location.
Specifically, robot autonomous localization can be carried out using Monte-Carlo particle filtering method.Accordingly, initialize single Unit 521 is initialized to the attitude of particle, in robot moving process, is estimated according to the data of magnetometer sensor collection Meter unit 522 in real time according to current time with a upper moment described in acceierometer sensor and gyro sensor collection Data and the encoder data, the particle position and attitudes vibration for calculating current time relative to a upper moment are estimated, are obtained Take unit 523 and estimate to determine some point maps according to position and attitudes vibration, matching unit 524 is by the map number of these point maps According to matching with the data of laser radar sensor currently collection, matching degree highest point map is the present bit of robot Put.
In above-mentioned matching process, the high precision of laser radar sensor, noise are low, and observed range is remote, can obtain The line feature of distant place, and the Main Basiss for matching are line features.Therefore, the data for being based only on laser radar sensor collection are entered Row matching, the time required to can effectively saving matching, improves the efficiency of robot localization, so as to faster realize navigation.
Also, it is used in combination encoder data, acceierometer sensor, gyro sensor and calculates current location and attitude Change is estimated, realizes multi-source fusion, can improve the accuracy of robot localization, so as to more accurately and reliably realize that robot leads Boat.
Wherein, the robot navigation device that the present embodiment is provided can perform the technical side of the embodiment of the method for embodiment two Case, which realizes that principle is similar with technique effect, and here is omitted.
The present embodiment provide the robot navigation device based on Fusion, realize it is robot autonomous fixed During position, it is used in combination encoder data, acceierometer sensor, gyro sensor and calculates current location and attitude change Change and estimate, the data for being based only upon laser radar sensor collection are matched, the time required to not only can effectively saving matching, also The accuracy of robot localization can be improved, so as to more rapidly accurately and reliably realize robot navigation.
Fig. 7 A are a kind of robot navigation device based on Fusion that the embodiment of the present invention seven is provided Structural representation, as shown in Figure 7 A, on the basis of embodiment five or embodiment six, control module 54 includes:
Detector unit 541, for the number for being gathered according to the Ultrasonic Sensor Data or the infrared sensor in real time According to judging that whether the robot is obstructed state in movement;
First control unit 542, if detecting the robot for detector unit 541 is not in mobile state of being obstructed, Then according to the depth camera and the data of the laser radar sensor currently collection, local environment map, and root are built According to the local environment map, the current location of the robot and programme path, by local paths planning algorithm controls institute State robot avoiding obstacles movement.
Illustrated with actual scene:Sector planning strategy is totally divided into two situations, and situation is the rule under normal condition Plan summary, another situation is that special state is processed.Specifically, for the first situation, detector unit 541 is in real time according to ultrasound Whether wave sensor data or the data of infrared sensor collection, judge the robot in mobile state of being obstructed, if not locating It is obstructed in the normal walking process of state, i.e. robot in movement, then the first control unit 542 is carried out using conventional planning strategy Planning, i.e., the local environment map for being built according to the data of depth camera and the currently collection of institute's laser radar sensor, with reference to The current location of the robot for above obtaining and programme path, by local paths planning algorithm controls robot avoiding obstacles It is mobile.Specifically, robot movement control can be carried out using dynamic window simulation algorithm.
For another kind of situation, i.e., the data for being gathered according to Ultrasonic Sensor Data or infrared sensor in real time, detection Unit 541 judge the robot in movement be obstructed state when, i.e., robot runs into special circumstances, for example:Apart from barrier Cross closely, because of situations such as overlapping with barrier caused by position error, being surrounded by barrier, then into special state process, tool Body, can according to the data and default set direction strategy of the currently collection of Ultrasonic Sensor Data or infrared sensor, Control the robot avoiding obstacles movement.Accordingly, as shown in Figure 7 B, Fig. 7 B are the another of the offer of the embodiment of the present invention seven The structural representation of the robot navigation device based on Fusion is planted, as shown in Figure 7 B, is being implemented shown in Fig. 7 A On the basis of mode, control module 54 also includes:
Second control unit 543, if detecting the robot in mobile state of being obstructed for detector unit 541, According to the Ultrasonic Sensor Data or the data and default set direction strategy of the infrared sensor currently collection, control Make the robot avoiding obstacles movement.
Wherein, ultrasonic sensor and infrared sensor are used for treatment on special problems, for example, when detector unit 541 passes through Ultrasonic sensor or infrared sensor are observed apart from barrier is excessively near, robot is besieged or due to position error robot When thinking itself to overlap with barrier, then the second control unit 543 carries out special state process.Now no longer according to office Portion's environmental map carries out decision-making, but directly selects to the direction away from barrier and fled from, specific set direction strategy Can be selected using modes such as including but not limited to closest approach opposite direction, Artificial Potential Field synthesis.
Wherein, the robot navigation device that the present embodiment is provided can perform the technical side of the embodiment of the method for embodiment three Case, which realizes that principle is similar with technique effect, and here is omitted.
The robot navigation device based on Fusion that the present embodiment is provided, in the mobile control to robot In system strategy, based on the data judging current state that Ultrasonic Sensor Data or infrared sensor are gathered, if normally, basis The data of depth camera and laser radar sensor currently collection build local environment map, and are based on the local ring condition Figure, with reference to robot current location and programme path, control robot avoiding obstacles movement, otherwise, using default direction Selection strategy is fled from, so that it is guaranteed that the reliability of robot navigation.
Fig. 8 is that the structure of the robot navigation device based on Fusion that the embodiment of the present invention eight is provided is shown It is intended to, as shown in figure 8, on the basis of embodiment seven, the first control unit 542 includes:
Projection subelement 81, for carrying out denoising to the data of the depth camera currently collection, and by after denoising Data are projected to two dimensional surface, obtain two-dimensional projection data;
Subelement 82 is processed, if there is no the laser radar sensor collection for two-dimensional projection data relevant position Data, then be filled the local environment map datum of relevant position with corresponding two-dimensional projection data;
Subelement 82 is processed, is adopted if being additionally operable to the two-dimensional projection data relevant position and there is the laser radar sensor The data of collection, then be weighted the data that the two-dimensional projection data and the laser radar sensor are gathered averagely, to obtain Obtain the local environment map datum of relevant position.
In the present embodiment, as a example by using the movement of dynamic window simulation algorithm control robot, dynamic window simulation algorithm It is based on two-dimensional grid map, it is therefore desirable to which projecting subelement 81 first carries out denoising to three dimensional point cloud, then by three-dimensional point Cloud data carry out projection to two dimensional surface and obtain two-dimensional projection data, now by three dimensional point cloud point row projection, if a certain Obstacle object point is shown, then the projection result of this row is barrier, is otherwise projected as the free time.Subelement 82 is processed by two-dimensional projection's number Merged according to the two dimensional surface data gathered with laser radar sensor, specific strategy is, using two-dimensional projection data to swashing The 2-D data of optical radar sensor acquisition is updated, i.e., for the position that laser radar sensor is not observed directly uses The two-dimensional projection data of acquisition is filled, for the position that laser radar sensor is observed, using two-dimensional projection data and The data that laser radar sensor is collected are weighted averagely.The method is applied to multiple depth cameras and multiple laser thunders Up to sensor.
Wherein, the robot navigation device that the present embodiment is provided can perform the technical side of the embodiment of the method for example IV Case, which realizes that principle is similar with technique effect, and here is omitted.
The robot navigation device based on Fusion that the present embodiment is provided, depth camera is gathered Three dimensional point cloud carries out denoising projection, and the two dimension collected based on the two-dimensional projection data and laser radar sensor that obtain is put down Face data are merged, and build local environment map, realize mobile control to robot, and the present embodiment is by by two dimensional surface Data, three dimensional point cloud carry out fusion treatment, can reach preferable sector planning effect, improve navigation accuracy.
Finally it should be noted that:Various embodiments above only to illustrate technical scheme, rather than a limitation;To the greatest extent Pipe has been described in detail to the present invention with reference to foregoing embodiments, it will be understood by those within the art that:Its according to So the technical scheme described in foregoing embodiments can be modified, or which part or all technical characteristic are entered Row equivalent;And these modifications or replacement, do not make the essence of appropriate technical solution depart from various embodiments of the present invention technology The scope of scheme.

Claims (10)

1. a kind of robot navigation method based on Fusion, it is characterised in that include:
The data collected according to laser radar sensor and encoder data, using positioning immediately and map structuring technology, structure Build total environment map;
Gathered according to the laser radar sensor, acceierometer sensor, gyro sensor and magnetometer sensor in real time Data, the total environment map and encoder data, the robot is obtained described total by vision localization algorithm Current location in body environmental map;
In real time according to the total environment map and the current location of the robot, the machine is obtained by path planning algorithm Programme path of the device people from the current location to target location;
According to the current location and the programme path, passed using the laser radar sensor, depth camera, ultrasound wave Sensor and the data and encoder data of infrared sensor collection, by robot described in local paths planning algorithm controls Avoiding obstacles are moved.
2. method according to claim 1, it is characterised in that described in real time according to the laser radar sensor, acceleration Degree flowmeter sensor, the data of gyro sensor and magnetometer sensor collection, the total environment map and encoder number According to, current location of the robot in the total environment map is obtained by vision localization algorithm, including:
According to the data of magnetometer sensor collection, the attitude of the robot is initialized;
In real time according to current time with a upper moment described in acceierometer sensor and gyro sensor collection data, And the encoder data, current time was calculated relative to the position of robot described in a upper moment and attitudes vibration estimation;
Determine in the total environment map with the position and the attitudes vibration corresponding point map of estimation, obtain the point map Map datum;
By the data of the laser radar sensor currently collection are matched respectively with the map datum of the point map, The current location of the robot is obtained, the current location of the robot is the position of matching degree highest point map.
3. method according to claim 1, it is characterised in that described according to the current location and the programme path, The data gathered using the laser radar sensor, depth camera, ultrasonic sensor and infrared sensor and coding Device data, are moved by robot avoiding obstacles described in local paths planning algorithm controls, including:
Whether the data for being gathered according to the Ultrasonic Sensor Data or the infrared sensor in real time, judge the robot In mobile state of being obstructed;
If the robot is not in mobile state of being obstructed, worked as according to the depth camera and the laser radar sensor The data of front collection, build local environment map, and according to the local environment map, the current location of the robot and rule Route is drawn, is moved by robot avoiding obstacles described in local paths planning algorithm controls.
4. method according to claim 3, it is characterised in that described in real time according to the Ultrasonic Sensor Data or institute The data of infrared sensor collection are stated, after judging whether the robot is obstructed state in movement, is also included:
If the robot works as according to the Ultrasonic Sensor Data or the infrared sensor in mobile state of being obstructed The data of front collection and default set direction strategy, control the robot avoiding obstacles movement.
5. the method according to claim 3 or 4, it is characterised in that described according to the depth camera and the laser The data of radar sensor currently collection, build local environment map, including:
Denoising is carried out to the data of the depth camera currently collection, and the data after denoising are thrown to two dimensional surface Shadow, obtains two-dimensional projection data;
If there are no the data of the laser radar sensor collection in two-dimensional projection data relevant position, by the office of relevant position Portion's environmental map data are filled with corresponding two-dimensional projection data;
If the two-dimensional projection data relevant position has the data of the laser radar sensor collection, the two dimension is thrown The data of shadow data and laser radar sensor collection are weighted averagely, to obtain the local environment map of relevant position Data.
6. a kind of robot navigation device based on Fusion, it is characterised in that include:
Map structuring module, for the data that collected according to laser radar sensor and encoder data, using positioning immediately With map structuring technology, total environment map is built;
Real-time positioning module, in real time according to the laser radar sensor, acceierometer sensor, gyro sensor and Data, the total environment map and encoder data that magnetometer sensor is gathered, obtain institute by vision localization algorithm State current location of the robot in the total environment map;
Route planning module, in real time according to the total environment map and the current location of the robot, by path Planning algorithm obtains programme path of the robot from the current location to target location;
Control module, for according to the current location and the programme path, being taken the photograph using the laser radar sensor, depth As the data and encoder data of the collection of head, ultrasonic sensor and infrared sensor, by local paths planning algorithm control Make the robot avoiding obstacles movement.
7. device according to claim 6, it is characterised in that the real-time positioning module, including:
Initialization unit, for the data gathered according to the magnetometer sensor, is carried out initially to the attitude of the robot Change;
Estimation unit, in real time according to current time with a upper moment described in acceierometer sensor and the gyro sensors The data and the encoder data of device collection, calculated current time relative to the position of robot described in a upper moment and Attitudes vibration is estimated;
Acquiring unit, for determine in the total environment map with the position and the attitudes vibration corresponding point map of estimation, Obtain the map datum of the point map;
Matching unit, for by by the map datum of the laser radar sensor currently data of collection and the point map Matched respectively, obtained the current location of the robot, the current location of the robot is matching degree highest map The position of point.
8. device according to claim 6, it is characterised in that the control module, including:
Detector unit, for the data for being gathered according to the Ultrasonic Sensor Data or the infrared sensor in real time, judges Whether the robot is in mobile state of being obstructed;
First control unit, if detecting the robot for the detector unit is not in mobile state of being obstructed, basis The data of the depth camera and the laser radar sensor currently collection, build local environment map, and according to described Local environment map, the current location of the robot and programme path, by machine described in local paths planning algorithm controls People's avoiding obstacles are moved.
9. device according to claim 8, it is characterised in that the control module, also includes:
Second control unit, if detecting the robot in mobile state of being obstructed for the detector unit, according to institute The data and default set direction strategy of Ultrasonic Sensor Data or the infrared sensor currently collection are stated, control is described Robot avoiding obstacles movement.
10. device according to claim 8 or claim 9, it is characterised in that first control unit, including:
Projection subelement, for the data of the depth camera currently collection are carried out with denoising, and by the data after denoising to Two dimensional surface is projected, and obtains two-dimensional projection data;
Subelement is processed, if there are no the data of the laser radar sensor collection for two-dimensional projection data relevant position, Then the local environment map datum of relevant position is filled with corresponding two-dimensional projection data;
The process subelement, if being additionally operable to the two-dimensional projection data relevant position has the laser radar sensor collection Data, then the data that the two-dimensional projection data and the laser radar sensor are gathered are weighted averagely, with acquisition The local environment map datum of relevant position.
CN201710061225.1A 2017-01-25 2017-01-25 Robot navigation method and device based on multi-sensor data fusion Pending CN106681330A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710061225.1A CN106681330A (en) 2017-01-25 2017-01-25 Robot navigation method and device based on multi-sensor data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710061225.1A CN106681330A (en) 2017-01-25 2017-01-25 Robot navigation method and device based on multi-sensor data fusion

Publications (1)

Publication Number Publication Date
CN106681330A true CN106681330A (en) 2017-05-17

Family

ID=58859252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710061225.1A Pending CN106681330A (en) 2017-01-25 2017-01-25 Robot navigation method and device based on multi-sensor data fusion

Country Status (1)

Country Link
CN (1) CN106681330A (en)

Cited By (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092264A (en) * 2017-06-21 2017-08-25 北京理工大学 Towards the service robot autonomous navigation and automatic recharging method of bank's hall environment
CN107167148A (en) * 2017-05-24 2017-09-15 安科机器人有限公司 Synchronous superposition method and apparatus
CN107193283A (en) * 2017-07-27 2017-09-22 青岛诺动机器人有限公司 The mobile robot and its operating method of a kind of independent navigation
CN107193282A (en) * 2017-06-16 2017-09-22 北京军立方机器人科技有限公司 A kind of intelligent security guard robot and intelligent safety and defence system
CN107291082A (en) * 2017-07-18 2017-10-24 佛山科学技术学院 A kind of control system and its control method of earthworm bio-robot
CN107284544A (en) * 2017-07-30 2017-10-24 福州大学 A kind of multi-functional General Mobile robot chassis and its application process
CN107301654A (en) * 2017-06-12 2017-10-27 西北工业大学 A kind of positioning immediately of the high accuracy of multisensor is with building drawing method
CN107357297A (en) * 2017-08-21 2017-11-17 深圳市镭神智能系统有限公司 A kind of sweeping robot navigation system and its air navigation aid
CN107422735A (en) * 2017-07-29 2017-12-01 深圳力子机器人有限公司 A kind of trackless navigation AGV laser and visual signature hybrid navigation method
CN107462892A (en) * 2017-07-28 2017-12-12 深圳普思英察科技有限公司 Mobile robot synchronous superposition method based on more sonacs
CN107655473A (en) * 2017-09-20 2018-02-02 南京航空航天大学 Spacecraft based on SLAM technologies is with respect to autonomous navigation system
CN107703935A (en) * 2017-09-12 2018-02-16 安徽胜佳和电子科技有限公司 Multiple data weighting fusions carry out method, storage device and the mobile terminal of avoidance
CN108168560A (en) * 2017-12-27 2018-06-15 沈阳智远弘业机器人有限公司 A kind of complex navigation control method for omnidirectional AGV
CN108196548A (en) * 2018-01-08 2018-06-22 四川文理学院 A kind of robot controller based on Arduino language and path following algorithm
CN108241463A (en) * 2018-01-02 2018-07-03 京东方科技集团股份有限公司 The control method and its control device of projection arrangement, projection arrangement
CN108326845A (en) * 2017-12-11 2018-07-27 浙江捷尚人工智能研究发展有限公司 Robot localization method, apparatus and system based on binocular camera and laser radar
CN108375373A (en) * 2018-01-30 2018-08-07 深圳市同川科技有限公司 Robot and its air navigation aid, navigation device
CN108422419A (en) * 2018-02-09 2018-08-21 上海芯智能科技有限公司 A kind of intelligent robot and its control method and system
CN108500992A (en) * 2018-04-09 2018-09-07 中山火炬高新企业孵化器有限公司 A kind of multi-functional mobile security robot
CN108731664A (en) * 2018-05-18 2018-11-02 深圳清创新科技有限公司 Robotary method of estimation, device, computer equipment and storage medium
CN108776474A (en) * 2018-05-24 2018-11-09 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108839060A (en) * 2018-05-31 2018-11-20 芜湖星途机器人科技有限公司 Auto-navigation robot
CN108839061A (en) * 2018-05-31 2018-11-20 芜湖星途机器人科技有限公司 Auto-navigation robot
CN108858226A (en) * 2018-07-20 2018-11-23 佛山科学技术学院 A kind of tableware intelligence recycling machine people of Multi-sensor Fusion SLAM technology
CN108958250A (en) * 2018-07-13 2018-12-07 华南理工大学 Multisensor mobile platform and navigation and barrier-avoiding method based on known map
CN109059927A (en) * 2018-08-21 2018-12-21 南京邮电大学 The mobile robot slam of multisensor builds drawing method and system under complex environment
CN109062201A (en) * 2018-07-23 2018-12-21 南京理工大学 Intelligent navigation micro-system and its control method based on ROS
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
CN109141364A (en) * 2018-08-01 2019-01-04 北京进化者机器人科技有限公司 Obstacle detection method, system and robot
CN109164804A (en) * 2018-08-27 2019-01-08 苏州边际智能科技有限公司 One kind being based on robot control system combined of multi-sensor information
CN109298629A (en) * 2017-07-24 2019-02-01 来福机器人 For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace
CN109318890A (en) * 2018-06-29 2019-02-12 北京理工大学 A kind of unmanned vehicle dynamic obstacle avoidance method based on dynamic window and barrier potential energy field
CN109357696A (en) * 2018-09-28 2019-02-19 西南电子技术研究所(中国电子科技集团公司第十研究所) Multiple Source Sensor information merges closed loop test framework
CN109375629A (en) * 2018-12-05 2019-02-22 苏州博众机器人有限公司 A kind of cruiser and its barrier-avoiding method that navigates
CN109483507A (en) * 2018-12-04 2019-03-19 北京壹氢科技有限公司 A kind of indoor vision positioning method of multiple wheeled robot walkings
CN109506661A (en) * 2019-01-11 2019-03-22 轻客小觅智能科技(北京)有限公司 A kind of localization method of robot, device, robot and storage medium
CN109582032A (en) * 2018-10-11 2019-04-05 天津大学 Quick Real Time Obstacle Avoiding routing resource of the multi-rotor unmanned aerial vehicle under complex environment
WO2019080156A1 (en) * 2017-10-24 2019-05-02 深圳市沃特沃德股份有限公司 Robot moving method and device and robot
CN109874655A (en) * 2019-04-24 2019-06-14 贺梓庭 A kind of automatic control system of potting flower sprinkler
CN109960254A (en) * 2017-12-25 2019-07-02 深圳市优必选科技有限公司 Robot and its paths planning method
CN109959935A (en) * 2017-12-14 2019-07-02 北京欣奕华科技有限公司 A kind of map method for building up, map establish device and robot
CN109978272A (en) * 2019-03-30 2019-07-05 华南理工大学 A kind of path planning system and method based on multiple omni-directional mobile robots
CN110220517A (en) * 2019-07-08 2019-09-10 紫光云技术有限公司 A kind of Indoor Robot robust slam method of the combining environmental meaning of one's words
CN110320913A (en) * 2019-07-10 2019-10-11 苏州欧博智慧机器人有限公司 The unmanned control device and method of low speed
CN110377029A (en) * 2019-06-27 2019-10-25 北京汽车集团有限公司 The control method and device of Vehicular automatic driving
CN110427002A (en) * 2019-07-29 2019-11-08 南京市晨枭软件技术有限公司 A kind of automatic inspection system and method for intelligence manufacture
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110658816A (en) * 2019-09-27 2020-01-07 东南大学 Mobile robot navigation and control method based on intelligent assembly
CN110673603A (en) * 2019-10-31 2020-01-10 郑州轻工业学院 Fire scene autonomous navigation reconnaissance robot
CN110750097A (en) * 2019-10-17 2020-02-04 上海飒智智能科技有限公司 Indoor robot navigation system and map building, positioning and moving method
CN110782506A (en) * 2019-11-21 2020-02-11 大连理工大学 Method for constructing grid map by fusing infrared camera and depth camera
CN110823211A (en) * 2019-10-29 2020-02-21 珠海市一微半导体有限公司 Multi-sensor map construction method, device and chip based on visual SLAM
CN110873562A (en) * 2018-08-29 2020-03-10 香港商女娲创造股份有限公司 Robot navigation system
CN110882110A (en) * 2019-12-02 2020-03-17 深圳职业技术学院 Small-area intelligent traveling wheelchair
CN110916562A (en) * 2018-09-18 2020-03-27 科沃斯机器人股份有限公司 Autonomous mobile device, control method, and storage medium
CN110967028A (en) * 2019-11-26 2020-04-07 深圳优地科技有限公司 Navigation map construction method and device, robot and storage medium
CN111006655A (en) * 2019-10-21 2020-04-14 南京理工大学 Multi-scene autonomous navigation positioning method for airport inspection robot
CN111123925A (en) * 2019-12-19 2020-05-08 天津联汇智造科技有限公司 Mobile robot navigation system and method
CN111168669A (en) * 2019-12-26 2020-05-19 上海高仙自动化科技发展有限公司 Robot control method, robot, and readable storage medium
CN111275750A (en) * 2020-01-19 2020-06-12 武汉大学 Indoor space panoramic image generation method based on multi-sensor fusion
CN111338361A (en) * 2020-05-22 2020-06-26 浙江远传信息技术股份有限公司 Obstacle avoidance method, device, equipment and medium for low-speed unmanned vehicle
CN111432171A (en) * 2020-03-19 2020-07-17 上海品览数据科技有限公司 Vision-based commodity checking device and system and checking method thereof
CN111609852A (en) * 2019-02-25 2020-09-01 北京奇虎科技有限公司 Semantic map construction method, sweeping robot and electronic equipment
CN111619556A (en) * 2020-05-22 2020-09-04 奇瑞汽车股份有限公司 Obstacle avoidance control method and device for automobile and storage medium
CN111665470A (en) * 2019-03-07 2020-09-15 阿里巴巴集团控股有限公司 Positioning method and device and robot
CN111949027A (en) * 2020-08-10 2020-11-17 珠海一维弦机器人有限公司 Self-adaptive robot navigation method and device
RU2740229C1 (en) * 2020-03-19 2021-01-12 Автономная некоммерческая образовательная организация высшего образования "Сколковский институт науки и технологий" Method of localizing and constructing navigation maps of mobile service robot
CN112454348A (en) * 2019-09-06 2021-03-09 李臣学 Intelligent robot
WO2021051754A1 (en) * 2019-09-17 2021-03-25 五邑大学 Intelligent medical supply replenishing robot based on internet of things and slam technology
CN112578778A (en) * 2019-09-27 2021-03-30 上海尊颐智能科技有限公司 Navigation control method and navigation system of function auxiliary device
US10990104B2 (en) 2019-01-10 2021-04-27 General Electric Company Systems and methods including motorized apparatus for calibrating sensors
CN112947433A (en) * 2021-02-03 2021-06-11 中国农业大学 Orchard mobile robot and autonomous navigation method thereof
CN112985505A (en) * 2021-03-02 2021-06-18 清华大学 Indoor environment space-time distribution field generation method combining mobile perception and fixed perception
CN113189987A (en) * 2021-04-19 2021-07-30 西安交通大学 Complex terrain path planning method and system based on multi-sensor information fusion
CN113226143A (en) * 2018-12-18 2021-08-06 特里纳米克斯股份有限公司 Autonomous household appliance
CN113282088A (en) * 2021-05-21 2021-08-20 潍柴动力股份有限公司 Unmanned driving method, device and equipment of engineering vehicle, storage medium and engineering vehicle
CN113485381A (en) * 2021-08-24 2021-10-08 山东新一代信息产业技术研究院有限公司 Robot moving system and method based on multiple sensors
CN113793351A (en) * 2021-09-30 2021-12-14 中国人民解放军国防科技大学 Laser filling method and device of multi-layer contour pattern based on contour line
CN114222366A (en) * 2021-08-06 2022-03-22 深圳技术大学 Indoor positioning method and device based on single base station
CN114371716A (en) * 2022-01-20 2022-04-19 红骐科技(杭州)有限公司 Automatic driving inspection method for fire-fighting robot
CN114384920A (en) * 2022-03-23 2022-04-22 安徽大学 Dynamic obstacle avoidance method based on real-time construction of local grid map
CN115283360A (en) * 2022-10-08 2022-11-04 天津盛安机械设备有限公司 Automatic visual point cloud path planning system and method based on intelligent subway purging
CN115384657A (en) * 2022-09-16 2022-11-25 中国民航大学 Intelligent robot based on laser positioning
WO2023036083A1 (en) * 2021-09-08 2023-03-16 汤恩智能科技(上海)有限公司 Sensor data processing method and system, and readable storage medium
CN116225029A (en) * 2023-05-05 2023-06-06 北华航天工业学院 Robot path planning method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050182518A1 (en) * 2004-02-13 2005-08-18 Evolution Robotics, Inc. Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
CN103777629A (en) * 2013-09-05 2014-05-07 武汉汉迪机器人科技有限公司 Self-guide carrying platform and navigation control method for carrying platform
CN103914068A (en) * 2013-01-07 2014-07-09 中国人民解放军第二炮兵工程大学 Service robot autonomous navigation method based on raster maps
CN105492985A (en) * 2014-09-05 2016-04-13 深圳市大疆创新科技有限公司 Multi-sensor environment map building
CN105783913A (en) * 2016-03-08 2016-07-20 中山大学 SLAM device integrating multiple vehicle-mounted sensors and control method of device
CN106325275A (en) * 2016-09-14 2017-01-11 广州今甲智能科技有限公司 Robot navigation system, robot navigation method and robot navigation device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050182518A1 (en) * 2004-02-13 2005-08-18 Evolution Robotics, Inc. Robust sensor fusion for mapping and localization in a simultaneous localization and mapping (SLAM) system
CN103914068A (en) * 2013-01-07 2014-07-09 中国人民解放军第二炮兵工程大学 Service robot autonomous navigation method based on raster maps
CN103777629A (en) * 2013-09-05 2014-05-07 武汉汉迪机器人科技有限公司 Self-guide carrying platform and navigation control method for carrying platform
CN105492985A (en) * 2014-09-05 2016-04-13 深圳市大疆创新科技有限公司 Multi-sensor environment map building
CN105783913A (en) * 2016-03-08 2016-07-20 中山大学 SLAM device integrating multiple vehicle-mounted sensors and control method of device
CN106325275A (en) * 2016-09-14 2017-01-11 广州今甲智能科技有限公司 Robot navigation system, robot navigation method and robot navigation device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王芳 等: "惯性与传感器技术在智能机器人导航控制中的应用", 《惯性技术发展动态发展方向研讨会文集》 *

Cited By (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107167148A (en) * 2017-05-24 2017-09-15 安科机器人有限公司 Synchronous superposition method and apparatus
CN107301654A (en) * 2017-06-12 2017-10-27 西北工业大学 A kind of positioning immediately of the high accuracy of multisensor is with building drawing method
CN107193282A (en) * 2017-06-16 2017-09-22 北京军立方机器人科技有限公司 A kind of intelligent security guard robot and intelligent safety and defence system
CN107193282B (en) * 2017-06-16 2020-07-14 哈工大机器人集团北京军立方科技有限公司 Intelligent security robot and intelligent security system
CN107092264A (en) * 2017-06-21 2017-08-25 北京理工大学 Towards the service robot autonomous navigation and automatic recharging method of bank's hall environment
CN107291082A (en) * 2017-07-18 2017-10-24 佛山科学技术学院 A kind of control system and its control method of earthworm bio-robot
CN109298629A (en) * 2017-07-24 2019-02-01 来福机器人 For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace
CN109298629B (en) * 2017-07-24 2023-08-15 南京市远弗科技有限公司 System and method for guiding mobile platform in non-mapped region
CN107193283A (en) * 2017-07-27 2017-09-22 青岛诺动机器人有限公司 The mobile robot and its operating method of a kind of independent navigation
CN107462892A (en) * 2017-07-28 2017-12-12 深圳普思英察科技有限公司 Mobile robot synchronous superposition method based on more sonacs
CN107462892B (en) * 2017-07-28 2021-11-30 深圳市远弗科技有限公司 Mobile robot synchronous positioning and map construction method based on multiple ultrasonic sensors
CN107422735A (en) * 2017-07-29 2017-12-01 深圳力子机器人有限公司 A kind of trackless navigation AGV laser and visual signature hybrid navigation method
CN107284544A (en) * 2017-07-30 2017-10-24 福州大学 A kind of multi-functional General Mobile robot chassis and its application process
CN107357297A (en) * 2017-08-21 2017-11-17 深圳市镭神智能系统有限公司 A kind of sweeping robot navigation system and its air navigation aid
CN107703935A (en) * 2017-09-12 2018-02-16 安徽胜佳和电子科技有限公司 Multiple data weighting fusions carry out method, storage device and the mobile terminal of avoidance
CN107655473A (en) * 2017-09-20 2018-02-02 南京航空航天大学 Spacecraft based on SLAM technologies is with respect to autonomous navigation system
CN107655473B (en) * 2017-09-20 2020-07-28 南京航空航天大学 Relative autonomous navigation system of spacecraft based on S L AM technology
WO2019080156A1 (en) * 2017-10-24 2019-05-02 深圳市沃特沃德股份有限公司 Robot moving method and device and robot
CN108326845A (en) * 2017-12-11 2018-07-27 浙江捷尚人工智能研究发展有限公司 Robot localization method, apparatus and system based on binocular camera and laser radar
CN108326845B (en) * 2017-12-11 2020-06-26 浙江捷尚人工智能研究发展有限公司 Robot positioning method, device and system based on binocular camera and laser radar
CN109959935A (en) * 2017-12-14 2019-07-02 北京欣奕华科技有限公司 A kind of map method for building up, map establish device and robot
CN109959935B (en) * 2017-12-14 2020-10-23 北京欣奕华科技有限公司 Map establishing method, map establishing device and robot
CN109960254A (en) * 2017-12-25 2019-07-02 深圳市优必选科技有限公司 Robot and its paths planning method
CN109960254B (en) * 2017-12-25 2022-09-23 深圳市优必选科技有限公司 Robot and path planning method thereof
CN108168560A (en) * 2017-12-27 2018-06-15 沈阳智远弘业机器人有限公司 A kind of complex navigation control method for omnidirectional AGV
CN108168560B (en) * 2017-12-27 2021-06-08 沈阳智远弘业机器人有限公司 Composite navigation control method for omnidirectional AGV
CN108241463B (en) * 2018-01-02 2021-02-12 京东方科技集团股份有限公司 Control method and control device of projection device and projection device
CN108241463A (en) * 2018-01-02 2018-07-03 京东方科技集团股份有限公司 The control method and its control device of projection arrangement, projection arrangement
CN108196548A (en) * 2018-01-08 2018-06-22 四川文理学院 A kind of robot controller based on Arduino language and path following algorithm
CN108375373A (en) * 2018-01-30 2018-08-07 深圳市同川科技有限公司 Robot and its air navigation aid, navigation device
CN108422419A (en) * 2018-02-09 2018-08-21 上海芯智能科技有限公司 A kind of intelligent robot and its control method and system
CN108500992A (en) * 2018-04-09 2018-09-07 中山火炬高新企业孵化器有限公司 A kind of multi-functional mobile security robot
CN108731664B (en) * 2018-05-18 2020-08-11 深圳一清创新科技有限公司 Robot state estimation method, device, computer equipment and storage medium
CN108731664A (en) * 2018-05-18 2018-11-02 深圳清创新科技有限公司 Robotary method of estimation, device, computer equipment and storage medium
CN108776474B (en) * 2018-05-24 2022-03-15 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108776474A (en) * 2018-05-24 2018-11-09 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108839060A (en) * 2018-05-31 2018-11-20 芜湖星途机器人科技有限公司 Auto-navigation robot
CN108839061A (en) * 2018-05-31 2018-11-20 芜湖星途机器人科技有限公司 Auto-navigation robot
CN109318890A (en) * 2018-06-29 2019-02-12 北京理工大学 A kind of unmanned vehicle dynamic obstacle avoidance method based on dynamic window and barrier potential energy field
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
CN108958250A (en) * 2018-07-13 2018-12-07 华南理工大学 Multisensor mobile platform and navigation and barrier-avoiding method based on known map
CN108858226A (en) * 2018-07-20 2018-11-23 佛山科学技术学院 A kind of tableware intelligence recycling machine people of Multi-sensor Fusion SLAM technology
CN109062201B (en) * 2018-07-23 2021-09-03 南京理工大学 ROS-based intelligent navigation microsystem and control method thereof
CN109062201A (en) * 2018-07-23 2018-12-21 南京理工大学 Intelligent navigation micro-system and its control method based on ROS
CN109141364A (en) * 2018-08-01 2019-01-04 北京进化者机器人科技有限公司 Obstacle detection method, system and robot
CN109141364B (en) * 2018-08-01 2020-11-03 北京进化者机器人科技有限公司 Obstacle detection method and system and robot
CN109059927A (en) * 2018-08-21 2018-12-21 南京邮电大学 The mobile robot slam of multisensor builds drawing method and system under complex environment
CN109164804A (en) * 2018-08-27 2019-01-08 苏州边际智能科技有限公司 One kind being based on robot control system combined of multi-sensor information
CN110873562A (en) * 2018-08-29 2020-03-10 香港商女娲创造股份有限公司 Robot navigation system
CN110916562A (en) * 2018-09-18 2020-03-27 科沃斯机器人股份有限公司 Autonomous mobile device, control method, and storage medium
CN109357696B (en) * 2018-09-28 2020-10-23 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-source sensor information fusion closed-loop testing framework
CN109357696A (en) * 2018-09-28 2019-02-19 西南电子技术研究所(中国电子科技集团公司第十研究所) Multiple Source Sensor information merges closed loop test framework
CN109582032B (en) * 2018-10-11 2021-10-12 天津大学 Multi-rotor unmanned aerial vehicle rapid real-time obstacle avoidance path selection method in complex environment
CN109582032A (en) * 2018-10-11 2019-04-05 天津大学 Quick Real Time Obstacle Avoiding routing resource of the multi-rotor unmanned aerial vehicle under complex environment
CN109483507B (en) * 2018-12-04 2021-06-29 北京壹氢科技有限公司 Indoor visual positioning method for walking of multiple wheeled robots
CN109483507A (en) * 2018-12-04 2019-03-19 北京壹氢科技有限公司 A kind of indoor vision positioning method of multiple wheeled robot walkings
CN109375629A (en) * 2018-12-05 2019-02-22 苏州博众机器人有限公司 A kind of cruiser and its barrier-avoiding method that navigates
CN113226143A (en) * 2018-12-18 2021-08-06 特里纳米克斯股份有限公司 Autonomous household appliance
US10990104B2 (en) 2019-01-10 2021-04-27 General Electric Company Systems and methods including motorized apparatus for calibrating sensors
CN109506661A (en) * 2019-01-11 2019-03-22 轻客小觅智能科技(北京)有限公司 A kind of localization method of robot, device, robot and storage medium
CN111609852A (en) * 2019-02-25 2020-09-01 北京奇虎科技有限公司 Semantic map construction method, sweeping robot and electronic equipment
CN111665470A (en) * 2019-03-07 2020-09-15 阿里巴巴集团控股有限公司 Positioning method and device and robot
CN109978272B (en) * 2019-03-30 2022-11-18 华南理工大学 Path planning system and method based on multiple omnidirectional mobile robots
CN109978272A (en) * 2019-03-30 2019-07-05 华南理工大学 A kind of path planning system and method based on multiple omni-directional mobile robots
CN109874655A (en) * 2019-04-24 2019-06-14 贺梓庭 A kind of automatic control system of potting flower sprinkler
CN110377029A (en) * 2019-06-27 2019-10-25 北京汽车集团有限公司 The control method and device of Vehicular automatic driving
CN110220517A (en) * 2019-07-08 2019-09-10 紫光云技术有限公司 A kind of Indoor Robot robust slam method of the combining environmental meaning of one's words
CN110320913A (en) * 2019-07-10 2019-10-11 苏州欧博智慧机器人有限公司 The unmanned control device and method of low speed
CN110427002A (en) * 2019-07-29 2019-11-08 南京市晨枭软件技术有限公司 A kind of automatic inspection system and method for intelligence manufacture
CN112454348A (en) * 2019-09-06 2021-03-09 李臣学 Intelligent robot
WO2021051754A1 (en) * 2019-09-17 2021-03-25 五邑大学 Intelligent medical supply replenishing robot based on internet of things and slam technology
CN110658816A (en) * 2019-09-27 2020-01-07 东南大学 Mobile robot navigation and control method based on intelligent assembly
CN110658816B (en) * 2019-09-27 2022-10-25 东南大学 Mobile robot navigation and control method based on intelligent component
CN112578778A (en) * 2019-09-27 2021-03-30 上海尊颐智能科技有限公司 Navigation control method and navigation system of function auxiliary device
CN110553652B (en) * 2019-10-12 2022-06-24 上海高仙自动化科技发展有限公司 Robot multi-sensor fusion positioning method and application thereof
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110750097A (en) * 2019-10-17 2020-02-04 上海飒智智能科技有限公司 Indoor robot navigation system and map building, positioning and moving method
CN111006655A (en) * 2019-10-21 2020-04-14 南京理工大学 Multi-scene autonomous navigation positioning method for airport inspection robot
CN110823211A (en) * 2019-10-29 2020-02-21 珠海市一微半导体有限公司 Multi-sensor map construction method, device and chip based on visual SLAM
CN110673603B (en) * 2019-10-31 2023-10-24 郑州轻工业大学 Fire scene autonomous navigation reconnaissance robot
CN110673603A (en) * 2019-10-31 2020-01-10 郑州轻工业学院 Fire scene autonomous navigation reconnaissance robot
CN110782506A (en) * 2019-11-21 2020-02-11 大连理工大学 Method for constructing grid map by fusing infrared camera and depth camera
CN110782506B (en) * 2019-11-21 2021-04-20 大连理工大学 Method for constructing grid map by fusing infrared camera and depth camera
CN110967028B (en) * 2019-11-26 2022-04-12 深圳优地科技有限公司 Navigation map construction method and device, robot and storage medium
CN110967028A (en) * 2019-11-26 2020-04-07 深圳优地科技有限公司 Navigation map construction method and device, robot and storage medium
CN110882110A (en) * 2019-12-02 2020-03-17 深圳职业技术学院 Small-area intelligent traveling wheelchair
CN111123925A (en) * 2019-12-19 2020-05-08 天津联汇智造科技有限公司 Mobile robot navigation system and method
CN111168669B (en) * 2019-12-26 2021-12-03 上海高仙自动化科技发展有限公司 Robot control method, robot, and readable storage medium
CN111168669A (en) * 2019-12-26 2020-05-19 上海高仙自动化科技发展有限公司 Robot control method, robot, and readable storage medium
CN111275750A (en) * 2020-01-19 2020-06-12 武汉大学 Indoor space panoramic image generation method based on multi-sensor fusion
CN111432171B (en) * 2020-03-19 2021-12-07 上海品览数据科技有限公司 Vision-based commodity checking device and system and checking method thereof
RU2740229C1 (en) * 2020-03-19 2021-01-12 Автономная некоммерческая образовательная организация высшего образования "Сколковский институт науки и технологий" Method of localizing and constructing navigation maps of mobile service robot
CN111432171A (en) * 2020-03-19 2020-07-17 上海品览数据科技有限公司 Vision-based commodity checking device and system and checking method thereof
CN111619556A (en) * 2020-05-22 2020-09-04 奇瑞汽车股份有限公司 Obstacle avoidance control method and device for automobile and storage medium
CN111619556B (en) * 2020-05-22 2022-05-03 奇瑞汽车股份有限公司 Obstacle avoidance control method and device for automobile and storage medium
CN111338361A (en) * 2020-05-22 2020-06-26 浙江远传信息技术股份有限公司 Obstacle avoidance method, device, equipment and medium for low-speed unmanned vehicle
CN111949027A (en) * 2020-08-10 2020-11-17 珠海一维弦机器人有限公司 Self-adaptive robot navigation method and device
CN112947433A (en) * 2021-02-03 2021-06-11 中国农业大学 Orchard mobile robot and autonomous navigation method thereof
CN112947433B (en) * 2021-02-03 2023-05-02 中国农业大学 Orchard mobile robot and autonomous navigation method thereof
CN112985505A (en) * 2021-03-02 2021-06-18 清华大学 Indoor environment space-time distribution field generation method combining mobile perception and fixed perception
CN113189987A (en) * 2021-04-19 2021-07-30 西安交通大学 Complex terrain path planning method and system based on multi-sensor information fusion
CN113282088A (en) * 2021-05-21 2021-08-20 潍柴动力股份有限公司 Unmanned driving method, device and equipment of engineering vehicle, storage medium and engineering vehicle
CN114222366A (en) * 2021-08-06 2022-03-22 深圳技术大学 Indoor positioning method and device based on single base station
CN113485381A (en) * 2021-08-24 2021-10-08 山东新一代信息产业技术研究院有限公司 Robot moving system and method based on multiple sensors
WO2023036083A1 (en) * 2021-09-08 2023-03-16 汤恩智能科技(上海)有限公司 Sensor data processing method and system, and readable storage medium
CN113793351A (en) * 2021-09-30 2021-12-14 中国人民解放军国防科技大学 Laser filling method and device of multi-layer contour pattern based on contour line
CN113793351B (en) * 2021-09-30 2023-06-02 中国人民解放军国防科技大学 Laser filling method and device for multilayer outline pattern based on contour lines
CN114371716A (en) * 2022-01-20 2022-04-19 红骐科技(杭州)有限公司 Automatic driving inspection method for fire-fighting robot
CN114384920A (en) * 2022-03-23 2022-04-22 安徽大学 Dynamic obstacle avoidance method based on real-time construction of local grid map
US11720110B2 (en) 2022-03-23 2023-08-08 Anhui University Dynamic obstacle avoidance method based on real-time local grid map construction
CN115384657A (en) * 2022-09-16 2022-11-25 中国民航大学 Intelligent robot based on laser positioning
CN115283360B (en) * 2022-10-08 2022-12-27 天津盛安机械设备有限公司 Automatic visual point cloud path planning system and method based on intelligent subway purging
CN115283360A (en) * 2022-10-08 2022-11-04 天津盛安机械设备有限公司 Automatic visual point cloud path planning system and method based on intelligent subway purging
CN116225029A (en) * 2023-05-05 2023-06-06 北华航天工业学院 Robot path planning method

Similar Documents

Publication Publication Date Title
CN106681330A (en) Robot navigation method and device based on multi-sensor data fusion
CN109916393B (en) Multi-grid-value navigation method based on robot pose and application thereof
CN104914865B (en) Intelligent Mobile Robot Position Fixing Navigation System and method
CN107655473B (en) Relative autonomous navigation system of spacecraft based on S L AM technology
CN105043396B (en) The method and system of self-built map in a kind of mobile robot room
CN103869814B (en) Terminal positioning and navigation method and mobile terminal
EP3086196B1 (en) Method and control system for surveying and mapping a terrain while operating a bulldozer
Kriegman et al. A mobile robot: Sensing, planning and locomotion
CN102042835B (en) Autonomous underwater vehicle combined navigation system
CN105955273A (en) Indoor robot navigation system and method
KR20180079428A (en) Apparatus and method for automatic localization
CN107966989A (en) A kind of robot autonomous navigation system
CN106168805A (en) The method of robot autonomous walking based on cloud computing
CN102368158B (en) Navigation positioning method of orchard machine
CN110275538A (en) Intelligent cruise vehicle navigation methods and systems
CN107544501A (en) A kind of intelligent robot wisdom traveling control system and its method
CN104236548A (en) Indoor autonomous navigation method for micro unmanned aerial vehicle
CN109541535A (en) A method of AGV indoor positioning and navigation based on UWB and vision SLAM
CN108345005A (en) The real-time continuous autonomous positioning orientation system and navigation locating method of tunnelling machine
CN103472823A (en) Raster map creating method for intelligent robot
CN115256414B (en) Mining drilling robot and coupling operation method thereof with geological and roadway model
CN112518739A (en) Intelligent self-navigation method for reconnaissance of tracked chassis robot
CN111949032A (en) 3D obstacle avoidance navigation system and method based on reinforcement learning
CN104406589B (en) Flight method of aircraft passing through radar area
CN113096190B (en) Omnidirectional mobile robot navigation method based on visual mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170517

RJ01 Rejection of invention patent application after publication