CN108021132A - Paths planning method - Google Patents

Paths planning method Download PDF

Info

Publication number
CN108021132A
CN108021132A CN201711222922.7A CN201711222922A CN108021132A CN 108021132 A CN108021132 A CN 108021132A CN 201711222922 A CN201711222922 A CN 201711222922A CN 108021132 A CN108021132 A CN 108021132A
Authority
CN
China
Prior art keywords
information
robot
gathered
paths planning
planning method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711222922.7A
Other languages
Chinese (zh)
Inventor
范传奇
梅志
易昊
徐健华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Xingtu Robot Technology Co Ltd
Original Assignee
Wuhu Xingtu Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhu Xingtu Robot Technology Co Ltd filed Critical Wuhu Xingtu Robot Technology Co Ltd
Priority to CN201711222922.7A priority Critical patent/CN108021132A/en
Publication of CN108021132A publication Critical patent/CN108021132A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The present invention relates to path planning field, open paths planning method, the paths planning method includes:Step 1, the peripheral information of robot is gathered in real time by multisensor;Step 2, place of arrival information is carried out to the planning in path according to the peripheral information, robot present position information and robot gathered;Step 3, robot is travelled according to the path planned.The path planning that the paths planning method overcomes existing mobile robot of the prior art is high to environmental requirement, it is also desirable to the problem of auxiliary equipment, realizes the path planning without auxiliary equipment.

Description

Paths planning method
Technical field
The present invention relates to path planning field, and in particular, to paths planning method.
Background technology
Path planning refers to, in the environment with barrier, according to certain evaluation criterion, finds one from starting shape State is to the collisionless path of dbjective state.Path planning employs Knowledge based engineering genetic algorithm in this algorithm, it is contained certainly The thought for so selecting and evolving, has very strong robustness.
The path planning mode of existing mobile robot is very much, such as inertia planning, magnetic planning, view planning, satellite planning Deng.These modes are respectively suitable for a variety of environment, including indoor and outdoors environment.But it is high to environmental requirement, also need Auxiliary equipment is wanted, in order to overcome the problem of above-mentioned, there is an urgent need for design a kind of paths planning method without auxiliary equipment.
The content of the invention
The object of the present invention is to provide a kind of paths planning method, v
To achieve these goals, the present invention provides a kind of paths planning method, which includes:
Step 1, the peripheral information of robot is gathered in real time by multisensor;
Step 2, place of arrival is believed according to the peripheral information, robot present position information and robot that are gathered Breath carries out the planning in path;
Step 3, robot is travelled according to the path planned.
Preferably, in step 1, gathering the method for the peripheral information of robot in real time by multisensor includes:
Gathered by monopod video camera, sonar sensor, infrared sensor and laser sensor and believed around robot Breath.
Preferably, in step 2, to being pre-processed by monopod video camera acquired image information, wherein, pre- place The method of reason includes:
Acquired image is subjected to gray processing, and carries out the smoothing processing of image.
Preferably, in step 2, to passing through monopod video camera, sonar sensor, infrared sensor and laser sensor The information gathered carries out information fusion, obtains peripheral information.
Preferably, in step 2, according to the peripheral information, robot present position information and robot gathered i.e. Place of arrival information is carried out to the method for the planning in path to be included:
Sensing robot present position information and peripheral information in real time.
Preferably, Image Edge-Detection and image segmentation are carried out successively to pretreated image information, to obtain machine The peripheral information of people.
Preferably, the method for the information progress information fusion to being gathered includes any one of in the following manner:Weighting is flat Equal method and Kalman filtering method.
Through the above technical solutions, the present invention has relatively low probabilistic related sensor, answering for map structuring is reduced or remitted Polygamy, obtains the description to barrier by distance measuring sensor in the unknown sector planning of environment, is carried with reference to visual sensor The environmental information of the abundant redundancy supplied so that perception of the robot to environment is more accurate and efficient, and robot interior Odometer then effectively provides the real-time posture information of robot, by correcting, corrects position deviation and the promotion of robot The acquisition of precise position information, improves the precision of positioning.
Other features and advantages of the present invention will be described in detail in subsequent specific embodiment part.
Brief description of the drawings
Attached drawing is for providing a further understanding of the present invention, and a part for constitution instruction, with following tool Body embodiment is used to explain the present invention together, but is not construed as limiting the invention.In the accompanying drawings:
Fig. 1 is the flow chart for illustrating a kind of paths planning method of the present invention.
Embodiment
The embodiment of the present invention is described in detail below in conjunction with attached drawing.It should be appreciated that this place is retouched The embodiment stated is merely to illustrate and explain the present invention, and is not intended to limit the invention.
The present invention provides a kind of paths planning method, which includes:
Step 1, the peripheral information of robot is gathered in real time by multisensor;
Step 2, place of arrival is believed according to the peripheral information, robot present position information and robot that are gathered Breath carries out the planning in path;
Step 3, robot is travelled according to the path planned.
Through the above technical solutions, the present invention has relatively low probabilistic related sensor, answering for map structuring is reduced or remitted Polygamy, obtains the description to barrier by distance measuring sensor in the unknown sector planning of environment, is carried with reference to visual sensor The environmental information of the abundant redundancy supplied so that perception of the robot to environment is more accurate and efficient, and robot interior Odometer then effectively provides the real-time posture information of robot, by correcting, corrects position deviation and the promotion of robot The acquisition of precise position information, improves the precision of positioning.In addition, present invention employs robot present position information and machine Device people is the mode for being combined two information of place of arrival information, and the peripheral information of robot is acquired in real time, and Path planning is carried out in real time, can greatly be travelled beneficial to the planning of robot.
In addition, also it is emphasized that the application employs a kind of new mode, robot appoints right nothing under the path of planning In the case that method travels, nearest route compared with route is planned, and so on.It may finally realize the arrival of shortest path.
In a kind of embodiment of the present invention, in step 1, the week of robot is gathered in real time by multisensor Enclosing the method for information can include:
Gathered by monopod video camera, sonar sensor, infrared sensor and laser sensor and believed around robot Breath.
Fully understanding for environmental information can be achieved using multiple sensors (internal sensor/external sensor), easy to machine Device people makes correct decision-making.Due to some undesirable features (such as sensor detection " blind area ") of sensor, information processing is not When or selection sensor between matching effect it is undesirable, the reasons such as Multi-sensor Fusion is poor, it is more accurate to hardly result in Reflect the cartographic model of true environment information.
In a kind of embodiment of the present invention, in step 2, to being believed by monopod video camera acquired image Breath is pre-processed, wherein, the method for pretreatment can include:
In order to realize the pretreatment of image information, acquired image is subjected to gray processing, and carries out the smooth place of image Reason.
In this kind of embodiment, in step 2, to by monopod video camera, sonar sensor, infrared sensor and The information that laser sensor is gathered carries out information fusion, obtains peripheral information.
The present invention a kind of embodiment in, in step 2, according to gathered peripheral information, institute of robot Place's positional information and robot are that the method for the planning that place of arrival information is carried out path can include:
Sensing robot present position information and peripheral information in real time.
In a kind of embodiment of the present invention, Image Edge-Detection is carried out successively to pretreated image information Split with image, to obtain the peripheral information of robot.
In a kind of embodiment of the present invention, the method for information fusion is carried out to the information gathered to be included Any one of in the following manner:Weighted mean method and Kalman filtering method.
It can analyze image information by Kalman filtering blending algorithm, realize the self-positioning of mobile robot herein. The fuzzy logic algorithm for the very big advantage having in uncertain expression, combined aspects, production are utilized in further path planning Raw more reliable, more accurately information, and reliable decision-making is made according to these information, so that appropriate path is obtained, enhancing letter The complementarity of breath, improves the robustness, flexibility and fault-tolerance of system.
The preferred embodiment of the present invention is described in detail above in association with attached drawing, still, the present invention is not limited to above-mentioned reality The detail in mode is applied, in the range of the technology design of the present invention, a variety of letters can be carried out to technical scheme Monotropic type, these simple variants belong to protection scope of the present invention.
It is further to note that each particular technique feature described in above-mentioned embodiment, in not lance In the case of shield, can be combined by any suitable means, in order to avoid unnecessary repetition, the present invention to it is various can The combination of energy no longer separately illustrates.
In addition, various embodiments of the present invention can be combined randomly, as long as it is without prejudice to originally The thought of invention, it should equally be considered as content disclosed in this invention.

Claims (7)

1. a kind of paths planning method, it is characterised in that the paths planning method includes:
Step 1, the peripheral information of robot is gathered in real time by multisensor;
Step 2, according to the peripheral information, robot present position information and robot gathered i.e. by place of arrival information into The planning in walking along the street footpath;
Step 3, robot is travelled according to the path planned.
2. paths planning method according to claim 1, it is characterised in that in step 1, adopted in real time by multisensor Collecting the method for the peripheral information of robot includes:
The peripheral information of robot is gathered by monopod video camera, sonar sensor, infrared sensor and laser sensor.
3. paths planning method according to claim 1, it is characterised in that in step 2, to passing through monopod video camera institute The image information of collection is pre-processed, wherein, the method for pretreatment includes:
Acquired image is subjected to gray processing, and carries out the smoothing processing of image.
4. paths planning method according to claim 2, it is characterised in that in step 2, to by monopod video camera, The information that sonar sensor, infrared sensor and laser sensor are gathered carries out information fusion, obtains peripheral information.
5. paths planning method according to claim 1, it is characterised in that in step 2, believe according to around being gathered Breath, robot present position information and robot are that the method for the planning that place of arrival information is carried out path includes:
Sensing robot present position information and peripheral information in real time.
6. paths planning method according to claim 4, it is characterised in that carried out successively to pretreated image information Image Edge-Detection and image segmentation, to obtain the peripheral information of robot.
7. paths planning method according to claim 1, it is characterised in that information fusion is carried out to the information gathered Method includes any one of in the following manner:Weighted mean method and Kalman filtering method.
CN201711222922.7A 2017-11-29 2017-11-29 Paths planning method Pending CN108021132A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711222922.7A CN108021132A (en) 2017-11-29 2017-11-29 Paths planning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711222922.7A CN108021132A (en) 2017-11-29 2017-11-29 Paths planning method

Publications (1)

Publication Number Publication Date
CN108021132A true CN108021132A (en) 2018-05-11

Family

ID=62077304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711222922.7A Pending CN108021132A (en) 2017-11-29 2017-11-29 Paths planning method

Country Status (1)

Country Link
CN (1) CN108021132A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110515381A (en) * 2019-08-22 2019-11-29 浙江迈睿机器人有限公司 Multi-sensor Fusion algorithm for positioning robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
JP5187758B2 (en) * 2008-12-26 2013-04-24 株式会社Ihiエアロスペース Unmanned mobile system
CN103400392A (en) * 2013-08-19 2013-11-20 山东鲁能智能技术有限公司 Binocular vision navigation system and method based on inspection robot in transformer substation
CN104036279A (en) * 2014-06-12 2014-09-10 北京联合大学 Intelligent vehicle running control method and system
CN104166400A (en) * 2014-07-11 2014-11-26 杭州精久科技有限公司 Multi-sensor fusion-based visual navigation AGV system
CN104267721A (en) * 2014-08-29 2015-01-07 陈业军 Unmanned driving system of intelligent automobile
CN106444780A (en) * 2016-11-10 2017-02-22 速感科技(北京)有限公司 Robot autonomous navigation method and system based on vision positioning algorithm
CN106652515A (en) * 2015-11-03 2017-05-10 中国电信股份有限公司 Vehicle automatic control method, vehicle automatic control device, and vehicle automatic control system
CN107161141A (en) * 2017-03-08 2017-09-15 深圳市速腾聚创科技有限公司 Pilotless automobile system and automobile

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5187758B2 (en) * 2008-12-26 2013-04-24 株式会社Ihiエアロスペース Unmanned mobile system
CN101975951A (en) * 2010-06-09 2011-02-16 北京理工大学 Field environment barrier detection method fusing distance and image information
CN103400392A (en) * 2013-08-19 2013-11-20 山东鲁能智能技术有限公司 Binocular vision navigation system and method based on inspection robot in transformer substation
CN104036279A (en) * 2014-06-12 2014-09-10 北京联合大学 Intelligent vehicle running control method and system
CN104166400A (en) * 2014-07-11 2014-11-26 杭州精久科技有限公司 Multi-sensor fusion-based visual navigation AGV system
CN104267721A (en) * 2014-08-29 2015-01-07 陈业军 Unmanned driving system of intelligent automobile
CN106652515A (en) * 2015-11-03 2017-05-10 中国电信股份有限公司 Vehicle automatic control method, vehicle automatic control device, and vehicle automatic control system
CN106444780A (en) * 2016-11-10 2017-02-22 速感科技(北京)有限公司 Robot autonomous navigation method and system based on vision positioning algorithm
CN107161141A (en) * 2017-03-08 2017-09-15 深圳市速腾聚创科技有限公司 Pilotless automobile system and automobile

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
张茂于: "《产业专利分析报告 第58册 自动驾驶》", 30 June 2017 *
杨帆: "《数字图像处理与分析》", 31 May 2015 *
段中兴: "《物联网传感技术》", 30 April 2014 *
江晶: "《运动传感器目标跟踪技术》", 30 April 2017 *
贺瑜飞: "小波分析和边缘检测在快速车牌定位中的应用", 《科技创新导报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110515381A (en) * 2019-08-22 2019-11-29 浙江迈睿机器人有限公司 Multi-sensor Fusion algorithm for positioning robot

Similar Documents

Publication Publication Date Title
EP3371671B1 (en) Method, device and assembly for map generation
KR101202695B1 (en) Autonomous movement device
TWI742554B (en) Positioning method, path determination method, robot and storage medium
JP2015532077A (en) Method for determining the position and orientation of an apparatus associated with an imaging apparatus that captures at least one image
US11656090B2 (en) Method and system for generating navigation data for a geographical location
CN108549376A (en) A kind of navigation locating method and system based on beacon
WO2016013095A1 (en) Autonomous moving device
CN115388902B (en) Indoor positioning method and system, AR indoor positioning navigation method and system
CN109445440A (en) The dynamic obstacle avoidance method with improvement Q learning algorithm is merged based on sensor
CN109459030B (en) Pedestrian positioning correction method and system based on landmarks
CN110260866A (en) A kind of robot localization and barrier-avoiding method of view-based access control model sensor
CN108549383A (en) A kind of community's robot navigation method of real-time multisensor
CN104898675A (en) Robot intelligent navigation control method
Choi et al. Efficient simultaneous localization and mapping based on ceiling-view: ceiling boundary feature map approach
CN110069058A (en) Navigation control method in a kind of robot chamber
Herath et al. Fusion-dhl: Wifi, imu, and floorplan fusion for dense history of locations in indoor environments
WO2022188333A1 (en) Walking method and apparatus, and computer storage medium
Liu Improvement of navigation of Mobile Robotics based on IoT System
CN108021132A (en) Paths planning method
Kim et al. Implementation of a mobile multi-target search system with 3D slam and object localization in indoor environments
Wen et al. Visual navigation of an indoor mobile robot based on a novel artificial landmark system
CN114236553B (en) Autonomous mobile robot positioning method based on deep learning
CN109727269A (en) Monocular vision and road map based matching positioning method
JP2839281B2 (en) Mobile object self-localization method
EP3637056B1 (en) Method and system for generating navigation data for a geographical location

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180511