CN112965481A - Orchard operation robot unmanned driving method based on point cloud map - Google Patents

Orchard operation robot unmanned driving method based on point cloud map Download PDF

Info

Publication number
CN112965481A
CN112965481A CN202110136495.0A CN202110136495A CN112965481A CN 112965481 A CN112965481 A CN 112965481A CN 202110136495 A CN202110136495 A CN 202110136495A CN 112965481 A CN112965481 A CN 112965481A
Authority
CN
China
Prior art keywords
robot
point cloud
orchard
cloud map
gps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110136495.0A
Other languages
Chinese (zh)
Inventor
付克昌
蔡明希
蒋涛
胡泽
邹艳玲
张林帅
王月红
罗辉
胥祥
崔亚男
段翠萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu University of Information Technology
Original Assignee
Chengdu University of Information Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu University of Information Technology filed Critical Chengdu University of Information Technology
Priority to CN202110136495.0A priority Critical patent/CN112965481A/en
Publication of CN112965481A publication Critical patent/CN112965481A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an orchard operation robot unmanned driving method based on a point cloud map, which comprises the following steps: step one, respectively positioning tree crowns and road signs around an orchard area through a differential GPS, and marking absolute positions of trees and road signs; scanning and recording a key characteristic trunk and recording corresponding characteristic signals by adopting a laser radar to obtain a corresponding point cloud map; marking a real-time position on the point cloud map by the operating robot; and step four, continuously scanning the trunk information through a laser radar to correspond to the GPS signal points on the point cloud map so as to calculate the advancing pose of the robot and correct the advancing direction of the robot. The invention provides an unmanned orchard operation robot driving method based on a point cloud map, which can realize the functions of accurate travel, turning and next line searching of a robot by using the point cloud map with position labels added in an orchard and inertial navigation assistance under the condition that the robot does not have a GPS between lines, and meets the requirement of continuous operation of the robot in the orchard.

Description

Orchard operation robot unmanned driving method based on point cloud map
Technical Field
The invention relates to the technical field of navigation. More specifically, the invention relates to an orchard operation robot unmanned driving method based on a point cloud map.
Background
The multifunctional agricultural robot is widely applied, so that the agricultural robot can replace manual work to finish various agricultural activities more and more in a wide field. Orchard production in agriculture is a complex task, requires a large amount of manpower and material resources, and meanwhile, a non-precise orchard management mode can generate a large amount of invalid investment and ecological pollution, so that the price of fruits is increased. Aiming at the situations, the development of an intelligent and accurate robot suitable for orchard operation is imperative. The robot replaces manual work to independently work in an orchard, an environment map is constructed to be the primary task of the robot for completing positioning and autonomous navigation, and the constructed map can help the robot to perform tasks such as path planning, fruit tree position prejudgment, and turning over at the ground.
In an orchard, autonomous work of a robot needs to be finished in a passageway between two rows of fruit trees, namely, each tree is operated along the passageway, and when the tree operation in one row is finished, the robot turns to the other row of the passageway to work until the whole orchard is finished. The robot can generally be located below the crown in the working process, if the robot is located by only depending on the GPS, the problems that satellite signals are shielded by the crown, multipath effect, radio frequency interference and the like can occur, and finally the robot position error is large, and even the robot cannot be located. The distance between orchard lines is mostly only about 3m, and the GPS is used for hardly acquiring an accurate inter-line positioning navigation effect; the electromagnetic navigation positioning can have better positioning and navigation effects, but the cost is higher.
Disclosure of Invention
An object of the present invention is to solve at least the above problems and/or disadvantages and to provide at least the advantages described hereinafter.
To achieve these objects and other advantages in accordance with the present invention, there is provided an orchard operation robot unmanned method based on a point cloud map, comprising:
step one, respectively positioning tree crowns and road signs around an orchard area through a differential GPS, and marking absolute positions of trees and road signs;
scanning and recording a key characteristic trunk and recording corresponding characteristic signals by adopting a laser radar to obtain a corresponding point cloud map;
acquiring a current GPS signal by the working robot before entering the orchard so as to mark a real-time position on a point cloud map;
step four, continuously scanning trunk information through a laser radar when the working robot travels in the orchard to correspond to GPS signal points on the point cloud map, calculating the traveling pose of the robot in real time in the autonomous traveling process, and further correcting the traveling direction of the robot;
and step five, when the working robot moves to the end of each row of the orchard, selecting a turning mode based on the point cloud map until all rows of the orchard are traversed.
Preferably, in the first step, the obtaining manner of the absolute position is configured to include:
s1, collecting data, arranging an elevated frame which carries a GPS antenna and is higher than the tree crowns on the mobile robot, carrying a laser radar in the middle of the elevated frame to carry out global scanning on the closed and structured orchard, enabling each tree trunk to be scanned, and configuring the stay time of the GPS antenna on the top end of each tree crown during scanning to be more than ten seconds;
and S2, processing data, extracting the GPS points in the stay time of each trunk by a processor on the mobile robot, and removing the offset points by a particle filter algorithm to obtain the GPS signal points corresponding to the absolute positions of the trunks.
Preferably, in the second step, the manner of acquiring the point cloud map is configured to include:
s3, calculating the middle point between two adjacent tree coordinates by the processor, and connecting the middle points to obtain a passable path;
s4, playing point cloud information bag obtained by scanning trunk and ground features through a laser radar under ros, loading the point cloud information bag by using a Lio-Sam algorithm, storing the point cloud information bag frame by frame as a pcd file, and taking the last file as a map;
and S5, adding the GPS signal points into the map in a multi-mark point cloud labeling mode by selecting sparse control points to obtain a corresponding point cloud map.
Preferably, in the fourth step, the autonomous traveling mode is configured to scan the trunk through the laser radar to obtain the corresponding GPS signal point tag after the robot enters the space between rows and the GPS signal disappears, measure the distance information of the trunk on the left and right sides through the laser radar to calculate the position of the center line, and compare the passable nodes to autonomously travel along the center line.
Preferably, in step four, the real-time estimation of the robot traveling pose and the correction of the traveling direction of the robot are configured to include:
s6, scanning a trunk by using a laser radar, and performing feature matching on the scanned point cloud information and a point cloud map to acquire GPS information of the current position;
and S7, carrying out data fusion on the GPS information and the IMU through the EKF to obtain accurate positioning data, wherein the shielding disappears when the robot is driven out, and after the differential GPS signal is recovered, correcting the position information at the moment through the positioning data to ensure that the pose of the robot is correct before the next steering is carried out.
Preferably, the mode of selecting the turn around mode is configured to include:
s8, when the robot moves to the end of a row in an orchard, turning information is obtained by scanning information of two trees at the head of the row;
and S9, when the radar mounting center line of the robot is parallel to the trunk of the line head, detecting the distance between the robot and the enclosing wall, and selecting any one of a semicircular arc head dropping method, an interlaced head dropping method and a 3/4 circular arc head dropping method to enter a new line to continuously drive by scanning the distance of the boundary line until all the lines are traversed.
The invention at least comprises the following beneficial effects: the invention can realize the functions of accurate travel, turning and searching the next row of the robot by using a point cloud map of a position label added in the orchard under the condition that the robot cannot be positioned depending on GPS positioning in the orchard under the condition that the robot is not positioned between rows, and the point cloud map and inertial navigation are used for assisting, thereby meeting the requirement of continuous work of the robot in the orchard, ensuring accurate positioning and controllable cost when a GPS signal is lost in the orchard.
Secondly, aiming at the condition that the robot can not be accurately controlled to turn around in general planning, the invention can lead the robot to select different turning strategies according to the different distances between the current position and the border line of the fruit tree walking head, thereby greatly improving the success rate of turning around and improving the navigation efficiency.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a technical roadmap for an embodiment of the invention;
FIG. 2 is a diagram of a mechanical layout modeling in the navigation system of the present invention;
FIG. 3 is a schematic diagram of laser radar and inertial information fusion according to the present invention;
FIG. 4 is a schematic view of a semicircular arc drop method in another embodiment of the present invention;
FIG. 5 is a schematic diagram of an alternate cropping method in accordance with another embodiment of the present invention;
FIG. 6 is a schematic representation of an 3/4 arc crop-out method in another embodiment of the present invention.
Detailed Description
The present invention is further described in detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description text.
It will be understood that terms such as "having," "including," and "comprising," as used herein, do not preclude the presence or addition of one or more other elements or groups thereof.
The invention provides a method based on a point cloud map and inertial navigation assistance, aiming at the problems that a GPS (global positioning system) cannot be accurately positioned and the line-to-line turning cannot be accurately performed when a robot runs in a densely planted orchard in a scene. The invention ensures that the robot can autonomously complete the operation in the semi-structured environment (clear row and column distribution of fruit trees) of the orchard. Specifically, the orchard unmanned driving method based on the point cloud map comprises the following steps:
step 1: point cloud map based on trunk and landmark
The positioning of the tree crowns around the orchard area is realized through a differential GPS, and then the road signs are positioned to mark the absolute positions of the trees and the road signs. And scanning and recording key characteristic trunk recording characteristics by using a 16-line laser radar, and manufacturing a high-precision topological map. The specific operation is as follows: (1) data acquisition: a light elevated frame is welded on the mobile robot, and the elevated frame is higher than the crown. The overhead top carries a GPS antenna, the middle carries a 16-line laser radar to carry out global scanning on the closed structured orchard, and the orchard enters the closed structured orchard line by line in a snake shape from the beginning along the enclosing wall until all passable areas are reached. The uniform speed is kept in the advancing process, the tree trunk is ensured to be completely scanned, and the stay time of the GPS antenna at the top end of each tree crown exceeds ten seconds. (2) Data processing: and for each tree trunk, extracting GPS points in time from the beginning of stopping of the robot to the end of moving the robot again, and removing offset points by using a particle filter algorithm to obtain the actual position of the tree trunk. This loops until all the trunks have been calculated. And calculating the middle point between the coordinates of two adjacent trees, and connecting the two adjacent trees to obtain the passable path. And (3) playing point cloud information bag obtained by scanning trunk and ground features by using a laser radar, loading by using a Lio-Sam algorithm, storing the point cloud information bag frame by frame as a pcd file, and obtaining the final file which is the required point cloud map. The method comprises the steps of adding processed GPS signal points into a map in a multi-mark point cloud marking mode of selecting sparse control points, and therefore, the robot can be absolutely positioned in a relative positioning mode through a laser radar, GPS coordinate points are obtained under the condition of no GPS signals.
Step 2: autonomous traveling between rows of robot
And before the robot enters a line, acquiring a GPS signal, and marking the position of the robot on the established point cloud map. And when the vehicle enters a row, the GPS signal disappears, the trunk is scanned by the laser radar to obtain the GPS signal, the corresponding GPS point label is obtained, the position of the central line is calculated through the distance information of the trunks on the left side and the right side, and the passable nodes are compared to autonomously run along the central line.
And after the left side line and the right side line are obtained through the point cloud map, the side line equations are obtained. The left line equation is: k is1x+b1The left-hand line equation is: k is2x+b2. Selecting the current passable road terminal point as a coordinate point (x0, y0), and calculating the distance of the coordinate point as shown in formula (1):
Figure BDA0002927144100000051
and then controlling the robot to drive along the center line of the passable road. The sizes of d1 and d2 are compared in real time, steering control is carried out through preset rules, and the robot is forced to keep driving along the center line of the passable road at all times. The steering control rule is shown in formula (2):
Figure BDA0002927144100000052
so far, the robot can independently travel between orchard lines until walking to the export between orchard lines.
And step 3: calculating the pose of the robot
Modeling the inertial navigation system (as shown in fig. 2), completing the calculation of navigation parameters of the inertial navigation system in a navigation coordinate system, and finally performing primary positioning by using the inertial navigation system.
Since the absolute position of each point of the point cloud map is known, the carrier coordinate system can be transferred to the default northeast coordinate system.
And obtaining the presumed position after obtaining the real-time speed. However, although the distance estimated from the speed has high accuracy in a short time, errors continue to accumulate due to the existence of the system zero offset error, and the orchard road surface is not flat and bumpy, and thus the errors are caused, so that data fusion must be performed by using radar.
Scanning a trunk by using a laser radar, performing feature matching on scanned point cloud information and a point cloud map built by the user to acquire a current position, acquiring an absolute position due to the addition of a position information label in the step 1, namely directly acquiring accurate GPS information without considering the influence of a tree crown, performing data fusion on the GPS information and an IMU through EKF (extended Kalman filter) to acquire accurate positioning data, when the robot moves out of a travel space, the shielding disappears, recovering a differential GPS signal, directly correcting the position information through the signal to ensure that the pose of the robot is correct before the next steering is performed, wherein the specific flow is shown in figure 3, in the step, the positions and the poses of the robot and the orchard can be calculated by adopting a proper algorithm and external assistance, and a control signal for steering the robot is generated in time according to the pose information of the current robot to cope with different operation modes, in particular, the prior art is generally data fusion of IMU and GPS, and generally does not involve lidar. According to the scheme, GPS signals are added into the laser point cloud map between lines to replace GPS, and the GPS signals are fused with IMU data to acquire pose information. And (4) recovering the out-of-line signals, and carrying out position calibration through differential GPS.
In actual operation, due to the fact that the crown is shielded, GPS signals are weak, and the laser radar is used as a main positioning mode, so that the NDT algorithm with higher accuracy is selected for registration. The NDT is an algorithm based on a rasterized map, and the main algorithm idea based on a 2D scene is to divide a reference frame point cloud map into small blocks (cells), calculate a point cloud obtained by a target frame at the next moment, convert the point cloud to a reference coordinate, and obtain the probability of falling into the cell of the corresponding reference frame. The detection results of the same environment point and different moments are transformed to the same coordinate and fall into the same cell. And obtaining a scoring index through the relation, and optimizing to obtain R and t. The method comprises the steps that the conjecture results of radar and inertial navigation are fused through extended Kalman filtering, so that an accurate position result can be output, after the conjecture results are taken out of a space, information processing is carried out through the extended Kalman filtering again, and accurate position information is obtained, so that the technical problems that in the densely planted orchard environment, a turning path is planned at the head of the field, the position of a robot is obtained through GPS measurement, GPS signal loss and large positioning error exist, the robot turns a corner with great instability, and the subsequent positioning cost of the GPS is high are solved; and a topological map is constructed in the prior art, but the error of inertial navigation is large, how to realize that the robot accurately enters the next row is realized, the machine vision is only used for forcibly updating the position after detecting a topological point, the precision is poor, and the robot can not completely realize the autonomous driving of the robot in the orchard environment.
And 4, step 4: selecting a turn-around mode based on a point cloud map
When the robot moves to the end of an orchard row, the turning information is obtained by scanning the specific information of the two trees at the head of the row, when the radar installation center line of the robot is parallel to the trunk at the head of the row, the distance between the robot and the fence at the moment is detected, and a proper turning mode is selected to enter a new orchard row for continuous driving. The turn around mode is judged by scanning the distance of the boundary line.
Under normal conditions, the front reserved position is sufficient, and the traditional semi-circular arc turning method shown in fig. 4 is selected.
When the width of a fruit tree row is smaller than the turning radius of the robot and the width of a row head boundary line is also smaller than the turning radius, the traditional semicircular arc turning strategy cannot meet the narrow space, and therefore a new fruit tree row can be entered only by adopting an interlaced turning method as shown in fig. 5.
When the two turning methods are not applicable, the space in the opposite direction of the turning of the robot is large, and the width of the line-head boundary line meets the requirement, an 3/4 arc turning method shown in fig. 6 can be adopted.
The three methods can meet the turning scene among most fruit tree lines.
The following table will illustrate the applicable ranges (constraints) of the three u-turn modes, where r represents the minimum turning radius of the robot, d represents the distance between the line head and the enclosing wall, and L represents the line spacing.
Turning around mode Application scope
Semicircular arc head dropping method d>r,L>2r
Interlaced head dropping method d<r
3/4 arc end-cutting method d>2r,L<r
The orchard fusion positioning method implemented by the method can accurately and quickly obtain the pose information of the robot by combining the point cloud map added with the GPS coordinate point label with the inertial navigation system, judge whether the robot needs to go forward or turn to the road through the coordinate points, judge whether the turning around is successful through the course angle, and in the actual operation, if the coordinate system of the map selects the northeast sky coordinate system, when the robot enters the space from south to north and the course angle needs to be due north, and when the robot finishes the turning around operation and carries out the line change running, the course angle needs to be due south. When the heading angle becomes south, scanning by a laser radar to obtain information, detecting the distance between the left and right lines, comparing by off-line GPS coordinate information, and judging whether the pose of the robot is correct or not, so as to judge whether the turning is successful or not.
The above scheme is merely illustrative of a preferred example, and is not limiting. When the invention is implemented, appropriate replacement and/or modification can be carried out according to the requirements of users.
The number of apparatuses and the scale of the process described herein are intended to simplify the description of the present invention. Applications, modifications and variations of the present invention will be apparent to those skilled in the art.
While embodiments of the invention have been disclosed above, it is not intended to be limited to the uses set forth in the specification and examples. It can be applied to all kinds of fields suitable for the present invention. Additional modifications will readily occur to those skilled in the art. It is therefore intended that the invention not be limited to the exact details and illustrations described and illustrated herein, but fall within the scope of the appended claims and equivalents thereof.

Claims (6)

1. An orchard operation robot unmanned driving method based on a point cloud map is characterized by comprising the following steps:
step one, respectively positioning tree crowns and road signs around an orchard area through a differential GPS, and marking absolute positions of trees and road signs;
scanning and recording a key characteristic trunk and recording corresponding characteristic signals by adopting a laser radar to obtain a corresponding point cloud map;
acquiring a current GPS signal by the working robot before entering the orchard so as to mark a real-time position on a point cloud map;
step four, continuously scanning trunk information through a laser radar when the working robot travels in the orchard to correspond to GPS signal points on the point cloud map, calculating the traveling pose of the robot in real time in the autonomous traveling process, and further correcting the traveling direction of the robot;
and step five, when the working robot moves to the end of each row of the orchard, selecting a turning mode based on the point cloud map until all rows of the orchard are traversed.
2. An orchard operation robot unmanned method based on point cloud map according to claim 1, characterized in that in step one, the obtaining mode of the absolute position is configured to include:
s1, collecting data, arranging an elevated frame which carries a GPS antenna and is higher than the tree crowns on the mobile robot, carrying a laser radar in the middle of the elevated frame to carry out global scanning on the closed and structured orchard, enabling each tree trunk to be scanned, and configuring the stay time of the GPS antenna on the top end of each tree crown during scanning to be more than ten seconds;
and S2, processing data, extracting the GPS points in the stay time of each trunk by a processor on the mobile robot, and removing the offset points by a particle filter algorithm to obtain the GPS signal points corresponding to the absolute positions of the trunks.
3. An orchard operation robot unmanned method based on point cloud map according to claim 2, characterized in that in step two, the acquisition mode of the point cloud map is configured to include:
s3, calculating the middle point between two adjacent tree coordinates by the processor, and connecting the middle points to obtain a passable path;
s4, playing point cloud information bag obtained by scanning trunk and ground features through a laser radar under ros, loading the point cloud information bag by using a Lio-Sam algorithm, storing the point cloud information bag frame by frame as a pcd file, and taking the last file as a map;
and S5, adding the GPS signal points into the map in a multi-mark point cloud labeling mode by selecting sparse control points to obtain a corresponding point cloud map.
4. An orchard operation robot unmanned driving method based on point cloud map according to claim 1, wherein in step four, the autonomous traveling mode is configured to scan the trunk through laser radar to obtain the corresponding GPS signal point label after the robot enters the space between rows and the GPS signal disappears, measure the distance information of the trunk on the left and right sides through laser radar to calculate the position of the central line, and compare the passable nodes to autonomously travel along the central line.
5. An orchard operation robot unmanned driving method based on point cloud map according to claim 1, wherein in step four, the robot traveling pose is calculated in real time, and then the traveling direction of the robot is corrected, and the method is configured to include:
s6, scanning a trunk by using a laser radar, and performing feature matching on the scanned point cloud information and a point cloud map to acquire GPS information of the current position;
and S7, carrying out data fusion on the GPS information and the IMU through the EKF to obtain accurate positioning data, wherein the shielding disappears when the robot is driven out, and after the differential GPS signal is recovered, correcting the position information at the moment through the positioning data to ensure that the pose of the robot is correct before the next steering is carried out.
6. An orchard operation robot unmanned method based on point cloud map according to claim 1, characterised in that the way of selecting the turning mode is configured to include:
s8, when the robot moves to the end of a row in an orchard, turning information is obtained by scanning information of two trees at the head of the row;
and S9, when the radar mounting center line of the robot is parallel to the trunk of the line head, detecting the distance between the robot and the enclosing wall, and selecting any one of a semicircular arc head dropping method, an interlaced head dropping method and a 3/4 circular arc head dropping method to enter a new line to continuously drive by scanning the distance of the boundary line until all the lines are traversed.
CN202110136495.0A 2021-02-01 2021-02-01 Orchard operation robot unmanned driving method based on point cloud map Pending CN112965481A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110136495.0A CN112965481A (en) 2021-02-01 2021-02-01 Orchard operation robot unmanned driving method based on point cloud map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110136495.0A CN112965481A (en) 2021-02-01 2021-02-01 Orchard operation robot unmanned driving method based on point cloud map

Publications (1)

Publication Number Publication Date
CN112965481A true CN112965481A (en) 2021-06-15

Family

ID=76273564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110136495.0A Pending CN112965481A (en) 2021-02-01 2021-02-01 Orchard operation robot unmanned driving method based on point cloud map

Country Status (1)

Country Link
CN (1) CN112965481A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534184A (en) * 2021-07-13 2021-10-22 华南农业大学 Laser sensing agricultural robot space positioning method
CN113608551A (en) * 2021-08-11 2021-11-05 成都信息工程大学 Unmanned agricultural machinery group cooperation system and application method thereof
CN113778081A (en) * 2021-08-19 2021-12-10 中国农业科学院农业资源与农业区划研究所 Orchard path identification method and robot based on laser radar and vision
CN113848877A (en) * 2021-08-19 2021-12-28 中国农业科学院农业资源与农业区划研究所 Robot unmanned driving method based on topological map and robot
CN114355912A (en) * 2021-12-27 2022-04-15 山东新一代信息产业技术研究院有限公司 Edge walking decision method for park automatic driving logistics trolley
CN114485667A (en) * 2022-01-13 2022-05-13 中国农业大学 Light and intelligent orchard ground navigation method
CN115443792A (en) * 2022-08-12 2022-12-09 深圳拓邦股份有限公司 Method and system for establishing image of mower and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450561A (en) * 2017-09-18 2017-12-08 河南科技学院 The autonomous path planning of mobile robot and obstacle avoidance system and its application method
CN108303097A (en) * 2018-03-28 2018-07-20 西北农林科技大学 A kind of closing orchard navigation system positioned based on 2D laser radars and the Big Dipper
CN108519615A (en) * 2018-04-19 2018-09-11 河南科技学院 Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450561A (en) * 2017-09-18 2017-12-08 河南科技学院 The autonomous path planning of mobile robot and obstacle avoidance system and its application method
CN108303097A (en) * 2018-03-28 2018-07-20 西北农林科技大学 A kind of closing orchard navigation system positioned based on 2D laser radars and the Big Dipper
CN108519615A (en) * 2018-04-19 2018-09-11 河南科技学院 Mobile robot autonomous navigation method based on integrated navigation and Feature Points Matching

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李延华: "自主移动果园作业机器人地头转向与定位研究", 《中国优秀硕士学位论文全文数据库 农业科技辑》 *
纪嘉文等: "一种基于多传感融合的室内建图和定位算法", 《成都信息工程大学学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534184A (en) * 2021-07-13 2021-10-22 华南农业大学 Laser sensing agricultural robot space positioning method
CN113534184B (en) * 2021-07-13 2023-08-29 华南农业大学 Laser-perceived agricultural robot space positioning method
CN113608551A (en) * 2021-08-11 2021-11-05 成都信息工程大学 Unmanned agricultural machinery group cooperation system and application method thereof
CN113778081A (en) * 2021-08-19 2021-12-10 中国农业科学院农业资源与农业区划研究所 Orchard path identification method and robot based on laser radar and vision
CN113848877A (en) * 2021-08-19 2021-12-28 中国农业科学院农业资源与农业区划研究所 Robot unmanned driving method based on topological map and robot
CN114355912A (en) * 2021-12-27 2022-04-15 山东新一代信息产业技术研究院有限公司 Edge walking decision method for park automatic driving logistics trolley
CN114485667A (en) * 2022-01-13 2022-05-13 中国农业大学 Light and intelligent orchard ground navigation method
CN114485667B (en) * 2022-01-13 2024-05-24 中国农业大学 Light intelligent orchard ground navigation method
CN115443792A (en) * 2022-08-12 2022-12-09 深圳拓邦股份有限公司 Method and system for establishing image of mower and readable storage medium

Similar Documents

Publication Publication Date Title
CN112965481A (en) Orchard operation robot unmanned driving method based on point cloud map
Li et al. Openstreetmap-based autonomous navigation for the four wheel-legged robot via 3d-lidar and ccd camera
Kanagasingham et al. Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot
CN106774313B (en) A kind of outdoor automatic obstacle-avoiding AGV air navigation aid based on multisensor
CN109099901B (en) Full-automatic road roller positioning method based on multi-source data fusion
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
CN109634276B (en) Agricultural vehicle unmanned control method and system and agricultural vehicle
Wang et al. Map-based localization method for autonomous vehicles using 3D-LIDAR
CN108088439B (en) AGV composite navigation system and method integrating electronic map, two-dimensional code and color band
CN102368158B (en) Navigation positioning method of orchard machine
CN107085938B (en) The fault-tolerant planing method of intelligent driving local path followed based on lane line and GPS
CN108645420B (en) Method for creating multipath map of automatic driving vehicle based on differential navigation
CN105667518A (en) Lane detection method and device
CN108052107A (en) A kind of AGV indoor and outdoor complex navigation system and methods for merging magnetic stripe, magnetic nail and inertial navigation
CN109753075B (en) Agriculture and forestry park robot navigation method based on vision
KR101318560B1 (en) Vision based guideline interpretation method for stable driving control of guideline tracing AGVs
CN110806585B (en) Robot positioning method and system based on trunk clustering tracking
CN116839570B (en) Crop interline operation navigation method based on sensor fusion target detection
Asghar et al. Vehicle localization based on visual lane marking and topological map matching
CN114563795B (en) Positioning tracking method and system based on laser odometer and label fusion algorithm
CN114114367A (en) AGV outdoor positioning switching method, computer device and program product
Morimoto et al. Vision-based navigation system for autonomous transportation vehicle
CN108196545A (en) Using the AGV magnetic navigation control methods of Auto Disturbances Rejection Control Technique
CN116560362A (en) Automatic navigation path planning tracking method and system
CN111208820A (en) Particle unmanned vehicle group under artificial intelligence big data, control method and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination