CN109753075A - A kind of agricultural garden robot navigation method of view-based access control model - Google Patents

A kind of agricultural garden robot navigation method of view-based access control model Download PDF

Info

Publication number
CN109753075A
CN109753075A CN201910084147.6A CN201910084147A CN109753075A CN 109753075 A CN109753075 A CN 109753075A CN 201910084147 A CN201910084147 A CN 201910084147A CN 109753075 A CN109753075 A CN 109753075A
Authority
CN
China
Prior art keywords
robot
orchard
label
image
row
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910084147.6A
Other languages
Chinese (zh)
Other versions
CN109753075B (en
Inventor
史云
李会宾
吴文斌
杨鹏
唐华俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Agricultural Resources and Regional Planning of CAAS
Original Assignee
Institute of Agricultural Resources and Regional Planning of CAAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Agricultural Resources and Regional Planning of CAAS filed Critical Institute of Agricultural Resources and Regional Planning of CAAS
Priority to CN201910084147.6A priority Critical patent/CN109753075B/en
Publication of CN109753075A publication Critical patent/CN109753075A/en
Application granted granted Critical
Publication of CN109753075B publication Critical patent/CN109753075B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention proposes a kind of agricultural garden robot navigation method of view-based access control model, comprising: S1, the setting auxiliary label at the entrance of the orchard row of agricultural garden;S2 adjusts the distance and angle apart from the auxiliary label in front of robot motion to the auxiliary label, then completes to turn to.Method of the invention relies on the case where GPS positioning is in the presence of that cannot position for robot in orchard, it can allow robot under no GPS scenario, function that robot precisely goes on a journey, turns and find next line is realized with detection method and orientation direction label using orchard row, meets robot in orchard the needs of continuous work.In the environment of orchard, if can have the case where GPS often loses, positioning is not allowed and subsequent operating cost is larger using GPS positioning.The present invention can realize that robot correctly navigates, reduce economic input in the case where no GPS.

Description

A kind of agricultural garden robot navigation method of view-based access control model
Technical field
The present invention relates to airmanships, more particularly, to the agricultural garden robot navigation method of view-based access control model.
Background technique
Current era, multifunctional agriculture robot is widely applied, so that agricultural robot is on vast field, More and more instead of having been manually done various farm work.The task broad categories that orchard produces in agricultural, such as: flower thinning determines fruit, set Bag, beta pruning cover grass, pour water, apply fertilizer, spray spraying pesticide, the prevention and control of plant diseases, pest control, the by stages processes such as harvesting, need a large amount of manpower and object Power, while non-accurate orchard management mode can generate a large amount of ineffective investment and ecological pollution, increase fruit price.For These situations are developed and are suitble to the intelligent accurate humanoid robot of orchard operation imperative.Robot will be realized instead of artificial, in fruit AUTONOMOUS TASK in garden is to solve robot in orchard the problem of independent navigation first, and complete homing capability could be protected Card robot is automatically moved to the every nook and cranny in orchard, so that robot repeats to be reliably completed every cargo handling operation.
In orchard, the utonomous working of robot needs to complete in the passageway between two row fruit trees, i.e., along passageway pair Each tree carries out operation, and line of trees operation completes to work in then turning to another row passageway, until walking complete slice orchard.Machine People during the work time, can generally be located at tree crown hereinafter, if only positioned by GPS, it may appear that satellite-signal is by tree crown It blocks, multipath effect and the problems such as radio frequency interference, it is larger to eventually lead to robot position error, or even can not position.Orchard row Between distance there was only 2m or so mostly, be difficult to obtain accurately location navigation effect in the ranks using GPS;Electromagnetic navigation positioning can have Preferable location navigation effect, but higher cost.
Chinese patent application " CN102368158A " proposes a kind of navigation positioning method of orchard machine.This method is logical first Fusion laser scanning information and visual sensor information are crossed, the three-dimensional environment in front of Machinery for orchard is constructed, extracts effective navigation Information;Then when Machinery for orchard will travel to the fruit tree row end, biography inside Kalman filter fusion Machinery for orchard is utilized The information that sensor measures obtains Machinery for orchard navigation information;Finally when Machinery for orchard drives to the orchard edge of a field, adopt first Machinery for orchard headland turn radius is planned with method in optimal control, and obtains its location information using GPS sensor measurement, is passed through Compare theoretical path coordinate and actual position coordinate, obtains and be used for Machinery for orchard navigator fix information.
Chinese patent application " CN205843680U " proposes a kind of orchard robotic vision air navigation aid.This method is first The acquisition that path circumstances image in orchard is realized first with CCD camera, then extracts navigation reference line from the image, later will Navigation reference line parameter is converted into the course angle information in actual environment, is transferred to single chip control module.In robot ambulation In the process, a two dimensional code label is installed every 5m in the side of orchard row, positioning in the ranks and guide traveling side for robot To.Single chip control module finally is sent by two dimensional code signage information, single-chip microcontroller passes through change by using PID controller PWM wave turns empty ratio to control driving motor, realizes autonomous of the robot under the environment of orchard.
In 2 above-mentioned schemes, when robot orchard is in the ranks turned, work at present row how is walked out for robot, Then precisely less into this practical problem research of next line.Using on ground in Chinese patent application " CN102368158A " Turning path is planned at head, then obtains the position of robot using GPS measurement, but in the orchard environment of dense planting, it will There are GPS signal loss and the biggish risks of position error, and robot turning is caused to have great unstability.Except this it Outside, the subsequent positioning expense of GPS is also higher.Chinese patent application " CN205843680U " using two dimensional code although realized small The prompt of vehicle direction information, but for how to realize that robot accurately enters next line, particular studies are not done, however in the ranks Turning is the key that again one step of robot manipulating task, so the program fails completely to realize the autonomous row of robot under the environment of orchard It sails.
Summary of the invention
The present invention is directed to robot driving scene in orchard, and GPS occur can not precise positioning and in the ranks can not be smart The problem of quasi- u-turn, provides the method for a kind of vision guided navigation and auxiliary label guidance.This invention ensures that robot can be Under this semi-structure environment in orchard (distribution of fruit tree ranks is clear), independently fulfil assignment.
Specifically, the present invention proposes a kind of agricultural garden robot navigation method of vision, comprising: S1, in agricultural garden Orchard row entrance at setting auxiliary label;S2 is adjusted in front of robot motion to the auxiliary label apart from the auxiliary Then the distance and angle of label are completed to turn to.
The method that the present invention uses vision guided navigation, can obtain biggish detection range, while orchard environment also has preferably Semi-structured feature, using suitable algorithm and it is extraneous assist, position and the posture of robot and orchard row can be calculated, The control signal that robot turns to is generated in time.The method economic cost of vision is lower simultaneously, meets the needs of agricultural development.
The beneficial effect comprise that
1, the case where GPS positioning is in the presence of that cannot position is relied on for robot in orchard, the present invention can allow robot to exist Without under GPS scenario, under realizing that robot precisely goes on a journey, turns and finds using orchard row band detection method and orientation direction label The function of a line meets robot in orchard the needs of continuous work.In the environment of orchard, if can be deposited using GPS positioning The case where GPS often loses, positioning is not allowed and subsequent operating cost is larger.The present invention can in the case where no GPS, It realizes that robot correctly navigates, reduces economic input.
2, for this case that must rely on map in general navigation, the present invention can make robot refer in no map It in the case where drawing, can accurately be travelled in the ranks, simplify the navigation algorithm under the scene of orchard, improve robot in orchard Interior navigation efficiency.In trolley in automatic Pilot, the complex operations such as autonomous path planning and autonomous turn are avoided, are simplified Navigation algorithm under the scene of orchard.
3, the navigation strategy in the present invention, the existing constraint of map must be had by eliminating navigation, can robot be existed Under guide without map, it still is able to determination and travels in the line, and realize the operation precisely gone on a journey and entered a profession.
4, the present invention is guided using binocular camera, label and the strategy that accurately turns round realizes the accurate line feed of robot, The complexity of hardware is reduced, so that the present invention also reduces the investment of cost while improving robot working efficiency.
Detailed description of the invention
Fig. 1 is the flow chart of an embodiment of method of the invention.
Fig. 2 is the perspective view of auxiliary label of the invention.
Fig. 3 is the main view of auxiliary label of the invention.
Fig. 4 is the left view of auxiliary label of the invention.
Fig. 5 is the top view of auxiliary label of the invention.
Fig. 6 is the flow chart of the another embodiment of method of the invention.
Specific embodiment
Embodiments of the present invention are described with reference to the accompanying drawings, wherein identical component is presented with like reference characters. In the absence of conflict, the technical characteristic in following embodiment and embodiment can be combined with each other.Below by taking orchard as an example The description present invention, but in forest zone, the present invention is equally applicable.
One embodiment of method of the invention as shown in Figure 1, the method comprise the steps that S1, in agricultural garden Orchard row entrance at setting auxiliary label.S2 is adjusted in front of robot motion to the auxiliary label apart from the auxiliary Then the distance and angle of label are completed to turn to.
In one embodiment, special designing can be had by assisting the appearance of label, as Figure 2-Figure 5.Auxiliary label is adopted It is designed here using square center with the intermediate figure with obvious structure convenient for the determination at label center.At label center Position, one ball with limbus feature of vertical connection, as positioning auxiliary object.Under normal circumstances, ball is located at The front of label central figure, and central figure non-boundary is blocked.The benefit of the design is to utilize center ball and center The tour reference as robot, the central point that can use the ball centre of sphere and central figure in this way can determine both figures simultaneously Straight line can then correct the direction of robot only when robot is conllinear with this two o'clock.Assist the design result of label As shown in Figure 2-5.
Step S2 includes: S2-1, and scanning auxiliary label calculates the angle of the angle point and central figure in auxiliary label image Point is arranged one in the front of the central figure of auxiliary label and determines wherein being provided with central figure at the center of auxiliary label Position adminicle.
Corner Detection Algorithm is more, including Moravec Corner Detection Algorithm, Harris Corner Detection Algorithm, SUSAN Corner Detection Algorithm and FAST Corner Detection Algorithm etc..Then using the angle point for the central figure that detected, to determine center The center of figure.Assuming that the central figure angle point for all calculating out has N number of, respectively (xc1, yc1), (xc2, yc2), (xc3, yc3),…(xcN, ycN)。
S2-2 calculates the central point of central figure.
If (xsc, ysc) be center figure central point, shown in calculation formula such as formula (1):
S2-3 judges the area of central figure.
After calculating all angle points, all angle points in order successively line composition unique closed figure, as in Heart graphics field.Here by taking rectangular area as an example, rectangle is with 4 according to the angle point arranged clockwise, respectively (xc1, yc1), (xc2, yc2), (xc3, yc3) and (xc4, yc4), then calculate the area of central figure, shown in calculation formula such as formula (2):
S=| xc1-xc2|*|yc1-yc4| (2)
S2-4 calculates the centre coordinate of positioning auxiliary object (ball).
By taking positioning auxiliary object is ball as an example, if (xcc, ycc) ball central coordinate of circle, ball has apparent edge special Sign, by the circle in fitting image, to obtain the circular dot of ball.Ball is a circle in the picture, is fitted circular Method has the circle detection method based on Hough transformation and detection method based on curve matching etc., while can be with by such method Find out (xcc, ycc)。
S2-5, according to the difference between the coordinate of positioning auxiliary object and the coordinate of central figure, to adjust the inclined of robot To in the threshold range of setting.
If robot faces auxiliary label, there is lesser difference between central coordinate of circle and central figure coordinate, If the coordinate that robot recognizes centre point is located at the left side of central figure coordinate, robot is located at the right side of auxiliary label Side, it is on the contrary then be adjusted to the right so need to the left to adjust robot, until the central figure coordinate and centre point of adjustment The distance between coordinate, in the threshold range of setting, then determine robot face auxiliary label.Δ d is set as 2 The distance between coordinate, Δ x are the offset between 2 coordinates in the direction x, judge relationship calculation formula between 2 coordinates such as Shown in formula (3):
Set D0For the critical distance threshold value between 2 coordinates, the left and right drift condition such as formula (4) of robot is judged
It is shown:
S2-6 determines distance of the robot away from auxiliary label, and is adjusted.
After robot face assists label, needs to judge the distance between robot distance auxiliary label, be robot U-turn is prepared into next line.Distance measuring method can use laser radar, binocular camera and area method of comparison.It is adopted in embodiment With area method of comparison, i.e., the area of entire auxiliary label image is occupied, using the central figure of auxiliary label to determine machine The distance between people and auxiliary label, wherein the area of central figure and the distance between robot and label are inversely proportional.According to This feature can calculate, the corresponding relationship under different distance, between the area and distance of central figure.When robot is run To within the distance threshold of distance auxiliary label, then halt.Set sallFor the area of image, k be center graphics area and The ratio of image area, krefThe threshold limit value of central figure area and the ratio of image area, calculation formula such as formula (5) institute Show:
Robot stops after advancing, then robot is waited for that the side of operation row rotates about 90 ° to orchard.
By step S2, robot can robot the operation for walking out orchard row is completed.After robot trip, need to look for To the new a line in orchard, work on.
In the method for the invention, the inlet of new a line is searched out come guided robot by setting auxiliary label.? The farthest side of entrance row distance robot places identical label.Utilize method identical with step S2, guided robot face The label, then adjusts the distance between robot and label, which can control robot and just run to new orchard row The center of inlet.Then trolley is turned 90 ° to orchard row, allows the robot to the entrance for being directed at new orchard row Place, thus the traveling that enters a new line.After robot searches out the inlet of new orchard row, robot straight trip enters new orchard row, starts It is travelled in new orchard row.
Fig. 6 shows the flow chart of another embodiment of method of the invention.The method comprise the steps that T1, acquisition Image where robot in the row of orchard obtains the travelable path of robot by the classification fitting algorithm of image.It can travel The calculating in path can be then stored in robot in outside completion, can also be carried out in robot.T2, robot according to It can travel path, calculate the position of center line of orchard row, travelled along center line.T3, the end of the robot motion to orchard row When (can be outlet, it is also assumed that being entrance), by be arranged in entrance auxiliary label guidance, travel out current fruit Garden row, wherein the end in orchard row is provided with auxiliary label.T4, robot is by assisting label to search out adjacent orchard row Entry position (entry position that other orchard rows can also be found by being located at the auxiliary label at every line entry).T5, machine People enters new orchard row and continues to travel.
In one embodiment, step T1 includes:
T1-1 obtains the image in the row of orchard by the video camera installed in robot, in the orchard row of acquisition Image carries out feature space clustering processing.Mean-shift algorithm can be used for example: 1, using nonparametric inner core density function The feature space of image is calculated;2, each pixel is clustered by modal density function.
T1-2 classifies to cluster result using figure partitioning algorithm.Include: 1, define a figure area to be split;2, lead to It crosses Similarity measures function and divides the image into multiple big regions, can uniformly be divided into road area and non-rice habitats region.
T1-3 is filtered optimization to road area and extracts.Include: 1, road area and non-rice habitats region will be partitioned into Image is converted into gray level image;2, image is carried out by binaryzation according to segmentation threshold, for example, it can be set to the gray scale of road area Value is set as 0, and the gray value in non-rice habitats region is 255.
T1-4 extracts roadway area boundary line.It is preferable to use Hough transform algorithm, the advantage of Hough transform is Even if the boundary point in image on the same line, can also not calculate accurate result.
Pass through available two comprising the road area boundary line step T1.
In one embodiment, step T2 includes:
T2-1, behind two boundary lines for obtaining road area, distance of the calculating robot to two boundary lines.
It realizes that process is as follows in embodiment: calculating this o'clock using any point on the central series of image to two boundary lines Distance d1 and d2.Set the linear equation in left side boundary line are as follows: A1x+Bly+C1=0, the right boundary line equation are as follows: A2x+B2y+C2= 0, the point coordinate chosen is (x0, y0), shown in distance calculation formula such as formula (6):
T2-2, control vehicle are travelled along road axis.
It is bigger according to distance value, it is necessary to adjust the principle that vehicle is turned to its boundary line direction, control vehicle edge Road-center position traveling.Judge shown in the calculation formula such as formula (7) turned to:
By step T2, robot in the ranks can independently travel in orchard, and the exit of orchard row is arrived until walking.
Robot orchard in the ranks, along current line center travel, when at the end for moving to orchard row, need to go It is driven out to and is currently located row, continue operation to enter next line.Here at the center of two Boundary Trees and just before at a certain distance from, if Auxiliary label is determined and has carried out guided robot trip, i.e., guided robot walks a certain distance of going on a journey by specific direction, convenient for leaving and taking Space is that robot turns around to prepare into next line.
Embodiment described above, the only present invention more preferably specific embodiment, those skilled in the art is at this The usual variations and alternatives carried out within the scope of inventive technique scheme should be all included within the scope of the present invention.

Claims (10)

1. a kind of agricultural garden robot navigation method of view-based access control model characterized by comprising
S1, the setting auxiliary label at the entrance of the orchard row of agricultural garden;
S2 adjusts the distance and angle apart from the auxiliary label in front of robot motion to the auxiliary label, then completes to turn To.
2. air navigation aid according to claim 1, which is characterized in that
The auxiliary label is provided centrally with central figure, and positioning auxiliary object is provided in front of label.
3. air navigation aid according to claim 2, which is characterized in that in S2, comprising:
The auxiliary label is rectangle, and the central figure is square, and the positioning auxiliary object is ball.
4. air navigation aid according to claim 2 or 3, which is characterized in that in S2, comprising:
S2-1, scanning auxiliary label calculate the angle point of the angle point and central figure in auxiliary label image;
S2-2 calculates the central point of central figure;
S2-3 judges the area of central figure;
S2-4 calculates the centre coordinate of positioning auxiliary object;
S2-5 exists according to the difference between the coordinate of positioning auxiliary object and the coordinate of central figure to adjust the deviation of robot In the threshold range of setting;
S2-6 determines distance of the robot away from auxiliary label, and is adjusted.
5. air navigation aid according to claim 4, which is characterized in that
In S2-6, the central figure according to auxiliary label occupies the area of whole image, to determine robot and auxiliary label The distance between.
6. air navigation aid according to claim 1, which is characterized in that further include:
T1, the image where acquisition robot in the row of orchard obtain can travel for robot by the classification fitting algorithm of image Path;
T2, robot calculate the position of center line of orchard row according to travelable path, travel along center line;
T3 at the entrance of machine person form to the orchard row, is turned to according to S1-S2,
T4, robot is by assisting label to search out the entry position of adjacent orchard row;
T5, robot, which enters new orchard row, to be continued to travel.
7. air navigation aid according to claim 6, which is characterized in that step T1 further include:
In T1-1, the image in the row of orchard is obtained by the video camera installed in robot, in the orchard row of acquisition Image carries out feature space clustering processing;
T1-2 classifies to cluster result using figure partitioning algorithm;
T1-3 is filtered optimization to road area and extracts;
T1-4 extracts roadway area boundary line.
8. air navigation aid according to claim 7, which is characterized in that step T2 further include:
T2-1, behind two boundary lines for obtaining road area, distance of the calculating robot to two boundary lines;
T2-2 controls vehicle according to the distance and travels along road axis.
9. air navigation aid according to claim 8, which is characterized in that
In T2-1, calculated using any point on the central series of robot shooting image this o'clock to two boundary lines away from From.
10. air navigation aid according to claim 7, which is characterized in that
In T1-1, clustering processing is carried out using mean-shift algorithm, comprising: 1) using nonparametric inner core density function to figure The feature space of picture is calculated;2) each pixel is clustered by modal density function;
T1-2 includes: 1) to define a figure area to be split;2) multiple big regions are divided the image by Similarity measures function, It is divided into road area and non-rice habitats region;
T1-3 includes: 1) to convert gray level image for the image for being partitioned into road area and non-rice habitats region;2, according to segmentation threshold Image is carried out binaryzation by value.
CN201910084147.6A 2019-01-29 2019-01-29 Agriculture and forestry park robot navigation method based on vision Active CN109753075B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910084147.6A CN109753075B (en) 2019-01-29 2019-01-29 Agriculture and forestry park robot navigation method based on vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910084147.6A CN109753075B (en) 2019-01-29 2019-01-29 Agriculture and forestry park robot navigation method based on vision

Publications (2)

Publication Number Publication Date
CN109753075A true CN109753075A (en) 2019-05-14
CN109753075B CN109753075B (en) 2022-02-08

Family

ID=66406470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910084147.6A Active CN109753075B (en) 2019-01-29 2019-01-29 Agriculture and forestry park robot navigation method based on vision

Country Status (1)

Country Link
CN (1) CN109753075B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262497A (en) * 2019-06-27 2019-09-20 北京物资学院 A kind of semi-structure environment robot navigation control method and device
CN110516563A (en) * 2019-08-06 2019-11-29 西安电子科技大学 Agriculture transplanter intelligence method for path navigation based on DSP
CN113807118A (en) * 2020-05-29 2021-12-17 苏州科瓴精密机械科技有限公司 Robot edgewise working method and system, robot and readable storage medium
CN116048104A (en) * 2023-04-03 2023-05-02 华南农业大学 Orchard operation robot path planning method, system and electronic equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101452292A (en) * 2008-12-29 2009-06-10 天津理工大学 Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark
CN101451849A (en) * 2008-12-26 2009-06-10 天津理工大学 Multifunction marking for vision navigation of mobile object and synthesis navigation method
CN101660908A (en) * 2009-09-11 2010-03-03 天津理工大学 Visual locating and navigating method based on single signpost
CN104181926A (en) * 2014-09-17 2014-12-03 上海畔慧信息技术有限公司 Navigation control method of robot
CN105116885A (en) * 2015-07-16 2015-12-02 江苏大学 Automatic bait casting workboat vision navigation method based on artificial identification
US20160246302A1 (en) * 2014-09-03 2016-08-25 Sharp Laboratories Of America, Inc. Methods and systems for mobile-agent navigation
CN106017477A (en) * 2016-07-07 2016-10-12 西北农林科技大学 Visual navigation system of orchard robot
CN106370185A (en) * 2016-08-31 2017-02-01 北京翰宁智能科技有限责任公司 Mobile robot positioning method and system based on ground datum identifiers
CN106681321A (en) * 2016-12-16 2017-05-17 盐城工学院 RFID-based online scheduling control system of automatic guided vehicle
CN107766859A (en) * 2017-10-31 2018-03-06 广东美的智能机器人有限公司 Method for positioning mobile robot, device and mobile robot

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101451849A (en) * 2008-12-26 2009-06-10 天津理工大学 Multifunction marking for vision navigation of mobile object and synthesis navigation method
CN101452292A (en) * 2008-12-29 2009-06-10 天津理工大学 Fish glasses head omnidirectional vision aiming method based on sequence dual-color dot matrix type navigation mark
CN101660908A (en) * 2009-09-11 2010-03-03 天津理工大学 Visual locating and navigating method based on single signpost
US20160246302A1 (en) * 2014-09-03 2016-08-25 Sharp Laboratories Of America, Inc. Methods and systems for mobile-agent navigation
CN104181926A (en) * 2014-09-17 2014-12-03 上海畔慧信息技术有限公司 Navigation control method of robot
CN105116885A (en) * 2015-07-16 2015-12-02 江苏大学 Automatic bait casting workboat vision navigation method based on artificial identification
CN106017477A (en) * 2016-07-07 2016-10-12 西北农林科技大学 Visual navigation system of orchard robot
CN106370185A (en) * 2016-08-31 2017-02-01 北京翰宁智能科技有限责任公司 Mobile robot positioning method and system based on ground datum identifiers
CN106681321A (en) * 2016-12-16 2017-05-17 盐城工学院 RFID-based online scheduling control system of automatic guided vehicle
CN107766859A (en) * 2017-10-31 2018-03-06 广东美的智能机器人有限公司 Method for positioning mobile robot, device and mobile robot

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
M.DEVY: "A visual landmark framework for mobile robot navigation", 《IMAGE AND VISION COMPUTING》 *
张利: "农业机器人视觉传感系统的实现与应用研究进展", 《农业工程学报》 *
张善文: "《基于MATLAB和遗传算法的图像处理》", 31 October 2015 *
杜建超: "《计算机图形学原理及应用》", 30 June 2014 *
柳平增: "室外农业机器人导航研究综述", 《农业网络信息》 *
焦李成: "《雷达图像解译技术》", 31 December 2017 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262497A (en) * 2019-06-27 2019-09-20 北京物资学院 A kind of semi-structure environment robot navigation control method and device
CN110516563A (en) * 2019-08-06 2019-11-29 西安电子科技大学 Agriculture transplanter intelligence method for path navigation based on DSP
CN113807118A (en) * 2020-05-29 2021-12-17 苏州科瓴精密机械科技有限公司 Robot edgewise working method and system, robot and readable storage medium
CN113807118B (en) * 2020-05-29 2024-03-08 苏州科瓴精密机械科技有限公司 Robot edge working method, system, robot and readable storage medium
CN116048104A (en) * 2023-04-03 2023-05-02 华南农业大学 Orchard operation robot path planning method, system and electronic equipment

Also Published As

Publication number Publication date
CN109753075B (en) 2022-02-08

Similar Documents

Publication Publication Date Title
CN109753075A (en) A kind of agricultural garden robot navigation method of view-based access control model
Ball et al. Vision‐based obstacle detection and navigation for an agricultural robot
Pfrunder et al. Real-time autonomous ground vehicle navigation in heterogeneous environments using a 3D LiDAR
Lee et al. Deep learning-based monocular obstacle avoidance for unmanned aerial vehicle navigation in tree plantations: Faster region-based convolutional neural network approach
CN105425791B (en) A kind of the group robot control system and method for view-based access control model positioning
CN102914967B (en) Autonomous navigation and man-machine coordination picking operating system of picking robot
Shalal et al. A review of autonomous navigation systems in agricultural environments
US20190208695A1 (en) Path Planning for Area Coverage
Masuzawa et al. Development of a mobile robot for harvest support in greenhouse horticulture—Person following and mapping
CN110243372A (en) Intelligent agricultural machinery navigation system and method based on machine vision
Edan Design of an autonomous agricultural robot
Gao et al. A spraying path planning algorithm based on colour-depth fusion segmentation in peach orchards
Zhang et al. 3D perception for accurate row following: Methodology and results
Matsuzaki et al. 3D semantic mapping in greenhouses for agricultural mobile robots with robust object recognition using robots' trajectory
Ahmadi et al. Towards autonomous visual navigation in arable fields
Durand-Petiteville et al. Design of a sensor-based controller performing u-turn to navigate in orchards
CN116839570B (en) Crop interline operation navigation method based on sensor fusion target detection
CN109814551A (en) Cereal handles automated driving system, automatic Pilot method and automatic identifying method
Yang et al. Vision based fruit recognition and positioning technology for harvesting robots
Peng et al. A combined visual navigation method for greenhouse spray robot
Li et al. Autonomous navigation for orchard mobile robots: A rough review
Conejero et al. Collaborative smart-robot for yield mapping and harvesting assistance
Rapoport et al. Navigation and control problems in precision farming
CN109029416B (en) Autonomous navigation method of automatic guided transport vehicle and automatic guided transport vehicle
Rovira-Más Recent innovations in off-road intelligent vehicles: in-field automatic navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant