CN103891697B - The variable spray method of a kind of indoor autonomous spraying machine device people - Google Patents

The variable spray method of a kind of indoor autonomous spraying machine device people Download PDF

Info

Publication number
CN103891697B
CN103891697B CN201410119127.5A CN201410119127A CN103891697B CN 103891697 B CN103891697 B CN 103891697B CN 201410119127 A CN201410119127 A CN 201410119127A CN 103891697 B CN103891697 B CN 103891697B
Authority
CN
China
Prior art keywords
support
medicine
crop
device people
machine device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410119127.5A
Other languages
Chinese (zh)
Other versions
CN103891697A (en
Inventor
刘阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Vocational College
Original Assignee
Nantong Vocational College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Vocational College filed Critical Nantong Vocational College
Priority to CN201410119127.5A priority Critical patent/CN103891697B/en
Publication of CN103891697A publication Critical patent/CN103891697A/en
Application granted granted Critical
Publication of CN103891697B publication Critical patent/CN103891697B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Guiding Agricultural Machines (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses the method for work of a kind of indoor autonomous spraying machine device people, described indoor autonomous spraying machine device people comprises wheel type machine human body, described wheeled robot body upper part is provided with medicine-chest and control assembly, described medicine-chest side is installed with travel(l)ing rest, described travel(l)ing rest comprises mutual vertically disposed horizontal support and longitudinal carrier, the far-end of described horizontal support is fixed with spray boom, described spray boom end is provided with shower nozzle, described spray boom afterbody is connected with described medicine-chest by flexible pipe, described horizontal support fixedly mounts crop identification camera and ultrasonic sensor, described wheel type machine human body front end is provided with navigation camera and ultrasonic sensor.Variable spray medicine method of work of the present invention, effectively can reduce system image procossing operand, realize accurate, essence amount significantly promotes spraying machine device people while spraying medicine to target real work efficiency, to promoting that effect is actively promoted in the application of greenhouse spraying machine device people.

Description

The variable spray method of a kind of indoor autonomous spraying machine device people
Technical field
The present invention relates to agricultural plant protection mechanical field, particularly relate to a kind of indoor autonomous spraying machine device people to the variable spray method of indoor band crop.
Background technology
Agricultural production in plant disease serious threat, and chemical control, mechanical dispenser are the Main Means of disease control.At present, the effective rate of utilization of China's agricultural chemicals only has about 30% (developed country is 60% ~ 70%).This not only wastes agricultural chemicals, also causes environmental pollution.Since the nineties in 20th century, " accurate agricultural " technology is subject to the most attention of countries in the world.Precision spraying technology namely in dispenser process according to actual needs, accomplish that " quantitatively, fixed point " is to target spray medicine, pursuing minimum dosage and the peak efficiency of liquid, is the developing direction of pesticides application technology.Compared with the complicated circumstances in outdoor land for growing field crops, be that the friendly facilities environment of representative creates favorable conditions for the popularization that takes the lead in of spraying machine device people and precision spraying technology thereof with greenhouse.
Consult existing document, the related work in this field biases toward the detection of crop row and the Study of recognition of crop and weeds both at home and abroad at present, and be mostly applied in the environment of outdoor land for growing field crops, study for the accurate spray medicine of mobile robot in facility and seldom see in document at home.
Application number is CN 200910211739.6, name is called that the invention of " intelligent variable medicine spraying machine based on prescription map controls " relates to a kind of intelligent variable medicine spraying machine, according to the disease of GPS/GIS technical limit spacing, worm, crop smothering different information, spray medicine as required, comprise intelligence control system and spray medicine system.Its intelligence control system is based on prescription map work pattern, not from the real time discriminating of machine vision, and lacks vision guided navigation function, and flexibility and the accuracy of spray medicine working method are not enough.
Application number is CN201310272642, name is called that the invention of " greenhouse automatic medicine sprayer device people " relates to a kind of greenhouse automatic medicine sprayer device people walked on track, and it comprises track travel parts, be fixed on liquid spraying part on ground-engaging element and correlation control unit.The main of disclosed content is characterised in that the track that ground is laid, but the disappearance of its independent navigation function limits the flexibility of its working method.
Summary of the invention
The object of invention: instant invention overcomes the inefficient deficiency existed in the accurate spray medicine operation process existed in prior art, provides a kind of spray of variable efficiently medicine method of work for short row crop and similar arrangement in a row potted landscape plant.
Technical scheme: the variable spray method of a kind of indoor autonomous spraying machine device people, specifically comprise wheel type machine human body, described wheeled robot body upper part is provided with medicine-chest and control assembly, travel(l)ing rest is installed with in described medicine-chest side, described travel(l)ing rest comprises mutual vertically disposed horizontal support and longitudinal carrier, described horizontal support far-end is fixed with spray boom, described spray boom end is provided with shower nozzle, described spray boom afterbody is connected with described medicine-chest by flexible pipe, described horizontal support fixedly mounts crop identification camera and ultrasonic sensor, described wheel type machine human body front end is provided with navigation camera and ultrasonic sensor, the spray method of described robot is a kind of continuous operation process based on monocular vision, its method of work comprises the quick consecutive steps that following circulation performs:
The ranging information of the ultrasonic sensor collection of A, read machine human body front end, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears;
The current picture that B, the crop identification camera reading wheel type machine human body are taken, adopts chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realizes the binaryzation of image;
Whether C, the target crop row judged in present frame there is the end, if it is directly terminate program and run; Otherwise proceed to next step;
D, on the basis of step B binary image, matching crop row center line, extracts yaw angle, driftage immediately apart from two parameters;
E, according to the yaw angle extracted in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize controlling the Navigational Movements of wheel type machine human body, to realize the vertical tracking of the photocentre aligning crop row center line of crop identification camera;
F, wheel type machine human body often travel a segment distance, calculate the accounting of object pixel area in a Present navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, control mist flow in real time by the decision-making of the good fuzzy neural network of precondition, realize essence amount target spraying; Then steps A is returned.
Described horizontal support comprises the first mutually nested support and the second support, and described first support and described second support can mutually stretch and retract.
Described longitudinal carrier comprises the 3rd mutually nested support and the 4th support, and described 3rd support and described 4th support can mutually stretch and retract.
Beneficial effect: disclosed indoor autonomous spraying machine device people and variable spray method thereof, the navigation information extraction realized respectively by usual two kinds of different monocular visions, crop detection identification two kinds of different images processing procedures are fused to a kind of image processing process under monocular vision, effective minimizing image-processing operations amount, obviously accelerates the practical work process to row spray medicine.Namely spraying in medicine operation process along crop row, the mode of vertically following the trail of crop row center line with the photocentre of crop identification camera carries out target detection identification while realizing the efficient quick navigation under indoor friendly environment in Same Scene picture, obvious reduction system image procossing operand, realize accurate, essence amount significantly promotes spraying machine device people while spraying medicine to target real work efficiency, to promoting that effect is actively promoted in the application of greenhouse spraying machine device people.
Accompanying drawing explanation
Fig. 1 is the structural representation of indoor autonomous spraying machine device people of the present invention;
Fig. 2 is moving rack structure schematic diagram of the present invention;
Fig. 3 is method of work overall flow block diagram of the present invention;
Fig. 4 is the Fast Segmentation flow chart of target crop;
Fig. 5 is the extraction flow chart of crop row center line;
Fig. 6 is the motion control flow chart of wheel type machine human body;
Fig. 7 is smart spraying decision flow diagram.
Embodiment
As Fig. 1, shown in 2, disclosed one indoor autonomous spraying machine device people, specifically comprise wheel type machine human body 1, described wheel type machine human body 1 top is provided with medicine-chest 2 and control assembly 3, described medicine-chest 2 side fixed installation vertical height is adjustable, cross out the travel(l)ing rest 4 of adjustable in length, described travel(l)ing rest 4 comprises mutual vertically disposed horizontal support 41 and longitudinal carrier 42, described horizontal support 41 has mutually nested and first support 411 and second support 412 that can stretch and shrink, described longitudinal carrier 42 has mutually nested and the 3rd support 421 that can stretch and shrink and the 4th support 422, on described first support 411, binding has spray boom 5, described first support 411 is fixed with crop identification camera 6 and ultrasonic sensor 7, described spray boom 5 end is provided with shower nozzle 8, described spray boom 5 afterbody is connected with described medicine-chest 2 by flexible pipe 9, described wheel type machine human body 1 front end is provided with navigation camera 11 and ultrasonic sensor 12.
Wherein, the horizontal support 41 of described travel(l)ing rest 4 and longitudinal carrier 42 can carry out manual adjustments according to plant height, extension elongation, spray with the liquid adapting to differing heights and line width crop.
Indoor autonomous spraying machine device people of the present invention, realize on the whole moving ahead along crop row direction and accurately spraying medicine to side direction crop, the crop identification camera 6 horizontal support 41 installed vertically downward realizes navigation information extraction simultaneously and detects recognition function with crop, namely vertically follow the trail of while crop row center line extracts navigation benchmark with the photocentre of crop identification camera 6 and carry out target crop detection, the monocular vision processing procedure of usual two kinds of difference in functionalitys is united two into one, simplify image calculation and process, improve the actual job efficiency of spray medicine equipment.Simultaneously for avoiding robot direct of travel to there is barrier, gather the ranging information of the ultrasonic sensor 12 of described wheel type machine human body 1 front end.
As shown in Figure 3, the variable spray method implemented by above-mentioned indoor autonomous spraying machine device people, specifically comprises as follows based on the quick consecutive steps that the circulation of monocular vision performs:
The ranging information of ultrasonic sensor 12 collection of A, read machine human body 1 front end, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears.
The current picture that B, the crop identification camera 6 reading wheel type machine human body 1 are taken, adopts chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realizes the binaryzation of image.
Whether C, the target crop row judged in present frame there is the end, if it is directly terminate program and run; Otherwise proceed to next step.
D, on the basis of step B binary image, matching crop row center line, extracts yaw angle, driftage immediately apart from two parameters.
E, according to the yaw angle extracted in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize carrying out Navigational Movements control to wheel type machine human body 1, aim at the vertical tracking of crop row center line with the photocentre realizing crop identification camera 6.
F, wheel type machine human body 1 often travel a segment distance, calculate the accounting of object pixel area in a Present navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, control mist flow in real time by the decision-making of the good fuzzy neural network of precondition, realize essence amount target spraying; Then steps A is returned.
Below above-mentioned key step is specifically described:
As shown in Figure 4, the flow process of the Fast Segmentation of described target crop is, first by crop identification camera vertical collection crop map picture, the recycling color property factor carries out image gray processing, then carry out image binaryzation, then carry out medium filtering and morphology operations.
Iamge Segmentation refers to and utilizes machine vision technique target from background extracting out, the primary work realizing accurate dispenser, existing Iamge Segmentation pattern is numerous, and utilize color characteristic to be split from background by plant, it is the most frequently used pattern of plnat monitoring, usually the crop map picture of shooting is transformed into other color space such as HSI or Lab, rgb from rgb space, to weaken the impact of ambient lighting on image pixel value, but the conversion of color space needs computing consuming time, inapplicable when requirement of real-time is higher.
For green crop the most common, consuming time and the segmentation effect of comprehensive segmentation, select the mistake of the RGB color system green 2G-R-B aberration factor to carry out the gray processing of image in the present invention, and adopt Ostu adaptive threshold fuzziness to realize the binaryzation of image, split green crop and background preferably.Generally speaking, the excess green of plant part is greater than the non-plant parts such as soil, and namely the value of G-B and G-R of green plants is a little more than non-green plant, so first compare R, G, B, then carry out the gray count of image, its formula is as follows:
F (i, j)=0 is as R > G or B > G
F (i, j)=2G-R-B other
After realizing the gray processing of image, consider the impact of illumination variation, Ostu adaptive threshold fuzziness is adopted to carry out binaryzation, split green crop and background preferably, there is noise and hole in the image after Threshold segmentation, application median filter carries out filtering to the image after segmentation, and carries out morphology operations to fill up the hole in background, obtains comparatively complete crop row target.
As shown in Figure 5, the extraction flow process of crop row center line is first extract above-mentioned bianry image, scans object pixel upper and lower sides marginal point coordinate in order and averages, secondly pass through least square fitting crop row center line as navigation datum line, calculate yaw angle and driftage distance
On the basis of binary image, for adding fast scan speed, sequentially scan each row upper and lower sides marginal point coordinate of target pixel region from left to right every certain columns and obtain average, extract discrete each center point coordinate, if obtain the discrete middle point coordinates of each row by column, then consuming time more, the mode every column count is adopted to reduce operand, interval columns is more in theory, the columns of required statistics is fewer, computing time is fewer, more be conducive to the real-time process of Vision Builder for Automated Inspection, if but interval columns is too many, the error of calculation will certainly be caused, therefore, for Different Crop feature, an optimal spacing columns can be selected, reach the object not only shortening operation time but also do not cause the error of calculation, optimized algorithm execution efficiency.
The vision guided navigation of agricultural machinery adopts Hough transform to carry out straight-line detection mostly, advantage affects little by noise and crop disappearance, strong robustness, shortcoming is that real-time is poor, consider the feature that native system close shot imaging impure point is less, method of least squares more is efficiently selected to simulate rapidly crop row center line as navigation datum line to the discrete navigation spots extracted, the photocentre (initial point of image physical coordinates system) of quick acquisition crop camera and the yaw angle θ of crop row centre of surface line, driftage is apart from lambda parameter, when not considering pattern distortion, image coordinate system in vertical imaging can be considered with the yaw angle in world coordinate system and is equal to, but driftage distance λ is because the difference palpus of image height is by camera calibration Relation acquisition, because crop identification camera is at right angle setting, so calibration process greatly simplifies, when concrete shooting is highly fixing, demarcation drawing is placed in level ground, measure the conversion relation of pixel distance and actual range.When actual measurement driftage is apart from λ, according to right angled triangle model, according to measuring the image height merged from support ultrasonic, show that actual driftage is apart from λ very soon.
As shown in Figure 6, the motion control flow process of wheel type machine human body is, obtain current yaw angle and driftage distance, carry out domain conversion again, be mapped to corresponding fuzzy subset and degree of membership, by adaptive nuero-fuzzy inference system, obtain the fuzzy subset corresponding to controlled quentity controlled variable and degree of membership, by gravity model appoach defuzzification, driving wheel-type robot body.
The motion control of current wheeled mobile robot be divided into control with linear model, vehicle mathematical model control method that optimum control is representative and the intelligent control method that is representative with fuzzy control, ANN Control, because wheeled locomotion mechanism itself has nonlinearity and uncertainty, and consider that state of ground is not satisfactory and complicated with tire and mechanism, set up accurately rational model difficulty comparatively large, therefore avoid using auto model to control.On the contrary, fuzzy control has and does not rely on mathematical models, feature that robustness is good.But it is fine and smooth not to there is control in simple fuzzy control controller, the shortcoming that stable state accuracy is poor.Therefore, for improving control accuracy, improving dynamic property, adopting the fuzzy control of self-adaptive proportion gene, control to compare with simple fuzzy control, its control law can carry out self-adaptative adjustment according to actual deviation situation.
In the path trace of robot, main actions is divided into straight line moving and speed difference to turn to.Robot forward speed during operation is fixed, according to yaw angle, driftage distance parameter that a upper link is extracted, by regulating car body both sides speed discrepancy, calculate the pulse value that each control cycle of left/right turbin generator should be arranged, drive machines people straight ahead or speed difference turn to, to realize the vertical tracking of video camera photocentre to crop row center line.
The input variable that concrete two dimension fuzzy controls is respectively yaw angle θ and driftage distance λ, exports as speed discrepancy u is expected in two-wheeled model both sides.Its fuzzy control rule can be expressed as:
u=-[αλ+(1-α)θ] α∈[0,1]
Wherein, α is the rule chosen factor, also claims weighted factor.By adjusting alpha value, the weighting degree of yaw angle θ and driftage distance λ can be changed, realize the adjustment of control law.When going off course larger apart from λ, the weighting of λ is larger, can eliminate lateral attitude deviation fast; When going off course less apart from λ, the weighting of θ is larger, can improve the stability of system.The determination of weighted factor can be asked for fast according to specific experiment data acquisition linear interpolation method.Each control cycle, system can go out by following formulae discovery the different pulse number sent left and right sides motor according to actual state and carry out the straight line moving of control and speed difference turns to.
v ( K ) = v L ( K ) + v R ( K ) 2
v L ( K + 1 ) = v ( K ) + u ( K ) 2
v R ( K + 1 ) = v ( K ) - u ( K ) 2
Wherein, v l, v r, v represent respectively left side wheel speed, right side wheel speed and robot barycenter speed.
As shown in Figure 7, the flow process of spraying decision-making and control is, after a segment distance is exercised by robot, calculate the imaging area accounting of an object pixel in present image (because image-forming range is different, cannot directly obtain actual crop area), then extract support ultrasound data and merge and image-forming range, manually import solar term factor information again, merge above-mentioned three, carry out spray amount decision-making by fuzzy neural network, control in real time to spray flow.Target spraying is implemented complete, and spraying machine device people continues to move ahead, and enters circulate next time according to flow process shown in Fig. 3.
Wherein, the acquisition of image-forming range comes from the continuous measurement of ultrasonic sensor support being close to camera place.For correctly obtaining image-forming range, reducing measure error, extracting one piece of data before comprising imaging moment, through simple data Weighted Fusion, obtaining final image-forming range.
The present invention can be used for other specified scheme be included in essence of the present invention and scope and implements.Such as, take green crop as object in above-described embodiment, but the present invention is not limited to this, the crop row for other characteristic colors sprays, as long as after selecting the color characteristic factor be applicable to realize gray processing, can continue to perform according to the present embodiment.Equally, all available multiple methods of step such as the extraction of crop row center line, the motion control of mobile platform substitute, and all do not change the high efficiency essence of the continuous operation process based on monocular vision that the present invention states.Described embodiment should be considered as illustrative, and the present invention can change in Claims scope and full scope of equivalents thereof.

Claims (3)

1. the variable spray method of an indoor autonomous spraying machine device people, it is characterized in that: specifically comprise wheel type machine human body, described wheeled robot body upper part is provided with medicine-chest and control assembly, travel(l)ing rest is installed with in described medicine-chest side, described travel(l)ing rest comprises mutual vertically disposed horizontal support and longitudinal carrier, described horizontal support far-end is fixed with spray boom, described spray boom end is provided with shower nozzle, described spray boom afterbody is connected with described medicine-chest by flexible pipe, described horizontal support fixedly mounts crop identification camera and ultrasonic sensor, described wheel type machine human body front end is provided with navigation camera and ultrasonic sensor, the variable spray method of robot is a kind of continuous operation process based on monocular vision, comprise the quick consecutive steps that following circulation performs:
The ranging information of the ultrasonic sensor collection of A, read machine human body front end, guarantees in robot direction of advance accessible; Otherwise continue find range and wait for, until barrier disappears;
The current picture of the crop identification camera shooting of B, read machine human body, adopts chromatism method Fast Segmentation from vertical close shot imaging to go out target crop, and realizes the binaryzation of image;
Whether C, the target crop row judged in present frame there is the end, if it is directly terminate program and run; Otherwise proceed to next step;
D, on the basis of step B binary image, matching crop row center line, extracts yaw angle, driftage immediately apart from two parameters;
E, according to the yaw angle extracted in step D and driftage apart from two parameters, adopt adaptive fuzzy control method to realize controlling the Navigational Movements of robot body, to realize the vertical tracking of the photocentre aligning crop row center line of crop identification camera;
F, wheel type machine human body often travel a segment distance, calculate the accounting of object pixel area in a Present navigation frame, according to image-forming range and imaging accounting, in conjunction with the solar term information of artificial input, mist flow is controlled in real time by the decision-making of the good fuzzy neural network of precondition, realize essence amount target spraying, then return steps A.
2. the variable spray method of a kind of indoor autonomous spraying machine device people according to claim 1, it is characterized in that: described horizontal support comprises the first mutually nested support and the second support, and described first support and described second support can mutually stretch and retract.
3. the variable spray method of a kind of indoor autonomous spraying machine device people according to claim 1, it is characterized in that: described longitudinal carrier comprises the 3rd mutually nested support and the 4th support, and described 3rd support and described 4th support can mutually stretch and retract.
CN201410119127.5A 2014-03-28 2014-03-28 The variable spray method of a kind of indoor autonomous spraying machine device people Expired - Fee Related CN103891697B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410119127.5A CN103891697B (en) 2014-03-28 2014-03-28 The variable spray method of a kind of indoor autonomous spraying machine device people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410119127.5A CN103891697B (en) 2014-03-28 2014-03-28 The variable spray method of a kind of indoor autonomous spraying machine device people

Publications (2)

Publication Number Publication Date
CN103891697A CN103891697A (en) 2014-07-02
CN103891697B true CN103891697B (en) 2015-08-12

Family

ID=50983531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410119127.5A Expired - Fee Related CN103891697B (en) 2014-03-28 2014-03-28 The variable spray method of a kind of indoor autonomous spraying machine device people

Country Status (1)

Country Link
CN (1) CN103891697B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104082268B (en) * 2014-07-08 2015-05-13 西北农林科技大学 Self-adaptive orchard sprayer
CN104186451B (en) * 2014-08-19 2017-06-20 西北农林科技大学 A kind of deinsectization weeding spray robot based on machine vision
CN104396927B (en) * 2014-12-05 2016-09-14 重庆利贞元农林科技有限公司 Fruit tree orientation spray method
CN104488843B (en) * 2014-12-17 2017-06-06 江苏大学 The wind spraying aid type variable rate spray spray boom automatic tracking system and control method of a kind of use laser sensor
CN104574405B (en) * 2015-01-15 2018-11-13 北京天航华创科技股份有限公司 A kind of coloured image threshold segmentation method based on Lab space
CN104615059B (en) * 2015-02-09 2017-03-22 聊城大学 Control system of spraying robot for hyphantria cunea larva net curtain
CN104742734B (en) * 2015-03-30 2017-03-15 钟长瑞 A kind of multifunctional assembled farm-oriented motor-driven platform truck of four-wheel drive
CN105528004B (en) * 2015-12-02 2018-02-02 安徽农业大学 A kind of brainpower insufflation machine and brainpower insufflation method
CN105638389B (en) * 2016-02-18 2019-04-16 中国农业科学院农田灌溉研究所 A kind of Multi-functional self-actuated walks pergola on the city road irrigator
CN106774420B (en) * 2017-01-23 2019-11-05 东莞理工学院 A kind of automation agriculture pollination method based on micro-robot
CN106908062B (en) * 2017-04-21 2019-08-13 浙江大学 A kind of self-propelled chlorophyll fluorescence Image Acquisition robot and its acquisition method
CN107593629B (en) * 2017-10-31 2024-04-26 四川省农业机械科学研究院 Intelligent robot and intelligent silkworm house
CN109042595B (en) * 2018-08-13 2021-10-12 江苏大学 Telescopic spraying mechanism carried on rail moving device of greenhouse sprayer
CN110328666A (en) * 2019-07-16 2019-10-15 汕头大学 Identifying system and material mechanism for picking
CN110710519A (en) * 2019-10-08 2020-01-21 桂林理工大学 Intelligent pesticide spraying robot
CN110909668B (en) * 2019-11-20 2021-02-19 广州极飞科技有限公司 Target detection method and device, computer readable storage medium and electronic equipment
CN111436414B (en) * 2020-04-01 2021-11-23 江苏大学 Greenhouse strawberry canopy inner circumference wind-conveying pesticide applying robot and implementation method thereof
CN112189645B (en) * 2020-10-27 2023-12-15 江苏大学 Double-online medicine mixing sprayer suitable for intercropping and working method
CN112690265B (en) * 2020-12-24 2022-09-06 吉林农业大学 Timed pesticide spraying device and method for pest control
CN113100207B (en) * 2021-04-14 2022-11-22 郑州轻工业大学 Accurate formula pesticide applying robot system based on wheat disease information and pesticide applying method
CN113973607B (en) * 2021-09-14 2023-09-08 山东省农业科学院作物研究所 Self-propelled maize leaf monitoring and marking device
CN113925036A (en) * 2021-09-19 2022-01-14 南京农业大学 Accurate automatic pesticide applying device based on machine vision
CN117397661A (en) * 2023-10-27 2024-01-16 佛山市天下谷科技有限公司 Medicine spraying control method of medicine spraying robot and medicine spraying robot

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4132637C2 (en) * 1991-10-01 1995-04-27 Walter Prof Dr Kuehbauch Process for controlled weed control
US5348226A (en) * 1992-11-12 1994-09-20 Rhs Fertilizing/Spraying Systems Spray boom system with automatic boom end height control
CN201541615U (en) * 2009-10-16 2010-08-11 西北农林科技大学 Variable-amplitude spraying machine
CN101807252B (en) * 2010-03-24 2012-03-28 中国农业大学 Crop row center line extraction method and system
CN102428904B (en) * 2011-09-20 2013-11-20 上海交通大学 Automatic targeting and variable atomizing flow control system for weeding robot
CN102914967B (en) * 2012-09-21 2015-01-28 浙江工业大学 Autonomous navigation and man-machine coordination picking operating system of picking robot
CN103196441A (en) * 2013-03-19 2013-07-10 江苏大学 Spraying machine integrated navigation method and system of
CN103530643A (en) * 2013-10-11 2014-01-22 中国科学院合肥物质科学研究院 Pesticide positioned spraying method and system on basis of crop interline automatic identification technology
CN203762122U (en) * 2014-03-28 2014-08-13 南通职业大学 Indoor autonomous mobile pesticide spraying robot

Also Published As

Publication number Publication date
CN103891697A (en) 2014-07-02

Similar Documents

Publication Publication Date Title
CN103891697B (en) The variable spray method of a kind of indoor autonomous spraying machine device people
Meng et al. Development of agricultural implement system based on machine vision and fuzzy control
CN109885063A (en) A kind of application robot farmland paths planning method merging vision and laser sensor
CN107798330A (en) A kind of weld image characteristics information extraction method
CN106909148A (en) Based on the unmanned air navigation aid of agricultural machinery that farm environment is perceived
CN104751151B (en) A kind of identification of multilane in real time and tracking
CN110243372A (en) Intelligent agricultural machinery navigation system and method based on machine vision
CN105486228B (en) A kind of trees target volume method for real-time measurement based on two dimensional laser scanning instrument
CN106584451A (en) Visual navigation based transformer substation automatic composition robot and method
CN102914967A (en) Autonomous navigation and man-machine coordination picking operating system of picking robot
CN102172233A (en) Method for carrying out real-time identification and targeted spraying on cotton field weeds
CN113057154A (en) Greenhouse liquid medicine spraying robot
WO1996017279A1 (en) Vehicle guidance system
CN203762122U (en) Indoor autonomous mobile pesticide spraying robot
CN113065562A (en) Crop ridge row extraction and leading route selection method based on semantic segmentation network
CN108196538A (en) Field robots autonomous navigation system and method based on three-dimensional point cloud model
CN104268551A (en) Steering angle control method based on visual feature points
CN207139809U (en) A kind of agriculture inspecting robot with navigation barrier avoiding function
Wang et al. The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system
AU691051B2 (en) Vehicle guidance system
Chen et al. Development and performance test of a height-adaptive pesticide spraying system
CN107578447A (en) A kind of crop ridge location determining method and system based on unmanned plane image
CN115451965B (en) Relative heading information detection method for transplanting system of transplanting machine based on binocular vision
Peng et al. A combined visual navigation method for greenhouse spray robot
CN115280960B (en) Combined harvester steering control method based on field vision SLAM

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150812

Termination date: 20190328

CF01 Termination of patent right due to non-payment of annual fee