CN111553258B - Tea garden identification weeding method by utilizing convolutional neural network - Google Patents

Tea garden identification weeding method by utilizing convolutional neural network Download PDF

Info

Publication number
CN111553258B
CN111553258B CN202010337143.7A CN202010337143A CN111553258B CN 111553258 B CN111553258 B CN 111553258B CN 202010337143 A CN202010337143 A CN 202010337143A CN 111553258 B CN111553258 B CN 111553258B
Authority
CN
China
Prior art keywords
picture
small
weeding
turning
tea
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010337143.7A
Other languages
Chinese (zh)
Other versions
CN111553258A (en
Inventor
王根
江晓明
王鑫瑞
张金梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University
Original Assignee
Jiangsu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University filed Critical Jiangsu University
Priority to CN202010337143.7A priority Critical patent/CN111553258B/en
Publication of CN111553258A publication Critical patent/CN111553258A/en
Application granted granted Critical
Publication of CN111553258B publication Critical patent/CN111553258B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a tea garden identification weeding method by utilizing a convolutional neural network, belonging to the field of pattern identification; firstly, a robot is used for collecting images with fixed angles in real time among tea gardens, the images are divided by using horizontal and vertical grid lines, and the divided small images are used for convolutional neural network identification, so that a classification result is obtained. Because the tea garden environment is complex, the picture of the 'mixed' result in the obtained classification result is required to be subjected to secondary identification of the convolutional neural network, and the result is obtained. According to the identified result, the positions of tea, grass and other positions are obtained, and weeding operation is automatically carried out according to the obtained positions. The invention reduces the manpower consumption and improves the weeding efficiency and weeding quality of the tea garden.

Description

Tea garden identification weeding method by utilizing convolutional neural network
Technical Field
The invention belongs to the field of pattern recognition, and particularly relates to a tea garden recognition weeding method by using a convolutional neural network.
Background
Tea gardens have long suffered from weeds that affect the absorption of tea on fertilizer and moisture, and thus affect the growth and quality of tea. The weeding work of tea farmers becomes an important part of the tea garden work. The traditional weeding machine is provided with two wheels and a motor which rotates at high speed, and weeds are removed by manually pulling the weeding machine and rapidly rotating a weeding cutter head in front of the weeding machine. Although the method has a little improvement compared with the most traditional manual weed removal method, a person is required to hold a large-sized weeding machine to carry out the operation of personally taking the weeding machine and hold the large-sized weeding machine to work, and the work intensity of tea farmers is increased. Moreover, the weeder has the advantages that because the horsepower is very high, huge dust is very easy to lift in the weeding process, the respiratory tract health of tea farmers is affected, and the weeder is huge, so that the weeder bit is also somewhat bulkier, only some weeds far away from tea trees can be removed in the weeding process, and the weeds close to the tea trees are easy to hurt the tea trees in the removing process.
Disclosure of Invention
Aiming at the problems, the invention provides a tea garden identification weeding method by utilizing a convolutional neural network, so as to reduce manpower consumption and improve weeding efficiency and weeding quality.
In order to achieve the above purpose, the specific technical scheme of the invention is as follows: a tea garden identification weeding method by utilizing a convolutional neural network comprises the following steps:
1) Collecting image data, namely driving the intelligent weeding robot to walk according to a pre-planned path, and taking a picture;
2) Image processing: dividing each acquired image into N pictures with the same size, artificially judging the type of the picture, and adding label information for each picture according to the type of the picture, wherein the type of the picture comprises: tea, grass, other and mixtures;
3) Picture learning: using a neural network to learn by adopting a supervised learning mode; wherein the neural network is a GoogLeNet network with a first layer convolution kernel and a last layer convolution kernel of 1*1;
4) Weeding in real time.
Further, in the step 1), the intelligent weeding robot comprises a mechanical arm, a mechanical weeding small gear, a chain tire trolley, a camera, a control module and a power module; the mechanical arm and the mechanical weeding small gear are used for weeding; the chain type tire trolley is used for walking; the camera is used for shooting real-time images of field tea and grass, the installation height of the camera is 2/3 of the height of the tea tree, and the shooting angle of the camera is opposite to the tea tree; the control module is used for identifying and sending weeding commands, and the power supply module is used for driving the trolley and the control module.
Further, in the step 1), the pre-planned path is: walk back and forth between two rows of tea trees and maintain a fixed range of spacing with the tea trees.
Further, the real-time weeding in the step 4) comprises the following steps:
4.1 Starting an intelligent weeding robot and starting a camera;
4.2 Moving the intelligent weeding robot for a fixed distance, and shooting an image;
4.3 Dividing the shot image into N small pictures N with the same size;
4.4 Taking out a small picture n, inputting the small picture n into a neural network for recognition, and obtaining the judgment percentages of tea, grass, other materials and mixture through classification regression;
4.5 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture n;
4.6 If the type of the small picture n is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.7); if the type of the small picture is 'mixed', turning to the step 4.7);
4.7 Judging whether all the small pictures n are processed, if so, turning to step 4.8), otherwise, turning to step 4.4);
4.8 Judging whether a 'mixed' small picture exists, if so, turning to the step 4.9), otherwise turning to the step 4.2);
4.9 Taking out a piece of mixed small picture in turn, and setting the taken mixed small picture as H;
4.10 Horizontally expanding and vertically expanding the mixed small picture H to construct a mixed rectangular region R;
4.11 Dividing the mixed rectangular region R into Q, Q < N, of small pictures Q with the same size;
4.12 Judging whether the small picture q is larger than the input neural network picture threshold value or not, if so, turning to step 4.13), otherwise, turning to step 4.18);
4.13 Sequentially taking out a small picture q, inputting the small picture q into a neural network for recognition, and obtaining the judgment percentages of tea, grass and other through classification regression;
4.14 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture q;
4.15 If the type of the small picture q is "tea" or "other" to step 4.16); if the type of the small picture is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.16);
4.16 Judging whether the processing of the small picture q is finished, if so, turning to the step 4.17), otherwise, turning to the step 4.13);
4.17 Judging whether all the processing of the 'mixed' small pictures is finished, if so, turning to the step 4.2), otherwise, turning to the step 4.9);
4.18 Estimating the classification result of the small picture by averaging according to the mixed small picture H and the adjacent pictures around, and turning to the step 4.19);
4.19 If the "mix" thumbnail H is determined to be "tea" to step 4.17); if the type of the small picture is 'grass', starting the small mechanical weeding gear to weed, and turning to the step 4.17).
Further, in the above step 4.10), constructing the mixed rectangular region R includes the steps of:
4.10.1 Horizontally expanding the small picture with H until the small picture is marked with other labels or reaches the image boundary; obtaining a horizontal area A formed by small pictures;
4.10.2 Vertically expanding the small picture H until the other labels or the image boundary is reached to obtain a vertical area B formed by the small picture;
4.10.3 A) constructing a mixed rectangular region R with the horizontal region a and the horizontal region B as rectangular sides.
The invention provides a tea garden weeding method based on a convolutional neural network, which combines the neural network with tea garden weeding, analyzes and identifies tea tree images shot by a robot walking in a tea garden, gives instructions to a mechanical arm according to the identified results, and clears identified target grass. The working efficiency of the tea garden can be greatly increased, the working quality is improved, the consumption of manpower is reduced, more manpower is put into more needed places, and therefore the quality of tea leaves in the tea garden is improved.
Drawings
FIG. 1 is a flow chart of a tea garden identification weeding method.
Detailed Description
The invention will be further described with reference to the drawings and the specific embodiments, it being noted that the technical solution and the design principle of the invention will be described in detail with only one optimized technical solution, but the scope of the invention is not limited thereto.
The examples are preferred embodiments of the present invention, but the present invention is not limited to the above-described embodiments, and any obvious modifications, substitutions or variations that can be made by one skilled in the art without departing from the spirit of the present invention are within the scope of the present invention.
A tea garden identification weeding method by utilizing a convolutional neural network comprises the following steps:
1) Collecting image data, namely driving the intelligent weeding robot to walk according to a pre-planned path, and taking a picture; wherein:
the intelligent weeding robot comprises a mechanical arm, a mechanical weeding small gear, a chain type tire trolley, a camera, a control module and a power module; the mechanical arm and the mechanical weeding small gear are used for weeding; the chain type tire trolley is used for walking; the camera is used for shooting real-time images of field tea and grass, the installation height of the camera is 2/3 of the height of the tea tree, and the shooting angle of the camera is opposite to the tea tree; the control module is used for identifying and sending weeding commands, and the power supply module is used for driving the trolley and the control module.
The pre-planned path is: walk back and forth between two rows of tea trees and maintain a fixed range of spacing with the tea trees.
2) Image processing, namely dividing each acquired image into N pictures with the same size, manually judging the type of the picture, and adding label information for each picture according to the type of the picture, wherein the types of the picture comprise: tea, grass, other and mixtures;
3) Picture learning, namely learning by using a neural network and adopting a supervised learning mode; wherein the neural network is a GoogLeNet network with a first layer convolution kernel and a last layer convolution kernel of 1*1;
4) Weeding in real time. As shown in fig. 1, the method comprises the following steps:
4.1 Starting an intelligent weeding robot and starting a camera;
4.2 Moving the intelligent weeding robot for a fixed distance, and shooting an image;
4.3 Dividing the shot image into N small pictures N with the same size;
4.4 Taking out a small picture n, inputting the small picture n into a neural network for recognition, and obtaining the judgment percentages of tea, grass, other materials and mixture through classification regression;
4.5 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture n;
4.6 If the type of the small picture n is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.7); if the type of the small picture is 'mixed', turning to the step 4.7);
4.7 Judging whether all the small pictures n are processed, if so, turning to step 4.8), otherwise, turning to step 4.4);
4.8 Judging whether a 'mixed' small picture exists, if so, turning to the step 4.9), otherwise turning to the step 4.2);
4.9 Taking out a piece of mixed small picture in turn, and setting the taken mixed small picture as H;
4.10 Horizontally expanding and vertically expanding the mixed small picture H to construct a mixed rectangular region R; the construction method comprises the following steps:
4.10.1 Horizontally expanding the small picture with H until the small picture is marked with other labels or reaches the image boundary; obtaining a horizontal area A formed by small pictures;
4.10.2 Vertically expanding the small picture H until the other labels or the image boundary is reached to obtain a vertical area B formed by the small picture;
4.10.3 A mixed rectangular region R is constructed by taking the horizontal region A and the horizontal region B as rectangular sides;
4.11 Dividing the mixed rectangular region R into Q, Q < N, of small pictures Q with the same size;
4.12 Judging whether the small picture q is larger than the input neural network picture threshold value or not, if so, turning to step 4.13), otherwise, turning to step 4.18);
4.13 Sequentially taking out a small picture q, inputting the small picture q into a neural network for recognition, and obtaining the judgment percentages of tea, grass and other through classification regression;
4.14 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture q;
4.15 If the type of the small picture q is "tea" or "other" to step 4.16); if the type of the small picture is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.16);
4.16 Judging whether the processing of the small picture q is finished, if so, turning to the step 4.17), otherwise, turning to the step 4.13);
4.17 Judging whether all the processing of the 'mixed' small pictures is finished, if so, turning to the step 4.2), otherwise, turning to the step 4.9);
4.18 Estimating the classification result of the small picture by averaging according to the mixed small picture H and the adjacent pictures around, and turning to the step 4.19);
4.19 If the "mix" thumbnail H is determined to be "tea" to step 4.17); if the type of the small picture is 'grass', starting the small mechanical weeding gear to weed, and turning to the step 4.17).
Compared with traditional manual weeding, the invention greatly increases the weeding work efficiency of the tea garden, improves the working quality and reduces the manpower consumption.

Claims (1)

1. A tea garden identification weeding method by utilizing a convolutional neural network is characterized by comprising the following steps:
1) Collecting image data, namely driving the intelligent weeding robot to walk according to a pre-planned path, and taking a picture;
2) Image processing: dividing each acquired image into N pictures with the same size, artificially judging the type of the picture, and adding label information for each picture according to the type of the picture, wherein the type of the picture comprises: tea, grass, other and mixtures;
3) Picture learning: using a neural network to learn by adopting a supervised learning mode; wherein the neural network is a GoogLeNet network with a first layer convolution kernel and a last layer convolution kernel of 1*1;
4) Weeding in real time;
the intelligent weeding robot in the step 1) comprises a mechanical arm, a mechanical weeding small gear, a chain type tire trolley, a camera, a control module and a power module; the mechanical arm and the mechanical weeding small gear are used for weeding; the chain type tire trolley is used for walking; the camera is used for shooting real-time images of field tea and grass, the installation height of the camera is 2/3 of the height of the tea tree, and the shooting angle of the camera is opposite to the tea tree; the control module is used for identifying and sending weeding commands, and the power supply module is used for driving the trolley and the control module;
the path pre-planned in the step 1) is as follows: walking back and forth between two rows of tea trees, and keeping a distance with a fixed range difference with the tea trees;
the real-time weeding in the step 4) comprises the following steps:
4.1 Starting an intelligent weeding robot and starting a camera;
4.2 Moving the intelligent weeding robot for a fixed distance, and shooting an image;
4.3 Dividing the shot image into N small pictures N with the same size;
4.4 Taking out a small picture n, inputting the small picture n into a neural network for recognition, and obtaining the judgment percentages of tea, grass, other materials and mixture through classification regression;
4.5 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture n;
4.6 If the type of the small picture n is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.7); if the type of the small picture is 'mixed', turning to the step 4.7);
4.7 Judging whether all the small pictures n are processed, if so, turning to step 4.8), otherwise, turning to step 4.4);
4.8 Judging whether a 'mixed' small picture exists, if so, turning to the step 4.9), otherwise turning to the step 4.2);
4.9 Taking out a piece of mixed small picture in turn, and setting the taken mixed small picture as H;
4.10 Horizontally expanding and vertically expanding the mixed small picture H to construct a mixed rectangular region R;
4.11 Dividing the mixed rectangular region R into Q, Q < N, of small pictures Q with the same size;
4.12 Judging whether the small picture q is larger than the input neural network picture threshold value or not, if so, turning to step 4.13), otherwise, turning to step 4.18);
4.13 Sequentially taking out a small picture q, inputting the small picture q into a neural network for recognition, and obtaining the judgment percentages of tea, grass and other through classification regression;
4.14 Taking the highest value of the judgment percentages as a prejudgment value of the neural network, and taking the highest value as a recognition result to obtain the type of the small picture q;
4.15 If the type of the small picture q is "tea" or "other" to step 4.16); if the type of the small picture is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.16);
4.16 Judging whether the processing of the small picture q is finished, if so, turning to the step 4.17), otherwise, turning to the step 4.13);
4.17 Judging whether all the processing of the 'mixed' small pictures is finished, if so, turning to the step 4.2), otherwise, turning to the step 4.9);
4.18 Estimating the classification result of the small picture by averaging according to the mixed small picture H and the adjacent pictures around, and turning to the step 4.19);
4.19 If the "mix" thumbnail H is determined to be "tea" to step 4.17); if the type of the small picture is grass, starting a mechanical weeding small gear to weed, and turning to the step 4.17);
in the step 4.10), the construction of the mixed rectangular region R includes the steps of:
4.10.1 Horizontally expanding the small picture with H until the small picture is marked with other labels or reaches the image boundary; obtaining a horizontal area A formed by small pictures;
4.10.2 Vertically expanding the small picture H until the other labels or the image boundary is reached to obtain a vertical area B formed by the small picture;
4.10.3 A) constructing a mixed rectangular region R with the horizontal region a and the horizontal region B as rectangular sides.
CN202010337143.7A 2020-04-26 2020-04-26 Tea garden identification weeding method by utilizing convolutional neural network Active CN111553258B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010337143.7A CN111553258B (en) 2020-04-26 2020-04-26 Tea garden identification weeding method by utilizing convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010337143.7A CN111553258B (en) 2020-04-26 2020-04-26 Tea garden identification weeding method by utilizing convolutional neural network

Publications (2)

Publication Number Publication Date
CN111553258A CN111553258A (en) 2020-08-18
CN111553258B true CN111553258B (en) 2023-06-13

Family

ID=72000662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010337143.7A Active CN111553258B (en) 2020-04-26 2020-04-26 Tea garden identification weeding method by utilizing convolutional neural network

Country Status (1)

Country Link
CN (1) CN111553258B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113367013A (en) * 2021-04-26 2021-09-10 南充市农业科学院 Green grass control method for citrus orchard

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109522797A (en) * 2018-10-16 2019-03-26 华南农业大学 Rice seedling and Weeds at seedling recognition methods and system based on convolutional neural networks
CN109784205B (en) * 2018-12-25 2021-02-23 国网河北省电力有限公司电力科学研究院 Intelligent weed identification method based on multispectral inspection image
CN110135341B (en) * 2019-05-15 2021-05-18 河北科技大学 Weed identification method and device and terminal equipment

Also Published As

Publication number Publication date
CN111553258A (en) 2020-08-18

Similar Documents

Publication Publication Date Title
Li et al. Detection of fruit-bearing branches and localization of litchi clusters for vision-based harvesting robots
CN102239756B (en) Intelligent farming robot in greenhouse
CN103081597B (en) Swinging type intelligent inter-seedling hoeing machine and tool unit
Van Henten et al. Robotics in protected cultivation
WO2023050783A1 (en) Weeding robot and method and apparatus for planning weeding path thereof, and medium
US20230026679A1 (en) Mobile sensing system for crop monitoring
CN111553258B (en) Tea garden identification weeding method by utilizing convolutional neural network
CN108450436B (en) Facility greenhouse deinsectization robot
Vikram Agricultural Robot–A pesticide spraying device
Ozdemir et al. Precision Viticulture tools to production of high quality grapes.
CN109964675A (en) Vine beta pruning robot device
US20220101557A1 (en) Calibration of autonomous farming vehicle image acquisition system
Visentin et al. A mixed-autonomous robotic platform for intra-row and inter-row weed removal for precision agriculture
CN114779692A (en) Linear sliding table type weeding robot and control method thereof
CN202111998U (en) Intelligent tillage robot in greenhouse
Chengliang et al. Current status and development trends of agricultural robots
Braun et al. Improving pesticide spray application in vineyards by automated analysis of the foliage distribution pattern in the leaf wall
CN116686545A (en) Litchi picking robot shade removing method based on machine vision control
CN114451082B (en) In-line real-time mechanical weeding equipment and weeding method
Kushwaha Robotic and mechatronic application in agriculture
Vedula et al. Computer vision assisted autonomous intra-row weeder
Davidson et al. Recent work on robotic pruning of upright fruiting offshoot cherry systems
CN113361377A (en) Plant growth control model construction method, electronic device and storage medium
Kshetri et al. Automated System for Continuous Green Shoot Thinning in Vineyards
Valero et al. Single Plant Fertilization using a Robotic Platform in an Organic Cropping Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant