WO2019083336A1 - Method and device for crop and weed classification using neural network learning - Google Patents

Method and device for crop and weed classification using neural network learning

Info

Publication number
WO2019083336A1
WO2019083336A1 PCT/KR2018/012883 KR2018012883W WO2019083336A1 WO 2019083336 A1 WO2019083336 A1 WO 2019083336A1 KR 2018012883 W KR2018012883 W KR 2018012883W WO 2019083336 A1 WO2019083336 A1 WO 2019083336A1
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
crop
image
ced
crops
Prior art date
Application number
PCT/KR2018/012883
Other languages
French (fr)
Korean (ko)
Inventor
김형석
박동선
아디카리샴
양희찬
김용진
양창주
Original Assignee
전북대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전북대학교산학협력단 filed Critical 전북대학교산학협력단
Priority to JP2020512648A priority Critical patent/JP6771800B2/en
Priority claimed from KR1020180129482A external-priority patent/KR102188521B1/en
Publication of WO2019083336A1 publication Critical patent/WO2019083336A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • the present invention belongs to the image recognition field of a neural network.
  • the CED neural network is used to recognize the heat of a crop, and all the plants other than heat are regarded as weeds and are collectively removed, and a small number of weeds To be precisely identified by the learning of another CED neural network so as to be weeded.
  • Zhang et al. (1995) analyzed and presented the criteria that can distinguish weeds found in wheat fields from color, shape, and texture.
  • Woebbecke et al. (1995a) performed color analysis for separating weeds and backgrounds from images. In particular, the corrected color, 2g-r-b, and the chromatic coordinate showed that the weeds were well distinguished from the surrounding environment.
  • Tian et al. (1997) developed and tested a machine vision system that can locate young tomatoes and weeds in the open field. In Korea, we have conducted research in this area, and the authors (1999) conducted a study showing the possibility of detecting weeds in the field by extracting geometric features of weed, weed, Respectively.
  • weed recognition methods using the color, shape and texture spectrum of such plants are not adaptable to various changed environments and shapes because they are based on rules.
  • the present invention provides a technique for extracting crop heat from a crop image using artificial intelligence technology, a technology for identifying crops and weeds, and a technique for designating a location for weeding To construct themselves by means of the network.
  • the present invention extracts heat of crops using Convolutional Encoder-Decoder (CED) neural network technology so that weeds are removed as weeds,
  • CED Convolutional Encoder-Decoder
  • additional CED neural networks are used to identify and remove weeds from crops.
  • the system structure consists of CED neural network used for heat extraction of crops and CED neural network used for crop and weed identification.
  • the CED neural networks are composed of several stages between the input and output stages.
  • the CED neural networks have a constricted structure toward the middle stage and a CED neural network that performs convolutional computation at each stage.
  • the CED neural network has various structures modified from the basic structure. Although there is a difference in performance and characteristics among them, a CED neural network of any modified structure is used for heat extraction of the crop, Lt; / RTI >
  • a learning data set in which a crop image is used as an input image of the CED neural network and a line image in which the positions of crop lines in the input image are graphically displayed is used as a learning target image In large quantities;
  • the CED neural network repeatedly learns the learning data set so as to learn the technique of extracting the crop row from the crop image.
  • an image obtained by taking close-up images so that the shape of individual crops and weeds can be identified is used as an input image, and the shape of the crop or weeds in the input image, A large number of sets of learning data in which an image representing the target image is displayed as a learning target image;
  • the learning data set is repeatedly learned, and the CED neural network learns the technique of identifying the crops and weeds from the field / field image, and designating the position to the position, through the neural network learning.
  • the present invention directly draws an image that is desired to appear as an output image, and does not rely on an algorithm based on a conventional image processing technique, and learns the image directly. Therefore, it is effective to develop a new technology development method that allows the CED neural network to construct the technology required for obtaining the desired image recognition result through learning. It also has the effect of recognizing the heat of the crop and identifying the crops - weeds using this technology, so that the mowers can precisely and automatically weed.
  • Figure 1 is a convolutional encoder-decoder neural network structure usable in the invention
  • Figure 2 shows the structure of crop row and crop-weed identification and position detection system for weeding
  • Figure 3 illustrates some of the training data of the CED neural network for crop row extraction
  • FIG. 4 is an illustration of a test result of the learned CED neural network for crop heat extraction
  • FIG. 5 illustrates a part of the learning data of the CED neural network for mother and child identification
  • FIG. 6 illustrates an example of the test results of the learned CED neural network.
  • the accuracy of the CED neural network using the Unet of Fig. 1 (c) is higher than that of the CED neural network shown in Fig. 1 (a) and (b)
  • the CED neural network using DensNet is superior, and the CED neural network using the dense net is somewhat better than the CED neural network using the Yu net.
  • the required parameters are much larger than those of a CED neural network using a Yu net, and it is also difficult to select an appropriate hyper-parameter.
  • the CED neural network using the dense net has a disadvantage that the execution speed is slower than the CED neural network using the U net.
  • the best embodiment considering both accuracy, implementation, and economy is a form including a CED neural network using a uNet.
  • the convolutional encoder-decoder (CED) neural network used in the present invention has a plurality of stages between an input terminal and an output terminal, and is a neural network having a small size.
  • An increasingly smaller portion of the first half is referred to as an encoder portion, and an increasingly larger portion of the latter half is referred to as a decoder portion.
  • Fig. 1 (A) and Fig. 1 (B) may be used to implement the CED neural network structure or the two structures used in the implementation of the method for detecting the crop heat and the crop weed in the present invention. Also, as shown in Fig.
  • a CED neural network i.e., a Unet, or a skipped structure, in which the outputs of the respective layers of the encoder portion are skipped immediately to the input of the same layer of the decoder portion
  • a CED neural network of the structure can also be used and a modified CED neural network structure called DensNet having connections across each layer of the Unet structure as shown in Figure 1 (a) can also be used. It is also possible to use a neural network that is modified based on the CED neural network.
  • the present invention can be implemented by adopting different numbers of layers and different numbers of filters for each structure, and the object of the present invention can be achieved.
  • the present invention relates to a crop recognition method for recognizing the heat of crops using the Convolutional Encoder-Decoder (CED) neural network technology; And a crop-weed identification step that identifies the weeds from the crop using an additional CED neural network for a small number of weeds present on the crop line.
  • the hardware for this is composed of two neural networks such as a CED neural network 230 used in the step of recognizing the rows of crops and a CED neural network 260 used in the step of identifying the weeds from the crop, do.
  • each CED neural network the task to be performed to the CED neural network is exemplified by applying the output data to be obtained as an output together with the input image to the learning database (220 and 250), and the neural network To install the technology to perform.
  • a large number of crop images are firstly obtained, and a large number of graphic display images 210 of the crop row positions and a symbol display image 240 of the crop-weed position are obtained for each crop image, And a type and position image database 250 of the crop image-crop and weed, and stores it in a computer hard disk.
  • the crop image-crop column image database 220 and the crop image-crop and weed species types and location image database 250 are provided for each crop column recognition CED neural network 230 and for crop and weed identification and location recognition And is learned by the CED neural network 260.
  • the learned neural network has the data type of each connection parameter values for the structure of the pre-designed neural network. The following is a detailed description of the learning process of the CED neural network 230 for crop heat recognition, the CED neural network 260 for identifying crops and weeds, and the location recognition.
  • FIG. 3 illustrates a case of rice (rice) in a crop.
  • the CED neural network assigns a task to extract and present a crop image of the same crop as the right image will be.
  • the learning target image displays graphical lines like the right images of FIG. 3 along the positions corresponding to the respective simulation lines as shown in the left images of FIG. That is, when the right image is superimposed on the left image, a target image is generated in which each line of the right image is located at the center of the crop line in the left image.
  • the neural network is learned. The learning is repeated until the error falls below a predetermined threshold by using the backpropagation learning method, which is commonly used for neural network learning.
  • FIG. 4 is a graph illustrating a test result of the CED neural network learned for the above-mentioned crop line recognition.
  • the input images such as the leftmost position are applied to the CED neural network, Can be obtained.
  • the images superimposed on the CED neural network outputs on the input images are the same as the rightmost images in Fig. 4 in order to check whether the lines on the output images correctly specify the crop row positions of the input images.
  • the positions of the CED neural network output lines precisely specify the positions of the crop lines in the input image.
  • a neural network having a structure similar to that of the Convolutional Encoder-Decoder (CED) is used for discrimination between crops and weeds.
  • the learning data is displayed with one kind of symbol in the bottom part of the crop in the learning target image, and the image displayed in the base part of the weeds with different kinds of symbols is created and the neural network learns it. If a symbol is displayed on the bottom part, it is possible to make effective weeding by breaking the plant by a weeding machine.
  • the left images in FIG. 5 are the rice and blood images used as the neural network inputs, and the right target images are gray circular symbols corresponding to the bottom of the rice in the left images, These are examples of images displayed with circular symbols.
  • the training data of the CED neural network to identify the mother and the daughter of rice are prepared as an input image by capturing a large amount of images such as the left image and creating learning target images such as the right image for each input image.
  • the learning target image is an example in which a blank image is displayed with gray and black circular symbols at the base position of the monopulse, but the symbols may be superimposed on the input image.
  • the color and shape of the symbol to be displayed can be selected in various ways.
  • FIG. 6 illustrates some of the test results of the CED neural network for crop heat recognition.
  • the leftmost images in FIG. 6 are input test images that are not included in the learning data.
  • the intermediate images in FIG. 6 are neural network output images, and the right images are images in which an input image and an output image are superimposed.
  • the gray circular symbol means the position of the rice
  • the black circular symbol means the position of the blood.
  • the resultant images thus recognized are sent to a weeding robot to be able to weed out the actual position corresponding to the mother and the blood.
  • an image is acquired in real time through a camera 280, and the obtained image is used to identify the CED neural network 230 for crop heat recognition, It needs to be simultaneously applied to the food CED neural network 260 and processed at a high speed.
  • the CPU 100 is used to control the flow of information and configure the neural network in software.
  • the CPU 100 post-processes the outputs of the neural networks, analyzes the results, and generates appropriate control signals necessary for autonomous traveling of the external lawn mowers using the results. During this process, particularly high speed signal processing is required to construct the CED neural networks and perform their operations.
  • the GPU 110 is used as an auxiliary device.
  • the parameters of the learned CED neural network include the information necessary for crop recognition and crop-weed identification already extracted from the crop image-crop column image database 220 and the crop image-crop type and location image database 250
  • the databases are removed, and a camera 280, a CED neural network 230 for crop heat recognition, a weed identification and position recognition CED neural network 260,
  • the lawn mower control system 300 can be configured simply with the CPU 100 and the GPU 110.
  • the lawn mower control system configured as described above first acquires an image through the camera 280 in real time, and transmits the obtained image to the CED neural network 230 for crops recognition, And to identify and locate the weed CED neural network 260 at the same time.
  • the CPU 100 is used to control the flow of information and configure the neural network in software.
  • the CPU 100 performs post-processing of the output of the neural network, analyzes and analyzes the output of the neural network, and generates an appropriate control signal necessary for autonomous weeding operation of the weeder 290 connected to the outside.
  • a high-speed signal processing is required to construct a CED neural network and perform its operation.
  • the GPU 110 is used as an auxiliary device.
  • the present invention aims at automating and mechanization of weeding work, which occupies most of the farmer's labor force in cultivation of crops, thereby reducing production costs such as the efficiency of cultivation of crops and the employment cost of the workers for weeding work.
  • the cultivation of crops that require work is very likely to be industrially applicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention relates to a technology for crop and weed classification, comprising the steps of: recognizing rows of crops by using a convolutional encoder-decoder (CED) neural network technique, so as to classify all plants between the rows as weeds; and for a small number of weeds located in the rows of the crops, distinguishing the weeds from the crops. A CED neural network for extracting rows of crops receives an image of crops as an input image and learns a to-be-learned image obtained by marking graphic lines on the input image at locations corresponding to rows of the crops, so as to develop, by itself, a skill of extracting rows of crops from an image of crops; and a CED neural network for crop-weed classification receives, as an input image, a close-up captured image of crops and learns a to-be-learned image obtained by marking different kinds (colors/shapes) of symbols according to species of crops or weeds on the input image at locations corresponding to crops or weeds, so as to develop, by itself, a skill of classifying crops and weeds. Information on the locations of rows of crops and information on the locations of weeds on the rows, acquired through the technology of the present invention, can be transmitted to a weeding machine so as to allow the weeding machine to transfer the weeding tool thereof to the corresponding locations and perform weeding.

Description

신경회로망 학습에 의한 작물과 잡초 식별 방법 및 장치    Method and apparatus for identifying crops and weeds by neural network learning
본 발명은 신경회로망의 영상 인식분야에 속하는 기술로서 우선 CED 신경회로망을 이용해서 작물의 열을 인식하여 열 이외의 식물들은 모두 잡초로 간주하여 일괄 제거하게 하고, 열과 그 주위에 남아있는 소수의 잡초들에 대해서는 또 다른 CED 신경회로망의 학습에 의해 정밀 식별하게 하여 제초할 수 있게 하는 것을 특징으로하는 잡초식별 방법에 관한 것이다. The present invention belongs to the image recognition field of a neural network. First, the CED neural network is used to recognize the heat of a crop, and all the plants other than heat are regarded as weeds and are collectively removed, and a small number of weeds To be precisely identified by the learning of another CED neural network so as to be weeded.
친환경 농작물 재배에 노동력이 가장 많이 필요한 농작업의 하나가 제초작업이다. 제초가 효과적으로 이뤄지지 않으면 농작물들의 생장에 치명적 피해를 주기 때문에 농부들은 농작물 생장기간 내내 잡초제거에 매달려야 하는 어려움이 있다. 따라서, 제초작업의 자동화는 농부들의 꿈으로써 이를 위한 연구가 오래전부터 시도되어왔다.     One of the agricultural work that needs the most labor to cultivate environmentally friendly crops is weeding. Farmers are having difficulty with hanging out weeds throughout the crop growth period because the herbicide is not effective and causes catastrophic damage to crops. Therefore, the automation of weeding work is a dream of farmers, and research for this has long been attempted.
Zhang 등(1995)은 밀밭에서 발견되는 잡초를 구분해 낼 수 있는 기준을 색상, 형태, 질감등 3가지 측면에서 분석 및 제시하였다. Woebbecke 등(1995a)은 영상에서 잡초와 배경의 분리를 위한 색상 분석을 수행하였다. 특히, 수정 색상, 2g-r-b, 녹색 크로마 좌표(Chromatic coordinate)가 잡초를 주위환경으로부터 잘 구분함을 보였다. Tian 등(1997)은 노지에서 어린 토마토와 잡초의 위치를 알아낼 수 있는 기계시각 시스템을 개발하여 시험하였다. 국내에서도 이 분야 연구를 수행한 바 있는데, 조서인 등(1999)은 쇠비름, 바랭이, 명아주를 대상 잡초로 하여, 이들의 기하학적 특징을 추출하여 노지에서 잡초를 검출할 수 있는 가능성을 보이는 연구를 수행하였다. 그러나, 이와 같은 식물의 색상, 형태, 질감 spectrum을 통한 구별 등을 사용한 잡초 인식 방법은 규칙에 기반하기 때문에 다양하게 변화된 환경, 모양 등에 적응적이지 못하여 실용화에는 미흡하였다.       Zhang et al. (1995) analyzed and presented the criteria that can distinguish weeds found in wheat fields from color, shape, and texture. Woebbecke et al. (1995a) performed color analysis for separating weeds and backgrounds from images. In particular, the corrected color, 2g-r-b, and the chromatic coordinate showed that the weeds were well distinguished from the surrounding environment. Tian et al. (1997) developed and tested a machine vision system that can locate young tomatoes and weeds in the open field. In Korea, we have conducted research in this area, and the authors (1999) conducted a study showing the possibility of detecting weeds in the field by extracting geometric features of weed, weed, Respectively. However, weed recognition methods using the color, shape and texture spectrum of such plants are not adaptable to various changed environments and shapes because they are based on rules.
다양한 환경에서도 적용할 수 있는 강건한 잡초인식 기술이 필요하였는데, 최근 Deep Learning 신경회로망 기술이 크게 발전함에 따라, 이를 이용한 잡초인식 시도가 있었다. Cicco 등(Dec 2016, CVPR)은 그래픽 툴을 이용해서 생성한 데이터세트를 Deep Learning 신경회로망의 하나인 SegNet(CoRR 2015)상에서 학습시켜서 잡초와 작물을 구별하게 하였다. 또, Potena 등(Feb. 2017)도 작물과 잡초 구별 (Classification)을 위해서 Deep Learning 신경회로망 기술을 사용함으로써 분류의 정확성을 기존의 알고리즘적인 방법에 비해 크게 높였다. 이 방법들에서는 잡초 전체의 형상을 구별적으로 추출하려는데 목표를 두었으므로, 듬성듬성하게 분포된 잡초들에는 적용이 가능하지만, 많은 풀들과 작물이 섞여 있는 환경에서는 식물들이 서로 겹쳐있어 구별성이 없어지므로 실용화가 어렵다. 좀 다른 접근 방법으로서 근거리 작물들의 위치를 기계식 접촉센서를 이용하여 인식하게 하는 방법(출원번호 10-2013-0115057 )도 제시되어 있으나, 어린모에 대해서는 적용하기 어렵고, 이 문제를 해결하기 위해서 어린모에도 적용할 수 있게 레이저센서와 더듬이를 사용하는 방법{출원번호 1020090113990(2009.11.24)}도 개발되었으나 바람에 흔들리는 어린모의 위치가 정확하지 않고, 매우 근거리만 유효하다는 단점이 있다.     We need a robust weed recognition technology that can be applied in various environments. Recently, as the Deep Learning Neural Network technology has greatly developed, there have been attempts to recognize weeds using this technology. Cicco et al. (Dec 2016, CVPR) learned the data sets generated by graphical tools on SegNet (CoRR 2015), one of the Deep Learning Neural Networks, to distinguish between weeds and crops. Potena et al. (Feb. 2017) also used the Deep Learning Neural Network technology to classify crops and weeds, which greatly improved the accuracy of classification compared to existing algorithmic methods. These methods are aimed at extracting the shape of the entire weed, so it is possible to apply it to the weed species distributed at the same time, but in an environment where many grasses and crops are mixed, the plants overlap each other and are not distinguishable So it is difficult to put it to practical use. As a different approach, there is also disclosed a method (Patent No. 10-2013-0115057) for recognizing the position of the near-field crops by using the mechanical contact sensor, but it is difficult to apply to the young model. To solve this problem, (Application No. 1020090113990 (Nov. 24, 2009)) was also developed, but there is a disadvantage that the position of the young simulated shaking wind is not accurate and is only very close to the ground.
참고문헌  references
Zhang, N. and C. Chaisattapagon. 1995. Effective criteria for weed identifying in wheat fields using machine vision. Transactions of the ASAE 38(3):965-974.  Zhang, N. and C. Chaisattapagon. 1995. Effective criteria for weed identification in wheat fields using machine vision. Transactions of the ASAE 38 (3): 965-974.
Woebbecke, D. M., G. E. Meyer, K. Von Bargen and D. A. Mortensen. 1995a. Shape features for identifying young weeds using image analysis. Transactions of the ASAW 38(1): 271-281.   Woebbecke, D. M., G. E. Meyer, K. Von Bargen and D. A. Mortensen. 1995a. Shape features for identifying young weeds using image analysis. Transactions of the ASAW 38 (1): 271-281.
Tian, L., D. C. Slaughter and R. F. Norris. 1997. Outdoor field vision identification of tomato seedlings for automated weed control. Transactions of the ASAE 40(6):1761-1768.   Tian, L., D. C. Slaughter and R. F. Norris. 1997. Outdoor field vision identification of tomato seedlings for automated weed control. Transactions of the ASAE 40 (6): 1761-1768.
조성인, 이대성, 배영민. 1999. 기계시각을 이용한 잡초식별. 한국농업기계확회지 제24권 제1호, 59-66   In addition, 1999. Weed identification using machine vision. Korea Agricultural Machinery Survey Volume 24, Issue 1, 59-66
aurilio Di Cicco, Ciro Potena, Giorgio Grisetti and Alberto Pretto. 2016. Automatic Model Based Dataset Generation for Fast and Accurate Crop and Weeds Detection. CVPR. arXiv:1612.03019v1 [cs.CV] 9 Dec 2016C  aurilio Di Cicco, Ciro Potena, Giorgio Grisetti and Alberto Pretto. 2016. Automatic Model Based Dataset Generation for Fast and Accurate Crop and Weeds Detection. CVPR. arXiv: 1612.03019v1 [cs.CV] 9 Dec 2016C
iro Potena, Daniele Nardi, and Alberto Pretto. 2017. Fast and Accurate Crop and Weed dentification with Summarized Train Sets for Precision Agriculture. Intelligent Autonomous Systems vol 14, 105-121. Feb 2017.  iro Potena, Daniele Nardi, and Alberto Pretto. 2017. Fast and Accurate Crop and Weed Dentification with Summarized Train Sets for Precision Agriculture. Intelligent Autonomous Systems vol 14, 105-121. Feb 2017.
상기와 같은 문제점을 해결하기 위한 본 발명은, 인공지능기술을 이용하여 작물 영상으로부터 작물 열을 추출하는 기술과 작물과 잡초를 식별하는 기술을 개발하고, 제초 작업해야 할 위치까지 지정해 주는 기술을 학습을 통하여 스스로 구축하게 하는 기술에 관한 것이다. In order to solve the above problems, the present invention provides a technique for extracting crop heat from a crop image using artificial intelligence technology, a technology for identifying crops and weeds, and a technique for designating a location for weeding To construct themselves by means of the network.
상기 목적을 달성하기 위해 본 발명은 Convolutional Encoder-Decoder (CED) 신경회로망 기술을 이용한 작물들의 열을 추출하여 제초기로 하여금 열과 열 간의 식물들은 잡초로 간주하여 제거하도록하고; 작물 열 상에 존재하는 소수의 잡초들에 대해서는 추가적인 CED 신경회로망을 사용하여 작물로부터 잡초를 식별하여 제거하게하는 방법을 사용한다.In order to achieve the above object, the present invention extracts heat of crops using Convolutional Encoder-Decoder (CED) neural network technology so that weeds are removed as weeds, For a small number of weeds present on the crop line, additional CED neural networks are used to identify and remove weeds from crops.
이를 위한 시스템 구조는 작물들의 열 추출을 위하여 사용하는 CED 신경회로망과 작물과 잡초 식별에 사용하는 CED 신경회로망으로 구성된다. 상기 CED 신경회로망들은 공히 입력단에서 출력단까지의 사이에 여러 단으로 구성되고, 중간 단으로 갈수록 잘록한 구조를 가지며, 각 단에 convolutional 연산을 하는 CED 신경회로망으로 구성된다. CED 신경회로망에는 도 1과 같이 기본 구조로부터 변형된 여러 가지 구조가 있고, 이들 간의 성능과 특징차이는 있지만, 어떤 변형된 구조의 CED 신경회로망도 본 발명의 목적인 작물의 열 추출과 작물과 잡초 구별에 사용가능하다.       The system structure consists of CED neural network used for heat extraction of crops and CED neural network used for crop and weed identification. The CED neural networks are composed of several stages between the input and output stages. The CED neural networks have a constricted structure toward the middle stage and a CED neural network that performs convolutional computation at each stage. As shown in FIG. 1, the CED neural network has various structures modified from the basic structure. Although there is a difference in performance and characteristics among them, a CED neural network of any modified structure is used for heat extraction of the crop, Lt; / RTI >
상기 작물들의 열을 추출하는 CED신경회로망을 위해서는 작물영상을 CED 신경회로망의 입력영상으로 하고, 입력영상에서의 작물 열들의 위치를 그래픽으로 그려 표시한 선영상을 학습목표영상으로 하는 학습 데이터 세트를 대량으로 구비하고;     For the CED neural network that extracts the rows of the crops, a learning data set in which a crop image is used as an input image of the CED neural network and a line image in which the positions of crop lines in the input image are graphically displayed is used as a learning target image In large quantities;
상기 학습 데이터 세트를 CED 신경회로망이 반복 학습하여 작물 영상으로부터 작물 열을 추출하는 기술을 스스로 습득하게 한다.      The CED neural network repeatedly learns the learning data set so as to learn the technique of extracting the crop row from the crop image.
또, 상기 작물-잡초 식별 CED 신경회로망을 위해서는 개별 작물과 잡초의 형상이 식별 가능할 정도로 근접 촬영한 영상을 입력영상으로 하고, 입력된 영상에서의 작물 혹은 잡초의 위치에 서로 다른 형상 혹은 컬러의 심벌을 표시한 영상을 학습목표영상으로 한 학습 데이터 세트를 대량으로 마련하고;      In addition, for the crop-weed identification CED neural network, an image obtained by taking close-up images so that the shape of individual crops and weeds can be identified is used as an input image, and the shape of the crop or weeds in the input image, A large number of sets of learning data in which an image representing the target image is displayed as a learning target image;
상기 학습 데이터 세트를 반복 학습하여, CED 신경회로망이 논/밭 영상으로부터 작물과 잡초를 식별하고, 그 위치까지 지정해 주는 기술을 신경회로망 학습을 통하여 스스로 습득하게 한다.The learning data set is repeatedly learned, and the CED neural network learns the technique of identifying the crops and weeds from the field / field image, and designating the position to the position, through the neural network learning.
본 발명은 작물 열 추출과 작물/잡초를 식별하는 목표를 달성하기 위하여 기존의 영상처리 기술에 의한 알고리즘에 의존하지 않고, 출력영상으로 나타나기를 바라는 영상을 직접 그래픽으로 작성하여 이를 학습 시킨다. 따라서 원하는 영상 인식결과를 얻는데 요구되는 기술을 CED 신경회로망이 학습을 통하여 스스로 구축케하는 새로운 기술개발 방법을 개발하는 효과가 있다. 또 이 기술을 이용하여 작물의 열을 인식하고, 작물-잡초를 식별하게 함으로써 제초기가 정밀 자동 제초할 수 있게 하는 효과가 있다. In order to accomplish the goal of identifying crops and weeds, the present invention directly draws an image that is desired to appear as an output image, and does not rely on an algorithm based on a conventional image processing technique, and learns the image directly. Therefore, it is effective to develop a new technology development method that allows the CED neural network to construct the technology required for obtaining the desired image recognition result through learning. It also has the effect of recognizing the heat of the crop and identifying the crops - weeds using this technology, so that the mowers can precisely and automatically weed.
도 1은 발명에 사용가능한 Convolutional Encoder-decoder 신경회로망 구조이고;Figure 1 is a convolutional encoder-decoder neural network structure usable in the invention;
가) 작물 (벼)의 열 인식 실시에 사용한 Convolutional Encoder-decoder 신경회로망 구조A) Convolutional encoder-decoder neural network structure used for thermal recognition of crops (rice)
나) 작물 (벼)와 잡초 (피)의 식별 및 위치 인식 실시에 사용한 Convolutional Encoder-decoder 신경회로망 구조B) Convolutional encoder-decoder neural network structure used for identification and localization of crop (rice) and weed (p)
다) Unet (skip) CED 신경회로망, 라) Dense CED 신경회로망 및 DensNet 블록들은 본 발명을 실현시키는데 사용가능한 변형된 CED 신경회로망 구조C) Uned (skip) CED neural network, d) Dense CED neural network and DensNet blocks are the modified CED neural network structure
도 2는 제초를 위한 작물 열과 작물-잡초 식별 및 위치 검출 시스템 구조이고;Figure 2 shows the structure of crop row and crop-weed identification and position detection system for weeding;
도 3은 작물 열 추출을 위한 CED 신경회로망의 학습 데이터 일부를 예시한 것이고;Figure 3 illustrates some of the training data of the CED neural network for crop row extraction;
도 4는 학습된 작물 열 추출용 CED 신경회로망의 테스트 결과에 대한 예시이고; FIG. 4 is an illustration of a test result of the learned CED neural network for crop heat extraction; FIG.
도 5는 모와 피 구별을 위한 CED 신경회로망의 학습 데이터 일부를 예시한 것이고;도 6은 학습된 작물-피 식별용 CED 신경회로망의 테스트 결과에 대한 예시이다.FIG. 5 illustrates a part of the learning data of the CED neural network for mother and child identification; and FIG. 6 illustrates an example of the test results of the learned CED neural network.
부호의 설명Explanation of symbols
100: CPU100: CPU
110: GPU110: GPU
200 : 작물 잡초 식별 시스템200: Crop Weed Identification System
210 : 작물 열 위치의 그래픽 표시 영상 제작210: Graphic display of crop location
220 : 작물 영상 - 작물 열 영상 데이터베이스220: Crop Image - Crop Thermal Image Database
230 : 작물 열 인식용 CED 신경회로망230: CED neural network for crop heat recognition
240 : 작물-잡초 위치의 심벌 표시 영상제작240: Produce symbol display image of crop-weed location
250 : 작물 영상-작물 종류 위치 영상 데이터베이스250: Crop Image - Crop Type Location Video Database
260 : 작물과 잡초의 식별 및 위치 인식용 CED 신경회로망260: CED neural network for identification and localization of crops and weeds
280 : 카메라280: camera
290 : 제초기290: Mowers
아래의 발명의 실시를 위한 형태 중에서 정확성은 도 1 (가), (나)에서 제시하는 CED 신경회로망보다 도 1 (다)의 유네트(Unet)를 사용한 CED 신경회로망과 도 1 (라)의 덴스네트(DensNet)를 사용한 CED 신경회로망이 우수하며, 유네트를 사용한 CED 신경회로망보다 덴스네트를 사용한 CED 신경회로망이 다소 우수하다. 그러나, 덴스네트를 사용한 CED 신경회로망의 경우, 유네트를 사용한 CED 신경회로망보다 필요한 파라미터(parameter)가 훨씬 많으며, 적절한 하이퍼 파라미터(hyper-parameter)의 선택도 어렵다는 단점이 있다. 또, 덴스네트를 사용한 CED 신경회로망은 수행속도가 유네트를 사용한 CED 신경회로망보다 느리다는 단점도 있다.     Among the modes for carrying out the present invention, the accuracy of the CED neural network using the Unet of Fig. 1 (c) is higher than that of the CED neural network shown in Fig. 1 (a) and (b) The CED neural network using DensNet is superior, and the CED neural network using the dense net is somewhat better than the CED neural network using the Yu net. However, in the case of a CED neural network using a dense net, the required parameters are much larger than those of a CED neural network using a Yu net, and it is also difficult to select an appropriate hyper-parameter. In addition, the CED neural network using the dense net has a disadvantage that the execution speed is slower than the CED neural network using the U net.
따라서, 정확성, 구현성 및 경제성을 모두 고려한 최선의 실시 형태는 유네트를 사용한 CED 신경회로망을 포함하는 형태이다.    Therefore, the best embodiment considering both accuracy, implementation, and economy is a form including a CED neural network using a uNet.
본 발명에서 사용한 Convolutional Encoder-Decoder (CED) 신경회로망은 도 1과 같이 입력단에서 출력단까지의 사이에는 여러 단으로 구성되며, 그 크기가 점점 작아지다 커지는 구조를 가진 신경회로망이다. 전반부의 점점 작아지는 부분을 인코더 (Encoder)부분이라고 하고, 후반부의 점점 커지는 부분을 디코더 (decoder)부분이라고 부른다. 도 1 (가)와 도 1 (나)는 본 발명에서의 작물 열 인식과 작물 열 상의 잡초 검출방법의 실시에 각 각 사용한 CED 신경회로망 구조이나, 두 구조를 상호 바꾸어 구현해도 무방하다. 또, 도 1(다) 와 같이 인코더 부분의 각 층의 출력들이 디코더 부분의 동일한 층의 입력으로 바로 건너 (skip) 합산되는 CED 신경회로망 즉 유네트 (Unet) 혹은 스킵 (skip) 구조라 불리는 변형된 구조의 CED신경회로망도 사용을 사용할 수가 있으며, 도 1 (라) 와 같이 상기 유네트 (Unet) 구조의 각 층을 건너 연결을 갖는 DensNet라 불리우는 변형된 CED 신경회로망 구조도 사용가능하다. 또, CED 신경회로망을 기반으로 달리 변형된 신경회로망을 사용해도 무방하다. 이 경우, 각 구조마다 다른 수의 층, 다른 개수의 필터를 채용하여 본 발명을 구현할 수 있으며, 본 발명의 목적 달성이 가능하다.      As shown in FIG. 1, the convolutional encoder-decoder (CED) neural network used in the present invention has a plurality of stages between an input terminal and an output terminal, and is a neural network having a small size. An increasingly smaller portion of the first half is referred to as an encoder portion, and an increasingly larger portion of the latter half is referred to as a decoder portion. Fig. 1 (A) and Fig. 1 (B) may be used to implement the CED neural network structure or the two structures used in the implementation of the method for detecting the crop heat and the crop weed in the present invention. Also, as shown in Fig. 1 (C), a CED neural network, i.e., a Unet, or a skipped structure, in which the outputs of the respective layers of the encoder portion are skipped immediately to the input of the same layer of the decoder portion, A CED neural network of the structure can also be used and a modified CED neural network structure called DensNet having connections across each layer of the Unet structure as shown in Figure 1 (a) can also be used. It is also possible to use a neural network that is modified based on the CED neural network. In this case, the present invention can be implemented by adopting different numbers of layers and different numbers of filters for each structure, and the object of the present invention can be achieved.
본 발명은 상기 Convolutional Encoder-Decoder (CED) 신경회로망 기술을 이용해서 작물들의 열을 인식하게 하는 작물 열 인식 단계 와; 작물 열 상에 존재하는 소수의 잡초들에 대해서 추가적인 CED 신경회로망을 사용하여 작물로부터 잡초를 식별하게 하는 작물-잡초 식별 단계로 구성된다. 이를 위한 하드웨어는 도 2와 같이 작물들의 열을 인식하는 단계에서 사용하는 CED 신경회로망(230)과 작물로부터 잡초를 식별하게 하는 단계에서 사용하는 CED 신경회로망(260)] 등 두 개의 신경회로망으로 구성된다.      The present invention relates to a crop recognition method for recognizing the heat of crops using the Convolutional Encoder-Decoder (CED) neural network technology; And a crop-weed identification step that identifies the weeds from the crop using an additional CED neural network for a small number of weeds present on the crop line. The hardware for this is composed of two neural networks such as a CED neural network 230 used in the step of recognizing the rows of crops and a CED neural network 260 used in the step of identifying the weeds from the crop, do.
각 CED 신경회로망에는 입력영상과 함께 출력으로 얻고자 하는 출력을 학습데이터 베이스화(220 및 250) 하여 인가함으로써 CED 신경회로망에게 수행해야 할 임무를 예를 들어 제시하고, 신경회로망은 학습을 통하여 그 임무를 수행하는 기술을 설치하게 한다.       In each CED neural network, the task to be performed to the CED neural network is exemplified by applying the output data to be obtained as an output together with the input image to the learning database (220 and 250), and the neural network To install the technology to perform.
이를 위해서 먼저 작물영상을 대량으로 획득하여 각 작물영상에 대한 작물 열 위치의 그래픽 표시 영상(210)과 작물-잡초 위치의 심볼 표시 영상(240)을 대량으로 제작하여 작물영상-작물 열 영상 데이터 베이스(220)와 작물 영상-작물과 잡초의 종류 및 위치 영상 데이터베이스(250)를 구성하여 컴퓨터 하드디스크에 저장한다. 상기 작물영상-작물 열 영상 데이터 베이스(220)와 작물 영상-작물과 잡초의 종류 및 위치 영상 데이터베이스(250)는 각 작물 열 인식용 CED 신경회로망(230)과 작물과 잡초의 식별 및 위치 인식용 CED 신경회로망(260)에 의해 학습한다. 학습된 신경회로망은 미리 설계된 신경회로망의 구조에 대한 각 연결파라미터 값들의 데이터 형태를 갖는다. 다음은 작물 열 인식용 CED 신경회로망(230)과 작물과 잡초의 식별 및 위치 인식용 CED 신경회로망 (260)의 학습과정에 대한 상세 설명이다. In order to do this, a large number of crop images are firstly obtained, and a large number of graphic display images 210 of the crop row positions and a symbol display image 240 of the crop-weed position are obtained for each crop image, And a type and position image database 250 of the crop image-crop and weed, and stores it in a computer hard disk. The crop image-crop column image database 220 and the crop image-crop and weed species types and location image database 250 are provided for each crop column recognition CED neural network 230 and for crop and weed identification and location recognition And is learned by the CED neural network 260. The learned neural network has the data type of each connection parameter values for the structure of the pre-designed neural network. The following is a detailed description of the learning process of the CED neural network 230 for crop heat recognition, the CED neural network 260 for identifying crops and weeds, and the location recognition.
작물 열 인식을 위한 학습Learning for crop heat recognition
CED 신경회로망에 작물 열 인식하는 기술을 개발시키기 위해서는 그 예가 되는 학습 데이터를 영상의 형태로 다양하게 대규모로 준비한다. 도 3은 작물 중 모 (벼) 의 경우를 예시한 것으로서 좌측 영상들과 같은 입력영상들이 인가된 경우, CED 신경회로망에게 우측의 영상들과 같은 작물의 열 영상을 추출하여 제시하도록 임무를 부여하는 것이다. 보다 구체적으로는 학습목표영상은 도 3의 왼쪽 영상들과 같이 모의 각 열에 해당하는 위치를 따라 도 3의 오른쪽 영상들처럼 그래픽에 의한 선들을 표시하여 제시해 준다. 즉, 왼쪽 영상 위에 오른쪽 영상을 중첩할 경우, 오른쪽 영상의 각 선은 왼쪽 영상에서의 작물 열의 중앙에 위치하게 되는 학습목표영상을 제작한다. 그런데, 제초기가 작물 열을 따라 제초작업 할 때는 정면의 작물 열 들을 따라가며 작업하기 때문에 학습목표영상 제작 시 정면의 선들에 중점을 두어 그려주며, 좌 우 옆면으로 향하는 열들은 생략할 수도 있다. 또, 도 3 (라)에서처럼 중간에 모가 상실된 위치까지도 연장선을 만들어 줌으로써, 신경회로망이 연장선을 긋는 인간의 기술을 배우도록 한다.       In order to develop the technology of crop heat recognition on the CED neural network, the preparation of the learning data, which is an example, in various forms in the form of images. FIG. 3 illustrates a case of rice (rice) in a crop. When input images such as left images are applied, the CED neural network assigns a task to extract and present a crop image of the same crop as the right image will be. More specifically, the learning target image displays graphical lines like the right images of FIG. 3 along the positions corresponding to the respective simulation lines as shown in the left images of FIG. That is, when the right image is superimposed on the left image, a target image is generated in which each line of the right image is located at the center of the crop line in the left image. However, when lawn mowers weed along the row of crops, they work along the rows of crops in the front, so they focus on the lines on the front when creating the learning target image, and the rows toward the left and right sides may be omitted. In addition, as shown in FIG. 3 (d), by extending an extension line to the position where the nodule is lost in the middle, the neural network learns a human technique of drawing an extension line.
이와 같이 학습데이터가 준비되면, 이에 대한 학습 시 더 다양한 모양의 학습을 위해서 상기 구비된 영상을 대상으로 다양한 각도의 회전, 확대 및 회전, 축소 및 회전, 좌우 상하 이동 등의 기법(augmentation 기법)을 통하여 훨씬 많은 수의 학습 데이터를 추가 생성한 후, 신경회로망을 학습 시킨다. 이 때 사용하는 학습은 신경회로망 학습에 보편적으로 사용하는 backpropagation 학습 방법을 이용하여 에러가 미리 정한 문턱 치 이하로 떨어질 때까지 반복 학습한다.       When the learning data is prepared, augmentation techniques such as rotation, enlargement, rotation, reduction, rotation, left / right up / down movement of various angles, After a much larger number of learning data is generated, the neural network is learned. The learning is repeated until the error falls below a predetermined threshold by using the backpropagation learning method, which is commonly used for neural network learning.
도 4는 상기의 작물 열 인식을 위해 학습된 CED 신경회로망을 테스트한 결과를 예시한 그림인데, 가장 좌측 위치와 같은 입력 영상들을 CED 신경회로망에 인가할 때, 그 출력으로는 중간 위치와 같은 영상들을 얻을 수 있다. 이 출력 영상들 상의 선들이 입력 영상들의 작물 열 위치를 정확히 지정하고 있는지 여부를 확인하기 위해서, 이 CED 신경회로망 출력들을 입력 영상들에 중첩한 영상들은 도 4의 가장 오른 쪽 영상들과 같다. 그림에서 확인할 수 있는 바와 같이 CED 신경회로망 출력 선들의 위치가 입력 영상의 작물 열들의 위치를 정확히 지정해 주고 있음을 알 수 있다.       FIG. 4 is a graph illustrating a test result of the CED neural network learned for the above-mentioned crop line recognition. When the input images such as the leftmost position are applied to the CED neural network, Can be obtained. The images superimposed on the CED neural network outputs on the input images are the same as the rightmost images in Fig. 4 in order to check whether the lines on the output images correctly specify the crop row positions of the input images. As can be seen in the figure, it can be seen that the positions of the CED neural network output lines precisely specify the positions of the crop lines in the input image.
개별 작물과 잡초의 식별 학습Identification learning of individual crops and weeds
본 발명에서는 작물과 잡초 간의 식별도 상기 Convolutional Encoder-Decoder (CED)과 유사한 구조의 신경회로망을 사용하였다. 학습 데이터는 학습목표영상에서의 작물 밑둥 부분에 한 종류의 심벌로 표시하고, 잡초의 밑둥 부분에 다른 종류의 심벌로 표시한 영상을 작성하여 신경회로망이 이를 학습하게 한다. 이 밑둥 부분에 심벌이 표시될 경우, 이곳을 제초기계에 의해 파쇄하게하면 효과적인 제초가 가능하다.     In the present invention, a neural network having a structure similar to that of the Convolutional Encoder-Decoder (CED) is used for discrimination between crops and weeds. The learning data is displayed with one kind of symbol in the bottom part of the crop in the learning target image, and the image displayed in the base part of the weeds with different kinds of symbols is created and the neural network learns it. If a symbol is displayed on the bottom part, it is possible to make effective weeding by breaking the plant by a weeding machine.
도 5의 좌측 영상들은 신경회로망 입력으로 사용한 벼와 피의 영상이고, 오른쪽은 학습목표영상들로서 왼쪽 영상들에서 벼의 밑둥에 해당하는 곳에 회색 원형 심벌로 표시하고, 피의 밑둥에 해당하는 곳은 검은색 원형 심벌로 표시한 영상의 예들이다. 벼의 모와 피를 식별하기 위한 CED 신경회로망의 학습 데이터는 왼쪽 영상과 같은 영상을 대량으로 촬영하여 입력영상으로 준비하고 각 입력영상에 대해 오른쪽 영상과 같은 학습목표영상들을 작성한다. 여기서의 학습목표영상은 빈(blank) 영상에 모나 피의 밑둥 위치에 회색과 검은색 원형 심벌로 표시한 영상을 작성한 예를 보였지만, 입력영상 위에 상기 심벌들을 중첩하여 표시할 수도 있다. 또, 표시하는 심벌의 색이나 형상도 다양하게 선택할 수 있다. 이와 같은 방법으로 학습데이터 세트를 구성하여 backpropagation 학습 방법을 이용하여 에러가 미리 정한 문턱치 이하가 될 때까지 학습시킨다. 도 6은 작물 열 인식용 CED 신경회로망의 테스트 결과의 일부를 예시한 것이다. 도 6의 가장 왼쪽 영상들은 입력 테스트 영상들로서 학습데이터에 포함되지 않은 영상들이다. 도 6의 중간 영상들은 신경회로망 출력영상들이며 오른쪽 영상들은 입력영상과 출력영상들을 중첩시킨 영상들이다. 출력영상 및 중첩된 영상에서 회색 원형 심벌은 벼의 위치를 의미하고, 검은색 원형 심벌은 피의 위치를 의미한다. 결과 영상에서 확인하는 바와 같이 모와 피의 위치를 정확히 인식하여 표시하고 있음을 알 수 있다. 다만, 벼들이 겹쳐있어서 밑둥이 보이지 않은 벼들에 대해서는 식별하지 못하지만, 이것은 밑둥의 형상을 기준으로 식별하도록 학습시켰기 때문이다. 그러나 형상을 확실히 확인할 수 있는 크기와 위치의 피는 대부분 식별되어 검출됨을 알 수 있다. The left images in FIG. 5 are the rice and blood images used as the neural network inputs, and the right target images are gray circular symbols corresponding to the bottom of the rice in the left images, These are examples of images displayed with circular symbols. The training data of the CED neural network to identify the mother and the daughter of rice are prepared as an input image by capturing a large amount of images such as the left image and creating learning target images such as the right image for each input image. Here, the learning target image is an example in which a blank image is displayed with gray and black circular symbols at the base position of the monopulse, but the symbols may be superimposed on the input image. Also, the color and shape of the symbol to be displayed can be selected in various ways. In this way, the learning data set is constructed and learned by using the backpropagation learning method until the error falls below a predetermined threshold value. Figure 6 illustrates some of the test results of the CED neural network for crop heat recognition. The leftmost images in FIG. 6 are input test images that are not included in the learning data. The intermediate images in FIG. 6 are neural network output images, and the right images are images in which an input image and an output image are superimposed. In the output image and the superimposed image, the gray circular symbol means the position of the rice, and the black circular symbol means the position of the blood. As can be seen from the result image, it can be seen that the position of the mother blood is accurately recognized and displayed. However, we can not discern rice that has overlapping rice paddies but does not show a base, because we have learned to discriminate based on the shape of the base. However, it can be seen that most of the blood of size and position that can clearly confirm the shape is identified and detected.
이와 같이 인식된 결과 영상들은 제초로봇에 보내서 모와 피에 해당하는 실제의 위치를 찾아 제초할 수 있게 한다.      The resultant images thus recognized are sent to a weeding robot to be able to weed out the actual position corresponding to the mother and the blood.
제초시스템의 두뇌로서의 역할The Role of the Herbicide System as a Brain
본 발명을 제초시스템의 두뇌 역할로 활용하기 위해서는 카메라(280)를 통하여 영상을 실시간으로 획득하고, 이 획득된 영상을 상기 작물 열 인식용 CED 신경회로망(230)과 작물과 잡초의 식별 및 위치인식용 CED 신경회로망(260)에 동시에 인가해서 고속으로 처리해야 한다. 이와 같은 정보의 흐름을 제어하고, 신경회로망을 소프트웨어적으로 구성하여 수행시키는데 CPU(100)을 사용한다. 또, 상기 CPU(100)는 신경회로망들의 출력들을 후처리하고, 분석하며 그 결과를 이용하여 외부에 연결된 제초기의 자율주행에 필요한 적절한 제어신호를 생성시키는 역할을 수행한다. 이 과정 중, CED 신경회로망들을 구성하고 그 동작을 수행시키기 위해서는 특별히 고속의 신호처리가 필요한데, 이를 위해서 GPU(110)가 보조 장치로 사용된다.In order to utilize the present invention as a brain role of the weed system, an image is acquired in real time through a camera 280, and the obtained image is used to identify the CED neural network 230 for crop heat recognition, It needs to be simultaneously applied to the food CED neural network 260 and processed at a high speed. The CPU 100 is used to control the flow of information and configure the neural network in software. In addition, the CPU 100 post-processes the outputs of the neural networks, analyzes the results, and generates appropriate control signals necessary for autonomous traveling of the external lawn mowers using the results. During this process, particularly high speed signal processing is required to construct the CED neural networks and perform their operations. For this purpose, the GPU 110 is used as an auxiliary device.
제초기 제어 시스템의 구성Configuration of mower control system
학습된 CED 신경회로망의 파라미터에는 이미 작물 영상-작물 열 영상 데이터베이스(220)와 작물 영상-작물 종류 및 위치 영상 데이터베이스(250)로부터 추출된 작물 열 인식 및 작물-잡초 식별에 필요한 정보가 포함되어 있으므로 장치의 소형화를 위해서 상기 데이터베이스들은 제거하고, 카메라(280)와 작물 열 인식용 CED 신경회로망(230)과 잡초의 식별 및 위치 인식용 CED 신경회로망(260)과; CPU(100)와 GPU(110)로만 간단히 제초기 제어시스템(300)을 구성할 수 있다.     The parameters of the learned CED neural network include the information necessary for crop recognition and crop-weed identification already extracted from the crop image-crop column image database 220 and the crop image-crop type and location image database 250 For the miniaturization of the apparatus, the databases are removed, and a camera 280, a CED neural network 230 for crop heat recognition, a weed identification and position recognition CED neural network 260, The lawn mower control system 300 can be configured simply with the CPU 100 and the GPU 110. [
상기 구성한 제초기 제어시스템이 제초기(290)에 적절한 제초작업명령을 보내기 위해서는 먼저 카메라(280)를 통하여 영상을 실시간으로 획득하고, 이 획득된 영상을 상기 작물 열 인식용 CED 신경회로망(230)과 작물과 잡초의 식별 및 위치인식용 CED 신경회로망 (260)에 동시에 인가해서 고속으로 처리해야 한다. 이와 같은 정보의 흐름을 제어하고, 신경회로망을 소프트웨어적으로 구성하여 수행시키는데 CPU(100)을 사용한다. 또, 상기 CPU(100)는 신경회로망의 출력을 후처리하고, 분석하며 그 결과를 이용하여 외부에 연결된 제초기(290)의 자율제초작업에 필요한 적절한 제어신호를 생성시키는 역할을 수행한다. 이 과정 중, CED 신경회로망을 구성하고 그 동작을 수행시키기 위해서는 특별히 고속의 신호처리가 필요한데, 이를 위해서 GPU(110)가 보조 장치로 사용된다.    In order to send an appropriate weeding operation command to the lawn mower 290, the lawn mower control system configured as described above first acquires an image through the camera 280 in real time, and transmits the obtained image to the CED neural network 230 for crops recognition, And to identify and locate the weed CED neural network 260 at the same time. The CPU 100 is used to control the flow of information and configure the neural network in software. In addition, the CPU 100 performs post-processing of the output of the neural network, analyzes and analyzes the output of the neural network, and generates an appropriate control signal necessary for autonomous weeding operation of the weeder 290 connected to the outside. During this process, a high-speed signal processing is required to construct a CED neural network and perform its operation. For this purpose, the GPU 110 is used as an auxiliary device.
농업에 있어서 토양의 양분과 태양에너지를 뺏어가 농작물의 생육을 방해하는 잡초의 제거는 필요불가결한 일이다. 특히 최근의 국제적 추세인 친환경 농작물 재배에서는 농약 등의 화학제품을 사용하지 않으므로 제초작업에 더욱 많은 노동력이 필요한 상황이다. 이에 본 발명은 농작물의 재배에 있어 농부의 노동력의 대부분을 차지하는 제초작업의 자동화와 기계화를 통해 농작물 재배의 효율화와 제초작업을 위한 인부의 고용비용을 등 생산비용을 절감할 수 있게 하기 위한 것으로 제초작업이 필요한 농작물 재배에는 그 산업상 이용가능성이 매우 크다. It is indispensable to remove the weeds that hinder the growth of crops by taking away soil nutrients and solar energy in agriculture. Especially in the recent international trend of cultivation of environmentally friendly crops, we do not use chemicals such as pesticides, so we need more labor for weeding. Accordingly, the present invention aims at automating and mechanization of weeding work, which occupies most of the farmer's labor force in cultivation of crops, thereby reducing production costs such as the efficiency of cultivation of crops and the employment cost of the workers for weeding work. The cultivation of crops that require work is very likely to be industrially applicable.

Claims (13)

  1. 논이나 밭 작물 재배를 위한 제초를 위해서 농작물의 열들을 신경회로망 학습에 의해 인식함으로써 그 열들에서 벗어난 식물들을 모두 잡초로 간주하여 제거할 수 있도록 하는 단계와; 그 열들과 그 주위에 남아있는 잡초들에 대해서는 또 다른 신경회로망 학습을 통하여 작물과 분리 식별하는 단계로 구성된 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법Recognizing the rows of crops by neural network learning for weeding for rice and field crops so as to remove all of the plants that are out of those rows as weeds; Identification of the crops and weeds by neural network learning, characterized in that the columns and their remaining weeds are separated from the crops through another neural network learning
  2. 청구항 제1항에 있어서, 상기 농작물의 열 인식 단계와 작물과 잡초 식별 단계에서 사용하는 신경회로망은 여러 컨벌루션(convolution) 층으로 구성되며, 점점 작아지는 인코더(encoder) 부분과 다시 점점 커지는 디코더(decoder) 부분이 결합된 구조의 CED 신경회로망을 갖는 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법2. The method of claim 1, wherein the neural network used in the step of recognizing the crop and the crop and weed identification step comprises a plurality of convolution layers, and the encoder part and the decoder ) With a CED neural network having a structure in which a part of the CED neural network
  3. 청구항 제2항에 있어서, 상기 CED 신경회로망에는 인코더(encoder) 부분의 각 층의 출력들이 디코더(decoder) 부분의 동일한 층의 입력으로 합산되는 CED 신경회로망, 즉 유네트(Unet) 구조라 불리는 변형된 구조의 CED신경회로망도 사용가능한 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법The CED neural network of claim 2, wherein the CED neural network includes a CED neural network in which the outputs of each layer of the encoder portion are summed to the inputs of the same layer of the decoder portion, Identification of crops and weeds by neural network learning, characterized by the possibility of using CED neural network of structure
  4. 청구항 제3항에 있어서, 상기 CED 신경회로망에는 유네트(Unet) 구조라 불리는 CED 신경회로망을 기본 구조로 하고 각 층을 건너뛰는 연결을 추가적으로 갖는 덴스네트(DensNet)라 불리는 변형된 구조의 CED 신경회로망도 사용가능한 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법The CED neural network according to claim 3, wherein the CED neural network includes a CED neural network having a modified structure called DensNet having a basic structure of a CED neural network called a Unet structure, Identification of Crops and Weeds by Neural Network Learning
  5. 청구항 제1항에 있어서, 농작물의 열 인식 방법은 농작물의 영상이 상기 CED 신경회로망의 입력으로 인가되었을 때, 상기 CED 신경회로망은 입력으로 인가된 농작물 영상 중의 작물 열에 해당하는 위치가 상기 CED 신경회로망의 출력단에 그래픽 선으로 그려서 표시되도록 상기 CED 신경회로망을 반복적으로 학습시키고, 또 다른 많은 작물 열 영상들에 대해서도 동일한 방법으로 학습하게 함으로써, 임의의 테스트 농작물 영상이 인가되었을 때, 그 농작물 열에 해당하는 위치가 그래픽 선으로 그려져서 표시되게 하는 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법The method of claim 1, wherein, when a crop image is applied as an input to the CED neural network, the CED neural network is configured such that a position corresponding to a crop row in the crop image, The CED neural network is repeatedly learned so as to be drawn by the graphic line on the output end of the crop line image and the same method is applied to many other crop line images so that when an arbitrary test crop image is applied, A method for identifying crops and weeds by neural network learning, characterized in that the position is drawn by graphic lines and displayed
  6. 청구항 제1항에 있어서, 작물이 벼이고 잡초가 피인 경우에 이들을 구별하는 방법은 벼의 모의 경우 여러 개의 볍씨를 모아서 발아시키므로 모의 밑둥은 여러 개의 벼들이 포기를 이뤄 밀집해 있는데 비해, 피는 개별적으로 흩어져 발아하므로, 포기 밑둥 부분의 밀집도가 낮다는 점을 이용하여 포기의 밑둥 부분의 밀집도의 시각적 차이를 이용하여 구별하는 벼와 피를 식별하는 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법[2] The method according to claim 1, wherein when the crop is rice and weed is contaminated, a plurality of seeds are germinated by collecting a plurality of rice seeds in the case of a rice simulation, Identification of crops and weeds by neural network learning, characterized by distinguishing rice and blood distinguishing by visual differences in the density of the bottom part of aeration using the fact that the density of the bottom part of the aeration is low due to scattered germination Way
  7. 청구항 제6항에 있어서, 밑둥의 밀집도의 시각적 차이를 이용해서 벼와 피를 구별하기 위한 학습목표영상 데이터는 입력영상의 벼와 피의 밑둥 부분에 해당하는 곳에 서로 다른 색이나 형상의 심벌로 표시함으로써 벼와 피를 구별하게 하는 벼와 피 식별을 위한 학습목표영상을 작성하는 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법[Claim 6] The method of claim 6, wherein learning objective image data for distinguishing rice and blood using the visual difference in the density of the base is represented by symbols of different colors or shapes at positions corresponding to the base of the rice and blood of the input image A method for identifying crops and weeds by neural network learning, characterized in that a learning target image for identification of rice and blood which distinguishes rice and blood is created
  8. 청구항 제5항에 있어서, 작물 열 인식용 신경회로망 학습에 필요한 데이터베이스는, 제초기가 추종하고자 하는 방향에서 촬영한 작물 열 영상을 입력영상으로 하고; 상기 입력영상에서의 각 작물 열에 해당하는 위치에 그래픽 선으로 작성한 영상을 학습목표영상으로 하여; 상기 입력영상-학습목표영상이 한 개 세트로 구성된 학습 데이터세트를 대량으로 준비한 작물 열 학습용 데이터 베이스 구비 방법을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법 [Claim 7] The method according to claim 5, wherein the database necessary for learning the neural network for crop heat recognition comprises a crop thermal image taken in a direction that the grass mower wishes to follow, as an input image; An image formed by a graphic line at a position corresponding to each crop line in the input image as a learning target image; A method for identifying a crop and a weed by learning a neural network, characterized by a method for providing a large number of sets of learning data composed of one set of the input image-learning target images
  9. 청구항 제6항에 있어서, 작물-잡초 식별용 신경회로망에 필요한 학습 데이터베이스는 작물 열 상에서 작물과 잡초가 함께 포함된 영상들을 촬영하되, 작물과 잡초의 형상이 뚜렷하게 구별 가능하도록 근접 촬영한 영상들을 신경회로망의 입력영상으로 하고; 각 입력영상에서의 각 작물과 잡초에 해당하는 위치에 이들을 구별할 수 있도록 다른 형상 혹은 컬러의 심벌들을 사용한 영상을 작성하여 이를 학습목표영상으로 하며; 상기 입력영상-학습목표영상을 한 세트로 하는 학습 데이터세트를 대량으로 구비한 작물-잡초 식별 학습용 데이터베이스 작성 방법을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법 7. The method of claim 6, wherein the learning database required for the neural network for crop-weed identification is a bird's eye image, wherein the crops and weeds are included in the crop line and the close- The input image of the network; An image using different shape or color symbols is generated so as to distinguish them from each other in a position corresponding to each crop and weeds in each input image, and this is used as a learning target image; A method for identifying a crop and a weed by neural network learning characterized by a database for generating a crop-weed identification learning database in which a large number of learning data sets each including a set of the input image-learning target images as a set
  10. 청구항 제9항에 있어서, 작물-잡초 식별용 신경회로망에 필요한 학습데이터 베이스에서 학습목표영상 작성은 초기에 비어있는(blank) 영상 혹은 입력영상을 복사한 영상 위에 각 객체를 구별할 수 있도록 다양한 색을 사용한 선, 도형, 혹은 심벌들로 그래픽화하여 작성하는 학습목표영상 작성 방법을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법10. The method according to claim 9, wherein the learning target image is generated from a learning data base required for a neural network for crop / weed identification, wherein a blank image or an input image is copied, A method of identifying a crop and a weed by neural network learning characterized by a method of creating a learning target image using a line, a figure, or symbols using a graph
  11. 청구항 제9항에 있어서, 작물-잡초 식별용 신경회로망에 필요한 학습 데이터베이스에서의 학습목표영상 작성 시 각 심벌의 위치와 크기는 입력영상을 기준으로 하여 정하며; 심벌영역 내에 객체 간의 구별성이 큰 특징들이 가급적 많이 포함 될 수 있도록 심벌 중심점의 위치와 크기를 정하여 학습목표영상에 표시하는 학습목표영상에서의 심벌의 위치와 크기 표시 방법을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 방법10. The method of claim 9, wherein the position and size of each symbol are determined based on the input image when creating the learning target image in the learning database required for the neural network for crop-weed identification; A neural network learning method characterized by the position and size of a symbol center point and the method of displaying a symbol position and size in a learning target image to be displayed on a learning target image so as to include as many features having high distinguishability among objects as possible in a symbol area How to identify crops and weeds by
  12. 청구항 제5항 또는 제8항의 방법으로 제작된 작물 열 위치의 그래픽 표시 영상으로 작성된 작물 영상-작물 열 영상 데이터베이스(220)와; 상기 영상 데이터베이스(220)로 청구항 제1항 내지 제4항 중 어느 한 항의 방법으로 학습하는 작물 열 인식용 CED 신경회로망(230)과; 청구항 제6항, 제7항 및 제9항 내지 제11항 중 어느 한 항의 방법으로 제작된 작물-잡초 종류 및 위치의 심벌 표시 영상으로 작성된 작물 영상-작물 종류 위치 영상 데이터베이스(250)와; 상기 영상 데이터베이스(250)로 청구항 제1항 내지 제4항 중 어느 한 항의 방법으로 학습하는 작물과 잡초의 종류 및 위치인식용 CED 신경회로망(260)과; 작물-잡초 영상을 실시간으로 촬영하여 상기 CED 신경회로망(230, 260)의 입력으로 인가하는 카메라(280)와; 상기 CED 신경회로망을 소프트웨어적으로 구성하여 수행시키고, CED 신경회로망의 결과를 분석하고, 그 결과를 이용하여 외부에 연결된 제초기의 자율제초작업에 필요한 적절한 제어신호를 생성시키고, 이를 상기 제초기에 제공하는 CPU(100)와; 상기 CPU를 보조하며 CED 신경회로망을 고속으로 수행시키는 GPU(110)를 포함하는 것을 특징으로 하는 신경회로망 학습에 의한 작물과 잡초의 식별 장치A crop image-crop thermal image database (220) made up of a graphic display image of a crop row position produced by the method of claim 5 or 8; A CED neural network (230) for crop heat recognition to be learned by the method according to any one of claims 1 to 4 to the image database (220); A crop image-crop type location image database (250) made up of a symbol display image of the crop-weed species and location produced by the method of any one of claims 6, 7, and 11 to 11; A food CED neural network 260 for identifying the type and location of crops and weeds to be learned by the method of any one of claims 1 to 4 to the image database 250; A camera 280 for capturing a crop-weed image in real time and applying it to the input of the CED neural network 230, 260; The CED neural network is constructed in a software manner and analyzed, and the result of the CED neural network is analyzed to generate an appropriate control signal necessary for an autonomous weeding operation of an externally connected lawn mower using the result of the analysis, A CPU 100; And a GPU (110) for assisting the CPU and performing the CED neural network at a high speed, characterized in that the apparatus for identifying crops and weeds by neural network learning
  13. 청구항 제12항에 있어서 학습된 CED 신경회로망들의 파라미터들에는 이미 작물 영상-작물 열 영상 데이터베이스(220)와; 작물 영상-작물 종류 및 위치 영상 데이터베이스(250) 로부터 추출된 작물 열 인식 및 작물-잡초 식별에 필요한 정보가 포함되어 있으므로 장치의 소형화를 위해서 상기 데이터베이스들은 제거하고, 카메라(280)와 작물 열 인식용 CED 신경회로망(230)과; 잡초의 식별 및 위치 인식용 CED 신경회로망(260)과; CPU(100)와; GPU(110)로만 간단히 제초기의 제어시스템을 구성하는 것을 특징으로하는 신경회로망 학습에 의한 작물과 잡초의 식별 장치The parameters of the learned CED neural networks according to claim 12 include already a crop image-crop column image database (220); Crop information and crop-weed identification information extracted from the crop image-crop type and location image database 250. Therefore, in order to miniaturize the apparatus, the databases are removed, and the camera 280 and the crop- A CED neural network 230; A CED neural network 260 for identifying and locating weeds; A CPU 100; And the control system of the lawn mower is constructed simply by the GPU (110).
PCT/KR2018/012883 2017-10-27 2018-10-29 Method and device for crop and weed classification using neural network learning WO2019083336A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020512648A JP6771800B2 (en) 2017-10-27 2018-10-29 Devices and methods for identifying crops and weeds by learning neural networks

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20170140783 2017-10-27
KR10-2017-0140783 2017-10-27
KR1020180129482A KR102188521B1 (en) 2017-10-27 2018-10-29 Method and Apparatus for Identification of Crops and Weeds with Neural Network Learning
KR10-2018-0129482 2018-10-29

Publications (1)

Publication Number Publication Date
WO2019083336A1 true WO2019083336A1 (en) 2019-05-02

Family

ID=66247539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/012883 WO2019083336A1 (en) 2017-10-27 2018-10-29 Method and device for crop and weed classification using neural network learning

Country Status (1)

Country Link
WO (1) WO2019083336A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325240A (en) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 Weed-related computer-executable method and computer system
CN111414805A (en) * 2020-02-27 2020-07-14 华南农业大学 Rice-grass identification device and method with intelligent touch sense
CN111556157A (en) * 2020-05-06 2020-08-18 中南民族大学 Crop distribution monitoring method, equipment, storage medium and device
EP3811748A1 (en) 2019-10-24 2021-04-28 Ekobot Ab A weeding machine and a method for carrying out weeding using the weeding machine
JP2021136032A (en) * 2020-02-25 2021-09-13 ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド Method and apparatus for detecting mobile traffic light, electronic device and storage medium
CN113647281A (en) * 2021-07-22 2021-11-16 盘锦光合蟹业有限公司 Weeding method and system
CN114761183A (en) * 2019-12-03 2022-07-15 西门子股份公司 Computerized engineering tool and method for developing neurological skills for robotic systems
CN114818909A (en) * 2022-04-22 2022-07-29 北大荒信息有限公司 Weed detection method and device based on crop growth characteristics

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09128538A (en) * 1995-10-26 1997-05-16 Norin Suisansyo Hokkaido Nogyo Shikenjo Detecting method for farm products
JP2003102275A (en) * 2001-09-28 2003-04-08 National Agricultural Research Organization Algorithm for detecting position of crop
KR20080049472A (en) * 2006-11-30 2008-06-04 (주)한백시스템 Information detecting system using photographing apparatus load in vehicle and artificial neural network
KR20170028591A (en) * 2015-09-04 2017-03-14 한국전자통신연구원 Apparatus and method for object recognition with convolution neural network
KR101763835B1 (en) * 2015-10-30 2017-08-03 사단법인 한국온실작물연구소 System for distinguishing image divided by crop organ using image in colony

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09128538A (en) * 1995-10-26 1997-05-16 Norin Suisansyo Hokkaido Nogyo Shikenjo Detecting method for farm products
JP2003102275A (en) * 2001-09-28 2003-04-08 National Agricultural Research Organization Algorithm for detecting position of crop
KR20080049472A (en) * 2006-11-30 2008-06-04 (주)한백시스템 Information detecting system using photographing apparatus load in vehicle and artificial neural network
KR20170028591A (en) * 2015-09-04 2017-03-14 한국전자통신연구원 Apparatus and method for object recognition with convolution neural network
KR101763835B1 (en) * 2015-10-30 2017-08-03 사단법인 한국온실작물연구소 System for distinguishing image divided by crop organ using image in colony

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3811748A1 (en) 2019-10-24 2021-04-28 Ekobot Ab A weeding machine and a method for carrying out weeding using the weeding machine
CN114761183A (en) * 2019-12-03 2022-07-15 西门子股份公司 Computerized engineering tool and method for developing neurological skills for robotic systems
CN111325240A (en) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 Weed-related computer-executable method and computer system
JP2021136032A (en) * 2020-02-25 2021-09-13 ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド Method and apparatus for detecting mobile traffic light, electronic device and storage medium
JP7164644B2 (en) 2020-02-25 2022-11-01 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Mobile traffic light detection method, device, electronic device and storage medium
US11508162B2 (en) 2020-02-25 2022-11-22 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for detecting mobile traffic light
CN111414805A (en) * 2020-02-27 2020-07-14 华南农业大学 Rice-grass identification device and method with intelligent touch sense
CN111414805B (en) * 2020-02-27 2023-10-24 华南农业大学 Tactile intelligent rice-grass identification device and method
CN111556157A (en) * 2020-05-06 2020-08-18 中南民族大学 Crop distribution monitoring method, equipment, storage medium and device
CN113647281A (en) * 2021-07-22 2021-11-16 盘锦光合蟹业有限公司 Weeding method and system
CN114818909A (en) * 2022-04-22 2022-07-29 北大荒信息有限公司 Weed detection method and device based on crop growth characteristics
CN114818909B (en) * 2022-04-22 2023-09-15 北大荒信息有限公司 Weed detection method and device based on crop growth characteristics

Similar Documents

Publication Publication Date Title
WO2019083336A1 (en) Method and device for crop and weed classification using neural network learning
Tian et al. Machine vision identification of tomato seedlings for automated weed control
Ge et al. Fruit localization and environment perception for strawberry harvesting robots
JP6771800B2 (en) Devices and methods for identifying crops and weeds by learning neural networks
Zermas et al. 3D model processing for high throughput phenotype extraction–the case of corn
Dyrmann et al. Pixel-wise classification of weeds and crop in images by using a fully convolutional neural network.
Cheng et al. A feature-based machine learning agent for automatic rice and weed discrimination
CN109886155B (en) Single-plant rice detection and positioning method, system, equipment and medium based on deep learning
Huang et al. Deep localization model for intra-row crop detection in paddy field
Ajayi et al. Effect of varying training epochs of a faster region-based convolutional neural network on the accuracy of an automatic weed classification scheme
de Silva et al. Towards agricultural autonomy: crop row detection under varying field conditions using deep learning
Zermas et al. Extracting phenotypic characteristics of corn crops through the use of reconstructed 3D models
Fernando et al. Ai based greenhouse farming support system with robotic monitoring
Fernando et al. Intelligent disease detection system for greenhouse with a robotic monitoring system
Czymmek et al. Vision based crop row detection for low cost uav imagery in organic agriculture
Dandekar et al. Weed Plant Detection from Agricultural Field Images using YOLOv3 Algorithm
Wang et al. The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system
WO2021198731A1 (en) An artificial-intelligence-based method of agricultural and horticultural plants' physical characteristics and health diagnosing and development assessment.
Higgs et al. ProTractor: a lightweight ground imaging and analysis system for early-season field phenotyping
CN117036926A (en) Weed identification method integrating deep learning and image processing
De Silva et al. Towards infield navigation: leveraging simulated data for crop row detection
Husin et al. Plant chili disease detection using the RGB color model
Goondram et al. Strawberry Detection using Mixed Training on Simulated and Real Data
CN114757891A (en) Plant growth state identification method based on machine vision technology
Avilés-Mejia et al. Autonomous vision-based navigation and control for intra-row weeding

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18871359

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2020512648

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18871359

Country of ref document: EP

Kind code of ref document: A1