WO2019083336A1 - Procédé et dispositif de classification de cultures et de mauvaises herbes au moyen d'un apprentissage de réseau neuronal - Google Patents

Procédé et dispositif de classification de cultures et de mauvaises herbes au moyen d'un apprentissage de réseau neuronal

Info

Publication number
WO2019083336A1
WO2019083336A1 PCT/KR2018/012883 KR2018012883W WO2019083336A1 WO 2019083336 A1 WO2019083336 A1 WO 2019083336A1 KR 2018012883 W KR2018012883 W KR 2018012883W WO 2019083336 A1 WO2019083336 A1 WO 2019083336A1
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
crop
image
ced
crops
Prior art date
Application number
PCT/KR2018/012883
Other languages
English (en)
Korean (ko)
Inventor
김형석
박동선
아디카리샴
양희찬
김용진
양창주
Original Assignee
전북대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전북대학교산학협력단 filed Critical 전북대학교산학협력단
Priority to JP2020512648A priority Critical patent/JP6771800B2/ja
Priority claimed from KR1020180129482A external-priority patent/KR102188521B1/ko
Publication of WO2019083336A1 publication Critical patent/WO2019083336A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • the present invention belongs to the image recognition field of a neural network.
  • the CED neural network is used to recognize the heat of a crop, and all the plants other than heat are regarded as weeds and are collectively removed, and a small number of weeds To be precisely identified by the learning of another CED neural network so as to be weeded.
  • Zhang et al. (1995) analyzed and presented the criteria that can distinguish weeds found in wheat fields from color, shape, and texture.
  • Woebbecke et al. (1995a) performed color analysis for separating weeds and backgrounds from images. In particular, the corrected color, 2g-r-b, and the chromatic coordinate showed that the weeds were well distinguished from the surrounding environment.
  • Tian et al. (1997) developed and tested a machine vision system that can locate young tomatoes and weeds in the open field. In Korea, we have conducted research in this area, and the authors (1999) conducted a study showing the possibility of detecting weeds in the field by extracting geometric features of weed, weed, Respectively.
  • weed recognition methods using the color, shape and texture spectrum of such plants are not adaptable to various changed environments and shapes because they are based on rules.
  • the present invention provides a technique for extracting crop heat from a crop image using artificial intelligence technology, a technology for identifying crops and weeds, and a technique for designating a location for weeding To construct themselves by means of the network.
  • the present invention extracts heat of crops using Convolutional Encoder-Decoder (CED) neural network technology so that weeds are removed as weeds,
  • CED Convolutional Encoder-Decoder
  • additional CED neural networks are used to identify and remove weeds from crops.
  • the system structure consists of CED neural network used for heat extraction of crops and CED neural network used for crop and weed identification.
  • the CED neural networks are composed of several stages between the input and output stages.
  • the CED neural networks have a constricted structure toward the middle stage and a CED neural network that performs convolutional computation at each stage.
  • the CED neural network has various structures modified from the basic structure. Although there is a difference in performance and characteristics among them, a CED neural network of any modified structure is used for heat extraction of the crop, Lt; / RTI >
  • a learning data set in which a crop image is used as an input image of the CED neural network and a line image in which the positions of crop lines in the input image are graphically displayed is used as a learning target image In large quantities;
  • the CED neural network repeatedly learns the learning data set so as to learn the technique of extracting the crop row from the crop image.
  • an image obtained by taking close-up images so that the shape of individual crops and weeds can be identified is used as an input image, and the shape of the crop or weeds in the input image, A large number of sets of learning data in which an image representing the target image is displayed as a learning target image;
  • the learning data set is repeatedly learned, and the CED neural network learns the technique of identifying the crops and weeds from the field / field image, and designating the position to the position, through the neural network learning.
  • the present invention directly draws an image that is desired to appear as an output image, and does not rely on an algorithm based on a conventional image processing technique, and learns the image directly. Therefore, it is effective to develop a new technology development method that allows the CED neural network to construct the technology required for obtaining the desired image recognition result through learning. It also has the effect of recognizing the heat of the crop and identifying the crops - weeds using this technology, so that the mowers can precisely and automatically weed.
  • Figure 1 is a convolutional encoder-decoder neural network structure usable in the invention
  • Figure 2 shows the structure of crop row and crop-weed identification and position detection system for weeding
  • Figure 3 illustrates some of the training data of the CED neural network for crop row extraction
  • FIG. 4 is an illustration of a test result of the learned CED neural network for crop heat extraction
  • FIG. 5 illustrates a part of the learning data of the CED neural network for mother and child identification
  • FIG. 6 illustrates an example of the test results of the learned CED neural network.
  • the accuracy of the CED neural network using the Unet of Fig. 1 (c) is higher than that of the CED neural network shown in Fig. 1 (a) and (b)
  • the CED neural network using DensNet is superior, and the CED neural network using the dense net is somewhat better than the CED neural network using the Yu net.
  • the required parameters are much larger than those of a CED neural network using a Yu net, and it is also difficult to select an appropriate hyper-parameter.
  • the CED neural network using the dense net has a disadvantage that the execution speed is slower than the CED neural network using the U net.
  • the best embodiment considering both accuracy, implementation, and economy is a form including a CED neural network using a uNet.
  • the convolutional encoder-decoder (CED) neural network used in the present invention has a plurality of stages between an input terminal and an output terminal, and is a neural network having a small size.
  • An increasingly smaller portion of the first half is referred to as an encoder portion, and an increasingly larger portion of the latter half is referred to as a decoder portion.
  • Fig. 1 (A) and Fig. 1 (B) may be used to implement the CED neural network structure or the two structures used in the implementation of the method for detecting the crop heat and the crop weed in the present invention. Also, as shown in Fig.
  • a CED neural network i.e., a Unet, or a skipped structure, in which the outputs of the respective layers of the encoder portion are skipped immediately to the input of the same layer of the decoder portion
  • a CED neural network of the structure can also be used and a modified CED neural network structure called DensNet having connections across each layer of the Unet structure as shown in Figure 1 (a) can also be used. It is also possible to use a neural network that is modified based on the CED neural network.
  • the present invention can be implemented by adopting different numbers of layers and different numbers of filters for each structure, and the object of the present invention can be achieved.
  • the present invention relates to a crop recognition method for recognizing the heat of crops using the Convolutional Encoder-Decoder (CED) neural network technology; And a crop-weed identification step that identifies the weeds from the crop using an additional CED neural network for a small number of weeds present on the crop line.
  • the hardware for this is composed of two neural networks such as a CED neural network 230 used in the step of recognizing the rows of crops and a CED neural network 260 used in the step of identifying the weeds from the crop, do.
  • each CED neural network the task to be performed to the CED neural network is exemplified by applying the output data to be obtained as an output together with the input image to the learning database (220 and 250), and the neural network To install the technology to perform.
  • a large number of crop images are firstly obtained, and a large number of graphic display images 210 of the crop row positions and a symbol display image 240 of the crop-weed position are obtained for each crop image, And a type and position image database 250 of the crop image-crop and weed, and stores it in a computer hard disk.
  • the crop image-crop column image database 220 and the crop image-crop and weed species types and location image database 250 are provided for each crop column recognition CED neural network 230 and for crop and weed identification and location recognition And is learned by the CED neural network 260.
  • the learned neural network has the data type of each connection parameter values for the structure of the pre-designed neural network. The following is a detailed description of the learning process of the CED neural network 230 for crop heat recognition, the CED neural network 260 for identifying crops and weeds, and the location recognition.
  • FIG. 3 illustrates a case of rice (rice) in a crop.
  • the CED neural network assigns a task to extract and present a crop image of the same crop as the right image will be.
  • the learning target image displays graphical lines like the right images of FIG. 3 along the positions corresponding to the respective simulation lines as shown in the left images of FIG. That is, when the right image is superimposed on the left image, a target image is generated in which each line of the right image is located at the center of the crop line in the left image.
  • the neural network is learned. The learning is repeated until the error falls below a predetermined threshold by using the backpropagation learning method, which is commonly used for neural network learning.
  • FIG. 4 is a graph illustrating a test result of the CED neural network learned for the above-mentioned crop line recognition.
  • the input images such as the leftmost position are applied to the CED neural network, Can be obtained.
  • the images superimposed on the CED neural network outputs on the input images are the same as the rightmost images in Fig. 4 in order to check whether the lines on the output images correctly specify the crop row positions of the input images.
  • the positions of the CED neural network output lines precisely specify the positions of the crop lines in the input image.
  • a neural network having a structure similar to that of the Convolutional Encoder-Decoder (CED) is used for discrimination between crops and weeds.
  • the learning data is displayed with one kind of symbol in the bottom part of the crop in the learning target image, and the image displayed in the base part of the weeds with different kinds of symbols is created and the neural network learns it. If a symbol is displayed on the bottom part, it is possible to make effective weeding by breaking the plant by a weeding machine.
  • the left images in FIG. 5 are the rice and blood images used as the neural network inputs, and the right target images are gray circular symbols corresponding to the bottom of the rice in the left images, These are examples of images displayed with circular symbols.
  • the training data of the CED neural network to identify the mother and the daughter of rice are prepared as an input image by capturing a large amount of images such as the left image and creating learning target images such as the right image for each input image.
  • the learning target image is an example in which a blank image is displayed with gray and black circular symbols at the base position of the monopulse, but the symbols may be superimposed on the input image.
  • the color and shape of the symbol to be displayed can be selected in various ways.
  • FIG. 6 illustrates some of the test results of the CED neural network for crop heat recognition.
  • the leftmost images in FIG. 6 are input test images that are not included in the learning data.
  • the intermediate images in FIG. 6 are neural network output images, and the right images are images in which an input image and an output image are superimposed.
  • the gray circular symbol means the position of the rice
  • the black circular symbol means the position of the blood.
  • the resultant images thus recognized are sent to a weeding robot to be able to weed out the actual position corresponding to the mother and the blood.
  • an image is acquired in real time through a camera 280, and the obtained image is used to identify the CED neural network 230 for crop heat recognition, It needs to be simultaneously applied to the food CED neural network 260 and processed at a high speed.
  • the CPU 100 is used to control the flow of information and configure the neural network in software.
  • the CPU 100 post-processes the outputs of the neural networks, analyzes the results, and generates appropriate control signals necessary for autonomous traveling of the external lawn mowers using the results. During this process, particularly high speed signal processing is required to construct the CED neural networks and perform their operations.
  • the GPU 110 is used as an auxiliary device.
  • the parameters of the learned CED neural network include the information necessary for crop recognition and crop-weed identification already extracted from the crop image-crop column image database 220 and the crop image-crop type and location image database 250
  • the databases are removed, and a camera 280, a CED neural network 230 for crop heat recognition, a weed identification and position recognition CED neural network 260,
  • the lawn mower control system 300 can be configured simply with the CPU 100 and the GPU 110.
  • the lawn mower control system configured as described above first acquires an image through the camera 280 in real time, and transmits the obtained image to the CED neural network 230 for crops recognition, And to identify and locate the weed CED neural network 260 at the same time.
  • the CPU 100 is used to control the flow of information and configure the neural network in software.
  • the CPU 100 performs post-processing of the output of the neural network, analyzes and analyzes the output of the neural network, and generates an appropriate control signal necessary for autonomous weeding operation of the weeder 290 connected to the outside.
  • a high-speed signal processing is required to construct a CED neural network and perform its operation.
  • the GPU 110 is used as an auxiliary device.
  • the present invention aims at automating and mechanization of weeding work, which occupies most of the farmer's labor force in cultivation of crops, thereby reducing production costs such as the efficiency of cultivation of crops and the employment cost of the workers for weeding work.
  • the cultivation of crops that require work is very likely to be industrially applicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

La présente invention concerne une technologie de classification de cultures et de mauvaises herbes, comprenant les étapes consistant : à reconnaître des rangées de cultures à l'aide d'une technique de réseau neuronal à codeur-décodeur à convolution (CED), afin de classifier toutes les plantes situées entre les rangées en tant que mauvaises herbes; et pour un petit nombre de mauvaises herbes situées dans les rangées des cultures, à distinguer les mauvaises herbes des cultures. Un réseau neuronal CED destiné à extraire des rangées de cultures reçoit une image de cultures en tant qu'image d'entrée et apprend une image à apprendre obtenue par marquage de lignes graphiques sur l'image d'entrée à des emplacements correspondant aux rangées des cultures, afin de développer, par lui-même, une compétence d'extraction de rangées de cultures à partir d'une image de cultures; et un réseau neuronal CED destiné à la classification des mauvaises herbes reçoit, en tant qu'image d'entrée, une image capturée rapprochée de cultures et apprend une image à apprendre obtenue par marquage de différents types (couleurs/formes) de symboles suivant les espèces de cultures ou de mauvaises herbes sur l'image d'entrée à des emplacements correspondant à des cultures ou à des mauvaises herbes, afin de développer, par lui-même, une compétence de classification de cultures et de mauvaises herbes. Des informations sur les emplacements de rangées de cultures et des informations sur les emplacements des mauvaises herbes sur les rangées, acquises par la technologie de la présente invention, peuvent être transmises à une machine de désherbage afin de permettre à la machine de désherbage de déplacer son outil de désherbage aux emplacements correspondants et d'effectuer un désherbage.
PCT/KR2018/012883 2017-10-27 2018-10-29 Procédé et dispositif de classification de cultures et de mauvaises herbes au moyen d'un apprentissage de réseau neuronal WO2019083336A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020512648A JP6771800B2 (ja) 2017-10-27 2018-10-29 神経回路網の学習による作物と雑草を識別する装置及び方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2017-0140783 2017-10-27
KR20170140783 2017-10-27
KR1020180129482A KR102188521B1 (ko) 2017-10-27 2018-10-29 신경회로망 학습에 의한 작물과 잡초 식별 방법 및 장치
KR10-2018-0129482 2018-10-29

Publications (1)

Publication Number Publication Date
WO2019083336A1 true WO2019083336A1 (fr) 2019-05-02

Family

ID=66247539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/012883 WO2019083336A1 (fr) 2017-10-27 2018-10-29 Procédé et dispositif de classification de cultures et de mauvaises herbes au moyen d'un apprentissage de réseau neuronal

Country Status (1)

Country Link
WO (1) WO2019083336A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325240A (zh) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 与杂草相关的计算机可执行的方法和计算机系统
CN111414805A (zh) * 2020-02-27 2020-07-14 华南农业大学 一种触觉智能的稻-草辨识装置和方法
CN111556157A (zh) * 2020-05-06 2020-08-18 中南民族大学 农作物分布的监测方法、设备、存储介质及装置
EP3811748A1 (fr) 2019-10-24 2021-04-28 Ekobot Ab Machine de désherbage et procédé permettant de mettre en uvre le désherbage à l'aide de la machine de désherbage
JP2021136032A (ja) * 2020-02-25 2021-09-13 ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド 移動式信号機の検出方法、装置、電子機器及び記憶媒体
CN113647281A (zh) * 2021-07-22 2021-11-16 盘锦光合蟹业有限公司 一种除草方法及系统
CN114761183A (zh) * 2019-12-03 2022-07-15 西门子股份公司 用于为机器人系统开发神经技能的计算机化工程工具和方法
CN114818909A (zh) * 2022-04-22 2022-07-29 北大荒信息有限公司 一种基于作物长势特征的杂草检测方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09128538A (ja) * 1995-10-26 1997-05-16 Norin Suisansyo Hokkaido Nogyo Shikenjo 作物の検出方法
JP2003102275A (ja) * 2001-09-28 2003-04-08 National Agricultural Research Organization 作物位置検出のためのアルゴリズム
KR20080049472A (ko) * 2006-11-30 2008-06-04 (주)한백시스템 차량 탑재형 촬영장치 및 인공신경회로망을 이용한정보검출시스템
KR20170028591A (ko) * 2015-09-04 2017-03-14 한국전자통신연구원 컨볼루션 신경망을 이용한 객체 인식 장치 및 방법
KR101763835B1 (ko) * 2015-10-30 2017-08-03 사단법인 한국온실작물연구소 군락에서 영상이미지를 통한 작물기관별 이미지식별시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09128538A (ja) * 1995-10-26 1997-05-16 Norin Suisansyo Hokkaido Nogyo Shikenjo 作物の検出方法
JP2003102275A (ja) * 2001-09-28 2003-04-08 National Agricultural Research Organization 作物位置検出のためのアルゴリズム
KR20080049472A (ko) * 2006-11-30 2008-06-04 (주)한백시스템 차량 탑재형 촬영장치 및 인공신경회로망을 이용한정보검출시스템
KR20170028591A (ko) * 2015-09-04 2017-03-14 한국전자통신연구원 컨볼루션 신경망을 이용한 객체 인식 장치 및 방법
KR101763835B1 (ko) * 2015-10-30 2017-08-03 사단법인 한국온실작물연구소 군락에서 영상이미지를 통한 작물기관별 이미지식별시스템

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3811748A1 (fr) 2019-10-24 2021-04-28 Ekobot Ab Machine de désherbage et procédé permettant de mettre en uvre le désherbage à l'aide de la machine de désherbage
CN114761183A (zh) * 2019-12-03 2022-07-15 西门子股份公司 用于为机器人系统开发神经技能的计算机化工程工具和方法
CN111325240A (zh) * 2020-01-23 2020-06-23 杭州睿琪软件有限公司 与杂草相关的计算机可执行的方法和计算机系统
JP2021136032A (ja) * 2020-02-25 2021-09-13 ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド 移動式信号機の検出方法、装置、電子機器及び記憶媒体
JP7164644B2 (ja) 2020-02-25 2022-11-01 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド 移動式信号機の検出方法、装置、電子機器及び記憶媒体
US11508162B2 (en) 2020-02-25 2022-11-22 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for detecting mobile traffic light
CN111414805A (zh) * 2020-02-27 2020-07-14 华南农业大学 一种触觉智能的稻-草辨识装置和方法
CN111414805B (zh) * 2020-02-27 2023-10-24 华南农业大学 一种触觉智能的稻-草辨识装置和方法
CN111556157A (zh) * 2020-05-06 2020-08-18 中南民族大学 农作物分布的监测方法、设备、存储介质及装置
CN113647281A (zh) * 2021-07-22 2021-11-16 盘锦光合蟹业有限公司 一种除草方法及系统
CN114818909A (zh) * 2022-04-22 2022-07-29 北大荒信息有限公司 一种基于作物长势特征的杂草检测方法和装置
CN114818909B (zh) * 2022-04-22 2023-09-15 北大荒信息有限公司 一种基于作物长势特征的杂草检测方法和装置

Similar Documents

Publication Publication Date Title
WO2019083336A1 (fr) Procédé et dispositif de classification de cultures et de mauvaises herbes au moyen d'un apprentissage de réseau neuronal
Tian et al. Machine vision identification of tomato seedlings for automated weed control
Ge et al. Fruit localization and environment perception for strawberry harvesting robots
JP6771800B2 (ja) 神経回路網の学習による作物と雑草を識別する装置及び方法
Zermas et al. 3D model processing for high throughput phenotype extraction–the case of corn
Dyrmann et al. Pixel-wise classification of weeds and crop in images by using a fully convolutional neural network.
Cheng et al. A feature-based machine learning agent for automatic rice and weed discrimination
CN109886155B (zh) 基于深度学习的单株水稻检测定位方法、系统、设备及介质
Huang et al. Deep localization model for intra-row crop detection in paddy field
Ajayi et al. Effect of varying training epochs of a faster region-based convolutional neural network on the accuracy of an automatic weed classification scheme
de Silva et al. Towards agricultural autonomy: crop row detection under varying field conditions using deep learning
Zermas et al. Extracting phenotypic characteristics of corn crops through the use of reconstructed 3D models
Fernando et al. Ai based greenhouse farming support system with robotic monitoring
Fernando et al. Intelligent disease detection system for greenhouse with a robotic monitoring system
Czymmek et al. Vision based crop row detection for low cost uav imagery in organic agriculture
Dandekar et al. Weed Plant Detection from Agricultural Field Images using YOLOv3 Algorithm
Wang et al. The identification of straight-curved rice seedling rows for automatic row avoidance and weeding system
WO2021198731A1 (fr) Procédé de diagnostic de santé et d'évaluation du développement de caractéristiques physiques de plantes agricoles et horticoles basé sur l'intelligence artificielle
CN117036926A (zh) 一种融合深度学习与图像处理的杂草识别方法
De Silva et al. Towards infield navigation: leveraging simulated data for crop row detection
Husin et al. Plant chili disease detection using the RGB color model
CN114757891A (zh) 一种基于机器视觉技术的植物生长状态的识别方法
Avilés-Mejia et al. Autonomous vision-based navigation and control for intra-row weeding
Wu Detection of salient region of in-field rapeseed plant images based-on visual attention model
Herrera et al. A new combined strategy for discrimination between types of weed

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18871359

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 2020512648

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18871359

Country of ref document: EP

Kind code of ref document: A1