CN116912702B - Weed coverage determination method, system and device and electronic equipment - Google Patents

Weed coverage determination method, system and device and electronic equipment Download PDF

Info

Publication number
CN116912702B
CN116912702B CN202311186203.XA CN202311186203A CN116912702B CN 116912702 B CN116912702 B CN 116912702B CN 202311186203 A CN202311186203 A CN 202311186203A CN 116912702 B CN116912702 B CN 116912702B
Authority
CN
China
Prior art keywords
type
image
weeds
crop
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311186203.XA
Other languages
Chinese (zh)
Other versions
CN116912702A (en
Inventor
于佳琳
刘腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Provincial Laboratory Of Weifang Modern Agriculture
Original Assignee
Shandong Provincial Laboratory Of Weifang Modern Agriculture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Provincial Laboratory Of Weifang Modern Agriculture filed Critical Shandong Provincial Laboratory Of Weifang Modern Agriculture
Priority to CN202311186203.XA priority Critical patent/CN116912702B/en
Publication of CN116912702A publication Critical patent/CN116912702A/en
Application granted granted Critical
Publication of CN116912702B publication Critical patent/CN116912702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a weed coverage determination method, a weed coverage determination system, a weed coverage determination device and electronic equipment. Wherein the method comprises the following steps: determining a crop type of the crop in the target area; under the condition that the crop type is the first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model; in the case of a crop of a second type, determining the number of grids comprising the detection grid of weeds using a second identification model; in the case of a first crop type, the weed coverage of the target area is determined in accordance with the number of pixels to which the weeds correspond, and in the case of a second crop type, the weed coverage of the target area is determined in accordance with the number of grids of the detection grid containing the weeds. The method solves the technical problems of low evaluation efficiency and poor accuracy of weed coverage caused by the fact that the related technology mostly uses an artificial naked eye evaluation mode when evaluating weed coverage.

Description

Weed coverage determination method, system and device and electronic equipment
Technical Field
The application relates to the technical field of automation, in particular to a weed coverage determining method, a weed coverage determining system, a weed coverage determining device and electronic equipment.
Background
Weeds compete with crops for illumination, moisture and soil nutrition, and plant diseases and insect pests are bred, so that the yield of grains is reduced. Weeds in the field often grow in aggregate at different locations in the field. The field weed coverage and distribution need to be detected and evaluated in real time so as to make weed control decisions in time. By evaluating weed coverage, farmers and researchers can understand the influence of weeds on crop growth and yield, and formulate corresponding control measures.
However, the existing field weed investigation method mainly utilizes a manual mode, relies on manual evaluation of weed coverage and field distribution in the field by naked eyes through W or Z shapes, and has the problems of low weed coverage investigation efficiency, poor accuracy and the like.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the application provides a weed coverage determination method, a weed coverage determination system, a weed coverage determination device and electronic equipment, which at least solve the technical problems of low weed coverage evaluation efficiency and poor weed coverage accuracy caused by a mode of mostly using artificial naked eyes for weed coverage evaluation in the related technology.
According to one aspect of embodiments of the present application, there is provided a weed coverage determination method comprising: determining a crop type of the crop in the target area; under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold; under the condition that the crop type is a second type, dividing an image corresponding to a target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; and determining the weed coverage of the target area according to the number of pixels corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the number of grids of the detection grid containing the weeds when the crop type is the second type.
Optionally, determining, by using the first recognition model, the number of pixels corresponding to the weeds in the image corresponding to the target area includes: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; recognizing crops in the original image by adopting a first recognition model, and removing the recognized crops from the original image to obtain a first image; converting the first image into a second image with a preset format, wherein the preset format is a hue-saturation-color brightness HSV color space format; determining pixel points corresponding to weeds in the second image according to a preset color threshold, wherein the preset color threshold comprises: hue threshold, saturation threshold, and color brightness threshold.
Optionally, determining, according to the preset color threshold, the pixel point corresponding to the weed in the second image includes: extracting a pixel point area of a target color in the second image according to a preset color threshold, wherein the target color is a color corresponding to weeds; performing binarization processing on the second image to obtain a third image, wherein a pixel point area of a target color in the third image is a first color, and other areas except the pixel point area are second colors; performing corrosion and/or expansion operation on the third image to obtain a plurality of communication areas corresponding to the target colors; and counting the number of pixel points in each communication area in the third image to obtain the number of pixels corresponding to weeds.
Optionally, the first recognition model is deployed in the cloud server, and is trained by the initial model with the attention mechanism through the first training data set, where the attention mechanism includes: channel attention and spatial attention, the first training dataset comprising: a plurality of training images of different growth periods of the first type of crop, and a crop name corresponding to the training images.
Optionally, determining the number of grids comprising the detection grid of weeds comprises: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; dividing an original image into a plurality of detection grids, wherein the number of pixel points in each detection grid is the same; and judging whether the detection grids contain weeds or not by adopting a second identification model, and counting the number of grids of the detection grids containing the weeds in the original image.
Optionally, the second recognition model is deployed in the cloud server, and is trained by the initial model with the attention mechanism through the second training data set, where the attention mechanism includes: channel attention and spatial attention, the second training dataset comprising: the training grids are obtained by dividing images of the crops of the second type, and target labels corresponding to the training grids are used for indicating whether weeds are contained in the training grids.
Optionally, determining the weed coverage of the target area according to the number of pixels corresponding to the weeds includes: and under the condition that the crop type is the first type, calculating a first ratio of the number of pixels corresponding to weeds in the image corresponding to the target area to the total number of pixels of the image, and determining the first ratio as the weed coverage.
Optionally, determining the weed coverage of the target area in terms of the number of grids comprising the weed detection grid comprises: and under the condition that the crop type is the second type, calculating a second ratio of the grid number of the detection grids containing the weeds in the image corresponding to the target area to the total number of the detection grids in the image, and determining the second ratio as the weed coverage.
According to another aspect of embodiments of the present application, there is also provided a weed coverage determination system comprising: the system comprises an unmanned aerial vehicle, a terminal device and a cloud server, wherein the unmanned aerial vehicle is used for responding to an acquisition instruction sent by the terminal device, acquiring an image of a target area and sending the image to the cloud server; the cloud server is used for determining the crop type of crops in the target area; under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold; under the condition that the crop type is a second type, dividing an image corresponding to a target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; and determining the weed coverage of the target area according to the number of pixels corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the number of grids of the detection grid containing the weeds when the crop type is the second type.
According to another aspect of the embodiments of the present application, there is also provided a weed coverage determining apparatus including: a crop type determination module for determining a crop type of the crop in the target area; the first recognition module is used for determining the number of pixels corresponding to weeds in the image corresponding to the target area by adopting a first recognition model under the condition that the crop type is a first type, wherein the planting density of the crops of the first type is lower than a preset density threshold; the second recognition module is used for dividing the image corresponding to the target area into a plurality of detection grids by adopting a second recognition model under the condition that the crop type is a second type, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; the coverage determining module is used for determining the weed coverage of the target area according to the pixel number corresponding to the weeds under the condition that the crop type is the first type, and determining the weed coverage of the target area according to the grid number of the detection grids containing the weeds under the condition that the crop type is the second type.
According to still another aspect of the embodiments of the present application, there is also provided an electronic device, including: the system comprises a memory and a processor for running a program stored in the memory, wherein the program executes a weed coverage determination method when running.
According to still another aspect of the embodiments of the present application, there is also provided a nonvolatile storage medium including a stored computer program, where a device in which the nonvolatile storage medium is located performs the weed coverage determining method by running the computer program.
In the embodiment of the application, the crop type of crops in a target area is determined; under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold; under the condition that the crop type is a second type, dividing an image corresponding to a target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; under the condition that the crop type is the first type, the weed coverage of the target area is determined according to the number of pixels corresponding to weeds, and under the condition that the crop type is the second type, the weed coverage of the target area is determined according to the number of grids comprising detection grids of weeds, and the purposes of improving the weed coverage detection efficiency and precision are achieved by respectively establishing computer vision models corresponding to different crop types and presuming the weed coverage in real time through the effective computer vision models, so that the technical problems of low weed coverage assessment efficiency and poor weed coverage accuracy caused by the fact that manual naked eye assessment is mostly used in the related technology of weed coverage assessment are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a block diagram of the hardware architecture of a computer terminal (or electronic device) for implementing a method of weed coverage determination, provided in accordance with an embodiment of the present application;
FIG. 2 is a schematic illustration of a method flow of weed coverage determination provided in accordance with an embodiment of the present application;
FIG. 3 is a schematic diagram of a target frame for identifying crop plant formation provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic illustration of a first image provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic illustration of a third image provided in accordance with an embodiment of the present application;
FIG. 6 is a schematic illustration of a determination of weed coverage from a detection grid, provided in accordance with an embodiment of the present application;
FIG. 7 is a schematic diagram of a weed coverage determination system provided in accordance with an embodiment of the present application;
FIG. 8 is a schematic diagram of a real-time detection process for field weed according to an embodiment of the present application;
FIG. 9 is a schematic illustration of the overall flow of a weed coverage detection model provided in accordance with embodiments of the present application;
Fig. 10 is a schematic structural view of a weed coverage determining apparatus provided according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the growing season, the damage degree of weeds invading farmland crops is usually expressed as weed coverage, and timely detection is needed to provide effective weed control decision information. At present, the method for investigating weed coverage in the related art mainly has the following problems:
1) At present, the field weed investigation method mainly utilizes a manual mode to evaluate the weed coverage and the field distribution in the field by naked eyes through a W or Z shape. Under the condition of large field area, even if the user walks for a long distance, multipoint acquisition can not cover all plots in the field in a panoramic mode, and errors are easy to generate in investigation data.
2) In the scene of carrying out field herbicide agent experiments, after herbicide treatment is sprayed, the weeding effect is evaluated by naked eyes from each experimental district at intervals by adopting a manual mode, so that the time and the labor are wasted, scientific researchers often work for a long time in hot or cold field weather environments, and the labor cost is high.
3) The weed researchers typically randomly place the plants with fixed grids (quadrate) in the field test cell, then count the number of grids with weeds, and estimate the weed coverage by comparing the total number of grids, which is time consuming and labor intensive and prone to error.
4) The manual mode evaluates the weed coverage by naked eyes, is influenced by subjective factors, is extremely easy to generate errors, cannot practically reflect the field weed coverage generally, and does not have a unified objective evaluation standard.
5) The weed coverage is estimated by naked eyes in a manual mode, the whole farmland weeds cannot be estimated in a panoramic mode due to the limitation of the visual range, and the population distribution and coverage of the weeds which are checked by the method cannot reflect the actual situation.
6) The manual mode uses naked eyes to evaluate weed coverage, site information needs to be marked manually on a map, time and labor are wasted, and mislabeling is easy.
In order to solve the above-mentioned problems, related solutions are provided in the embodiments of the present application, and the following detailed description is provided.
In accordance with the embodiments of the present application, there is provided a method embodiment of weed coverage determination, it being noted that the steps illustrated in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
The method embodiments provided by the embodiments of the present application may be performed in a mobile terminal, a computer terminal, or similar computing device. Fig. 1 shows a hardware block diagram of a computer terminal (or electronic device) for implementing the weed coverage determination method. As shown in fig. 1, the computer terminal 10 (or electronic device) may include one or more processors 102 (shown as 102a, 102b, … …,102 n) which may include, but are not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA, a memory 104 for storing data, and a transmission device 106 for communication functions. In addition, the method may further include: a display, an input/output interface (I/O interface), a Universal Serial BUS (USB) port (which may be included as one of the ports of the BUS), a network interface, a power supply, and/or a camera. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 1 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the computer terminal 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
It should be noted that the one or more processors 102 and/or other data processing circuits described above may be referred to generally herein as "data processing circuits. The data processing circuit may be embodied in whole or in part in software, hardware, firmware, or any other combination. Furthermore, the data processing circuitry may be a single stand-alone processing module, or incorporated, in whole or in part, into any of the other elements in the computer terminal 10 (or electronic device). As referred to in the embodiments of the present application, the data processing circuit acts as a processor control (e.g., selection of the path of the variable resistor termination to interface).
The memory 104 may be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the weed coverage determination methods in the embodiments of the present application, and the processor 102 executes the software programs and modules stored in the memory 104, thereby performing various functional applications and data processing, i.e., implementing the weed coverage determination methods described above. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the computer terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 106 is arranged to receive or transmit data via a network. The specific examples of the network described above may include a wireless network provided by a communication provider of the computer terminal 10. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module for communicating with the internet wirelessly.
The display may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the computer terminal 10 (or electronic device).
In the above operating environment, the present embodiment provides a weed coverage determining method, and fig. 2 is a schematic diagram of a method flow for determining weed coverage according to the present embodiment, as shown in fig. 2, and the method includes the following steps:
step S202, determining the crop type of crops in a target area;
crops are divided into two crop types in this example: a first type and a second type, wherein the planting density of the crop of the first type is below a preset density threshold, for example: corn, sorghum, soybean, cotton, vegetables, etc., are often poorly effective in directly identifying weeds due to inconsistent field weed species, density, plant size, height, but the plant morphology of the first type of crop is often uniform and consistent, and easy to identify.
The planting density of the second type of crop is not lower than a preset density threshold, for example: lawns, rice, wheat, barley, oats, rye, etc., such crops and weeds tend to interleave, and it is difficult to identify individual crop targets by object detection models.
The preset density threshold can be adjusted according to actual requirements.
The present application provides a different weed coverage calculation scheme for each crop type for the two different crop characteristics described above, as further described below.
Step S204, under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold;
in the technical solution provided in step S204, a first recognition model is deployed in a cloud server, and is obtained by training an initial model with an attention mechanism through a first training data set, where the attention mechanism includes: channel attention and spatial attention, the first training dataset comprising: a plurality of training images of different growth periods of the first type of crop, and a crop name corresponding to the training images.
Specifically, for crops of a first type, according to the scheme, the target detection model is improved, a attention mechanism is added, data (for example, training images of different growth periods, crop names corresponding to the training images and the like) of crops of the first type such as corn, sorghum, soybean, cotton, vegetables and the like in different growth periods are collected, the target detection network is trained on a large scale, a first identification model is obtained, the crops in the images are accurately identified by the first identification model, and the crops are removed, so that weeds are indirectly identified. I.e. all ground green plants except the crop are identified as weeds. The percentage of weed coverage was calculated by calculating the proportion of weeds to the whole image.
The method of weed coverage calculation for the first type of crop described above is further described below.
In some embodiments of the present application, determining, using the first recognition model, a number of pixels corresponding to weeds in an image corresponding to a target area includes the steps of: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; recognizing crops in the original image by adopting a first recognition model, and removing the recognized crops from the original image to obtain a first image; converting the first image into a second image with a preset format, wherein the preset format is a hue-saturation-color brightness HSV color space format; determining pixel points corresponding to weeds in the second image according to a preset color threshold, wherein the preset color threshold comprises: hue threshold, saturation threshold, and color brightness threshold.
Specifically, a user can log in a console through a mobile terminal to operate an unmanned aerial vehicle to a farmland land block needing to survey weeds, select a working area (namely the target area), acquire field image information according to Real-time kinematic (RTK) positioning information, acquire an original image through aerial photography, and upload the original image to a cloud server.
The first recognition model in the cloud server firstly recognizes crop plants in the original image to form a target frame, as shown in fig. 3; then, removing crop plants in all target frames in the original image to obtain a first image, as shown in fig. 4; then converting the first image of the RGB color mode into a second image of a hue-saturation-value HSV color space format; and finally, determining the pixel points corresponding to weeds in the second image in the HSV format.
The method of determining the pixel points corresponding to weeds in the second image in HSV format is further described below.
In some embodiments of the present application, determining the pixel point corresponding to the weed in the second image according to the preset color threshold includes the following steps: extracting a pixel point area of a target color in the second image according to a preset color threshold, wherein the target color is a color corresponding to weeds; performing binarization processing on the second image to obtain a third image, wherein a pixel point area of a target color in the third image is a first color, and other areas except the pixel point area are second colors; performing corrosion and/or expansion operation on the third image to obtain a plurality of communication areas corresponding to the target colors; and counting the number of pixel points in each communication area in the third image to obtain the number of pixels corresponding to weeds.
Specifically, a suitable color threshold (preset color threshold) is first selected to extract a green area (i.e., the pixel area of the target color) in the second image, where in the HSV color space, a Hue (Hue) Value of the green is usually in a range close to 0 or 120, and a Saturation (Saturation) and a brightness (Value) can be adjusted according to a specific situation, and the green area in the image is extracted by selecting the suitable color threshold, so as to obtain the pixel area of the target color in the second image.
Then, a binary mask is created to perform binarization processing, that is, a pixel area (green area) of the extracted target color is converted into a binary image (i.e., the above-described third image) using a threshold dividing operation, wherein the green area is white and the other areas are black, as shown in fig. 5, so as to better distinguish green weeds from the background.
According to the noise and detail characteristics of the third image obtained after the binarization processing, morphological operations (such as corrosion and/or expansion operations) can be applied to remove noise or green weed areas with disconnected connections in the third image, so as to obtain a plurality of communication areas corresponding to target colors; and, it is also possible to fill the inside of the green weed region using a region filling algorithm and draw a boundary line around the boundary in order to better visualize the segmentation result.
And finally, counting the total number of pixels of all green communication areas in the third image to obtain the number of pixels corresponding to weeds.
Step S206, under the condition that the crop type is a second type, dividing an image corresponding to the target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value;
in the technical solution provided in step S206, the second recognition model is deployed in the cloud server, and is obtained by training an initial model with an attention mechanism through a second training data set, where the attention mechanism includes: channel attention and spatial attention, the second training dataset comprising: the training grids are obtained by dividing images of the crops of the second type, and target labels corresponding to the training grids are used for indicating whether weeds are contained in the training grids.
Specifically, for crops of a second type, the scheme of the application is to divide pictures of crops of the second type, such as lawns, rice, wheat, barley, oat, rye and the like, collected by an unmanned aerial vehicle into training grids with the same pixels, the training grids with weeds are classified into true types, the training grids without weeds are classified into true negative types, a second training set is constructed, the second training set is utilized to train a target classification convolutional neural network, a second recognition model is obtained, the second recognition model is used for deducing how many grids in an image contain weeds, and the percentage of the number of grids with weeds to the total number of grids in the whole image is estimated to be weed coverage.
It should be noted that, in the embodiment of the present application, the first recognition model and the second recognition model both introduce attention mechanisms to improve the expressive power and the perceptibility of the models in the process of building and training. The method can be specifically established by the following steps: the use of self-attention mechanisms (self-attention) to draw channel attention and spatial attention allows the model to take into account information at other locations while computing a representation of each location.
In this embodiment, the introduced channel attention and spatial attention mechanisms are added to the first layer and the last layer of the backbone network respectively, so as to reduce the modification of the network structure of the original backbone network. Wherein for channel attention, global features for each channel are obtained by using global averaging pooling (global average pooling), and then applying a full-join or convolution layer to map the global features to an attention weight vector. This attention weight vector may be applied to the original signature, weighting each channel, thereby producing the effect of channel attention.
For spatial attention, this is achieved by introducing position coding (position encoding) in the self-attention mechanism. Position coding is a method of embedding position information into a feature representation. The position information is encoded by using sine and cosine functions and the position encoded vector is added to the original feature vector. Thus, the model may take into account the location information at the same time when performing the self-attention calculations.
The scheme of the application enhances the expression capacity of the first recognition model and the second recognition model by introducing a channel attention and a spatial attention mechanism: the channel attention mechanism may adaptively learn the importance of each channel, thereby improving the perceptibility of the model to different channel features. The spatial attention mechanism can help the model to capture important position information in the image or sequence data, so that the spatial perception capability of the model is improved; moreover, the self-attention mechanism can model global context in the input sequence or feature map so that the model can better understand the dependency between different positions in the sequence or image; the attention mechanism can concentrate the attention of the model on important characteristics, inhibit the influence of noise and redundant information, and improve the robustness and generalization capability of the model; the attention mechanism may provide an explanation of the model decision, i.e. the model may indicate at which locations or channels attention is focused to make predictions. By visualizing the attention weights, the focus of the model on the input can be better understood, enhancing the interpretability of the model.
After the first recognition model and the second recognition model built according to different crop types are trained, the first recognition model and the second recognition model are deployed in a cloud server, and the real-time weed coverage detection efficiency is improved by means of the calculation force of the cloud large server.
The method of calculating weed coverage for the above-described second type of crop is further described below.
In some embodiments of the present application, determining the number of grids comprising the detection grid of weeds comprises the steps of: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; dividing an original image into a plurality of detection grids, wherein the number of pixel points in each detection grid is the same; and judging whether the detection grids contain weeds or not by adopting a second identification model, and counting the number of grids of the detection grids containing the weeds in the original image.
Specifically, a user can log in a control console through a mobile terminal to operate the unmanned aerial vehicle to a farmland land block needing to survey weeds, a working area (namely the target area) is selected, and according to RTK positioning information, the unmanned aerial vehicle acquires field image information in an aerial way, obtains an original image and uploads the original image to a cloud server.
Dividing an original image into a plurality of detection grids with equal pixels, and identifying how many detection grids in the original image contain weeds by adopting a second identification model trained in a cloud server to obtain the number of the grids containing the detection grids of the weeds.
Step S208, determining the weed coverage of the target area according to the pixel number corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the grid number of the detection grids containing the weeds when the crop type is the second type.
In some embodiments of the present application, determining the weed coverage of the target area according to the number of pixels to which the weed corresponds comprises the steps of: and under the condition that the crop type is the first type, calculating a first ratio of the number of pixels corresponding to weeds in the image corresponding to the target area to the total number of pixels of the image, and determining the first ratio as the weed coverage.
Specifically, in the case where the crop type is the first type, the weed density (weed coverage) is calculated by the ratio of the weed pixels (i.e., the number of pixels corresponding to the weeds) in the entire imageThe calculation formula is as follows:
wherein,represents the total number of green connected regions (i.e. connected regions corresponding to the above-mentioned target color),represent the firstThe number of pixels of the green connected region,representing the total number of pixels of the image.
In some embodiments of the present application, determining weed coverage of a target area from the number of grids comprising the weed detection grid comprises the steps of: and under the condition that the crop type is the second type, calculating a second ratio of the grid number of the detection grids containing the weeds in the image corresponding to the target area to the total number of the detection grids in the image, and determining the second ratio as the weed coverage.
Specifically, in the case where the crop type is the second type, the percentage of the number of detection grids containing weeds in the image to the total number of grids is taken as the weed coverage, and as shown in fig. 6, there are 100 detection grids in total in the image, and 5 detection grids contain weeds, and the weed coverage of the corresponding target area is about 5%.
Through the steps, through respectively establishing the computer vision models corresponding to different crop types and through the effective computer vision models, the weed coverage is estimated in real time, the purpose of improving the weed coverage detection efficiency and precision is achieved, and the technical problems of low weed coverage assessment efficiency and poor accuracy caused by the fact that the related technology adopts a manual visual assessment mode when the weed coverage assessment is carried out are solved.
Embodiments of a weed coverage determination system are also provided according to embodiments of the present application. Fig. 7 is a schematic diagram of the structure of a weed coverage determination system provided according to an embodiment of the present application. As shown in fig. 7, the system includes: a drone 70, a terminal device 72, a cloud server 74, wherein,
the unmanned aerial vehicle 70 is configured to collect an image of the target area in response to the collection instruction sent by the terminal device 72, and send the image to the cloud server 74;
A cloud server 74 for determining a crop type of the crop in the target area; under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold; under the condition that the crop type is a second type, dividing an image corresponding to a target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; and determining the weed coverage of the target area according to the number of pixels corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the number of grids of the detection grid containing the weeds when the crop type is the second type.
Further describing the weed coverage determining system, fig. 8 is a schematic diagram of a real-time detection process of field weed according to an embodiment of the present application, as shown in fig. 8.
Firstly, establishing weed coverage detection models (comprising a first identification model and a second identification model) aiming at different crop types, wherein the overall flow of the weed coverage detection models is shown in figure 9, if crops are crops of a first type such as corn, sorghum, soybean, cotton, vegetables and the like, the crops are identified by adopting an object detection model (namely the first identification model), all ground green plants except the crops are identified as weeds, and the types of the weeds including broadleaf weeds, grasses and sedge weeds can be subjected to multi-classification training of a computer vision model; a computer vision model may also be created to identify a particular weed. If the crop is a second type crop such as lawn, rice, wheat, barley, oat, rye and the like, dividing an image acquired by the unmanned aerial vehicle into grids with the same pixels, classifying the grids with weeds into true types, classifying the grids without weeds into true negative types, establishing a picture classification convolutional neural network model (namely a second identification model), and distinguishing whether weeds exist in the grids. The identification model of different crop types established by the application basically covers the current main crops, and has wide application range.
After model training is completed, the model is deployed to a cloud server, wherein the deployment of the computer vision model is only performed at the cloud server, a user does not need to download the model to a mobile phone end, the updating process is noninductive to the user, and the user only needs to select the corresponding crop type, so that the weed coverage can be detected and estimated in real time, and the user operation is facilitated.
The user can log in the console through the mobile terminal to operate the unmanned aerial vehicle to a farmland land block needing to survey weeds, an operation area (target area) is selected, field image information is acquired through aerial photography of the unmanned aerial vehicle according to RTK positioning information, an original image is obtained, the acquired original image information is transmitted to the cloud server through a 4G/5G network or other wireless communication technologies, and the weed coverage is determined in real time through a weed coverage detection model (comprising a first recognition model and a second recognition model) deployed by the cloud server.
The flow of the method for determining the weed coverage is shown in fig. 2, and therefore, the explanation of the method for determining the weed coverage is also applicable to the embodiments of the present application, and is not repeated herein.
In addition, longitude and latitude coordinates corresponding to the image acquired by the unmanned aerial vehicle can be marked by using RTK positioning information; in the application scene of a user for investigating herbicide effect in a farmland experimental cell, when an unmanned aerial vehicle flies to a designated field experimental cell, image information is collected, each cell can be numbered according to longitude and latitude, and each number corresponds to herbicide treatment. Meanwhile, the herbicide application time and weather conditions (such as temperature, humidity and the like) are tracked and recorded and used as auxiliary information for herbicide efficacy investigation.
After the cloud server determines the weed coverage of the target area through the weed coverage detection model (comprising a first identification model and a second identification model), the weed coverage and weed species information result can be transmitted to the mobile phone terminal through a 4G/5G network or other wireless communication technology network, so that a user can view the detection result in real time. Meanwhile, the weed coverage data can be stored in the cloud, and a user can download and check the weed coverage data at the same time through a mobile phone terminal or a computer, so that the subsequent data analysis is convenient.
As an alternative embodiment, the weed detection coverage profile of the whole field can also be presented by a panoramic mode; by identifying weed coverage with different color depth information, different color depths represent different percentages of weed coverage, e.g., a dark color represents high weed coverage and a light color represents low weed coverage. Meanwhile, weed detection data and distribution diagram data are also stored in the cloud, and a user can access and analyze the data through a mobile phone terminal or a computer at any time.
Compared with the manual evaluation method in the related art, the method for detecting the weed coverage and the weed distribution by using the unmanned aerial vehicle in the scheme of the application has the advantages that the weed coverage is estimated in real time through an effective computer vision model, the precision is high, the traditional manual visual observation of the weed coverage is influenced by subjective factors, and large errors are easy to generate; according to the scheme, the unmanned aerial vehicle is adopted to detect the weed coverage right above the farmland, the panoramic type no-dead-angle investigation weed coverage is provided, and the artificial naked eye is adopted to observe the weed coverage, so that the influence of the visual range and the angle is avoided, and larger errors are easy to generate; according to the scheme, the detected weed coverage data are stored in the cloud for a user to download and transmit to a computer, so that the analysis of the data after the day is convenient, the manual investigation of the weed coverage is usually recorded on paper, the data are manually input into the computer, the operation process is time-consuming and labor-consuming, and errors are easy to generate; according to the scheme, manual investigation of weed data in a field is not needed, the unmanned aerial vehicle is operated through the mobile phone terminal console to investigate the weed density, the operation is convenient, and the labor cost is greatly saved.
It should be noted that, the weed coverage determining system provided in the present embodiment may be used to perform the weed coverage determining method shown in fig. 2, and therefore, the explanation of the weed coverage determining method described above is also applicable to the embodiments of the present application, and is not repeated here.
According to an embodiment of the present application, there is also provided an embodiment of a weed coverage determining apparatus. Fig. 10 is a schematic structural view of a weed coverage determining apparatus provided according to an embodiment of the present application. As shown in fig. 10, the apparatus includes:
a crop type determination module 1000 for determining a crop type of a crop in the target area;
the first recognition module 1002 is configured to determine, using a first recognition model, a number of pixels corresponding to weeds in an image corresponding to the target area when the crop type is a first type, where a planting density of the crop of the first type is lower than a preset density threshold;
the second recognition module 1004 is configured to divide, with a second recognition model, an image corresponding to the target area into a plurality of detection grids, and determine the number of grids including the detection grids of the weeds, where a planting density of crops of the second type is not lower than a preset density threshold;
The coverage determining module 1006 is configured to determine, in case of the crop type being of a first type, a weed coverage of the target area according to a number of pixels corresponding to the weeds, and determine, in case of the crop type being of a second type, the weed coverage of the target area according to a number of grids including a detection grid of the weeds.
Note that each module in the weed coverage determination device may be a program module (for example, a set of program instructions for realizing a specific function), or may be a hardware module, and for the latter, it may take the following form, but is not limited thereto: the expression forms of the modules are all a processor, or the functions of the modules are realized by one processor.
It should be noted that, the weed coverage determining apparatus provided in the present embodiment may be used to perform the weed coverage determining method shown in fig. 2, and therefore, the explanation of the weed coverage determining method described above is also applicable to the embodiments of the present application, and is not repeated here.
The embodiment of the application also provides a nonvolatile storage medium, which comprises a stored computer program, wherein the equipment where the nonvolatile storage medium is located executes the following weed coverage determining method by running the computer program: determining a crop type of the crop in the target area; under the condition that the crop type is a first type, determining the number of pixels corresponding to weeds in an image corresponding to a target area by adopting a first identification model, wherein the planting density of the crop of the first type is lower than a preset density threshold; under the condition that the crop type is a second type, dividing an image corresponding to a target area into a plurality of detection grids by adopting a second identification model, and determining the grid number of the detection grids containing weeds, wherein the planting density of the crop of the second type is not lower than a preset density threshold value; and determining the weed coverage of the target area according to the number of pixels corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the number of grids of the detection grid containing the weeds when the crop type is the second type.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. A method of weed coverage determination, comprising:
determining a crop type of the crop in the target area;
under the condition that the crop type is the first type, determining the number of pixels corresponding to weeds in the image corresponding to the target area by adopting a first identification model comprises the following steps: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; identifying crops in the original image by adopting the first identification model, and removing the identified crops from the original image to obtain a first image; converting the first image into a second image with a preset format, wherein the preset format is a hue-saturation-color brightness HSV color space format; determining pixel points corresponding to weeds in the second image according to a preset color threshold, wherein the preset color threshold comprises: a hue threshold, a saturation threshold, and a color definition threshold, wherein the planting density of the first type of crop is less than a preset density threshold, and the plant morphology of the first type of crop is uniform and consistent and different from the plant morphology of the weed;
In the case that the crop type is a second type, dividing the image corresponding to the target area into a plurality of detection grids by using a second recognition model, and determining the grid number of the detection grids containing weeds, wherein the method comprises the following steps: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; dividing the original image into a plurality of detection grids, wherein the number of pixel points in each detection grid is the same; judging whether weeds are contained in the detection grids by adopting the second identification model, and counting the grid number of the detection grids containing the weeds in the original image, wherein the planting density of the second type of crops is not lower than the preset density threshold value, and the second type of crops are interwoven with the weeds in the target area;
and determining the weed coverage of the target area according to the pixel number corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the grid number of the detection grids containing the weeds when the crop type is the second type.
2. The weed coverage determination method according to claim 1, wherein determining the pixel points corresponding to the weeds in the second image according to the preset color threshold value comprises:
extracting a pixel point region of a target color in the second image according to the preset color threshold, wherein the target color is a color corresponding to weeds;
performing binarization processing on the second image to obtain a third image, wherein a pixel point area of the target color in the third image is a first color, and other areas except the pixel point area are second colors;
performing corrosion and/or expansion operation on the third image to obtain a plurality of communication areas corresponding to the target color;
and counting the number of pixel points in each communication area in the third image to obtain the number of pixels corresponding to the weeds.
3. The weed coverage determination method according to claim 1, wherein the first recognition model is deployed in a cloud server, trained from a first training data set by an initial model with an attentiveness mechanism, wherein the attentiveness mechanism comprises: channel attention and spatial attention, the first training dataset comprising: a plurality of training images of different growth periods of the first type of crop, and a crop name corresponding to the training images.
4. The weed coverage determination method according to claim 1, wherein the second recognition model is deployed in a cloud server, trained from a second training data set by an initial model with an attentiveness mechanism, wherein the attentiveness mechanism comprises: channel attention and spatial attention, the second training dataset comprising: the system comprises a training grid obtained by dividing the image of the crop of the second type and target labels corresponding to each training grid, wherein the target labels are used for indicating whether weeds are contained in the training grids.
5. The weed coverage determination method according to claim 1, wherein determining the weed coverage of the target area in accordance with the number of pixels to which the weed corresponds comprises:
and under the condition that the crop type is a first type, calculating a first ratio of the number of pixels corresponding to weeds in the image corresponding to the target area to the total number of pixels of the image, and determining the first ratio as the weed coverage.
6. The weed coverage determination method according to claim 1, wherein determining the weed coverage of the target area from the number of meshes of the detection mesh containing weeds comprises:
And under the condition that the crop type is a second type, calculating a second ratio of the grid number of the detection grids containing weeds in the image corresponding to the target area to the total number of the detection grids in the image, and determining the second ratio as the weed coverage.
7. A weed coverage determination system, comprising: unmanned plane, terminal equipment, cloud server, wherein,
the unmanned aerial vehicle is used for responding to the acquisition instruction sent by the terminal equipment, acquiring an image of a target area and sending the image to the cloud server;
the cloud server is used for determining the crop type of crops in the target area; under the condition that the crop type is the first type, determining the number of pixels corresponding to weeds in the image corresponding to the target area by adopting a first identification model comprises the following steps: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; identifying crops in the original image by adopting the first identification model, and removing the identified crops from the original image to obtain a first image; converting the first image into a second image with a preset format, wherein the preset format is a hue-saturation-color brightness HSV color space format; determining pixel points corresponding to weeds in the second image according to a preset color threshold, wherein the preset color threshold comprises: a hue threshold, a saturation threshold, and a color definition threshold, wherein the planting density of the first type of crop is less than a preset density threshold, and the plant morphology of the first type of crop is uniform and consistent and different from the plant morphology of the weed; in the case that the crop type is a second type, dividing the image corresponding to the target area into a plurality of detection grids by using a second recognition model, and determining the grid number of the detection grids containing weeds, wherein the method comprises the following steps: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; dividing the original image into a plurality of detection grids, wherein the number of pixel points in each detection grid is the same; judging whether weeds are contained in the detection grids by adopting the second identification model, and counting the grid number of the detection grids containing the weeds in the original image, wherein the planting density of the second type of crops is not lower than the preset density threshold value, and the second type of crops are interwoven with the weeds in the target area; and determining the weed coverage of the target area according to the pixel number corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the grid number of the detection grids containing the weeds when the crop type is the second type.
8. A weed coverage determining apparatus, comprising:
a crop type determination module for determining a crop type of the crop in the target area;
the first recognition module is configured to determine, using a first recognition model, a number of pixels corresponding to weeds in an image corresponding to the target area, where the crop type is a first type, and includes: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; identifying crops in the original image by adopting the first identification model, and removing the identified crops from the original image to obtain a first image; converting the first image into a second image with a preset format, wherein the preset format is a hue-saturation-color brightness HSV color space format; determining pixel points corresponding to weeds in the second image according to a preset color threshold, wherein the preset color threshold comprises: a hue threshold, a saturation threshold, and a color definition threshold, wherein the planting density of the first type of crop is less than a preset density threshold, and the plant morphology of the first type of crop is uniform and consistent and different from the plant morphology of the weed;
The second recognition module is configured to divide, when the crop type is a second type, an image corresponding to the target area into a plurality of detection grids by using a second recognition model, and determine the number of grids of the detection grids including weeds, and includes: acquiring an original image obtained after the unmanned aerial vehicle acquires the image of the target area; dividing the original image into a plurality of detection grids, wherein the number of pixel points in each detection grid is the same; judging whether weeds are contained in the detection grids by adopting the second identification model, and counting the grid number of the detection grids containing the weeds in the original image, wherein the planting density of the second type of crops is not lower than the preset density threshold value, and the second type of crops are interwoven with the weeds in the target area;
the coverage determining module is used for determining the weed coverage of the target area according to the pixel number corresponding to the weeds when the crop type is the first type, and determining the weed coverage of the target area according to the grid number of the detection grids containing the weeds when the crop type is the second type.
9. An electronic device, comprising: a memory and a processor for executing a program stored in the memory, wherein the program is executed to perform the weed coverage determination method according to any one of claims 1 to 6.
10. A non-volatile storage medium comprising a stored computer program, wherein the device in which the non-volatile storage medium resides performs the weed coverage determination method according to any one of claims 1 to 6 by running the computer program.
CN202311186203.XA 2023-09-14 2023-09-14 Weed coverage determination method, system and device and electronic equipment Active CN116912702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311186203.XA CN116912702B (en) 2023-09-14 2023-09-14 Weed coverage determination method, system and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311186203.XA CN116912702B (en) 2023-09-14 2023-09-14 Weed coverage determination method, system and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN116912702A CN116912702A (en) 2023-10-20
CN116912702B true CN116912702B (en) 2024-01-26

Family

ID=88351569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311186203.XA Active CN116912702B (en) 2023-09-14 2023-09-14 Weed coverage determination method, system and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116912702B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523457A (en) * 2020-04-22 2020-08-11 七海行(深圳)科技有限公司 Weed identification method and weed treatment equipment
AU2020103332A4 (en) * 2020-11-09 2021-01-21 Bhatt, Kaushal MR IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System
CN113795846A (en) * 2020-06-24 2021-12-14 深圳市大疆创新科技有限公司 Method, device and computer storage medium for determining crop planting information
CN113807130A (en) * 2020-06-12 2021-12-17 广州极飞科技股份有限公司 Weed identification method and device, computing equipment and storage medium
WO2022079172A1 (en) * 2020-10-14 2022-04-21 Basf Agro Trademarks Gmbh Treatment system for plant specific treatment
CN115018770A (en) * 2022-05-19 2022-09-06 北京大学现代农业研究院 Method and device for determining weeding operation area and weeding equipment
WO2023001526A1 (en) * 2021-07-23 2023-01-26 Robert Bosch Gmbh Weed detection device, method for detecting weeds, computer program, and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3244343A1 (en) * 2016-05-12 2017-11-15 Bayer Cropscience AG Recognition of weed in a natural environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523457A (en) * 2020-04-22 2020-08-11 七海行(深圳)科技有限公司 Weed identification method and weed treatment equipment
CN113807130A (en) * 2020-06-12 2021-12-17 广州极飞科技股份有限公司 Weed identification method and device, computing equipment and storage medium
CN113795846A (en) * 2020-06-24 2021-12-14 深圳市大疆创新科技有限公司 Method, device and computer storage medium for determining crop planting information
WO2022079172A1 (en) * 2020-10-14 2022-04-21 Basf Agro Trademarks Gmbh Treatment system for plant specific treatment
AU2020103332A4 (en) * 2020-11-09 2021-01-21 Bhatt, Kaushal MR IMLS-Weed Recognition/Classification: Intelligent Weed Recognition /Classification using Machine Learning System
WO2023001526A1 (en) * 2021-07-23 2023-01-26 Robert Bosch Gmbh Weed detection device, method for detecting weeds, computer program, and storage medium
CN115018770A (en) * 2022-05-19 2022-09-06 北京大学现代农业研究院 Method and device for determining weeding operation area and weeding equipment

Also Published As

Publication number Publication date
CN116912702A (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN106971167B (en) Crop growth analysis method and system based on unmanned aerial vehicle platform
Yu et al. Automatic image-based detection technology for two critical growth stages of maize: Emergence and three-leaf stage
GB2618896A (en) System and method for crop monitoring
Kawamura et al. Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm
Rasti et al. A survey of high resolution image processing techniques for cereal crop growth monitoring
WO2001033505A2 (en) Multi-variable model for identifying crop response zones in a field
CN103489006A (en) Computer vision-based rice disease, pest and weed diagnostic method
WO2020000043A1 (en) Plant growth feature monitoring
Liu et al. 3DBunch: A novel iOS-smartphone application to evaluate the number of grape berries per bunch using image analysis techniques
CN108776106A (en) A kind of crop condition monitoring method and system based on unmanned plane low-altitude remote sensing
Sunoj et al. Digital image analysis estimates of biomass, carbon, and nitrogen uptake of winter cereal cover crops
CN115687850A (en) Method and device for calculating irrigation water demand of farmland
CN115015258A (en) Crop growth and soil moisture association determination method and related device
Rocha et al. Automatic detection and evaluation of sugarcane planting rows in aerial images
Xiang et al. PhenoStereo: a high-throughput stereo vision system for field-based plant phenotyping-with an application in sorghum stem diameter estimation
CN112541383B (en) Method and device for identifying weed area
CN111985472A (en) Trough hay temperature image processing method based on artificial intelligence and active ball machine
CN116912702B (en) Weed coverage determination method, system and device and electronic equipment
CN113807143A (en) Crop connected domain identification method and device and operation system
CN110781865A (en) Crop growth control system
CN116739739A (en) Loan amount evaluation method and device, electronic equipment and storage medium
de Ocampo et al. Integrated Weed Estimation and Pest Damage Detection in Solanum melongena Plantation via Aerial Vision-based Proximal Sensing.
CN114663652A (en) Image processing method, image processing apparatus, management system, electronic device, and storage medium
CN113807129A (en) Crop area identification method and device, computer equipment and storage medium
CN112733582A (en) Crop yield determination method and device and nonvolatile storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant