CN117557924B - Agricultural environment monitoring method, device, system and storage medium - Google Patents

Agricultural environment monitoring method, device, system and storage medium Download PDF

Info

Publication number
CN117557924B
CN117557924B CN202311598081.5A CN202311598081A CN117557924B CN 117557924 B CN117557924 B CN 117557924B CN 202311598081 A CN202311598081 A CN 202311598081A CN 117557924 B CN117557924 B CN 117557924B
Authority
CN
China
Prior art keywords
plant
objects
structural
value
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311598081.5A
Other languages
Chinese (zh)
Other versions
CN117557924A (en
Inventor
李轶骥
刘丽
朱润华
胡思雪
阳圣莹
夏武奇
刘爱华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Science And Technology Guarantee Center Of Sichuan Academy Of Agricultural Sciences
Original Assignee
Science And Technology Guarantee Center Of Sichuan Academy Of Agricultural Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Science And Technology Guarantee Center Of Sichuan Academy Of Agricultural Sciences filed Critical Science And Technology Guarantee Center Of Sichuan Academy Of Agricultural Sciences
Priority to CN202311598081.5A priority Critical patent/CN117557924B/en
Publication of CN117557924A publication Critical patent/CN117557924A/en
Application granted granted Critical
Publication of CN117557924B publication Critical patent/CN117557924B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an agricultural environment monitoring method, a device, a system and a storage medium, wherein the method comprises the steps of obtaining a plurality of plant objects; constructing a selection range and selecting a plurality of structural objects in the selection range; acquiring height data of the structural objects and grouping the structural objects by using the height data; acquiring color data and size data of all structural objects in each group; comparing the obtained color data value with a corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with a corresponding size data threshold value to obtain a second ratio value; and obtaining environment evaluation value data after the deviation degree calculation of the first ratio and the second ratio. According to the agricultural environment monitoring method, device, system and storage medium disclosed by the application, the growth condition related information of crops is obtained through quantitative analysis of the acquired images, so that whether the growth environment of the crops meets the requirement is determined, and long-term monitoring can be realized through the method.

Description

Agricultural environment monitoring method, device, system and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, a system, and a storage medium for monitoring agricultural environments.
Background
Agricultural environment monitoring is to know the growth process, the growth environment and the like of crops by means of a technical means, ground monitoring modes comprise modes of using various sensors, remote shooting and the like, and an air monitoring mode is used for obtaining data by using a remote sensing mode.
However, for agricultural environment monitoring in a large area, the amount of data acquired by a sensor mode is limited, enough data cannot be provided for analysis, and certain defects exist in the form of data acquisition. The remote sensing detection data mainly depends on high-altitude detection means such as satellites, and the mode can provide a wide coverage area, but has defects in timeliness, pertinence and the like.
With the rise of unmanned aerial vehicle technology, the unmanned aerial vehicle mode for acquiring information has advantages in the aspects of acquisition convenience and data volume, but no standard is formed for the processing mode for acquiring images, for example, how to acquire relevant information of crop growth vigor through information analysis on the images, and then whether the growth environment of crops is suitable is determined.
Disclosure of Invention
The application provides an agricultural environment monitoring method, device, system and storage medium, which can obtain growth condition related information of crops through quantitative analysis of acquired images so as to determine whether the growth environment of the crops meets the requirement or not.
The above object of the present application is achieved by the following technical solutions:
in a first aspect, the present application provides an agricultural environment monitoring method, comprising:
responding to the acquired image, analyzing the image to obtain a plurality of plant objects;
constructing a selection range by taking the center of the plant object as a reference, and selecting a plurality of structural objects in the selection range;
acquiring height data of the structural objects and grouping the structural objects by using the height data;
acquiring color data and size data of all structural objects in each group;
Calculating the color data value and the width data average value of all the structural objects in each group;
Comparing the obtained color data value with a corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with a corresponding size data threshold value to obtain a second ratio value; and
And calculating the deviation degree of the first ratio and the second ratio to obtain environmental evaluation value data.
In a possible implementation manner of the first aspect, parsing the image to obtain a plurality of plant objects includes determining plant objects according to color boundaries.
In a possible implementation manner of the first aspect, parsing the image to obtain a plurality of plant objects includes dividing the image into a plurality of sub-images according to the position coordinates, each sub-image being a plant object.
In a possible implementation manner of the first aspect, parsing the image to obtain a plurality of plant objects includes:
identifying a plant object type in the image;
determining the central position of a plant object according to the color difference area distribution;
defining plant ranges according to the central position areas, wherein no overlapping areas exist between adjacent plant ranges; and
And screening the structural objects belonging to the plant objects in the plant range, and reserving the structural objects of the plant objects associated with the plant objects in the plant range.
In a possible implementation manner of the first aspect, the screening of the structural objects belonging to the plant object within the plant range includes:
decomposing the structural objects in the image to obtain a plurality of structural objects;
Determining whether the structural object is associated with a plant object in a plant range according to the width change of the structural object; and
And reserving the structural objects which are associated with the plant objects in the plant range, and deleting the structural objects which are not associated with the plant objects in the plant range.
In a possible implementation manner of the first aspect, calculating the color data value of the structural object includes:
separating the structural object from the image;
Processing the structure objects separated from the image by using a plurality of gray screening domains to obtain a plurality of structure object gray images, wherein the gray screening domains comprise a normal gray screening domain and an abnormal gray screening domain;
calculating the number of regional points in the gray level image of the structural object;
accumulating the number of regional points in the gray level image of the structural object obtained by using the abnormal gray level screening domain processing to obtain an accumulated value; and
The accumulated value is taken as a color data value of the calculation structure object.
In a possible implementation manner of the first aspect, calculating the width data mean of the structural object includes:
determining a boundary of the structural object and dividing the boundary into a plurality of line segments;
Determining the included angle of any two adjacent line segments, and taking the point corresponding to the included angle with the smallest numerical value as a starting point;
drawing a simulated center line using the starting point and the boundary of the structural object;
Obtaining a width data change value of the structural object according to the simulated center line; and
And carrying out average value calculation on the obtained multiple width data change values to obtain a width data average value.
In a second aspect, the present application provides an agricultural environment monitoring device comprising:
The image analysis unit is used for responding to the acquired image and analyzing the image to obtain a plurality of plant objects;
the selecting unit is used for constructing a selecting range by taking the center of the plant object as a reference and selecting a plurality of structural objects in the selecting range;
A first processing unit for acquiring height data of the structural object and grouping the structural object using the height data;
the second processing unit is used for acquiring color data and size data of all the structural objects in each group;
A numerical value calculating unit for calculating color data values and width data average values of all the structural objects in each group;
The numerical value comparison unit is used for comparing the obtained color data value with the corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with the corresponding size data threshold value to obtain a second ratio value; and
And the numerical value evaluation unit is used for calculating the deviation degree of the first ratio and the second ratio to obtain environment evaluation value data.
In a third aspect, the present application provides an agricultural environment monitoring system, the system comprising:
One or more memories for storing instructions; and
One or more processors configured to invoke and execute the instructions from the memory, to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium comprising:
A program which, when executed by a processor, performs a method as described in the first aspect and any possible implementation of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising program instructions which, when executed by a computing device, perform a method as described in the first aspect and any possible implementation of the first aspect.
In a sixth aspect, the present application provides a chip system comprising a processor for implementing the functions involved in the above aspects, e.g. generating, receiving, transmitting, or processing data and/or information involved in the above methods.
The chip system can be composed of chips, and can also comprise chips and other discrete devices.
In one possible design, the system on a chip also includes memory to hold the necessary program instructions and data. The processor and the memory may be decoupled, provided on different devices, respectively, connected by wire or wirelessly, or the processor and the memory may be coupled on the same device.
Drawings
Fig. 1 is a schematic flow chart of steps of an agricultural environment monitoring method provided by the application.
Fig. 2 is a schematic diagram of a first obtained plant object provided by the present application.
FIG. 3 is a schematic illustration of a second resulting plant object provided by the present application.
Fig. 4 is a schematic diagram of a third resulting plant object provided by the present application.
FIG. 5 is a schematic block diagram of a third method for obtaining plant objects according to the present application.
Detailed Description
The technical scheme in the application is further described in detail below with reference to the accompanying drawings.
The application discloses an agricultural environment monitoring method, referring to fig. 1, comprising the following steps:
s101, responding to the acquired image, and analyzing the image to obtain a plurality of plant objects;
S102, constructing a selection range by taking the center of a plant object as a reference, and selecting a plurality of structural objects in the selection range;
s103, acquiring the height data of the structural object and grouping the structural object by using the height data;
s104, acquiring color data and size data of all the structural objects in each group;
s105, calculating color data values and width data average values of all the structural objects in each group;
S106, comparing the obtained color data value with a corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with a corresponding size data threshold value to obtain a second ratio value; and
And S107, calculating the deviation degree of the first ratio and the second ratio to obtain environmental evaluation value data.
Firstly, it is required to explain that the agricultural environment monitoring method disclosed by the application is realized by means of an unmanned aerial vehicle, the unmanned aerial vehicle carries an image sensor and a laser radar, the image sensor is responsible for collecting image data, and the laser radar is responsible for collecting three-dimensional space distance data.
The collected data can be processed in a processor of the unmanned aerial vehicle, and can also be sent to a data processing server for processing, and for convenience of description, the processing is unified as a server.
In step S101, the image is sent to a server, and for the received image, the image needs to be parsed first to obtain a plurality of plant objects, where the plant objects refer to crops growing on the ground. The following are specific examples of the image analysis method:
In a first manner, referring to fig. 2, obtaining a plurality of plant objects includes determining plant objects according to color boundaries, where obvious boundaries exist between plant objects, and the plant objects may be processed by color demarcation boundaries or using neural network recognition.
In a second manner, please refer to fig. 3, obtaining a plurality of plant objects includes dividing an image into a plurality of sub-images (dashed rectangular boxes in fig. 3) according to position coordinates, wherein each sub-image is used as a plant object, and the method is suitable for the situation that the plant object has high planting density or luxuriant branches and leaves, and the situation is difficult to divide the plant object in a recognition manner, so that the method of region division is used for processing.
In a third mode, please refer to fig. 4 and 5, the steps are as follows:
s201, identifying the plant object type in the image;
s202, determining the central position of a plant object according to the color difference area distribution;
s203, defining plant ranges according to the central position areas, wherein no overlapping areas exist between adjacent plant ranges; and
S204, screening the structural objects belonging to the plant objects in the plant range, and reserving the structural objects of the plant objects associated with the plant objects in the plant range.
In steps S201 to S204, the type of the plant object in the image is first identified, and then the center position of the plant object is determined through the color difference area distribution. Taking corn as an example, the head and the leaves of corn have obvious distinguishing characteristics and have obvious distinguishing degrees in terms of color and shape.
However, the display of the shape on the two-dimensional image is not obvious in view of the two-dimensional nature of the plant object displayed on the image. The reason is mainly that under the influence of overlooking angle, the characteristics of flowers, stamens and the like of plant objects are only displayed in a very small area, but the main body part of the plant objects cannot be displayed, so that objective recognition difficulty is caused. In order to solve this problem, the present application uses a color difference region distribution method for identification. The distribution of the color difference areas is that distinct color areas are found, such as yellow areas of the corn pistil and green areas of the leaves.
Of course, for other plants, the colors of flowers and pistils etc. are clearly distinguishable from the leaves, and can be distinguished using the distribution of the chromatic aberration regions. When the features such as flowers and stamens are not obvious, the method can also be solved by similar color distinction on plant objects, for example, the color of new leaves is obviously lighter than that of old leaves.
In step S102, a selection range is constructed based on the center of the plant object, and a plurality of structural objects are selected within the selection range, and the selection range is constructed, and the plurality of structural objects are selected within the selection range, wherein the structural objects mainly refer to the leaves of the plant object.
Then in step S103, height data of the structural objects are acquired and the structural objects are grouped using the height data, the purpose of the grouping being to take into account the actual influence of the height on the color. The sunlight intensity is different at different heights, and the color change caused by the shielding relation exists.
In step S104, color data and size data of all the structural objects in each group are acquired, where the color data characterizes the color of the structural objects, and the size data characterizes the size and size variation of the structural objects (mainly blades), which are further described in the following.
In step S105, color data values and width data means of all the structural objects in each group are calculated, wherein the color data values represent color distribution conditions at a height value or a height area range, and the width data means represent width and width change conditions of the structural objects (mainly blades).
In S106, the obtained color data value is compared with the corresponding color data threshold value to obtain a first ratio, and the obtained width data average value is compared with the corresponding size data threshold value to obtain a second ratio. Here, the color data threshold value and the size data threshold value are both preset data.
The comparison results are two, the first is within the threshold (color data threshold, size data threshold) and the second is outside the threshold (color data threshold, size data threshold), the first uses a fixed value, e.g., zero, in the subsequent deviation calculation; the second is calculated based on the difference (both positive) or using a step calculation, each step corresponding to a fixed value.
Finally, in step S107, the first ratio and the second ratio are calculated to obtain environmental evaluation value data, where the environmental evaluation value data is given according to the result of the calculation of the deviation, and the result of the calculation of the deviation may be directly used as the environmental evaluation value data, or a value corresponding to the range may be selected according to the range where the result of the calculation of the deviation is located.
And evaluating the environment of the plant object according to the finally obtained environment evaluation value data, if the finally obtained environment evaluation value data is within the allowable range, the growing environment of the plant object is good, and if the finally obtained environment evaluation value data is outside the allowable range, the growing environment of the plant object is problematic, and measures are needed for adjustment.
For the whole area, the ratio of the number in the allowable range to the total number in the environment evaluation value data obtained finally is determined.
In some examples, screening structural objects belonging to a plant object within a plant range includes the steps of:
S301, decomposing a structural object in an image to obtain a plurality of structural objects;
S302, determining whether the structural object is associated with a plant object in a plant range according to the width change of the structural object; and
S303, reserving a structural object which is related to the plant object in the plant range, and deleting the structural object which is not related to the plant object in the plant range.
The content in steps S301 to S303 is that the structural objects not related to the plant objects need to be removed, so as to avoid the structural objects belonging to other plant objects from affecting the results. Here, taking two adjacent plant objects as an example, at the same height, colors of structural objects respectively belonging to the two plant objects may be different, and if the structural objects are not associated and directly used for a subsequent calculation process, uncontrollable deviation of a calculation result may occur.
By a change in width of a structural object is meant that the structural object (e.g. a leaf) increases in width in a direction towards the stem (belonging to the plant object), and by the direction of the change in width of the structural object (e.g. a leaf), the relation of the structural object (e.g. a leaf) to the plant object can be determined.
In some examples, calculating the color data value of the structural object includes the steps of:
s401, separating a structural object from an image;
S402, processing the structural objects separated from the image by using a plurality of gray screening domains to obtain a plurality of structural object gray images, wherein the gray screening domains comprise a normal gray screening domain and an abnormal gray screening domain;
s403, calculating the number of regional points in the gray level image of the structural object;
S404, accumulating the number of regional points in the gray level image of the structural object obtained by using the abnormal gray level screening domain processing to obtain an accumulated value; and
S405, the accumulated value is set as the color data value of the calculation structure object.
In steps S401 to S405, a plurality of gray-scale screening fields are used to process the structural object, and abnormal gray-scale screening fields under different gray-scale values can be obtained during the processing, so as to find more abnormal color areas on the structural object. And then, accumulating the abnormal color areas to obtain an accumulated value, and finally, taking the accumulated value as a color data value of the calculation structure object.
In some examples, calculating the width data mean of the structural object includes:
S501, determining the boundary of a structural object and dividing the boundary into a plurality of line segments;
S502, determining the included angle of any two adjacent line segments, and taking the point corresponding to the included angle with the smallest value as a starting point;
S503, drawing a simulation center line by using the starting point and the boundary of the structural object;
S504, obtaining a width data change value of the structural object according to the simulated center line; and
S505, performing average calculation on the obtained multiple width data change values to obtain a width data average value.
In step S501 to step S505, a simulated center line is used, the simulated center line represents a symmetry line of the structural object, according to the symmetry line, the width and the width change condition of the structural object can be determined, and the average value of the obtained plurality of width data change values is calculated, so that the width data average value can be obtained.
The object of finding the symmetry line is to obtain the starting point of each structural object, typically the tip of a leaf, and for the width of the structural object, the width at a fixed distance from the tip can be taken as the width of the structural object, and the width data mean value at this time is to calculate the mean value of the width values.
For the width change condition of the structural object, values can be obtained at a plurality of fixed distances from the tip, and then the change condition of the values is obtained, and the width data average value at the moment is the value at the corresponding position for average value calculation.
In some examples, the changes may be displayed using polylines, which are then placed into the same coordinate system. For example, a standard polyline is provided in the coordinate system, and a comparison of the resulting polyline with the standard polyline results in a difference (e.g., the area of the portion of the region between the polylines) that can be used to evaluate the difference between the polyline and the standard polyline.
The application also provides an agricultural environment monitoring device, which comprises:
The image analysis unit is used for responding to the acquired image and analyzing the image to obtain a plurality of plant objects;
the selecting unit is used for constructing a selecting range by taking the center of the plant object as a reference and selecting a plurality of structural objects in the selecting range;
A first processing unit for acquiring height data of the structural object and grouping the structural object using the height data;
the second processing unit is used for acquiring color data and size data of all the structural objects in each group;
A numerical value calculating unit for calculating color data values and width data average values of all the structural objects in each group;
The numerical value comparison unit is used for comparing the obtained color data value with the corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with the corresponding size data threshold value to obtain a second ratio value; and
And the numerical value evaluation unit is used for calculating the deviation degree of the first ratio and the second ratio to obtain environment evaluation value data.
Further, parsing the image to obtain a plurality of plant objects includes determining plant objects according to the color boundaries.
Further, parsing the image to obtain a plurality of plant objects includes dividing the image into a plurality of sub-images according to the location coordinates, each sub-image being a plant object.
Further, the method further comprises the following steps:
the type identification unit is used for identifying the type of the plant object in the image;
The position determining unit is used for determining the central position of the plant object according to the color difference area distribution;
The range defining unit is used for defining plant ranges according to the central position area, and no overlapping area exists between adjacent plant ranges; and
And the screening unit is used for screening the structural objects belonging to the plant objects in the plant range and reserving the structural objects of the plant objects associated with the plant objects in the plant range.
Further, the method further comprises the following steps:
The image decomposition unit is used for decomposing the structural objects in the image to obtain a plurality of structural objects;
the relation determining unit is used for determining whether the structural object is associated with the plant object in the plant range according to the width change of the structural object; and
And the object screening unit is used for reserving the structural objects which are related to the plant objects in the plant range and deleting the structural objects which are not related to the plant objects in the plant range.
Further, the method further comprises the following steps:
a separation unit for separating the structural object from the image;
A gray level processing unit, configured to process the structural objects separated from the image by using a plurality of gray level screening fields, so as to obtain a plurality of structural object gray level images, where the gray level screening fields include a normal gray level screening field and an abnormal gray level screening field;
a first number calculation unit for calculating the number of region points in the gray image of the structural object;
The quantity accumulating unit is used for accumulating the quantity of the regional points in the gray level image of the structural object obtained by using the abnormal gray level screening domain processing to obtain an accumulated value; and
And a result unit for taking the accumulated value as a color data value of the calculation structure object.
Further, the method further comprises the following steps:
a third processing unit for determining a boundary of the structural object and dividing the boundary into a plurality of line segments;
The fourth processing unit is used for determining the included angle of any two adjacent line segments and taking the point corresponding to the included angle with the smallest value as a starting point;
a drawing unit for drawing a simulated center line using the starting point and a boundary of the structural object;
The quantity acquisition unit is used for obtaining the width data change value of the structural object according to the simulation center line; and
And the second number calculating unit is used for carrying out average value calculation on the obtained multiple width data change values to obtain a width data average value.
In one example, the unit in any of the above apparatuses may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (application specific integratedcircuit, ASIC), or one or more digital signal processors (DIGITAL SIGNAL processor, DSP), or one or more field programmable gate arrays (fieldprogrammable GATE ARRAY, FPGA), or a combination of at least two of these integrated circuit forms.
For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as a central processing unit (central processing unit, CPU) or other processor that may invoke a program. For another example, the units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Various objects such as various messages/information/devices/network elements/systems/devices/actions/operations/processes/concepts may be named in the present application, and it should be understood that these specific names do not constitute limitations on related objects, and that the named names may be changed according to the scenario, context, or usage habit, etc., and understanding of technical meaning of technical terms in the present application should be mainly determined from functions and technical effects that are embodied/performed in the technical solution.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should also be understood that in various embodiments of the present application, first, second, etc. are merely intended to represent that multiple objects are different. For example, the first time window and the second time window are only intended to represent different time windows. Without any effect on the time window itself, the first, second, etc. mentioned above should not impose any limitation on the embodiments of the present application.
It is also to be understood that in the various embodiments of the application, where no special description or logic conflict exists, the terms and/or descriptions between the various embodiments are consistent and may reference each other, and features of the various embodiments may be combined to form new embodiments in accordance with their inherent logic relationships.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a computer-readable storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned computer-readable storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The application also provides an agricultural environment monitoring system, which comprises:
One or more memories for storing instructions; and
One or more processors configured to invoke and execute the instructions from the memory to perform the method as described above.
The present application also provides a computer program product comprising instructions which, when executed, cause the terminal device and the network device to perform operations of the terminal device and the network device corresponding to the above method.
The present application also provides a chip system comprising a processor for implementing the functions involved in the above, e.g. generating, receiving, transmitting, or processing data and/or information involved in the above method.
The chip system can be composed of chips, and can also comprise chips and other discrete devices.
The processor referred to in any of the foregoing may be a CPU, microprocessor, ASIC, or integrated circuit that performs one or more of the procedures for controlling the transmission of feedback information described above.
In one possible design, the system on a chip also includes memory to hold the necessary program instructions and data. The processor and the memory may be decoupled, and disposed on different devices, respectively, and connected by wired or wireless means, so as to support the chip system to implement the various functions in the foregoing embodiments. Or the processor and the memory may be coupled to the same device.
Optionally, the computer instructions are stored in a memory.
Alternatively, the memory may be a storage unit in the chip, such as a register, a cache, etc., and the memory may also be a storage unit in the terminal located outside the chip, such as a ROM or other type of static storage device, a RAM, etc., that may store static information and instructions.
It will be appreciated that the memory in the present application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory.
The non-volatile memory may be ROM, programmable read-only memory (programmableROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electricallyEPROM, EEPROM), or flash memory.
The volatile memory may be RAM, which acts as external cache. RAM is of a variety of different types, such as sram (STATIC RAM, SRAM), DRAM (DYNAMIC RAM, DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (doubledata RATE SDRAM, DDR SDRAM), enhanced SDRAM (ENHANCED SDRAM, ESDRAM), synchronous DRAM (SYNCH LINK DRAM, SLDRAM), and direct memory bus RAM.
The embodiments of the present application are all preferred embodiments of the present application, and are not intended to limit the scope of the present application in this way, therefore: all equivalent changes in structure, shape and principle of the application should be covered in the scope of protection of the application.

Claims (8)

1. An agricultural environment monitoring method, comprising:
responding to the acquired image, analyzing the image to obtain a plurality of plant objects;
constructing a selection range by taking the center of the plant object as a reference, and selecting a plurality of structural objects in the selection range;
acquiring height data of the structural objects and grouping the structural objects by using the height data;
acquiring color data and size data of all structural objects in each group;
Calculating the color data value and the width data average value of all the structural objects in each group;
Comparing the obtained color data value with a corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with a corresponding size data threshold value to obtain a second ratio value; and
The first ratio and the second ratio are subjected to deviation degree calculation to obtain environment evaluation value data;
calculating the color data value of the structural object includes:
separating the structural object from the image;
Processing the structure objects separated from the image by using a plurality of gray screening domains to obtain a plurality of structure object gray images, wherein the gray screening domains comprise a normal gray screening domain and an abnormal gray screening domain;
calculating the number of regional points in the gray level image of the structural object;
Accumulating the number of regional points in the gray level image of the structural object obtained by using the abnormal gray level screening domain processing to obtain an accumulated value;
Taking the accumulated value as a color data value of a calculation structure object;
Calculating the width data mean of the structural object includes:
determining a boundary of the structural object and dividing the boundary into a plurality of line segments;
Determining the included angle of any two adjacent line segments, and taking the point corresponding to the included angle with the smallest numerical value as a starting point;
drawing a simulated center line using the starting point and the boundary of the structural object;
obtaining a width data change value of the structural object according to the simulated center line;
And carrying out average value calculation on the obtained multiple width data change values to obtain a width data average value.
2. The agricultural environment monitoring method according to claim 1, wherein parsing the image to obtain a plurality of plant objects includes determining plant objects based on color boundaries.
3. The agricultural environment monitoring method according to claim 1, wherein parsing the image to obtain a plurality of plant objects includes dividing the image into a plurality of sub-images according to the position coordinates, each sub-image being a plant object.
4. The agricultural environment monitoring method of claim 1, wherein parsing the image to obtain a plurality of plant objects comprises:
identifying a plant object type in the image;
determining the central position of a plant object according to the color difference area distribution;
defining plant ranges according to the central position areas, wherein no overlapping areas exist between adjacent plant ranges; and
And screening the structural objects belonging to the plant objects in the plant range, and reserving the structural objects of the plant objects associated with the plant objects in the plant range.
5. The agricultural environment monitoring method according to claim 4, wherein the screening of the structural objects belonging to the plant object within the plant area includes:
decomposing the structural objects in the image to obtain a plurality of structural objects;
Determining whether the structural object is associated with a plant object in a plant range according to the width change of the structural object; and
And reserving the structural objects which are associated with the plant objects in the plant range, and deleting the structural objects which are not associated with the plant objects in the plant range.
6. An agricultural environment monitoring device, comprising:
The image analysis unit is used for responding to the acquired image and analyzing the image to obtain a plurality of plant objects;
the selecting unit is used for constructing a selecting range by taking the center of the plant object as a reference and selecting a plurality of structural objects in the selecting range;
A first processing unit for acquiring height data of the structural object and grouping the structural object using the height data;
the second processing unit is used for acquiring color data and size data of all the structural objects in each group;
A numerical value calculating unit for calculating color data values and width data average values of all the structural objects in each group;
The numerical value comparison unit is used for comparing the obtained color data value with the corresponding color data threshold value to obtain a first ratio value, and comparing the obtained width data average value with the corresponding size data threshold value to obtain a second ratio value;
the numerical value evaluation unit is used for calculating the deviation degree of the first ratio and the second ratio to obtain environment evaluation value data;
a separation unit for separating the structural object from the image;
A gray level processing unit, configured to process the structural objects separated from the image by using a plurality of gray level screening fields, so as to obtain a plurality of structural object gray level images, where the gray level screening fields include a normal gray level screening field and an abnormal gray level screening field;
a first number calculation unit for calculating the number of region points in the gray image of the structural object;
the quantity accumulating unit is used for accumulating the quantity of the regional points in the gray level image of the structural object obtained by using the abnormal gray level screening domain processing to obtain an accumulated value;
a result unit for taking the accumulated value as a color data value of the calculation structure object;
a third processing unit for determining a boundary of the structural object and dividing the boundary into a plurality of line segments;
The fourth processing unit is used for determining the included angle of any two adjacent line segments and taking the point corresponding to the included angle with the smallest value as a starting point;
a drawing unit for drawing a simulated center line using the starting point and a boundary of the structural object;
The quantity acquisition unit is used for obtaining the width data change value of the structural object according to the simulation center line; and
And the second number calculating unit is used for carrying out average value calculation on the obtained multiple width data change values to obtain a width data average value.
7. An agricultural environment monitoring system, the system comprising:
One or more memories for storing instructions; and
One or more processors to invoke and execute the instructions from the memory to perform the method of any of claims 1 to 5.
8. A computer-readable storage medium, the computer-readable storage medium comprising:
Program which, when executed by a processor, performs the method according to any one of claims 1 to 5.
CN202311598081.5A 2023-11-28 2023-11-28 Agricultural environment monitoring method, device, system and storage medium Active CN117557924B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311598081.5A CN117557924B (en) 2023-11-28 2023-11-28 Agricultural environment monitoring method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311598081.5A CN117557924B (en) 2023-11-28 2023-11-28 Agricultural environment monitoring method, device, system and storage medium

Publications (2)

Publication Number Publication Date
CN117557924A CN117557924A (en) 2024-02-13
CN117557924B true CN117557924B (en) 2024-06-25

Family

ID=89810717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311598081.5A Active CN117557924B (en) 2023-11-28 2023-11-28 Agricultural environment monitoring method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN117557924B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107452014A (en) * 2017-07-11 2017-12-08 中国农业科学院农业信息研究所 A kind of image partition method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112164089A (en) * 2020-08-20 2021-01-01 浙江大学 Satellite image-based farmland boundary extraction method and device, electronic equipment and storage medium
CN114463412A (en) * 2022-02-07 2022-05-10 浙江托普云农科技股份有限公司 Plant phenotype measuring method, system and device based on computer vision
CN217954949U (en) * 2022-08-22 2022-12-02 四川省农业科学院科技保障中心 Wisdom farming data monitoring and processing apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107452014A (en) * 2017-07-11 2017-12-08 中国农业科学院农业信息研究所 A kind of image partition method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
关于农业温室环境无线监测及其控制系统平台设计;李轶骥;《通信设计与应用》;20190531;第77-78页 *

Also Published As

Publication number Publication date
CN117557924A (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CA3073291C (en) Generating a yield map for an agricultural field using classification and regression methods
CN108629494B (en) Drought level assessment method and system
Fisette et al. Annual space-based crop inventory for Canada: 2009–2014
DE112018002314T5 (en) METHOD AND DEVICE FOR DETECTING AN OBJECT STATUS
US10317260B2 (en) Yield data calibration methods
CN113029971B (en) Crop canopy nitrogen monitoring method and system
MX2012006717A (en) Method and system for estimating vegetation growth relative to an object of interest.
Primicerio et al. Individual plant definition and missing plant characterization in vineyards from high-resolution UAV imagery
CN111666797B (en) Vehicle positioning method, device and computer equipment
CN105259909A (en) Vegetation data acquisition method and acquisition apparatus based on unmanned aerial vehicle
CN111985724B (en) Crop yield estimation method, device, equipment and storage medium
CN117557924B (en) Agricultural environment monitoring method, device, system and storage medium
CN112924967B (en) Remote sensing monitoring method for crop lodging based on radar and optical data combination characteristics and application
CN114519712A (en) Point cloud data processing method and device, terminal equipment and storage medium
US8483478B1 (en) Grammar-based, cueing method of object recognition, and a system for performing same
CN111626148B (en) Unmanned aerial vehicle farmland checking method, unmanned aerial vehicle farmland checking system, intelligent terminal and storage medium
CN113570273A (en) Spatial method and system for irrigation farmland statistical data
Ge et al. Improving spatial and temporal variation of ammonia emissions for the Netherlands using livestock housing information and a Sentinel-2-derived crop map
CN114026609A (en) Sensor fusion
CN117935068B (en) Crop disease analysis method and analysis system
Qiao et al. Integration of optical and polarimetric SAR imagery for locally accurate crop classification
CN113836188A (en) Data processing method, device, server and storage medium
CN113340307A (en) Unmanned aerial vehicle path planning method based on field division
CN109241866B (en) Automatic crop classification method and device based on historical crop distribution map
CN116860904B (en) Target tracking method and device based on decentralization network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant