CN117935068B - Crop disease analysis method and analysis system - Google Patents

Crop disease analysis method and analysis system Download PDF

Info

Publication number
CN117935068B
CN117935068B CN202410339267.7A CN202410339267A CN117935068B CN 117935068 B CN117935068 B CN 117935068B CN 202410339267 A CN202410339267 A CN 202410339267A CN 117935068 B CN117935068 B CN 117935068B
Authority
CN
China
Prior art keywords
sub
feature
characteristic
feature objects
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410339267.7A
Other languages
Chinese (zh)
Other versions
CN117935068A (en
Inventor
高青山
彭旭敏
柳德军
肖静
罗劲旭
徐洪伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Branch Of China Pingan Property Insurance Co ltd
Original Assignee
Sichuan Branch Of China Pingan Property Insurance Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Branch Of China Pingan Property Insurance Co ltd filed Critical Sichuan Branch Of China Pingan Property Insurance Co ltd
Priority to CN202410339267.7A priority Critical patent/CN117935068B/en
Publication of CN117935068A publication Critical patent/CN117935068A/en
Application granted granted Critical
Publication of CN117935068B publication Critical patent/CN117935068B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application relates to a crop disease analysis method and an analysis system, wherein the method comprises the steps of responding to an acquired instruction, and acquiring an image of an area pointed by the instruction to obtain an image group; identifying a feature object in the image to be processed, and separating the feature object from the image to be processed; decomposing the feature objects to obtain a plurality of sub-feature objects, and grouping the sub-feature objects; counting the characteristic parameters of the sub-characteristic objects and grouping the sub-characteristic objects according to the characteristic parameters; and comparing the similarity of any two sub-feature objects in each group, and secondarily grouping the sub-feature objects in one group according to the similarity result, and giving labels and counting the number of the labels of each feature object of the pointing region of the instruction. The crop disease analysis method and the crop disease analysis system can judge the crop disease condition in the area through regional population analysis and individual difference analysis, so as to improve the discovery probability of early detection of diseases.

Description

Crop disease analysis method and analysis system
Technical Field
The application relates to the technical field of data processing, in particular to a crop disease analysis method and an analysis system.
Background
The airborne imaging system with the multispectral and hyperspectral cameras is widely applied to detection and drawing of crop disease maps, and in a laboratory, a judging model for diseases of various crops is successfully built and is pertinently trained by using picture images, so that the judging model has high judging rate.
However, in the practical application process, it is found that the method still has certain defects, for example, a picture image used in a laboratory has certain requirements on quality, and meanwhile, the judgment result can be directly influenced by practical influence parameters such as distance, illumination and the like. The image obtained at the actual planting field still has defects in the current judging mode.
Early detection of diseases remains a challenge because the trait performance is not obvious at this time, early disease is difficult to find by virtue of the middle-late performance, although both on-board imaging and satellite imaging have been successfully applied to detect and map disease of numerous crops, and in most cases damage may already occur when disease symptoms of crops are displayed in remote sensing images. If the disease is found early enough, prevention and control measures can be timely taken to reduce further damage to crops; but for some crops it may be too late to prevent the spread of infection during the current growing season.
In addition, as the number of the judgment models increases, the economic value of the judgment models is also challenged, because in the actual use process, it is difficult to build the judgment models for each disease of each type of crop, and the judgment models obtained in this way have a defect in universality.
Disclosure of Invention
The application provides a crop disease analysis method and an analysis system, which can judge the crop disease condition in an area through regional group analysis and individual difference analysis so as to improve the discovery probability of early detection of diseases.
The above object of the present application is achieved by the following technical solutions:
In a first aspect, the present application provides a method for crop disease analysis comprising:
responding to the acquired instruction, and carrying out image acquisition on the area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
Cutting the images according to the generation positions of the images to obtain images to be processed, wherein the corresponding areas of any two images to be processed in the same image group are not overlapped;
Identifying a feature object in the image to be processed, and separating the feature object from the image to be processed;
decomposing the feature objects to obtain a plurality of sub-feature objects, and grouping the sub-feature objects;
comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping and label assignment on the sub-feature objects in one group according to the similarity result; and
The statistics indicate the number of tags for each feature object of the region.
In a possible implementation manner of the first aspect, decomposing the feature object includes:
drawing the outline of the characteristic object, wherein the outline is a closed curve;
Determining a central area of the outline;
And drawing the characteristic contours of the characteristic objects, and dividing the characteristic contours according to the central area to obtain a plurality of sub-characteristic contours, wherein each sub-characteristic contour corresponds to one sub-characteristic object.
In a possible implementation manner of the first aspect, the method further includes:
randomly selecting a plurality of judgment points on the outline;
constructing line segments by using any two judgment points and obtaining a distance value group on the line segments at intervals;
the judgment points at any end of the line segment are adjusted according to the change trend of the distance value group, the line segment is reconstructed, and the change trend comprises trend increase and trend decrease; and
Obtaining intersection areas of a plurality of line segments and taking the intersection areas as a central area of the outline;
Wherein the number of the central areas of the outline is at least one.
In a possible implementation manner of the first aspect, grouping sub-feature objects includes:
Determining symmetry lines of the sub-feature objects and determining the height change trend of the symmetry lines;
grouping the sub-feature objects according to the height change trend, wherein the height change trend of symmetry lines of the sub-feature objects belonging to the same class is the same;
wherein the symmetry line of the sub-feature object points to the central area of the outline;
the length of the symmetry line of the sub-feature objects belonging to the same class is within the required interval.
In a possible implementation manner of the first aspect, the similarity of the sub-feature objects includes color difference and edge uniformity.
In a possible implementation manner of the first aspect, obtaining the color difference in the similarity of the sub-feature objects includes:
overlapping the two sub-feature objects to obtain an overlapping region, and simultaneously moving any one of the two sub-feature objects to maximize the area of the overlapping region;
Calculating the difference value of two pixel values on each pixel point, wherein the difference value comprises a positive difference value and a negative difference value;
and taking the sum of the positive difference value quantity and the negative difference value quantity as the chromatic aberration of the sub-characteristic object.
In a possible implementation manner of the first aspect, obtaining edge uniformity in similarity of the sub-feature objects includes:
obtaining the edge profile of the sub-feature object;
Segmenting the edge contour of the sub-feature object to obtain a plurality of sub-edge contour segments, wherein the lengths of each sub-edge contour segment are the same;
calculating the separation degree of the sub-edge contour segments, wherein the calculation mode of the separation degree is to establish standard line segments by using two ends of the sub-edge contour segments, randomly selecting a plurality of points on the sub-edge contour segments and calculating the minimum linear distance between each point and the standard line segments;
and accumulating the separation degrees of the sub-edge contour segments to obtain the edge uniformity in the similarity of the sub-feature objects.
In a second aspect, the present application provides a crop disease analysis apparatus comprising:
The image acquisition unit is used for responding to the acquired instruction, carrying out image acquisition on the area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
The image cutting unit is used for cutting the images according to the generation positions of the images to obtain images to be processed, and the corresponding areas of any two images to be processed in the same image group are not overlapped;
The identification and separation unit is used for identifying the characteristic object in the image to be processed and separating the characteristic object from the image to be processed;
The decomposition unit is used for decomposing the feature objects to obtain a plurality of sub-feature objects and grouping the sub-feature objects;
The first processing unit is used for comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping on the sub-feature objects in one group and giving labels according to the similarity result; and
And the counting unit is used for counting the number of labels of each characteristic object of the instruction pointing region.
In a third aspect, the present application provides a crop disease analysis system, the system comprising:
one or more memories for storing instructions; and
One or more processors configured to invoke and execute the instructions from the memory to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium comprising:
A program which, when executed by a processor, performs the method of the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, the present application provides a computer program product comprising program instructions which, when executed by a computing device, perform a method as described in the first aspect and any possible implementation of the first aspect.
In a sixth aspect, the present application provides a chip system comprising a processor for implementing the functions involved in the above aspects, e.g. generating, receiving, transmitting, or processing data and/or information involved in the above methods.
The chip system can be composed of chips, and can also comprise chips and other discrete devices.
In one possible design, the system on a chip also includes memory to hold the necessary program instructions and data. The processor and the memory may be decoupled, provided on different devices, respectively, connected by wire or wirelessly, or the processor and the memory may be coupled on the same device.
The beneficial effects of the application are as follows:
The crop disease analysis method and the crop disease analysis system can discover abnormal individuals through regional population analysis and judge whether the region is diseased or not, and the mode can discover diseased plants in the early stage of disease occurrence so that intervention means can intervene as early as possible
Drawings
Fig. 1 is a schematic block diagram of a flow chart of steps of a crop disease analysis method provided by the application.
Fig. 2 is a schematic diagram of image acquisition according to the present application.
Fig. 3 is a schematic diagram of a process for creating a line segment according to the present application.
Fig. 4 is a schematic diagram of a process for obtaining a distance value set according to the present application.
Fig. 5 is a schematic diagram of a process for calculating a difference between two pixel values at a pixel point according to the present application.
Fig. 6 is a schematic diagram of a principle of calculating the separation of sub-edge contour segments according to the present application.
Detailed Description
The technical scheme in the application is further described in detail below with reference to the accompanying drawings.
The application discloses a crop disease analysis method, in some examples, referring to fig. 1, the crop disease analysis method disclosed by the application comprises the following steps:
S101, responding to an acquired instruction, and carrying out image acquisition on an area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
S102, cutting the image according to the generation position of the image to obtain an image to be processed, wherein the corresponding areas of any two images to be processed in the same image group are not overlapped, as shown in 2;
s103, identifying a feature object in the image to be processed, and separating the feature object from the image to be processed;
S104, decomposing the feature objects to obtain a plurality of sub-feature objects, and grouping the sub-feature objects;
S105, comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping and label assignment on the sub-feature objects in one group according to the similarity result; and
S106, counting the number of labels of each feature object of the pointed region.
According to the crop disease analysis method disclosed by the application, the images are acquired by means of small-sized flying equipment such as an unmanned aerial vehicle, the acquired images are sent to a server for analysis, and the unmanned aerial vehicle and the server are taken as a whole and are collectively called as an analysis system.
In step S101, according to the received instruction, the analysis system performs image acquisition on the area pointed by the instruction, so as to obtain an image group, where the image group includes a plurality of images, and the area pointed by the instruction refers to the planting area in fig. 2.
The obtained image is processed in step S102, and in step S102, the image is cut according to the generation position of the image, so as to obtain an image to be processed, and corresponding areas of any two images to be processed in the same image group are not overlapped.
It will be appreciated that the area corresponding to the image is positively correlated with the height of the analysis system, but that the image quality is high near the location of the analysis system in terms of image quality, whereas the image quality is low, since the image distortion becomes worse with increasing distance. In order to solve the problem, the application uses a processing mode of cutting the image to solve the problem, when the analysis system moves to a shooting position, the image is shot, and then the shot image is cut according to the position and the size.
In step S103, a feature object in the image to be processed is identified and separated from the image to be processed, where the feature object refers to a crop. Then in step S104, the feature objects are decomposed to obtain a plurality of sub-feature objects, where the sub-feature objects refer to parts on the crop, such as stems and leaves, and after obtaining a plurality of sub-feature objects, the sub-feature objects need to be grouped, where the purpose of grouping is to collect sub-feature objects of the same class, and then compare the sub-feature objects.
In step S105, the similarity of any two sub-feature objects in each group is compared, sub-feature objects in one group are grouped and labeled according to the similarity result, where the sub-grouping is to divide sub-feature objects belonging to the same class again, and sub-feature objects having some common features are grouped into one group.
Taking the leaf as an example, in step S104, all the leaves are put into one group, and in step S105, some leaves with common characteristics are put into one group, where three groups may be obtained, or four groups may be obtained, and the number of groups is not determined. The label is given according to the result of the secondary grouping, and the part of the content is introduced in the follow-up content.
Finally, in step S106, the number of labels of each feature object of the region pointed to by the instruction is counted, and the counted content in the step is the number of labels of the feature object pointed to by the instruction (in step S101), for example, the number of feature objects at the time is ten, and in the foregoing, labels are given to sub-feature objects after the secondary grouping, and these sub-feature objects with labels have a direct association relationship with the feature objects, so that the number of labels of the feature objects can be obtained accordingly.
The greater the number of tags carried by a signature object, the greater the probability that the signature object has a disease, where the disease is primarily an early manifestation of the disease.
In the method for analyzing the crop diseases, abnormal individuals are found through the analysis of the integrity of the crops in one area and the comparison analysis among individuals, and the method has the advantages that an analysis model is prevented from being built for each disease of each crop, and the method has better universality.
Meanwhile, the image of the processing process is restrained, the accuracy is higher, and the work with diseases or suspected diseases can be found in the initial stage, so that a timely intervention means is adopted, and the loss is controlled within a limited range.
The step of decomposing the feature object is as follows:
s201, drawing the outline of the characteristic object, wherein the outline is a closed curve;
S202, determining a central area of the outline;
s203, drawing the characteristic contours of the characteristic objects, and dividing the characteristic contours according to the central area to obtain a plurality of sub-characteristic contours, wherein each sub-characteristic contour corresponds to one sub-characteristic object.
Specifically, in step S201 and step S202, the outline and the center area of the feature object, that is, the edge and the center of the feature object, are first determined, and the outline of the feature object is determined for the purpose of separating the feature object individually, so as to determine a clear range for the sub-feature object obtained subsequently.
In step S203, if there is a clear relationship between the sub-feature profile and the central region, that is, the sub-feature object may extend to the central region of the outline profile, the sub-feature profile can be determined, and when the sub-feature profile is determined, the remaining feature profiles are segmented by means of intersection connection.
The specific mode is that firstly, the intersection point (the intersection point of two line segments) on the rest characteristic contours is determined, then the intersection point is connected with the central area to obtain a parting line, and finally, the parting line is used as the edge of the seed characteristic contour.
The specific way to obtain the central area of the outline is as follows:
s301, randomly selecting a plurality of judgment points on the outline;
s302, constructing line segments by using any two judgment points and obtaining distance value groups on the line segments at intervals;
S303, adjusting judgment points at any end of the line segment according to the change trend of the distance value group, and reconstructing the line segment, wherein the change trend comprises trend increase and trend decrease; and
S304, obtaining intersection areas of a plurality of line segments and taking the intersection areas as the central area of the outline;
Wherein the number of the central areas of the outline is at least one.
In steps S301 to S304, the line segments are used to determine the central area of the outline, and the specific principle is that the line segments (shown in fig. 3) are established and the distance value sets (shown in fig. 4) are obtained, because for crops, there is a characteristic that the middle part is high and the edge part is low, the change trend (including the trend of increasing and the trend of decreasing) of the line segments can be determined by establishing a plurality of line segments and obtaining the distance value sets (the dotted line in fig. 4) on the line segments, and on the premise that the line segments are obtained, the intersection areas of the line segments can be used as the central area of the outline.
The sub-feature objects are grouped as follows:
S401, determining symmetry lines of the sub-feature objects and determining the height change trend of the symmetry lines;
s402, grouping sub-feature objects according to the height change trend, wherein the height change trend of symmetry lines of the sub-feature objects belonging to the same class is the same;
wherein the symmetry line of the sub-feature object points to the central area of the outline;
the length of the symmetry line of the sub-feature objects belonging to the same class is within the required interval.
The contents in step S401 and step S402 are that the sub-feature objects are grouped according to the height variation trend of the symmetry line of the sub-feature objects, and this grouping method has an advantage of universality, and basic grouping is also performed in combination with colors and shapes, and the colors and shapes are grouped in an initial stage, which mainly separates flowers, leaves, etc., and the flowers and the leaves have obvious color differences and shape differences.
However, for the leaves, there are diseased leaves and normal leaves, and there may be a clear difference in color and shape, and at this time, grouping according to the color and shape cannot be performed, because a grouping error may be caused. In order to solve the problem, the application solves the problem by using a mode of changing the height trend of the symmetrical line.
The symmetry line of the sub-feature object points to the central area of the outline, and the height change of the symmetry line has various conditions of rising, lowering, rising before lowering and the like, so that the sub-feature objects can be grouped according to the height change trend. Meanwhile, in order to further improve the grouping accuracy, the height regions of the sub-feature objects need to be divided, because the leaf lengths also directly influence the height variation trend of the symmetry line.
The similarity in step S105, which includes color difference and edge uniformity in the present application, is specifically that the color difference in the similarity of the sub-feature objects is obtained by:
overlapping the two sub-feature objects to obtain an overlapping region, and simultaneously moving any one of the two sub-feature objects to maximize the area of the overlapping region;
Calculating a difference value of two pixel values on each pixel point, referring to fig. 5, the difference value includes a positive difference value and a negative difference value;
and taking the sum of the positive difference value quantity and the negative difference value quantity as the chromatic aberration of the sub-characteristic object.
Specifically, two sub-feature objects are overlapped, then the difference value of two pixel values on each pixel point is calculated, the difference value has two positive difference values and two negative difference values, and the overlapping is that the symmetry lines of the sub-feature objects are required to be overlapped.
As for the difference, there are two kinds of allowable differences and abnormal differences, and an allowable difference refers to a difference within an allowable range, and the difference at this time is normal, or within the allowable range, and an abnormal or out of the allowable range is an abnormal difference.
In this case, the term "label" refers to a color anomaly region on the sub-feature object, and even more specifically, a specific number of color anomaly regions. In a specific process, a color anomaly region should appear in a plurality of comparison processes, and as for the number of occurrences, it is required to be specifically set according to sensitivity.
When the lengths of the two sub-feature objects are different in the comparison process, the sub-feature objects need to be moved to the next position for comparison after the comparison of one position is completed until all areas on the two sub-feature objects participate in the comparison process, and at the moment, edge areas on two sides of a symmetry line of the sub-feature objects need to be ignored.
The advantage of this comparison is that instead of using average chromatic aberration as chromatic aberration, local chromatic aberration is used as chromatic aberration. Since the chromatic aberration may occur in the form of points at the initial stage of the disease, which may present problems that cannot be found if calculated using the average chromatic aberration.
The specific way to get edge uniformity in similarity of sub-feature objects is as follows:
obtaining the edge profile of the sub-feature object;
Segmenting the edge contour of the sub-feature object to obtain a plurality of sub-edge contour segments, wherein the lengths of each sub-edge contour segment are the same;
The degree of separation of the sub-edge contour segments is calculated, see fig. 6, by creating standard line segments using both ends of the sub-edge contour segments, randomly selecting a plurality of points on the sub-edge contour segments, and calculating the minimum linear distance (dotted line in fig. 6) of each point to the standard line segments.
And accumulating the separation degrees of the sub-edge contour segments to obtain the edge uniformity in the similarity of the sub-feature objects.
Specifically, the separation degree of each sub-edge contour segment is calculated after the edge contour of the sub-feature object is segmented, and the edge contour segments are smooth straight lines or curves, but when crops suffer from diseases, the edges of leaves can change.
At this time, the separation degree of the sub-edge contour segments is calculated again, and the obtained value is larger than the separation degree of the sub-edge contour segments at the corresponding positions of the leaves under normal conditions. At this time, for the edge of the leaf, the leaf needs to be scanned by means of a laser radar at the same time, so as to obtain edge data of the leaf.
The edge uniformity is a specific value, which needs to be given according to the specific type of crop.
The application also provides a crop disease analysis device, which comprises:
The image acquisition unit is used for responding to the acquired instruction, carrying out image acquisition on the area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
The image cutting unit is used for cutting the images according to the generation positions of the images to obtain images to be processed, and the corresponding areas of any two images to be processed in the same image group are not overlapped;
The identification and separation unit is used for identifying the characteristic object in the image to be processed and separating the characteristic object from the image to be processed;
The decomposition unit is used for decomposing the feature objects to obtain a plurality of sub-feature objects and grouping the sub-feature objects;
The first processing unit is used for comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping on the sub-feature objects in one group and giving labels according to the similarity result; and
And the counting unit is used for counting the number of labels of each characteristic object of the instruction pointing region.
Further, the method further comprises the following steps:
The first curve processing unit is used for drawing the outline of the characteristic object, wherein the outline is a closed curve;
A second curve processing unit for determining a central area of the outline;
and the third curve processing unit is used for drawing the characteristic contours of the characteristic objects and dividing the characteristic contours according to the central area to obtain a plurality of sub-characteristic contours, and each sub-characteristic contour corresponds to one sub-characteristic object.
Further, the method further comprises the following steps:
The first selecting unit is used for randomly selecting a plurality of judging points on the outline;
the second processing unit is used for constructing line segments by using any two judgment points and obtaining distance value groups on the line segments at intervals;
the third processing unit is used for adjusting the judgment points at any end of the line segment according to the change trend of the distance value group and reconstructing the line segment, wherein the change trend comprises trend increase and trend decrease; and
A fourth processing unit, configured to obtain intersection areas of the plurality of line segments and use the intersection areas as a center area of the outline;
Wherein the number of the central areas of the outline is at least one.
Further, the method further comprises the following steps:
A determining unit for determining a symmetry line of the sub-feature object and determining a height variation trend of the symmetry line;
the grouping unit is used for grouping the sub-feature objects according to the height variation trend, and the height variation trend of the symmetry lines belonging to the same class of sub-feature objects is the same;
wherein the symmetry line of the sub-feature object points to the central area of the outline;
the length of the symmetry line of the sub-feature objects belonging to the same class is within the required interval.
Further, the similarity of the sub-feature objects includes color difference and edge uniformity.
Further, the method further comprises the following steps:
A fifth processing unit, configured to perform overlapping processing on the two sub-feature objects to obtain an overlapping area, and simultaneously move any one of the two sub-feature objects to maximize an area of the overlapping area;
a first calculating unit, configured to calculate a difference value between two pixel values at each pixel point, where the difference value includes a positive difference value and a negative difference value;
And the first assignment unit is used for taking the sum of the positive difference value quantity and the negative difference value quantity as the chromatic aberration of the sub-feature object.
Further, the method further comprises the following steps:
A sixth processing unit, configured to obtain an edge contour of the sub-feature object;
a seventh processing unit, configured to segment an edge contour of the sub-feature object to obtain a plurality of sub-edge contour segments, where a length of each sub-edge contour segment is the same;
The second calculation unit is used for calculating the separation degree of the sub-edge contour segments, wherein the separation degree is calculated by using two ends of the sub-edge contour segments to establish standard line segments, randomly selecting a plurality of points on the sub-edge contour segments and calculating the minimum linear distance between each point and the standard line segments;
And the second assignment unit is used for accumulating the separation degree of the sub-edge contour segments to obtain the edge uniformity in the similarity of the sub-feature objects.
In one example, the unit in any of the above apparatuses may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (application specific integratedcircuit, ASIC), or one or more digital signal processors (DIGITAL SIGNAL processor, DSP), or one or more field programmable gate arrays (field programmable GATE ARRAY, FPGA), or a combination of at least two of these integrated circuit forms.
For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as a central processing unit (central processing unit, CPU) or other processor that may invoke a program. For another example, the units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Various objects such as various messages/information/devices/network elements/systems/devices/actions/operations/processes/concepts may be named in the present application, and it should be understood that these specific names do not constitute limitations on related objects, and that the named names may be changed according to the scenario, context, or usage habit, etc., and understanding of technical meaning of technical terms in the present application should be mainly determined from functions and technical effects that are embodied/performed in the technical solution.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should also be understood that in various embodiments of the present application, first, second, etc. are merely intended to represent that multiple objects are different. For example, the first time window and the second time window are only intended to represent different time windows. Without any effect on the time window itself, the first, second, etc. mentioned above should not impose any limitation on the embodiments of the present application.
It is also to be understood that in the various embodiments of the application, where no special description or logic conflict exists, the terms and/or descriptions between the various embodiments are consistent and may reference each other, and features of the various embodiments may be combined to form new embodiments in accordance with their inherent logic relationships.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a computer-readable storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned computer-readable storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The application also provides a crop disease analysis system, which comprises:
one or more memories for storing instructions; and
One or more processors configured to invoke and execute the instructions from the memory to perform the method as set forth above.
The present application also provides a computer program product comprising instructions which, when executed, cause the terminal device and the network device to perform operations of the terminal device and the network device corresponding to the above method.
The present application also provides a chip system comprising a processor for implementing the functions involved in the above, e.g. generating, receiving, transmitting, or processing data and/or information involved in the above method.
The chip system can be composed of chips, and can also comprise chips and other discrete devices.
The processor referred to in any of the foregoing may be a CPU, microprocessor, ASIC, or integrated circuit that performs one or more of the procedures for controlling the transmission of feedback information described above.
In one possible design, the system on a chip also includes memory to hold the necessary program instructions and data. The processor and the memory may be decoupled, and disposed on different devices, respectively, and connected by wired or wireless means, so as to support the chip system to implement the various functions in the foregoing embodiments. Or the processor and the memory may be coupled to the same device.
Optionally, the computer instructions are stored in a memory.
Alternatively, the memory may be a storage unit in the chip, such as a register, a cache, etc., and the memory may also be a storage unit in the terminal located outside the chip, such as a ROM or other type of static storage device, a RAM, etc., that may store static information and instructions.
It will be appreciated that the memory in the present application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory.
The non-volatile memory may be a ROM, programmable ROM (PROM), erasable programmable ROM (erasable PROM, EPROM), electrically erasable programmable EPROM (EEPROM), or flash memory.
The volatile memory may be RAM, which acts as external cache. There are many different types of RAM, such as sram (STATIC RAM, SRAM), DRAM (DYNAMIC RAM, DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (double DATA RATE SDRAM, DDR SDRAM), enhanced SDRAM (ENHANCED SDRAM, ESDRAM), synchronous DRAM (SYNCH LINK DRAM, SLDRAM), and direct memory bus RAM.
The embodiments of the present application are all preferred embodiments of the present application, and are not intended to limit the scope of the present application in this way, therefore: all equivalent changes in structure, shape and principle of the application should be covered in the scope of protection of the application.

Claims (7)

1. A method for analyzing crop disease, comprising:
responding to the acquired instruction, and carrying out image acquisition on the area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
Cutting the images according to the generation positions of the images to obtain images to be processed, wherein the corresponding areas of any two images to be processed in the same image group are not overlapped;
Identifying a feature object in the image to be processed, and separating the feature object from the image to be processed;
decomposing the feature objects to obtain a plurality of sub-feature objects, and grouping the sub-feature objects;
comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping and label assignment on the sub-feature objects in one group according to the similarity result; and
Counting the number of labels of each characteristic object of the pointed area of the instruction, wherein the number of labels carried by the characteristic objects is positively correlated with the probability of suffering from the disease;
Decomposing the feature object includes:
drawing the outline of the characteristic object, wherein the outline is a closed curve;
Determining a central area of the outline;
Drawing a characteristic contour of a characteristic object, and dividing the characteristic contour according to a central area to obtain a plurality of sub-characteristic contours, wherein each sub-characteristic contour corresponds to one sub-characteristic object;
Further comprises:
randomly selecting a plurality of judgment points on the outline;
constructing line segments by using any two judgment points and obtaining a distance value group on the line segments at intervals;
the judgment points at any end of the line segment are adjusted according to the change trend of the distance value group, the line segment is reconstructed, and the change trend comprises trend increase and trend decrease; and
Obtaining intersection areas of a plurality of line segments, taking the intersection areas as central areas of outline contours, wherein the number of the central areas of the outline contours is at least one;
Grouping sub-feature objects includes:
Determining symmetry lines of the sub-feature objects and determining the height change trend of the symmetry lines;
grouping the sub-feature objects according to the height change trend, wherein the height change trend of symmetry lines of the sub-feature objects belonging to the same class is the same;
wherein the symmetry line of the sub-feature object points to the central area of the outline;
the length of the symmetry line of the sub-feature objects belonging to the same class is within the required interval.
2. The crop disease analysis method of claim 1, wherein the similarity of the sub-feature objects includes chromatic aberration and edge uniformity.
3. The crop disease analysis method according to claim 2, wherein obtaining the color difference in the similarity of the sub-feature objects comprises:
overlapping the two sub-feature objects to obtain an overlapping region, and simultaneously moving any one of the two sub-feature objects to maximize the area of the overlapping region;
Calculating the difference value of two pixel values on each pixel point, wherein the difference value comprises a positive difference value and a negative difference value;
and taking the sum of the positive difference value quantity and the negative difference value quantity as the chromatic aberration of the sub-characteristic object.
4. A crop disease analysis method as claimed in claim 2 or claim 3, wherein obtaining edge uniformity in similarity of sub-feature objects comprises:
obtaining the edge profile of the sub-feature object;
Segmenting the edge contour of the sub-feature object to obtain a plurality of sub-edge contour segments, wherein the lengths of each sub-edge contour segment are the same;
calculating the separation degree of the sub-edge contour segments, wherein the calculation mode of the separation degree is to establish standard line segments by using two ends of the sub-edge contour segments, randomly selecting a plurality of points on the sub-edge contour segments and calculating the minimum linear distance between each point and the standard line segments;
and accumulating the separation degrees of the sub-edge contour segments to obtain the edge uniformity in the similarity of the sub-feature objects.
5. A crop disease analysis apparatus, comprising:
The image acquisition unit is used for responding to the acquired instruction, carrying out image acquisition on the area pointed by the instruction to obtain an image group, wherein the image group comprises a plurality of images;
The image cutting unit is used for cutting the images according to the generation positions of the images to obtain images to be processed, and the corresponding areas of any two images to be processed in the same image group are not overlapped;
The identification and separation unit is used for identifying the characteristic object in the image to be processed and separating the characteristic object from the image to be processed;
The decomposition unit is used for decomposing the feature objects to obtain a plurality of sub-feature objects and grouping the sub-feature objects;
the first processing unit is used for comparing the similarity of any two sub-feature objects in each group, and carrying out secondary grouping on the sub-feature objects in one group and giving labels according to the similarity result;
the statistics unit is used for counting the number of labels of each characteristic object of the instruction pointing region, and the number of labels carried by the characteristic objects is positively correlated with the probability of suffering from the disease;
The first curve processing unit is used for drawing the outline of the characteristic object, wherein the outline is a closed curve;
A second curve processing unit for determining a central area of the outline;
the third curve processing unit is used for drawing the characteristic contours of the characteristic objects and dividing the characteristic contours according to the central area to obtain a plurality of sub-characteristic contours, and each sub-characteristic contour corresponds to one sub-characteristic object;
The first selecting unit is used for randomly selecting a plurality of judging points on the outline;
the second processing unit is used for constructing line segments by using any two judgment points and obtaining distance value groups on the line segments at intervals;
the third processing unit is used for adjusting the judgment points at any end of the line segment according to the change trend of the distance value group and reconstructing the line segment, wherein the change trend comprises trend increase and trend decrease;
A fourth processing unit, configured to obtain intersection areas of the plurality of line segments, and take the intersection areas as a central area of the outline, where the number of the central areas of the outline is at least one;
A determining unit for determining a symmetry line of the sub-feature object and determining a height variation trend of the symmetry line;
the grouping unit is used for grouping the sub-feature objects according to the height variation trend, and the height variation trend of the symmetry lines belonging to the same class of sub-feature objects is the same;
wherein the symmetry line of the sub-feature object points to the central area of the outline;
the length of the symmetry line of the sub-feature objects belonging to the same class is within the required interval.
6. A crop disease analysis system, the system comprising:
one or more memories for storing instructions; and
One or more processors to invoke and execute the instructions from the memory to perform the crop disease analysis method of any of claims 1 to 4.
7. A computer-readable storage medium, the computer-readable storage medium comprising:
a program which, when executed by a processor, is executed by the crop disease analysis method as claimed in any one of claims 1 to 4.
CN202410339267.7A 2024-03-25 2024-03-25 Crop disease analysis method and analysis system Active CN117935068B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410339267.7A CN117935068B (en) 2024-03-25 2024-03-25 Crop disease analysis method and analysis system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410339267.7A CN117935068B (en) 2024-03-25 2024-03-25 Crop disease analysis method and analysis system

Publications (2)

Publication Number Publication Date
CN117935068A CN117935068A (en) 2024-04-26
CN117935068B true CN117935068B (en) 2024-05-24

Family

ID=90761246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410339267.7A Active CN117935068B (en) 2024-03-25 2024-03-25 Crop disease analysis method and analysis system

Country Status (1)

Country Link
CN (1) CN117935068B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105957063A (en) * 2016-04-22 2016-09-21 北京理工大学 CT image liver segmentation method and system based on multi-scale weighting similarity measure
CN106932780A (en) * 2017-03-14 2017-07-07 北京京东尚科信息技术有限公司 Object positioning method, device and system
WO2018176624A1 (en) * 2017-03-28 2018-10-04 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
WO2019015477A1 (en) * 2017-07-18 2019-01-24 Oppo广东移动通信有限公司 Image correction method, computer readable storage medium and computer device
CN109766904A (en) * 2015-07-27 2019-05-17 蚌埠医学院 The innovatory algorithm of medical domain image, semantic similarity matrix
CN113344055A (en) * 2021-05-28 2021-09-03 北京百度网讯科技有限公司 Image recognition method, image recognition device, electronic equipment and medium
CN113822247A (en) * 2021-11-22 2021-12-21 广东泰一高新技术发展有限公司 Method and system for identifying illegal building based on aerial image
AU2021103121A4 (en) * 2021-06-04 2022-03-24 Baba Farid College Of Engineering And Technology Bathinda A system of roi & k-means clustering based hybrid technique for detection of yellow spot disease in crops
CN114766041A (en) * 2019-12-03 2022-07-19 巴斯夫欧洲公司 System and method for determining crop damage
CN115690919A (en) * 2021-07-23 2023-02-03 北京嘀嘀无限科技发展有限公司 Living body detection method, living body detection device, living body detection equipment, storage medium and program product
CN115995005A (en) * 2023-03-22 2023-04-21 航天宏图信息技术股份有限公司 Crop extraction method and device based on single-period high-resolution remote sensing image
CN116524369A (en) * 2023-04-18 2023-08-01 中国地质大学(武汉) Remote sensing image segmentation model construction method and device and remote sensing image interpretation method
WO2023242236A1 (en) * 2022-06-14 2023-12-21 Basf Se Synthetic generation of training data
CN117496109A (en) * 2023-09-23 2024-02-02 深圳市中科中天科技有限公司 Image comparison and analysis method and device, electronic equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11227382B2 (en) * 2018-01-11 2022-01-18 Intelinair, Inc. Change detection system
CA3125790A1 (en) * 2019-02-04 2020-08-13 Farmers Edge Inc. Shadow and cloud masking for remote sensing images in agriculture applications using multilayer perceptron
US11425852B2 (en) * 2020-10-16 2022-08-30 Verdant Robotics, Inc. Autonomous detection and control of vegetation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109766904A (en) * 2015-07-27 2019-05-17 蚌埠医学院 The innovatory algorithm of medical domain image, semantic similarity matrix
CN105957063A (en) * 2016-04-22 2016-09-21 北京理工大学 CT image liver segmentation method and system based on multi-scale weighting similarity measure
CN106932780A (en) * 2017-03-14 2017-07-07 北京京东尚科信息技术有限公司 Object positioning method, device and system
WO2018176624A1 (en) * 2017-03-28 2018-10-04 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
WO2019015477A1 (en) * 2017-07-18 2019-01-24 Oppo广东移动通信有限公司 Image correction method, computer readable storage medium and computer device
CN114766041A (en) * 2019-12-03 2022-07-19 巴斯夫欧洲公司 System and method for determining crop damage
CN113344055A (en) * 2021-05-28 2021-09-03 北京百度网讯科技有限公司 Image recognition method, image recognition device, electronic equipment and medium
AU2021103121A4 (en) * 2021-06-04 2022-03-24 Baba Farid College Of Engineering And Technology Bathinda A system of roi & k-means clustering based hybrid technique for detection of yellow spot disease in crops
CN115690919A (en) * 2021-07-23 2023-02-03 北京嘀嘀无限科技发展有限公司 Living body detection method, living body detection device, living body detection equipment, storage medium and program product
CN113822247A (en) * 2021-11-22 2021-12-21 广东泰一高新技术发展有限公司 Method and system for identifying illegal building based on aerial image
WO2023242236A1 (en) * 2022-06-14 2023-12-21 Basf Se Synthetic generation of training data
CN115995005A (en) * 2023-03-22 2023-04-21 航天宏图信息技术股份有限公司 Crop extraction method and device based on single-period high-resolution remote sensing image
CN116524369A (en) * 2023-04-18 2023-08-01 中国地质大学(武汉) Remote sensing image segmentation model construction method and device and remote sensing image interpretation method
CN117496109A (en) * 2023-09-23 2024-02-02 深圳市中科中天科技有限公司 Image comparison and analysis method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
田蕾等."山地乔灌木斑块空间分布格局及其多样性沿海拔梯度 的变化规律".《生态学报》.2023,第43卷(第24期),10320-10333. *

Also Published As

Publication number Publication date
CN117935068A (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN111398176B (en) Water body water color abnormity remote sensing identification method and device based on pixel scale characteristics
CN112686274B (en) Target object detection method and device
CN110033009B (en) Method for processing image data in a connected network
CN111666797B (en) Vehicle positioning method, device and computer equipment
CN111192239B (en) Remote sensing image change area detection method and device, storage medium and electronic equipment
CN112084869A (en) Compact quadrilateral representation-based building target detection method
CN113989305A (en) Target semantic segmentation method and street target abnormity detection method applying same
CN112183212A (en) Weed identification method and device, terminal equipment and readable storage medium
CN113487607A (en) Defect detection method and device based on multi-view-field image
CN109034239B (en) Remote sensing image classification method, and site selection method and device for distributed wind power plant
CN112101195A (en) Crowd density estimation method and device, computer equipment and storage medium
CN117935068B (en) Crop disease analysis method and analysis system
CN114926732A (en) Multi-sensor fusion crop deep learning identification method and system
US20210365721A1 (en) Systems and methods for creating a parking map
CN115171031A (en) Method and device for detecting surface water accumulation based on vehicle reference object and application
CN117557924B (en) Agricultural environment monitoring method, device, system and storage medium
CN115115542A (en) Rapid restoration method of color difference strip after remote sensing image mosaic based on cloud platform
CN114359748A (en) Target classification extraction method and device
CN112132215A (en) Method and device for identifying object type and computer readable storage medium
CN112669346A (en) Method and device for determining road surface emergency
CN115546621B (en) Crop growth condition analysis method, device and application
CN113408514B (en) Method and device for detecting berths of roadside parking lot based on deep learning
CN117635839B (en) Crop information acquisition and three-dimensional image presentation method, device and system
CN115546208B (en) Method, device and application for measuring plant height of field crops
CN116449875B (en) Unmanned aerial vehicle inspection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant