CN117036677A - Method and device for monitoring field moth pests, electronic equipment and storage medium - Google Patents

Method and device for monitoring field moth pests, electronic equipment and storage medium Download PDF

Info

Publication number
CN117036677A
CN117036677A CN202310977024.1A CN202310977024A CN117036677A CN 117036677 A CN117036677 A CN 117036677A CN 202310977024 A CN202310977024 A CN 202310977024A CN 117036677 A CN117036677 A CN 117036677A
Authority
CN
China
Prior art keywords
target
determining
trapping device
target object
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310977024.1A
Other languages
Chinese (zh)
Inventor
杨保军
李航
刘淑华
姚青
唐健
王爱英
罗举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sci Tech University ZSTU
China National Rice Research Institute
Original Assignee
Zhejiang Sci Tech University ZSTU
China National Rice Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sci Tech University ZSTU, China National Rice Research Institute filed Critical Zhejiang Sci Tech University ZSTU
Priority to CN202310977024.1A priority Critical patent/CN117036677A/en
Publication of CN117036677A publication Critical patent/CN117036677A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M2200/00Kind of animal
    • A01M2200/01Insects
    • A01M2200/012Flying insects

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Pest Control & Pesticides (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Environmental Sciences (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Catching Or Destruction (AREA)

Abstract

The embodiment of the specification discloses a field moth pest monitoring method, a field moth pest monitoring device, electronic equipment and a storage medium. The user recommendation method includes determining a target pest species and determining a target trapping device; sending a first control instruction to the target object to instruct the target object to place the target trap in the monitoring area; and determining the target space parameter of the target trapping device based on the meteorological data every time the first preset time period passes, and sending a second control instruction to the target object to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter. According to the embodiment of the specification, the target trapping device can be automatically placed according to the type of the target insect to be monitored, and meanwhile, the space parameters of each target trapping device can be continuously adjusted according to meteorological data, so that the accuracy of obtained monitoring image data is high, and the monitoring requirements of moth insects are met.

Description

Method and device for monitoring field moth pests, electronic equipment and storage medium
Technical Field
One or more embodiments of the present disclosure relate to pest monitoring technology, and in particular, to a method, a device, an electronic apparatus, and a storage medium for monitoring field moth pests.
Background
Moth pests are one of the important factors for crop yield reduction. The field moth pest monitoring can be used for determining the migration time of adults, the insect source base number and the population development progress estimation. The traditional field moth pest monitoring method can be characterized by manually identifying types, counting and reporting results in a mode of manual investigation, utilizing light or pheromone trapping (insect condition measuring and reporting lamp or insect condition measuring and reporting instrument) and the like, and has the problems of strong specialization, time and labor waste, non-real-time property and the like. In recent years, although there are automatic monitoring methods using a photoelectric counting module and a machine vision module, it is difficult to adjust the monitoring methods, positions, etc. of the pests in real time according to the change of the complex field environment, so that the accuracy of data is poor, and the monitoring requirements of the moth pests cannot be met.
Disclosure of Invention
To solve the above problems, one or more embodiments of the present specification describe a field moth pest monitoring method, apparatus, electronic device, and storage medium.
According to a first aspect, there is provided a field moth pest monitoring method comprising:
determining a target pest species and determining a target trap corresponding to the target pest species;
Sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area;
and determining a target space parameter of the target trapping device based on meteorological data every time a first preset time period passes, and sending a second control instruction to the target object to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
Preferably, the determining a target pest species and determining a target trap corresponding to the target pest species includes:
acquiring a monitoring target, and determining the type of the target insect according to the monitoring target;
and determining a target pest characteristic corresponding to the target pest species, and determining a target trapping device based on the target pest characteristic.
Preferably, the sending a first control instruction to a target object, for instructing the target object to place the target trap in a monitoring area, includes:
determining the initial number of the target trapping devices according to the area of the monitoring area, and acquiring initial spatial parameters of the target trapping devices, wherein the spatial parameters comprise the arrangement mode of the target trapping devices, a first distance between the target trapping devices and a ridge, a second distance between the target trapping devices and the height of an insect attracting part;
And sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area according to the initial quantity and the initial space parameters.
Preferably, the sending the first control instruction to the target object includes:
when a non-target trapping device exists in the monitoring area, determining a minimum distance between the target trapping device and the non-target trapping device, and sending a first control instruction and a third control instruction to a target object, wherein the third control instruction is used for instructing the target object to adjust a third distance between any target trapping device and the non-target trapping device to be not smaller than the minimum distance.
Preferably, each time the first preset time period passes, determining a target space parameter of the target trapping device based on meteorological data, and sending a second control instruction to the target object to instruct the target object to adjust a current space parameter of the target trapping device based on the target space parameter, including:
determining target space parameters of the target trapping device based on meteorological data and determining target placement areas and target quantity of the target trapping device based on monitoring area characteristics every time a first preset time period passes;
And sending a second control instruction to the target object to instruct the target object to be in the target placement area, and adjusting the current number and the current space parameters of the target trapping device based on the target number and the target space parameters.
Preferably, the determining the target space parameter of the target trapping device based on meteorological data comprises:
and acquiring meteorological data at the current moment, inquiring a parameter database corresponding to the target trapping device based on the meteorological data to obtain target space parameters, wherein the parameter database stores mapping relations between the meteorological data and the space parameters.
Preferably, the method further comprises:
sending a fourth control instruction to the target object every time a second preset time period passes, wherein the fourth control instruction is used for controlling the target object to acquire and upload an insect attracting part image of the target trapping device;
analyzing the insect attracting part image, identifying target pests and counting the target pests.
According to a second aspect, there is provided a field moth pest monitoring device, the device comprising:
a determining module for determining a target pest species and determining a target trap corresponding to the target pest species;
The first sending module is used for sending a first control instruction to a target object and instructing the target object to place the target trapping device in a monitoring area;
and the second sending module is used for determining the target space parameter of the target trapping device based on meteorological data every time the first preset time period passes, and sending a second control instruction to the target object so as to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
According to a third aspect, there is provided an electronic device comprising a processor and a memory;
the processor is connected with the memory;
the memory is used for storing executable program codes;
the processor runs a program corresponding to executable program code stored in the memory by reading the executable program code for performing the steps of the method as provided in the first aspect or any one of the possible implementations of the first aspect.
According to a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer program having instructions stored therein which, when run on a computer or processor, cause the computer or processor to perform a method as provided by any one of the possible implementations of the first aspect or the first aspect.
According to the method and the device provided by the embodiment of the specification, the target trapping device can be automatically placed according to the type of the target insect to be monitored, and meanwhile, the space parameters of each target trapping device can be continuously adjusted according to meteorological data, so that the accuracy of the obtained monitoring image data is high, and the monitoring requirements of the moth insect are met.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method of monitoring a field moth pest in one embodiment of the present disclosure.
Fig. 2 is a schematic structural view of a field moth pest monitoring device according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
In the following description, the terms "first," "second," and "first," are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The following description provides various embodiments of the application that may be substituted or combined between different embodiments, and thus the application is also to be considered as embracing all possible combinations of the same and/or different embodiments described. Thus, if one embodiment includes feature A, B, C and another embodiment includes feature B, D, then the present application should also be considered to include embodiments that include one or more of all other possible combinations including A, B, C, D, although such an embodiment may not be explicitly recited in the following.
The following description provides examples and does not limit the scope, applicability, or examples set forth in the claims. Changes may be made in the function and arrangement of elements described without departing from the scope of the application. Various examples may omit, replace, or add various procedures or components as appropriate. For example, the described methods may be performed in a different order than described, and various steps may be added, omitted, or combined. Furthermore, features described with respect to some examples may be combined into other examples.
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for monitoring field moth pests according to an embodiment of the present application. In an embodiment of the present application, the method includes:
s101, determining a target pest type and determining a target trapping device corresponding to the target pest type.
The execution subject of the present application may be a cloud server.
In the embodiment of the present disclosure, the cloud server first determines the target pest species of the target pest to be monitored. Depending on the type of pest, different sexual trapping devices will be used to trap the pest. Therefore, the cloud server determines the target trapping device used for the pest monitoring according to the target pest type.
Exemplary, the sex trapping device is provided with a trapping core, and the trapping core contains sex pheromone slow release components, has species specificity and can trap a moth pest. Common shapes for sex traps include boat-shaped traps, cylindrical diamond-shaped inlet traps, and bell-jar inverted funnel traps. Considering the influence of the structure of the sex trap on the insect attracting type and quantity, for example, the noctuid needs to be trapped by a cylindrical diamond inlet type, and the borer needs to be trapped by a bell jar inverted funnel type. The insect attracting area can also be used for recycling insect samples by using the insect sticking plate.
In one embodiment, step S101 includes:
acquiring a monitoring target, and determining the type of the target insect according to the monitoring target;
and determining a target pest characteristic corresponding to the target pest species, and determining a target trapping device based on the target pest characteristic.
In the embodiment of the specification, a worker can set a monitoring target on a terminal according to pests to be monitored, and a cloud server can determine the types of target pests by acquiring the monitoring target. Each pest species is provided with corresponding pest characteristics (such as noctuid characteristics, borer characteristics and the like) according to the species characteristics, and the cloud server can search and determine the corresponding target trapping device according to the target pest characteristics.
S102, sending a first control instruction to a target object to instruct the target object to place the target trap device in a monitoring area.
In the present description, the target object may be an AGV cart, an intelligent robot, or the like, which is capable of moving in a field environment. And after the cloud server determines the target trapping device, a first control instruction is sent to the target object. After the target object receives the first control instruction, the target trapping device can be held and placed in a designated monitoring area. The monitoring area is an area defined manually in advance, and data is provided for global monitoring through the monitoring area fixing field (or a plant protection department observing garden) of the specific moth pests.
The target object specifically may travel to an area corresponding to the target trapping device in the area-fixed field, determine a specific position of each target trapping device according to an identification sensor provided on each target trapping device, identify a gripping portion of the target trapping device through the identification sensor, and grip the identified gripping portion by using a mechanical arm, a gripping member, or the like of the target trapping device to complete gripping of the target trapping device. Then, a travel route can be planned according to the current position and the monitoring area of the travel route. After the target trapping device reaches the monitoring area according to the travelling route, the target trapping device can be placed by loosening the gripping part after the placement of the target trapping device is determined through the identification of the height and the azimuth of the identification sensor. The identification sensor may be an infrared sensor, a photoelectric sensor, or the like. The route of travel may be planned in any manner that enables existing artificial intelligence path planning, and will not be described in detail herein.
In one embodiment, step S102 includes:
determining the initial number of the target trapping devices according to the area of the monitoring area, and acquiring initial spatial parameters of the target trapping devices, wherein the spatial parameters comprise the arrangement mode of the target trapping devices, a first distance between the target trapping devices and a ridge, a second distance between the target trapping devices and the height of an insect attracting part;
And sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area according to the initial quantity and the initial space parameters.
In the embodiment of the specification, in order to diversify the obtained data, the occurrence of misjudgment is reduced, and the monitoring accuracy is improved. The cloud server can control the target object to place a plurality of target trapping devices in the monitoring area, and in order to ensure the actual effect of each target trapping device after being placed, the positions, the distribution and the like of the target trapping devices need to be adjusted. Specifically, in the case of determining the types of the target trapping devices, the mapping relationship between the number of devices and the monitoring area is preset in the database according to experience, and the cloud server can determine the initial number of the target trapping devices according to the area of the monitoring area. In addition, in the case of determining the kind of the target trapping device, the corresponding initial spatial parameters, i.e., how much distance should be set between the target trapping device and the ridge, how much distance should be maintained between the target trapping devices, the height of the insect attracting portion, etc., may be preset according to human experience. The target object will place the monitoring area according to the initial number and the initial spatial parameters. Wherein the initial placement location of the first targeted trap may be an edge location or a center location of the area. The specific adjustment mode of the height of the insect attracting part can be that an electric control telescopic piece is arranged on the target trapping device, and when the target object grabs the target trapping device, telescopic parameters can be sent to a controller of the telescopic piece through transmission of electric signals, and then the telescopic piece responds to the telescopic parameters to be telescopic to the designated height.
In one embodiment, the sending the first control instruction to the target object includes:
when a non-target trapping device exists in the monitoring area, determining a minimum distance between the target trapping device and the non-target trapping device, and sending a first control instruction and a third control instruction to a target object, wherein the third control instruction is used for instructing the target object to adjust a third distance between any target trapping device and the non-target trapping device to be not smaller than the minimum distance.
In the embodiment of the specification, in the same time period, the monitoring area can be provided with a plurality of trapping devices to realize trapping of a plurality of field moth pests. To avoid interference between different sex traps, it is desirable to provide a reasonable spacing and arrangement of sex traps. In particular, the database may also be pre-empirically provided with a minimum distance between the various types of sex trapping devices. The cloud server sends a third control instruction at the same time of sending the first control instruction so as to instruct the target object to correct the position of the target trapping device, so that a third distance between the target trapping device and the non-target trapping device is not smaller than the minimum distance.
And S103, determining a target space parameter of the target trapping device based on meteorological data every time a first preset time period passes, and sending a second control instruction to the target object to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
In the embodiment of the present specification, in the case of heavy wind, rainfall, or a change in wind direction, the activity position, activity height, or the like of the moth pests are changed accordingly. In order to be able to trap target pests more accurately, the specific location/distribution/height of the target trapping device should be adjusted according to the change in the meteorological data. Therefore, the cloud server acquires the weather data at the current moment every time when a first preset time period (for example, 24 hours) passes, and determines the space parameter, namely the target space parameter, which is the most suitable for the weather data at the moment according to the historical weather data and the current field vegetation information. And then, the cloud server sends a second control instruction to the target object to instruct the target object to grasp the target trapping device, and adjusts the current space parameters of the target trapping device according to the target space parameters, namely the first distance, the second distance and the insect attracting position height of the target trapping device. Therefore, the setting mode of the target trapping device can be correspondingly adjusted according to the moving ranges of the target pests under different meteorological data, and the accuracy of the target trapping device on trapping the target pests is improved. The meteorological data may include night wind power, wind direction, rainfall factor, temperature, humidity, etc.
In one embodiment, step S103 includes:
determining target space parameters of the target trapping device based on meteorological data and determining target placement areas and target quantity of the target trapping device based on monitoring area characteristics every time a first preset time period passes;
and sending a second control instruction to the target object to instruct the target object to be in the target placement area, and adjusting the current number and the current space parameters of the target trapping device based on the target number and the target space parameters.
In the present embodiments, it is contemplated that not all locations within the monitored area are suitable for preventing targeted trapping devices, depending on the situation. Therefore, in addition to determining the target space parameters of the target trapping device according to the meteorological data, the cloud server can acquire the monitoring area characteristics of the monitoring area according to the map data and the planting data of the area, so as to determine the area where the target trapping device can be actually placed, namely the target placement area, and the target quantity under the corresponding area of the target placement area according to the monitoring area characteristics. Therefore, the second control instruction generated by the cloud server can instruct the target object to adjust the actual state of the target trapping device in the target placement area according to the target quantity and the target space parameter, and the pest monitoring accuracy of the target trapping device is further ensured. The monitoring area features may include crop planting structures such as crop varieties, layouts, planting periods, surrounding topography, and the like, among others.
In one embodiment, the determining the target spatial parameter of the target trap based on meteorological data comprises:
and acquiring meteorological data at the current moment, inquiring a parameter database corresponding to the target trapping device based on the meteorological data to obtain target space parameters, wherein the parameter database stores mapping relations between the meteorological data and the space parameters.
In the embodiment of the specification, a parameter database is preset, and a mapping relationship between meteorological data and space parameters is constructed in the parameter database according to the activity rules of moth pests in historical data under different meteorological data. After the cloud server acquires the meteorological data at the current moment, the meteorological data are input into a parameter database for query calculation, and then the target space parameters can be obtained.
Specifically, the mapping relationship may be that after the spatial distribution of the moth pests corresponding to each kind of meteorological data is determined, the artificial experience is combined to set the corresponding spatial parameters for each kind of meteorological data, or a CNN neural network is obtained by training according to historical data, and the mapping function of the neural network is used as the mapping relationship.
In one embodiment, the method further comprises:
sending a fourth control instruction to the target object every time a second preset time period passes, wherein the fourth control instruction is used for controlling the target object to acquire and upload an insect attracting part image of the target trapping device;
analyzing the insect attracting part image, identifying target pests and counting the target pests.
In this embodiment of the present disclosure, the cloud server may further send a fourth control instruction to the target object every time the cloud server passes through the second preset duration, so as to control the target object to perform image acquisition on the insect attracting portion of the target trap device through the image capturing device set by the target object. The target object uploads the acquired image to the cloud server, the cloud server can identify target pests in the image by carrying out image identification on the pest attracting part image, the number of the target pests is determined, and the growth period of each target pest is determined according to different images. The second preset time length can be the same as the first preset time length, so that the target object can collect images while adjusting the target trapping device each time, or can be different from the first preset time length, so that the time for adjusting the spatial position and collecting the images is staggered. In addition, in some stable environments, the target object may not need to adjust the spatial parameter once every first preset time period, but the target object may still complete image acquisition according to the fourth control instruction generated every second preset time period, and vice versa.
In addition, the image data corresponding to the collected insect attracting part image can be additionally provided with additional information, and the additional information can comprise trap attracting core target types, placement time, placement positions, crop names, growth period information and the like. Therefore, workers can synthesize various information to calculate the occurrence dynamic information of targets in a specific period, display the moth peak period in a certain period, infer the larva occurrence period (hazard period) prediction and recommend the control adaptation period.
The specific analysis and identification process of the insect attracting part image can be as follows:
step 1, image generation and brightness pretreatment:
and extracting database pictures by using Java imageIO, and carrying out various processes on the original pictures received by the cloud server, wherein the processes comprise the steps of backing up the original pictures, generating compressed pictures, enhancing data and preprocessing the pictures.
The original picture is in jpg format and is used for high-definition display, compressed picture generation, model training, result drawing display and the like.
The compressed picture is in jpg format and is used for displaying the user system small picture, so that the picture transmission speed is increased.
The corresponding path of the picture in the cloud server is stored in the database.
The data enhancement is mainly used for expanding a data set, preparing for subsequent model training and improving the robustness of the model. Mainly uses image turning and rotation, noise adding and noise adding methods.
The data enhancement pictures are stored in an image dataset server for training of subsequent models.
Step 2, image preprocessing, self-adaptive brightness adjustment and local highlight processing:
database pictures were extracted with the openCV of python and the pictures were preprocessed.
The sex adhesion inducing plate image has the problems of complex background, uneven brightness and local highlight on the image caused by different mucilage glue and illumination on the mythic fungus plate.
Aiming at the problem of uneven brightness, an image preprocessing method based on linear transformation is adopted, so that an image with uniform background brightness is obtained. Specifically, the image histogram is analyzed by an adaptive brightness adjustment algorithm. And calculating a histogram of RGB in the RGB three-channel image, converting the RGB color space into an HSI color space, performing histogram equalization on the brightness channel of the HSI color space, and converting the RGB three-channel image from the HSI color space into the RGB color space after the processing is completed. And the image brightness is primarily corrected through histogram equalization, so that the condition of overexposure is reduced. The brightness of the image is then adjusted, the steps comprising:
(1) The brightness of the image is first obtained according to the following formula.
Brightness=0.299×R+0.587×G+0.114×B
Wherein R, G and B are pixel values of three channels of images Red, green and Blue respectively, and Brightness is calculated image Brightness.
(2) After obtaining the image brightness, the original image f is subjected to linear transformation through a formula (2) to generate a new image g.
g(i,j)=goal_Brightness/Brightness×f(i,j)
Where i, j is the corresponding position of the pixel point in the image, f (i, j) is the RGB pixel value of the image f at the (i, j) position, and g (i, j) is the RGB pixel value of the image g at the (i, j) position. The goal_brightness is the target Brightness, ranging from [0,255]. To find the value of gold_Brightness, this embodiment sets gold_Brightness within [150, 200] for multiple tests. Finally, the effect is best when the gold_brightness=180 is set, so that the overall Brightness of all images is the same.
Aiming at the local highlight problem, firstly, a gray level diagram is generated, secondly, according to experiments, the area higher than 220 pixel values is a highlight area, and the highlight area can be screened out through a binary mask. While the traditional filling mode of the highlight area is to fill white or the color with the largest occurrence number in the image, the background is complex, the color is generally difficult to determine, and the generated image is too offending and is unfavorable for the subsequent recognition of the model. The algorithm adopts a blocking color taking strategy to respectively carry out blocking operation on the R, G, B three channels. For each channel, the original pixel value range is 0-255, the color block size is set to be the block size (typically a factor of 256), and after multiple attempts, the block size is set to be 16, so that the effect is best. The pixel value interval is divided into 16 equal divisions, each block corresponds to a continuous segment of color values, the first fill color selected was therefore reduced from 255 x 255 colors to 16 x 16 color patches, which more closely encompasses the colors seen by the naked eye. Because the background is complex, the occurrence times of a certain color are more than the occurrence times of a main tone, but the same color is used for inducing plates, so that the overall background colors are similar, the colors are different and tend to be in a certain color block, therefore, the occurrence times of each color block color in the figure are calculated according to the image, the color block with the largest occurrence times is the color block kmax required by filling, the specific filling color is required to be obtained after the color block is obtained, the method continues to block color taking strategy, the color block with the largest occurrence times is continuously reduced until the size of the block size is 1 in the region with the color block kmax, and the like. When the blocksize is 1, the color value represented by the color block having the largest number of occurrences is used as the fill color value of the channel. The adoption of the block color taking strategy can rapidly determine the background color interval, reduce the influence of background interferents, enable the finally selected color to accord with the overall trend, and generate an image more reasonably.
The corresponding path of the pictures in the cloud server is stored in a database
The highlight removing picture is transmitted to the Django frame through an http protocol, and the highlight removing picture is identified through a Django frame calling model. The image is compressed in equal proportion to meet the size requirement of model input, and the blank part is left white or processed by SPP algorithm.
Step 3, detecting a model:
identification detection of convolutional model networks by YOLO-SFD. Because more impurities exist on the image of the armyworm plate, different insect bodies possibly have adhesion, and similar non-target pests exist due to the specificity problem of sex attractant, in order to improve the detection precision of spodoptera frugiperda and reduce the false detection of the non-target similar pests, the scheme provides an image detection model YOLO-SFD on the basis of YOLOX, and the model consists of a feature extraction module and a decoupling head module. After the image preprocessed is subjected to CSPDarknet to extract multi-scale features, a new attention mechanism module Channel and Space Transformer (CST) is formed by fusion of CBAM and transducer, so that key information in an input feature map can be enhanced. To preserve global features, CST uses image fusion in a transducer. The multiscale image output from CSPDarknet is first input to CST, passed through channel attention mechanism and spatial attention mechanism successively, and an enhanced feature map is generated. And performing feature map fusion by using an Add & Nor structure in a transducer to generate a feature map, and fusing by using the Add & Nor structure to generate an output feature. The CST plays a role of amplifying key information of the image channel and the spatial layer, retaining all information of the input features, and implements a channel attention mechanism and a spatial attention mechanism.
And 4, the final recognition result of the YOLO-SFD is the position information of each target object in the picture, and the specific format is (the abscissa position of the target in the image, the ordinate position of the target in the image, the width of the target and the height of the target). And the algorithm server of the cloud server returns the position information and the corresponding target information to the Web server.
And 5, drawing a result graph of the original picture according to the return information of the algorithm server, framing out target positions in the original picture, marking different target targets through different colors, displaying the number of the different target targets in the upper right corner of the picture, and generating a compressed graph of the result graph. And the paths of the result graph and the corresponding compressed graph in the server are stored in a database, and the state of the board is judged through a new and old board verification algorithm. And loading the acquired picture compressed images and the corresponding picture result compressed images into an interface for display through parallel loading.
Wherein, the state judgment of the board is to judge whether the next board is a new board relative to the previous board through the front and the back boards. Specifically, processing judgment is performed according to the position information in the picture corresponding result diagram. The position information in the two pictures is known, 4 data are one target, and the format is: the abscissa position of the object in the image, the ordinate position of the object in the image, the width of the object, the height of the object. By means of the method, whether identification targets in two pictures coincide or not and whether the identification targets are identical pests or not is judged one by one. And calculating the overlapped IOU value after meeting the conditions, recording as a matching target when the IOU is more than 0.5, calculating the number of the matching targets after all target calculation is completed, and calculating the number of non-overlapping targets in the new image when the number is more than 2/3 of the number in the previous image and judging that the new image is old and the data is analyzed. Otherwise, the target is a new plate, and all targets are newly added targets during data analysis.
The field moth pest monitoring device provided by the embodiment of the application is described in detail below with reference to fig. 2. It should be noted that, the field moth pest monitoring device shown in fig. 2 is used to execute the method of the embodiment shown in fig. 1, and for convenience of explanation, only the portion relevant to the embodiment of the present application is shown, and specific technical details are not disclosed, please refer to the embodiment shown in fig. 1 of the present application.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a field moth pest monitoring device according to an embodiment of the present application. As shown in fig. 2, the apparatus includes:
a determining module 201 for determining a target pest species and determining a target trap corresponding to the target pest species;
a first sending module 202, configured to send a first control instruction to a target object, for instructing the target object to place the target trap in a monitoring area;
and the second sending module 203 is configured to determine, based on meteorological data, a target space parameter of the target trapping device every time a first preset period passes, and send a second control instruction to the target object to instruct the target object to adjust a current space parameter of the target trapping device based on the target space parameter.
In one embodiment, the determining module 201 is specifically configured to:
acquiring a monitoring target, and determining the type of the target insect according to the monitoring target;
and determining a target pest characteristic corresponding to the target pest species, and determining a target trapping device based on the target pest characteristic.
In one embodiment, the first sending module 202 is specifically configured to:
determining the initial number of the target trapping devices according to the area of the monitoring area, and acquiring initial spatial parameters of the target trapping devices, wherein the spatial parameters comprise the arrangement mode of the target trapping devices, a first distance between the target trapping devices and a ridge, a second distance between the target trapping devices and the height of an insect attracting part;
and sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area according to the initial quantity and the initial space parameters.
In one embodiment, the first sending module 202 is specifically further configured to:
when a non-target trapping device exists in the monitoring area, determining a minimum distance between the target trapping device and the non-target trapping device, and sending a first control instruction and a third control instruction to a target object, wherein the third control instruction is used for instructing the target object to adjust a third distance between any target trapping device and the non-target trapping device to be not smaller than the minimum distance.
In one implementation, the second sending module 203 is specifically configured to:
determining target space parameters of the target trapping device based on meteorological data and determining target placement areas and target quantity of the target trapping device based on monitoring area characteristics every time a first preset time period passes;
and sending a second control instruction to the target object to instruct the target object to be in the target placement area, and adjusting the current number and the current space parameters of the target trapping device based on the target number and the target space parameters.
In one embodiment, the second sending module 203 is specifically further configured to:
and acquiring meteorological data at the current moment, inquiring a parameter database corresponding to the target trapping device based on the meteorological data to obtain target space parameters, wherein the parameter database stores mapping relations between the meteorological data and the space parameters.
In one embodiment, the apparatus further comprises:
the third sending module is used for sending a fourth control instruction to the target object every time a second preset time period passes, and controlling the target object to acquire and upload an insect attracting part image of the target trapping device;
The analyzing module is used for analyzing the insect attracting part image, identifying target pests and counting the target pests.
It will be clear to those skilled in the art that the technical solutions of the embodiments of the present application may be implemented by means of software and/or hardware. "Unit" and "module" in this specification refer to software and/or hardware capable of performing a specific function, either alone or in combination with other components, such as Field programmable gate arrays (Field-Programmable Gate Array, FPGAs), integrated circuits (Integrated Circuit, ICs), etc.
The processing units and/or modules of the embodiments of the present application may be implemented by an analog circuit that implements the functions described in the embodiments of the present application, or may be implemented by software that executes the functions described in the embodiments of the present application.
Referring to fig. 3, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown, where the electronic device may be used to implement the method in the embodiment shown in fig. 1. As shown in fig. 3, the electronic device 300 may include: at least one central processor 301, at least one network interface 304, a user interface 303, a memory 305, at least one communication bus 302.
Wherein the communication bus 302 is used to enable connected communication between these components.
The user interface 303 may include a Display screen (Display), a Camera (Camera), and the optional user interface 303 may further include a standard wired interface, and a wireless interface.
The network interface 304 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the central processor 301 may comprise one or more processing cores. The central processor 301 connects the various parts within the overall electronic device 300 using various interfaces and lines, performs various functions of the terminal 300 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 305, and invoking data stored in the memory 305. Alternatively, the central processor 301 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The central processor 301 may integrate one or a combination of several of a central processor (Central Processing Unit, CPU), an image central processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the cpu 301 and may be implemented by a single chip.
The memory 305 may include a random access memory (Random Access Memory, RAM) or a Read-only memory (Read-only memory). Optionally, the memory 305 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 305 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 305 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described respective method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 305 may also optionally be at least one storage device located remotely from the aforementioned central processor 301. As shown in fig. 3, an operating system, a network communication module, a user interface module, and program instructions may be included in the memory 305, which is a type of computer storage medium.
In the electronic device 300 shown in fig. 3, the user interface 303 is mainly used for providing an input interface for a user, and acquiring data input by the user; and central processor 301 may be configured to invoke the field moth pest monitoring application program stored in memory 305 and specifically perform the following operations:
Determining a target pest species and determining a target trap corresponding to the target pest species;
sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area;
and determining a target space parameter of the target trapping device based on meteorological data every time a first preset time period passes, and sending a second control instruction to the target object to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
The present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the above method. The computer readable storage medium may include, among other things, any type of disk including floppy disks, optical disks, DVDs, CD-ROMs, micro-drives, and magneto-optical disks, ROM, RAM, EPROM, EEPROM, DRAM, VRAM, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional manners of dividing the actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some service interface, device or unit indirect coupling or communication connection, electrical or otherwise.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on this understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product, or all or part of the technical solution, which is stored in a memory, and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be performed by hardware associated with a program that is stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Embodiments of the present disclosure will be readily apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.

Claims (10)

1. A method for monitoring field moth pests, the method comprising:
Determining a target pest species and determining a target trap corresponding to the target pest species;
sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area;
and determining a target space parameter of the target trapping device based on meteorological data every time a first preset time period passes, and sending a second control instruction to the target object to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
2. The method of claim 1, wherein the determining a target pest species and determining a target trap corresponding to the target pest species comprises:
acquiring a monitoring target, and determining the type of the target insect according to the monitoring target;
and determining a target pest characteristic corresponding to the target pest species, and determining a target trapping device based on the target pest characteristic.
3. The method of claim 1, wherein the sending a first control instruction to a target object to instruct the target object to place the targeted trap in a monitoring area comprises:
Determining the initial number of the target trapping devices according to the area of the monitoring area, and acquiring initial spatial parameters of the target trapping devices, wherein the spatial parameters comprise the arrangement mode of the target trapping devices, a first distance between the target trapping devices and a ridge, a second distance between the target trapping devices and the height of an insect attracting part;
and sending a first control instruction to a target object to instruct the target object to place the target trap in a monitoring area according to the initial quantity and the initial space parameters.
4. The method of claim 1, wherein the sending the first control instruction to the target object comprises:
when a non-target trapping device exists in the monitoring area, determining a minimum distance between the target trapping device and the non-target trapping device, and sending a first control instruction and a third control instruction to a target object, wherein the third control instruction is used for instructing the target object to adjust a third distance between any target trapping device and the non-target trapping device to be not smaller than the minimum distance.
5. The method of claim 1, wherein the determining the target spatial parameter of the target trap based on meteorological data, sending a second control instruction to the target object to instruct the target object to adjust the current spatial parameter of the target trap based on the target spatial parameter, each time a first preset time period elapses, comprises:
Determining target space parameters of the target trapping device based on meteorological data and determining target placement areas and target quantity of the target trapping device based on monitoring area characteristics every time a first preset time period passes;
and sending a second control instruction to the target object to instruct the target object to be in the target placement area, and adjusting the current number and the current space parameters of the target trapping device based on the target number and the target space parameters.
6. The method of claim 5, wherein the determining the target space parameter of the target trap based on meteorological data comprises:
and acquiring meteorological data at the current moment, inquiring a parameter database corresponding to the target trapping device based on the meteorological data to obtain target space parameters, wherein the parameter database stores mapping relations between the meteorological data and the space parameters.
7. The method according to claim 1, wherein the method further comprises:
sending a fourth control instruction to the target object every time a second preset time period passes, wherein the fourth control instruction is used for controlling the target object to acquire and upload an insect attracting part image of the target trapping device;
Analyzing the insect attracting part image, identifying target pests and counting the target pests.
8. A field moth pest monitoring device comprising:
a determining module for determining a target pest species and determining a target trap corresponding to the target pest species;
the first sending module is used for sending a first control instruction to a target object and instructing the target object to place the target trapping device in a monitoring area;
and the second sending module is used for determining the target space parameter of the target trapping device based on meteorological data every time the first preset time period passes, and sending a second control instruction to the target object so as to instruct the target object to adjust the current space parameter of the target trapping device based on the target space parameter.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1-7 when the computer program is executed.
10. A computer readable storage medium having stored thereon a computer program having instructions stored therein, which when run on a computer or processor, cause the computer or processor to perform the steps of the method according to any of claims 1-7.
CN202310977024.1A 2023-08-04 2023-08-04 Method and device for monitoring field moth pests, electronic equipment and storage medium Pending CN117036677A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310977024.1A CN117036677A (en) 2023-08-04 2023-08-04 Method and device for monitoring field moth pests, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310977024.1A CN117036677A (en) 2023-08-04 2023-08-04 Method and device for monitoring field moth pests, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117036677A true CN117036677A (en) 2023-11-10

Family

ID=88625593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310977024.1A Pending CN117036677A (en) 2023-08-04 2023-08-04 Method and device for monitoring field moth pests, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117036677A (en)

Similar Documents

Publication Publication Date Title
US10977494B2 (en) Recognition of weed in a natural environment
US11048940B2 (en) Recognition of weed in a natural environment
CN108073908B (en) Pest identification method and device, computer device and storage medium
JP2019187259A (en) Culture support method, culture support program, culture support device and culture support system
JP2013111078A (en) Method and system for discriminating plant disease, and recording medium for the same
CN107463958A (en) Insect identifies method for early warning and system
CN107742290A (en) Plant disease identifies method for early warning and device
CN111460990A (en) Big data-based alpine pastoral area grassland insect pest monitoring and early warning system and method
CN111753646A (en) Agricultural pest detection and classification method fusing population season growth and elimination information
CN110991222B (en) Object state monitoring and sow oestrus monitoring method, device and system
CN108829762A (en) The Small object recognition methods of view-based access control model and device
CN115661650A (en) Farm management system based on data monitoring of Internet of things
JPWO2020137085A1 (en) Information processing equipment and information processing system
CN115294518B (en) Intelligent monitoring method and system for precise greenhouse cultivation of horticultural plants
CN112465109A (en) Green house controlling means based on cloud limit is in coordination
CN112686862A (en) Pest identification and counting method, system and device and readable storage medium
CN112580671A (en) Automatic detection method and system for multiple development stages of rice ears based on deep learning
CN108874910A (en) The Small object identifying system of view-based access control model
CN113377062B (en) Multifunctional early warning system with disease and pest damage and drought monitoring functions
CN107064159B (en) Device and system for detecting and judging growth trend according to yellow leaves of plants
CN113377141A (en) Artificial intelligence agricultural automatic management system
CN117036677A (en) Method and device for monitoring field moth pests, electronic equipment and storage medium
CN113807143A (en) Crop connected domain identification method and device and operation system
CN115424151A (en) Agricultural intelligent platform based on image processing
CN114651283A (en) Seedling emergence by search function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination