CN117290762A - Insect pest falling-in identification method, type identification method, device, insect trap and system - Google Patents

Insect pest falling-in identification method, type identification method, device, insect trap and system Download PDF

Info

Publication number
CN117290762A
CN117290762A CN202311316016.9A CN202311316016A CN117290762A CN 117290762 A CN117290762 A CN 117290762A CN 202311316016 A CN202311316016 A CN 202311316016A CN 117290762 A CN117290762 A CN 117290762A
Authority
CN
China
Prior art keywords
pest
insect
infrared data
trap
waveform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311316016.9A
Other languages
Chinese (zh)
Other versions
CN117290762B (en
Inventor
马传阳
周慧玲
彭德刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiuxing Xinda Technology Co ltd
Beijing University of Posts and Telecommunications
Original Assignee
Beijing Jiuxing Xinda Technology Co ltd
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiuxing Xinda Technology Co ltd, Beijing University of Posts and Telecommunications filed Critical Beijing Jiuxing Xinda Technology Co ltd
Priority to CN202410281131.5A priority Critical patent/CN118077660A/en
Priority to CN202311316016.9A priority patent/CN117290762B/en
Publication of CN117290762A publication Critical patent/CN117290762A/en
Application granted granted Critical
Publication of CN117290762B publication Critical patent/CN117290762B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M1/00Stationary means for catching or killing insects
    • A01M1/02Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
    • A01M1/026Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects combined with devices for monitoring insect presence, e.g. termites
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Pest Control & Pesticides (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Catching Or Destruction (AREA)

Abstract

The application provides a pest falling identification method, a type identification method, a device, an insect trap and a system, and relates to the technical field of machine learning, wherein the pest falling identification method comprises the following steps: receiving a discrete time sequence of the two channels acquired from within the insect trap; and acquiring a waveform identification type corresponding to the discrete time sequence of the double channels based on the time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall into the current insect trap and controlling the insect trap to acquire images of the pests falling into the current insect trap. The pest that falls into the insect trap can be effectively discerned and distinguish whether the object that falls into the insect trap is the pest, and can effectively improve the accuracy and the reliability that the pest falls into discernment, and can just control the insect trap and carry out timely and effectual image acquisition to the pest that falls into its inside when confirming to fall into the pest, and then can effectively reduce the invalid collection number of times and the frequent start-up number of times of image acquisition equipment in the insect trap, and then can reduce insect trap energy consumption, make the heat be difficult for piling up.

Description

Insect pest falling-in identification method, type identification method, device, insect trap and system
Technical Field
The application relates to the technical field of machine learning, in particular to a pest falling identification method, a type identification method, a device, an insect trap and a system.
Background
The pest in grain storage is one of the important factors for damaging the safe storage of grains, and the pest causes the loss of the quantity and quality of grains and can cause the problem of food safety. At present, the sampling method and the screening method are the detection methods which are commonly used at present and used for evaluating the occurrence of pests, but the labor input is large, the operation is complicated, the detection result has hysteresis, and particularly, the accurate evaluation can be obtained only when the number of the samples and the single sample amount are increased along with the increase of the density of the insect population, which is difficult to achieve in practical management. Based on the above, insect traps such as electronic probe traps in grain piles have become a trend of automatically monitoring pests in grain piles.
At present, an image acquisition device is usually arranged in an insect trap such as an electronic probe trap in a grain pile, an internal image of the insect trap is acquired in real time or at fixed time, and the acquired image is sent to other processing devices, so that the other processing devices store the image, and the type of insect pests in the area where the insect trap is located is identified from the stored image, so that targeted area insect pest disposal is performed.
However, with the existing insect trap, if the image acquisition time interval is too large, it is easy to cause that the insect damage in the area cannot be treated in time. If the image acquisition time interval is too small, the frequent start of the image acquisition equipment can generate larger heat accumulation, so that the energy consumption of the image acquisition equipment can be increased, the reproduction and growth of pests can be accelerated, larger fire safety hazards exist, a plurality of ineffective images of non-pests can be generated, and the efficiency and effectiveness of regional pest disposal can be further reduced.
Based on this, there is a need to design a way to effectively avoid the ineffective image acquisition times and the frequent image acquisition start times in the insect trap.
Disclosure of Invention
In view of this, embodiments of the present application provide pest drop-in identification methods, type identification methods, devices, traps, and systems to obviate or ameliorate one or more of the disadvantages of the prior art.
One aspect of the present application provides a pest drop-in identification method, comprising:
receiving target infrared data which are acquired from the insect trap and contain a double-channel discrete time sequence;
and acquiring a waveform identification type corresponding to the discrete time sequence of the two channels in the target infrared data based on a preset time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall into the pest trap at present and controlling the pest trap to acquire images of the pests falling into the pest trap.
In some embodiments of the present application, the acquiring, based on a preset time series classification model, a waveform identification type corresponding to a discrete time series of two channels in the target infrared data includes:
global feature extraction is carried out on the two-channel discrete time sequence in the target infrared data so as to obtain global features of the target infrared data;
inputting the two-channel discrete time sequence and the global feature into a preset time sequence classification model, so that the time sequence classification model extracts local features corresponding to the two-channel discrete time sequence, and fusing the local features with the global feature to obtain waveform type identification result data of the target infrared data, wherein the waveform type identification result data comprises: probabilities of different waveform identification types including pest detection waveforms and at least one non-pest detection waveform.
The beneficial effect of this technical scheme lies in: the accuracy and the effectiveness of the waveform type identification result data of the target infrared data can be effectively improved by carrying out global feature extraction of the double-channel discrete time sequence, realizing local feature extraction by adopting a time sequence classification model and fusing each local feature with the global feature.
In some embodiments of the present application, the global feature extraction of the discrete time sequence of the two channels in the target infrared data to obtain global features of the target infrared data includes:
respectively carrying out waveform downward translation processing with a minimum value of 0 on the two-channel discrete time sequences in the target infrared data so as to obtain two preprocessed time sequences corresponding to the target infrared data;
based on a preset effective threshold value, effective sampling points are respectively selected from the two preprocessed time sequences to form reaction areas corresponding to the two preprocessed time sequences respectively;
and determining global features of the target infrared data according to the corresponding reaction areas of the two preprocessed time sequences.
The beneficial effect of this technical scheme lies in: the method can effectively improve the effectiveness of global feature extraction by preprocessing the two-channel discrete time sequence and generating a reaction zone, and further can further improve the accuracy and effectiveness of the waveform type identification result data of the infrared data.
In some embodiments of the present application, before the acquiring, based on the preset time series classification model, a waveform identification type corresponding to a discrete time series of the two channels in the target infrared data, the method further includes:
Acquiring each historical infrared data respectively comprising different two-channel discrete time sequences and each corresponding waveform identification type label, wherein the waveform identification type labels comprise labels corresponding to pest detection waveforms and at least one non-pest detection waveform;
global feature extraction is carried out on the two-channel discrete time sequences in each historical infrared data so as to obtain global features corresponding to each historical infrared data;
based on each historical infrared data, the corresponding global feature and the waveform identification type label, training a preset feature extraction and classification network by a learnable time sequence classification algorithm so as to train the feature extraction and classification network into a time sequence classification model for identifying the waveform identification type of the infrared data.
The beneficial effect of this technical scheme lies in: the time sequence classification model is formed by training the feature extraction and classification network, so that the effectiveness and accuracy of waveform identification by adopting the time sequence classification model can be further improved.
In some embodiments of the present application, the global feature extraction of the two-channel discrete time sequence in each of the historical infrared data to obtain global features corresponding to each of the historical infrared data includes:
Performing downward translation processing on the waveform with the minimum value of 0 on the two-channel discrete time sequences in each historical infrared data respectively to obtain two preprocessed time sequences corresponding to each historical infrared data;
based on a preset effective threshold value, effective sampling points are respectively selected from the two preprocessed time sequences corresponding to the historical infrared data respectively so as to form reaction areas of the two preprocessed time sequences corresponding to the historical infrared data respectively;
acquiring at least one sub-reaction zone of each pre-processed time sequence, and selecting one with the largest number of effective sampling points from the sub-reaction zones corresponding to each pre-processed time sequence to be used as the maximum length reaction zone of the pre-processed time sequence;
and determining global features of the historical infrared data according to the maximum length reaction regions of the two preprocessed time sequences corresponding to the historical infrared data.
The beneficial effect of this technical scheme lies in: by preprocessing the training data and generating the reaction area, the reliability and the effectiveness of the training feature extraction and classification network process can be effectively improved.
In some embodiments of the present application, the feature extraction and classification network comprises:
the input layer is used for receiving the two-channel discrete time sequence, the corresponding global feature and the waveform identification type tag in the historical infrared data;
the local feature extraction layer is used for respectively carrying out local feature extraction on the two-channel discrete time sequences in the historical infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences, wherein the local feature extraction layer comprises a plurality of feature extraction blocks;
and the feature fusion layer is used for carrying out activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the historical infrared data so as to obtain probabilities of different waveform identification types corresponding to the historical infrared data, and calculating the loss value of the probabilities of the waveform identification types of the historical infrared data and the different waveform identification types corresponding to the historical infrared data based on a cross entropy loss function.
The beneficial effect of this technical scheme lies in: by designing a specific architecture of the feature extraction and classification network based on a learnable time sequence classification algorithm, the effectiveness and accuracy of the feature extraction and classification network to identify local features can be effectively improved, and the reliability and effectiveness of the training feature extraction and classification network process can be further improved.
In some embodiments of the present application, before the training a preset feature extraction and classification network by the learnable time-series classification algorithm based on each of the historical infrared data, the corresponding global feature and the waveform identification type tag, the method further comprises:
acquiring subsequences with the same length as each feature extraction block in the local feature extraction layer from each historical infrared data based on a sliding window method;
and obtaining a clustering center of each subsequence based on a clustering algorithm, and taking the clustering center as an initial parameter of the local feature extraction layer to finish initializing the local feature extraction layer.
The beneficial effect of this technical scheme lies in: by initializing the local feature extraction layer, the effectiveness and accuracy of feature extraction and classification network identification of local features can be further improved.
In some embodiments of the present application, the time series classification model comprises:
an input layer for receiving a two-channel discrete time sequence and corresponding global features in the target infrared data;
the local feature extraction layer is used for respectively extracting local features of the two-channel discrete time sequences in the target infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences;
And the feature fusion layer is used for carrying out activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the target infrared data so as to obtain different probabilities of the waveform identification types.
The beneficial effect of this technical scheme lies in: by designing a specific architecture of the time sequence classification model based on the learning time sequence classification algorithm, the effectiveness and accuracy of the time sequence classification model in identifying local features can be effectively improved.
In some embodiments of the present application, the non-pest detection waveform comprises: a booklice trigger waveform, a pest repetition trigger waveform and a false trigger waveform.
The beneficial effect of this technical scheme lies in: by designing the pest detection waveform, the booklice trigger waveform, the pest repeated trigger waveform and the false trigger waveform, the accuracy and the reliability of waveform identification type results corresponding to the discrete time sequence of the double channels in the target infrared data can be further improved.
A second aspect of the present application provides a pest type identification method, comprising:
if the pest falling identification method provided by the first aspect of the application knows that the waveform identification type is a pest detection waveform so as to confirm that pests falling into the pest trap currently and control the pest trap to acquire images of the pests falling into the pest trap currently, acquiring pest type identification result data in the pest trap currently based on a preset pest type identification model.
The beneficial effect of this technical scheme lies in: by acquiring the pest type recognition result data in the current insect trap based on the preset pest type recognition model, the degree of automation and the efficiency of pest type recognition can be further realized on the basis that the recognized pests fall into the insect trap, and the timeliness and pertinence of pest disposal of the region where the insect trap is located by a user based on the pest type recognition result can be effectively improved.
In some embodiments of the present application, the acquiring pest type recognition result data in the pest trap based on the preset pest type recognition model includes:
receiving live-action image data of pests falling into the insect trap, wherein the live-action image data is collected by the insect trap;
inputting the live-action image data into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the live-action image data, wherein the pest type recognition model comprises a first convolutional neural network.
The beneficial effect of this technical scheme lies in: by carrying out automatic pest type identification on the live-action image data, the accuracy and efficiency of pest type identification can be effectively improved.
In some embodiments of the present application, the acquiring pest type recognition result data in the pest trap based on the preset pest type recognition model includes:
converting the double-channel discrete time sequence in the target infrared data into a two-dimensional recursive image based on a preset recursive graph algorithm;
inputting the two-dimensional recursive image into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the two-dimensional recursive image data, wherein the pest type recognition model comprises a second convolutional neural network.
The beneficial effect of this technical scheme lies in: by carrying out pest type identification on the two-dimensional recursive image after waveform conversion, convenience and efficiency of pest type identification can be effectively improved.
In some embodiments of the present application, the pest type identification method further includes:
if the pest type recognition model comprises a first convolutional neural network and a second convolutional neural network, wherein the first convolutional neural network is used for recognizing pest type recognition result data corresponding to real image data of the pests which fall into the pest catcher, the second convolutional neural network is used for recognizing pest type recognition result data corresponding to a two-dimensional recursive image obtained by converting a two-channel discrete time sequence in the target infrared data, and iterative optimization is performed on the second convolutional neural network based on the pest type recognition result data corresponding to the real image data output by the first convolutional neural network.
The beneficial effect of this technical scheme lies in: by adopting pest type identification result data corresponding to the live-action image data to carry out iterative optimization on the second convolution neural network, the accuracy and the effectiveness of pest type identification on the two-dimensional recursive image after waveform conversion can be effectively improved.
In some embodiments of the present application, the second convolutional neural network is an innonv 3 network.
The beneficial effect of this technical scheme lies in: by adopting the innonv 3 network, the accuracy of pest type identification by two-dimensional recursive images after waveform conversion can be further improved.
In some embodiments of the present application, the pest type recognition result data includes probabilities of each of the different pest recognition types;
wherein the pest identification type includes: long-angle flat-grain theft, turkish flat-grain theft, saw-grain theft, red-grain theft, hybrid-grain theft, bark beetle, rice weevil and corn weevil.
The beneficial effect of this technical scheme lies in: the method can effectively improve the applicability and comprehensiveness of pest type identification, and further can effectively improve the timeliness and pertinence of pest disposal of the region where the pest catcher is located by a user based on the pest type identification result.
A third aspect of the present application provides a pest drop-in identification device comprising:
the infrared data receiving module is used for receiving target infrared data which is acquired from the insect trap and contains a double-channel discrete time sequence;
the pest falling-in identification and control image acquisition module is used for acquiring a waveform identification type corresponding to a double-channel discrete time sequence in the target infrared data based on a preset time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall in the pest trap at present and controlling the pest trap to acquire images of the pests falling into the pest trap.
A fourth aspect of the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the pest fall-into identification method of the first aspect or the pest type identification method of the second aspect when executing the computer program.
A fifth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the pest-fall-into-recognition method of the first aspect described above or implements the pest-type recognition method of the second aspect described above.
A sixth aspect of the present invention provides an insect trap, where the insect trap is configured to collect target infrared data including a two-channel discrete time sequence therein, and send the target infrared data to a cloud server, so that the cloud server executes the pest fall-in identification method according to the first aspect or implements the pest type identification method according to the second aspect;
the insect trap is also used for collecting images of pests falling into the insect trap when or after receiving the image collection control instruction sent by the cloud server.
The beneficial effect of this technical scheme lies in: can effectively discern and distinguish whether the object that falls into insect trap is the pest, and can effectively improve pest and fall into the accuracy and the reliability of discernment, and can just control insect trap and carry out timely and effectual image acquisition to the pest that falls into its inside when confirming to fall into the pest, and then can effectively reduce the invalid collection number of times and the frequent start-up number of times of image acquisition equipment in the insect trap, and then can reduce insect trap energy consumption, make the heat be difficult for piling up.
In some embodiments of the present application, the insect trap comprises an insect trap body comprising a control module, a trap module, and a monitoring module;
The trapping module comprises a first pipe and an insect collecting funnel arranged in the first pipe, wherein the insect collecting funnel and the first pipe are coaxially arranged, one end of the insect collecting funnel in a first direction is a first end, the other end of the insect collecting funnel is a second end, the diameter of the insect collecting funnel gradually contracts from the first end to the second end, and an insect passing hole is formed in the second end;
the monitoring module comprises a second pipe, and an infrared data acquisition assembly and a insect shooting assembly which are all installed in the second pipe, wherein the second pipe and the first pipe are coaxially arranged and are mutually connected, and the infrared data acquisition assembly and the insect shooting assembly are both in communication connection with the control module; the insect capturing device comprises an insect capturing component, an infrared data acquisition component and a control module, wherein the infrared data acquisition component is used for acquiring target infrared data which contains a double-channel discrete time sequence and is arranged in the insect capturing device according to an instruction of the control module, and the insect capturing component is used for acquiring images of insects falling into the insect capturing device according to the instruction of the control module.
The beneficial effect of this technical scheme lies in: when the insect collector is used, the insect collector is placed in a grain pile, the position of the first pipe is located above the second pipe, insects in the grain pile enter the first pipe and enter the insect collecting funnel, the first infrared receiving pipe can receive the light range emitted by the first infrared transmitting pipe to cover the insect passing hole in the first direction, the insects can enter the first infrared receiving pipe after falling out of the insect passing hole on the insect collecting funnel and can receive the light range emitted by the first infrared transmitting pipe, the control module receives signals acquired by the infrared data acquisition component, and the insect capturing component is controlled to capture images of insects, so that the insect collector can be started when the insect passing hole is dropped by objects, or can be started or not, the insect capturing component is not required to be kept normally open, the energy consumption is reduced, the heat is not easy to accumulate, the beneficial effects on the growth and the propagation of the insects are weakened relative to the existing insect collector, and the probability of safety accidents is also reduced.
In some embodiments of the present application, the infrared data acquisition assembly includes a first infrared transmitting tube and a first infrared receiving tube, the via hole is disposed near the infrared data acquisition assembly, a projection of the via hole in a first plane is located in a range in which the first infrared receiving tube can receive light emitted by the first infrared transmitting tube, the first plane is perpendicular to the first direction, and an axis of the first infrared transmitting tube and an axis of the first infrared receiving tube are both located in the plane, and the first direction is an axial direction of the first tube;
the infrared data acquisition assembly further comprises a second infrared receiving tube and a second infrared transmitting tube which are coaxially arranged, the axis of the second infrared receiving tube and the axis of the second infrared transmitting tube are both located in the first plane, the first infrared transmitting tube and the first infrared receiving tube are coaxially arranged, the axis of the first infrared receiving tube is a first axis, the axis of the second infrared receiving tube is a second axis, the intersection point of the first axis and the second axis is located in the center of the projection of the worm hole in the first plane, and the projection of the worm hole in the first plane is located in the range of light rays which can be received by the second infrared receiving tube and transmitted by the second infrared transmitting tube.
The beneficial effect of this technical scheme lies in: the number of the infrared receiving tubes and the infrared transmitting tubes is increased, so that the infrared monitoring sensitivity can be improved, and further, when an object falls from the worm hole, the control module can timely and accurately receive signals; the intersection point of the first axis and the second axis is located at the center of the projection of the worm hole in the first plane, so that the first infrared receiving tube can receive the light range emitted by the first infrared transmitting tube, and the second infrared receiving tube can receive the projection of the worm hole in the first plane, and further the projection of the worm hole in the first plane cannot deviate to a certain side to cause that a part of the projection is located outside the light range, and finally the problem that the infrared monitoring sensitivity is reduced because an object possibly falls outside the light range when falling from the worm hole is solved.
In some embodiments of the present application, the first axis is perpendicular to the second axis.
The beneficial effect of this technical scheme lies in: the range that first infrared receiving tube can receive the light scope that first infrared transmitting tube launched with the second infrared receiving tube can receive the light scope cross overlap of second infrared transmitting tube is great, makes the object that drops from the worm hole be difficult for dropping outside the light scope, has further improved infrared monitoring sensitivity.
In some embodiments of the present application, a distance between the first infrared emitting tube and the projection of the via in a first plane is greater than a distance between the first infrared receiving tube and the projection of the via in the first plane, and a distance between the second infrared emitting tube and the projection of the via in the first plane is greater than a distance between the second infrared receiving tube and the projection of the via in the first plane.
The beneficial effect of this technical scheme lies in: because the light rays emitted by the infrared emission tubes are distributed in a conical shape, the light ray coverage range is larger as the distance from the infrared emission tubes is longer, so that the projection of the worm hole in the first plane is relatively far from the distance between the infrared emission tubes, and the projection of the light rays emitted by the infrared emission tubes can be easily covered; the larger the distance between the infrared receiving tube and the infrared transmitting tube is, the smaller the range that the light emitted by the infrared transmitting tube can directly irradiate the infrared receiving tube is, so that the infrared receiving tube is closer to the projection setting of the worm hole in the first plane, and the projection is easier to fall in the light range that the infrared receiving tube can receive.
In some embodiments of the present application, the insect shooting assembly comprises a camera and a shooting barrel, a shooting bin is formed in the shooting barrel, a shooting bin inlet is formed in one end of the shooting barrel in the first direction, a shooting bin outlet is formed in the other end of the shooting barrel, the insect passing hole is communicated with the shooting bin inlet, the infrared data acquisition assembly is located between the insect collecting funnel and the shooting bin inlet in the first direction, the camera is close to the shooting bin inlet, the camera is fixed to the inner wall of the shooting bin, and the camera is connected with the control module in a communication mode so as to shoot in the shooting bin.
The beneficial effect of this technical scheme lies in: when the insect falls out of the insect passing hole to trigger the infrared data acquisition component, the control module receives signals of the infrared data acquisition component, the camera is controlled to shoot in the shooting bin by the object in the falling process in the shooting bin, and the camera is provided with sufficient shooting time by the object in the stopping process in the shooting bin.
In some embodiments of the present application, the shooting pot is tapered in diameter from the shooting pot inlet to the shooting pot outlet.
The beneficial effect of this technical scheme lies in: because from shooting storehouse entry to shooting storehouse export shooting storehouse's diameter reduces gradually, the object can be with shooting storehouse's lateral wall contact after falling into shooting storehouse and before falling out from shooting storehouse export, makes the whereabouts speed slow down under the effect of frictional force, gives the abundant shooting time of camera.
In some embodiments of the present application, the monitoring module further comprises a cleaning assembly installed in the second pipe, the cleaning assembly comprises a shell, an insect carrying platform and a driving piece, the shell is connected with the shooting barrel, the driving piece is installed in the shell, the insect carrying platform is installed in the driving piece, the driving piece is used for driving the insect carrying platform to reciprocate between a first position and a second position, the insect carrying platform is matched with the shooting bin outlet in the first position, a gap is formed between the insect carrying platform and the shooting bin outlet in the second position, and the driving piece is in communication connection with the control module.
The beneficial effect of this technical scheme lies in: after the camera finishes shooting the object falling into the shooting bin, the control module can control the driving piece to start and drive the insect carrying platform to move from the first position to the second position, so that the object can fall out from the shooting bin outlet, and before the next round of shooting the object in the shooting bin, the control module starts the driving piece again and drives the insect carrying platform to move from the second position to the first position, so that automatic cleaning is realized, and the whole structure is relatively simple and the labor cost is reduced. The shell is connected with the shooting bin through screws and nuts.
In some embodiments of the present application, the driving member includes a driving member body and a first output shaft mounted to the driving member body, the driving member is a rotary driving member, the first output shaft and the second pipe are coaxially arranged, the cleaning assembly further includes a threaded rod mounted to the first output shaft, a threaded hole is formed in the housing, and the threaded rod is matched with the threaded hole.
The beneficial effect of this technical scheme lies in: when the first output shaft of the driving piece rotates, the threaded rod is driven to rotate, and due to the fact that the threaded rod is matched with the threaded hole in the shell, the threaded rod, the driving piece and the insect carrying platform are driven to move in the first direction through rotation of the threaded rod, the rotation direction of the first output shaft is changed to drive the rotation direction of the threaded rod to change, and then the corresponding driving piece and the insect carrying platform also change the movement direction in the first direction.
In some embodiments of the present application, the cleaning assembly further comprises a trigger piece, and a first limit switch and a second limit switch which are both mounted on the housing, wherein the first limit switch and the second limit switch are both used for being in communication connection with the control module, the trigger piece is mounted on the driving piece body, and the trigger piece is used for moving along with the insect carrying platform, so that the trigger piece contacts with the first limit switch when the insect carrying platform is in a first position, and the trigger piece contacts with the second limit switch when the insect carrying platform is in a second position.
The beneficial effect of this technical scheme lies in: when the insect-carrying platform moves to the first position, the limiting piece triggers the first limiting switch to enable the movement of the insect-carrying platform towards the shooting bin to be automatically stopped, and when the insect-carrying platform moves to the second position, the limiting piece triggers the second limiting switch to enable the movement of the insect-carrying platform away from the shooting bin to be automatically stopped.
In some embodiments of the present application, the driving member further includes a second output shaft, the first output shaft and the second output shaft are coaxially disposed, and the first output shaft and the second output shaft are located at opposite ends of the driving member body in the first direction, and the insect-carrying platform is fixed to the second output shaft.
The beneficial effect of this technical scheme lies in: the first output shaft and the second output shaft are simultaneously driven by the driving piece body, after shooting operation of objects in the shooting bin is completed, the control module starts the driving piece, the driving piece drives the first output shaft and the second output shaft to rotate, on one hand, the insect carrying platform is enabled to move from the first position to the second position, the other hand drives the insect carrying platform to rotate, objects falling onto the insect carrying platform are thrown away from the insect carrying platform under the action of centrifugal force, and after the objects on the insect carrying platform are thrown away, the control module enables the driving piece to drive the first output shaft and the second output shaft to change the rotation direction, and enables the insect carrying platform to move from the second position to the first position.
In some embodiments of the present application, the insect-carrying platform includes the top surface towards shooting the storehouse setting be formed with first bead and second bead on the top surface, the length direction of first bead with the length direction of second bead is all parallel to the top surface, just first bead with the second bead is perpendicular alternately first bead with the intersection department of second bead still is formed with the convex cone.
In some embodiments of the present application, the insect-catching body further comprises a collecting tube coaxially arranged with the first tube, the collecting tube is detachably connected with one end of the second tube away from the first tube, and a collecting cavity communicated with the first tube is formed in the collecting tube.
The beneficial effect of this technical scheme lies in: when the insect trap is used for a period of time, the collecting pipe is separated from the second pipe, and then the object in the collecting cavity can be poured out.
In some embodiments of the application, the insect catching body further comprises a wiring tube coaxially arranged with the first tube, one end of the wiring tube extends to one end of the first tube away from the second tube, the other end of the wiring tube extends to an insect collecting funnel, a wire passing hole is formed in the insect collecting funnel, and the wiring tube is communicated with the wire passing hole.
The beneficial effect of this technical scheme lies in: the wiring pipe is arranged to facilitate wiring in the insect catching body, so that the wires are not easy to interfere with all elements in the insect catching body, and the wires are not easy to block the movement of objects such as insects in the insect catching body.
A seventh aspect of the present application provides a pest drop-in identification system comprising: a cloud server in communication with each other and the insect trap provided in the sixth aspect of the present application;
the cloud server is used for executing the pest falling identification method provided in the first aspect of the application or executing the pest type identification method provided in the second aspect of the application.
The beneficial effect of this technical scheme lies in: can effectively discern and distinguish whether the object that falls into insect trap is the pest, and can effectively improve pest and fall into the accuracy and the reliability of discernment, and can just control insect trap and carry out timely and effectual image acquisition to the pest that falls into its inside when confirming to fall into the pest, and then can effectively reduce the invalid collection number of times and the frequent start-up number of times of image acquisition equipment in the insect trap, and then can reduce insect trap energy consumption, make the heat be difficult for piling up.
According to the pest falling into identification method, target infrared data which are collected from the interior of the insect trap and contain a double-channel discrete time sequence are received; the method comprises the steps of acquiring a waveform identification type corresponding to a double-channel discrete time sequence in target infrared data based on a preset time sequence classification model, if the waveform identification type is a pest detection waveform, confirming that pests fall into the pest trap at present and controlling the pest trap to conduct image acquisition on the pests falling into the pest trap at present, effectively identifying and distinguishing whether the objects falling into the pest trap are pests or not, effectively improving accuracy and reliability of pest falling identification, controlling the pest trap to conduct timely and effective image acquisition on the pests falling into the pest trap when confirming that the pests fall into the pest trap, further effectively reducing ineffective acquisition times and frequent starting times of image acquisition equipment in the pest trap, further reducing energy consumption of the pest trap, enabling heat to be not prone to accumulating, weakening beneficial effects on growth and propagation of the pests compared with the existing pest trap, and reducing occurrence probability of safety accidents.
Additional advantages, objects, and features of the application will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present application are not limited to the above-detailed description, and that the above and other objects that can be achieved with the present application will be more clearly understood from the following detailed description.
Drawings
The accompanying drawings are included to provide a further understanding of the application, and are incorporated in and constitute a part of this application. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the application. Corresponding parts in the drawings may be exaggerated, i.e. made larger relative to other parts in an exemplary device actually manufactured according to the present application, for convenience in showing and describing some parts of the present application. In the drawings:
fig. 1 is a schematic diagram of a first flow chart of a pest drop identification method in an embodiment of the present application.
Fig. 2 is a schematic diagram of a second flow chart of a pest drop identification method in an embodiment of the present application.
Fig. 3 is a schematic diagram of a second flow chart of a pest drop identification method in an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating an example network architecture of a feature extraction and classification network according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a first flow chart of a pest type identification method in an embodiment of the present application.
Fig. 6 is a schematic diagram of a second flow chart of a pest type identification method in an embodiment of the present application.
Fig. 7 is a schematic diagram of a third flow chart of a pest type identification method in an embodiment of the present application.
Fig. 8 is a schematic diagram illustrating a flow of converting raw data into a two-dimensional image using a recursive graph algorithm in an embodiment of the present application.
FIG. 9 is an exemplary graph of H (Z) as a function of E (0 < 0.24) in the examples of the present application.
Fig. 10 is a schematic diagram of a fourth flow chart of a pest type identification method in the embodiment of the present application.
Fig. 11 (a) is a schematic diagram illustrating a first structure of the acceptance module in the embodiment of the application.
Fig. 11 (b) is a schematic diagram illustrating a second structure of the acceptance module in the embodiment of the application.
Fig. 11 (c) is a schematic diagram illustrating a third structure of the acceptance module in the embodiment of the application.
Fig. 12 is a schematic diagram of a pest drop-in identification device in an embodiment of the present application.
Fig. 13 is a schematic structural view of a pest type recognition device in the embodiment of the present application.
Fig. 14 is a schematic perspective view of an embodiment of an insect trap according to the present application.
Fig. 15 is a schematic front view of an embodiment of an insect capturing body according to the present application.
Fig. 16 is a schematic cross-sectional view of the structure at A-A in fig. 15.
Fig. 17 is a partially enlarged schematic view at B in fig. 16.
Fig. 18 is a schematic view showing a partial perspective structure of an embodiment of an insect trap according to the embodiment of the present application.
Fig. 19 is a schematic view of a portion of a left-hand configuration of an embodiment of an insect trap provided in an example of the present application.
Fig. 20 is a schematic view of a partial top view of an embodiment of an insect trap provided in an example of the present application.
Fig. 21 to 23 are schematic top view of a portion of an infrared data collection assembly according to an embodiment of the present disclosure.
Fig. 24 is a schematic view showing a partial perspective structure of an embodiment of an insect trap according to the present embodiment.
Fig. 25 is a schematic view of a portion of a left-hand configuration of an embodiment of an insect trap provided in an example of the present application.
FIG. 26 is a schematic cross-sectional view of the structure at C-C in FIG. 25.
Fig. 27 is a schematic top view of an embodiment of an insect-carrying platform.
Fig. 28 is a schematic diagram of a connection relationship between pest falling into a recognition system according to an embodiment of the present application.
Fig. 29 is a functional block diagram of a main control circuit board provided by an application example of the present application.
Fig. 30 is a diagram showing an example of pest voltage sequences provided in the application example of the present application.
Fig. 31 is a graph showing an example of a voltage sequence of grain detritus provided in the application example of the present application.
Fig. 32 is a schematic diagram of a pest drop-in recognition algorithm workflow provided by an example application of the present application.
Fig. 33 is an exemplary diagram of pest identification waveforms provided in an application example of the present application.
Fig. 34 (a) is an exemplary diagram of a booklice trigger waveform triggered by booklices among non-pest identification waveforms provided in the application example of the present application.
Fig. 34 (b) is an exemplary diagram of pest repetition trigger waveforms repeatedly triggered by pests staying in a detection area among non-pest identification waveforms provided as an application example of the present application.
Fig. 34 (c) is an example diagram of a first false trigger waveform triggered by two objects that fall in succession in a non-pest identification waveform provided in the application example of the present application.
Fig. 34 (d) is an example diagram of a second false trigger waveform generated by a malfunction of the apparatus or a large shake in the non-pest identification waveform provided in the application example of the present application.
Fig. 35 is a schematic waveform diagram of the retained data in the artificial cleaning example of pest species identification data set data provided in the application example of the present application.
Fig. 36 is a schematic waveform diagram of data to be cleaned in an example of manual cleaning of pest species identification data set data provided in the application example of the present application.
Reference numerals:
100-insect catching body;
110-a trapping module;
111-a first tube;
111 a-a cap section;
111 b-a trap section;
111 ba-insect trap wells;
112-an insect collecting funnel;
112 a-a via;
112 b-via holes;
112b' -projection;
113-passing the insect tube;
114-a cable transition plate;
115-light trap; 120-a monitoring module;
121-a second tube;
122-shooting barrel;
122 a-shooting bins;
122 b-shooting bin entrance;
122 c-shooting bin outlet;
123-cleaning assembly;
123 a-an insect-carrying platform;
123 aa-first rib;
123 ab-convex cone;
123 ac-second rib;
123 b-a housing;
123 ba-routing holes;
123 c-a threaded rod;
123 d-driving member;
124-an infrared data acquisition assembly;
124 a-a first infrared receiving tube;
124 b-a first infrared emitting tube;
124 c-an infrared data acquisition circuit board;
124 d-a second infrared receiving tube;
124 e-a second infrared emitting tube;
125-camera;
130-collecting tube;
140-wiring pipes;
150-a control module;
200-cables;
300-wireless communication device;
400-glan.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail with reference to the embodiments and the accompanying drawings. The exemplary embodiments of the present application and their descriptions are used herein to explain the present application, but are not intended to be limiting of the present application.
It should be noted here that, in order to avoid obscuring the present application due to unnecessary details, only structures and/or processing steps closely related to the solution according to the present application are shown in the drawings, while other details not greatly related to the present application are omitted.
It should be emphasized that the term "comprises/comprising" when used herein is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
It is also noted herein that the term "coupled" may refer to not only a direct connection, but also an indirect connection in which an intermediate is present, unless otherwise specified.
Hereinafter, embodiments of the present application will be described with reference to the drawings. In the drawings, the same reference numerals represent the same or similar components, or the same or similar steps.
In order to design a manner capable of effectively avoiding invalid image acquisition times and image acquisition frequent start times in an insect trap, the embodiments of the present application provide a pest fall-in identification method, a pest type identification method, a pest fall-in identification device for executing the pest fall-in identification method, a pest type identification device for executing the pest type identification method, a physical device, a computer readable storage medium, an insect trap and a pest fall-in identification system, which can effectively identify and distinguish whether an object falling into the insect trap is a pest, can effectively improve accuracy and reliability of pest fall-in identification, and can control the insect trap to timely and effectively acquire images of pests falling into the insect trap when confirming that the pests fall into the insect trap, so that the invalid acquisition times and frequent start times of the image acquisition device in the insect trap can be effectively reduced, and further energy consumption of the insect trap can be reduced, and heat is not easy to accumulate.
The following examples are provided to illustrate the invention in more detail.
Based on this, the embodiment of the present application provides a pest drop-in recognition method that can be implemented by a pest drop-in recognition device, referring to fig. 1, the pest drop-in recognition method specifically includes the following contents:
step 1000: and receiving target infrared data which is currently acquired from the insect trap and contains a double-channel discrete time sequence.
In one or more embodiments of the present application, the target infrared data including the two-channel discrete time sequence acquired from within the insect trap may be acquired by an infrared data acquisition component disposed within the insect trap, and then the acquired two-channel discrete time sequence is transmitted to the pest drop identification device through the communication module by the control module within the insect trap, where the pest drop identification device may be specifically implemented in a cloud server.
It can be understood that the target infrared data may be simply referred to as target raw data, where the waveform of the target raw data is composed of two sets of related discrete time sequences, that is, the target infrared data includes a two-channel discrete time sequence, and the discrete time sequence specifically refers to a voltage sequence that is continuous with time.
In one example of the present application, 128 may be set for each time series length, the time series length selected to be related to the ADC sampling frequency. According to experiments, when the sampling evaluation rate is set to 10KHz, the time sequence length is set to 128, the capturing time is 12.8ms, the complete dropping process (usually less than 10 ms) of the pests can be captured, and the time sequence of 128 points is selected to be favorable for storing data under the condition that the dropping details of the pests can be recorded. The total length of the corresponding two-channel discrete time sequence is 256.
Step 2000: and acquiring a waveform identification type corresponding to the discrete time sequence of the two channels in the target infrared data based on a preset time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall into the pest trap at present and controlling the pest trap to acquire images of the pests falling into the pest trap.
In step 2000, the time-series classification model may be specifically implemented by using a feature extraction and classification network trained by a learnable time-series classification algorithm. The core of the Shaplets algorithm is to find a high information subsequence existing in the time sequence data set, which is called shaplet. The biggest difference between the learning shapelets algorithm and the traditional shapelets algorithm is that the shapelets with the best time sequence classification effect can be automatically searched for in an iterative mode.
The execution basis of step 1000 is that the insect swatter assembly in the insect trap is currently in a dormant or off state. Only when the waveform identification type corresponding to the two-channel discrete time sequence is identified as the pest detection waveform by the step 2000, the pest falling identification device sends a corresponding notification message to the pest catcher, so that the controller of the pest falling identification device sends a starting instruction to the pest shooting component in the pest catcher after identifying that the waveform identification type corresponding to the two-channel discrete time sequence displayed by the notification message is the pest detection waveform, so that the pest shooting component performs image acquisition on the pest falling into the pest catcher, and sends acquired live-action image data to equipment for storing and displaying the live-action image data, wherein the equipment can be used for executing the pest falling identification method to fall into the pest identification device and can also be used for sending to other equipment with data processing and displaying functions.
From the above description, it can be known that the pest drop-in identification method provided by the embodiment of the application can effectively identify and distinguish whether the object dropped into the pest catcher is a pest, and can effectively improve the accuracy and reliability of pest drop-in identification, and can control the pest catcher to timely and effectively collect images of the pest dropped into the pest catcher when confirming the pest drop-in, so that the ineffective collection times and frequent start times of the image collection device in the pest catcher can be effectively reduced, further the energy consumption of the pest catcher can be reduced, heat is not easy to accumulate, the beneficial effects on growth and propagation of the pest are weakened relative to the existing pest catcher, and the probability of safety accident occurrence is also reduced.
In order to further improve accuracy and effectiveness of the waveform type identification result data of the target infrared data, in the pest falling identification method provided in the embodiment of the present application, referring to fig. 2, step 2000 of the pest falling identification method specifically includes the following:
step 2100: and carrying out global feature extraction on the two-channel discrete time sequence in the target infrared data to obtain global features of the target infrared data.
Step 2200: inputting the two-channel discrete time sequence and the global feature into a preset time sequence classification model, so that the time sequence classification model extracts local features corresponding to the two-channel discrete time sequence, and fusing the local features with the global feature to obtain waveform type identification result data of the target infrared data, wherein the waveform type identification result data comprises: probabilities of different waveform identification types including pest detection waveforms and at least one non-pest detection waveform.
Step 2300: if the waveform type identification result data shows that the waveform identification type of the target infrared data is a pest detection waveform, confirming that pests fall into the pest trap at present and controlling the pest trap to collect images of the pests falling into the pest trap.
As can be seen from the above description, according to the pest drop-in identification method provided by the embodiment of the application, by performing global feature extraction of the two-channel discrete time sequence and implementing local feature extraction by adopting the time sequence classification model, and fusing each local feature with the global feature, accuracy and effectiveness of waveform type identification result data of target infrared data can be effectively improved.
In order to further improve accuracy and effectiveness of global feature extraction, in a pest falling identification method provided in the embodiment of the present application, referring to fig. 3, step 2100 in the pest falling identification method specifically includes the following:
step 2110: and respectively carrying out waveform downward translation processing with a minimum value of 0 on the two-channel discrete time sequences in the target infrared data so as to obtain two preprocessed time sequences corresponding to the target infrared data.
Step 2120: and respectively selecting effective sampling points from the two pre-processed time sequences based on a preset effective threshold value to form reaction areas corresponding to the two pre-processed time sequences.
In step 2120, the effective threshold may be equal to a first percentage of the largest sample value in the post-preprocessing time series, which may be set between 5% and 50%, preferably 20%.
Specifically, for the post-preprocessing time series, since the 128 sampling points included therein are continuous time samples, only a part of the sampling points correspond to the case where there is an object falling. And when an object falls through the detection area, the sampling value rises firstly and then falls, and finally the sampling value is in a gentle trend. In order to concentrate the features extracted by the algorithm on the condition that an object falls, in one example of the application, sampling points with sampling values greater than 20% of the maximum value in the time sequence are regarded as effective sampling points which are more required to be focused, and the set of the effective sampling points is defined as the reaction area.
Step 2130: and determining global features of the target infrared data according to the corresponding reaction areas of the two preprocessed time sequences.
In the case that only one pest drops through the detection area, there is only one reaction area, but for various waveforms of the actual bin, there may be more than one reaction area. To obtain the number of reaction zones in different waveforms, a set omega of all subsets of the satisfied conditions (hereinafter referred to as sub-reaction zones) of reaction zone M is introduced. For the case of only one reaction zone Ω= { M }, based on this, in one or more embodiments of the present application, the global features of the target infrared data may include: and the maximum sampling value in the pre-processed time sequence, the sub-reaction areas corresponding to the two pre-processed time sequences and the result of the fast Fourier transform FFT (fast Fourier transform) of the pre-processed time sequence.
As can be seen from the above description, according to the pest drop-in identification method provided by the embodiment of the application, the validity of global feature extraction can be effectively improved by preprocessing the two-channel discrete time sequence and generating the reaction region, so that the accuracy and the validity of the waveform type identification result data of the infrared data can be further improved.
In order to further improve the effectiveness and accuracy of waveform identification by using the time series classification model, in the pest falling into the identification method provided in the embodiment of the present application, referring to fig. 2, before the pest falling into step 1000 or step 2000 in the identification method, the following contents are specifically included:
step 0100: and acquiring each historical infrared data respectively comprising different two-channel discrete time sequences and each corresponding waveform identification type label, wherein the waveform identification type labels comprise labels corresponding to pest detection waveforms and at least one non-pest detection waveform.
In step 0100, in order to further improve accuracy and reliability of a waveform identification type result corresponding to the two-channel discrete time sequence in the target infrared data, the non-pest detection waveform includes: a booklice trigger waveform, a pest repetition trigger waveform and a false trigger waveform.
Step 0200: and carrying out global feature extraction on the two-channel discrete time sequences in each historical infrared data to obtain global features corresponding to each historical infrared data.
Step 0300: based on each historical infrared data, the corresponding global feature and the waveform identification type label, training a preset feature extraction and classification network by a learnable time sequence classification algorithm so as to train the feature extraction and classification network into a time sequence classification model for identifying the waveform identification type of the infrared data.
From the above description, it can be seen that the pest drop-in identification method provided by the embodiment of the application makes the pest drop-in identification method become a time sequence classification model by training the feature extraction and classification network, so that the effectiveness and accuracy of waveform identification by adopting the time sequence classification model can be further improved.
In order to improve the reliability and effectiveness of the training feature extraction and classification network process, in the pest drop-in identification method provided in the embodiment of the present application, referring to fig. 3, step 0200 in the pest drop-in identification method specifically includes the following contents:
step 0210: and respectively carrying out waveform downward translation processing with a minimum value of 0 on the two-channel discrete time sequences in each historical infrared data so as to obtain two preprocessed time sequences corresponding to each historical infrared data.
In step 0210, the historical infrared data can be written as an original data waveform, which consists of two sets of correlated discrete time sequences, each having a length of 128. In the embodiment of the application, the time sequence features of a single group are calculated first, and then the two groups of features form the global feature of one piece of data. Defining an original time sequence of any group as f (K), K epsilon K; wherein K is a set of sampling points K, K= { k|k epsilon Z, 0.ltoreq.k.ltoreq.127 }; the time series is preprocessed, and the time series is translated downwards to be the minimum value of the time series is 0, so that the preprocessed time series F (k) =f (k) -min [ F (k) ].
Step 0220: based on a preset effective threshold value, effective sampling points are respectively selected from the two preprocessed time sequences corresponding to the historical infrared data respectively, so that reaction areas of the two preprocessed time sequences corresponding to the historical infrared data respectively are formed.
In step 0220, for time series F (k), since the 128 sampling points contained therein are consecutive time samples, only a part of the sampling points correspond to a case where there is an object falling. And when an object falls through the detection area, the sampling value rises firstly and then falls, and finally the sampling value is in a gentle trend. In order to concentrate the features extracted by the algorithm on the condition that an object falls, the application regards sampling points with sampling values larger than 20% of the maximum value in the time sequence as effective sampling points t which are more required to be focused, and the set of the effective sampling points t is defined as a reaction area M= { t|t epsilon K, F (t) > 0.2×max [ F (K) ].
Step 0230: and acquiring at least one sub-reaction zone of each pre-processed time sequence, and selecting one with the largest effective sampling point number from the sub-reaction zones corresponding to each pre-processed time sequence as the maximum length reaction zone of the pre-processed time sequence.
Specifically, in the case where only one pest drops through the detection area, there is only one reaction area, but for various waveforms of the actual bin, there may be more than one reaction area. To obtain the number of reaction zones in different waveforms, a set omega of all subsets of the satisfied conditions (hereinafter referred to as sub-reaction zones) of reaction zone M is introduced. For the case of only one reaction zone Ω= { M }, else Ω is calculated by the following equation (1):
the number of sampling points contained in the reaction zone is defined as the length of the reaction zone. For the sub-reaction zone in Ω, the one with the largest length is denoted G, and there is:
i denotes an index, ai and ai+1 denote different subsets of M, card (A i ) The number of elements in Ai is represented.
Step 0240: and determining global features of the historical infrared data according to the maximum length reaction regions of the two preprocessed time sequences corresponding to the historical infrared data.
After the above calculation, the global features of the time series are shown in table 1:
TABLE 1
In Table 1, max [ F (k) ] represents the maximum value in F (k); the card (Ω) represents the number of elements in Ω; f (i) represents the amplitude of a sampling point with index i in the sequence; min (G) represents the minimum value of the elements in G; max (G) represents the maximum value of the elements in G; the FFT [ F (k) ] represents the FFT transformation result of the sequence F (k), and since the energy of the class 4 waveform is mainly concentrated at low frequency, only the magnitudes corresponding to 8 frequency points near the frequency zero point in the FFT transformation result are used as the characteristics.
From the above description, it can be seen that the pest drop-in identification method provided by the embodiment of the present application can effectively improve the reliability and effectiveness of the training feature extraction and classification network process by preprocessing the training data and generating the reaction area.
In order to further improve the reliability and effectiveness of the training feature extraction and classification network process, in the pest falling identification method provided in the embodiment of the application, the feature extraction and classification network specifically includes the following contents:
the input layer, the local feature extraction layer and the feature fusion layer are sequentially connected.
The input layer is used for receiving the two-channel discrete time sequence, the corresponding global characteristic and the waveform identification type label in the historical infrared data;
The local feature extraction layer is used for respectively carrying out local feature extraction on the two-channel discrete time sequences in the historical infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences, wherein the local feature extraction layer comprises a plurality of feature extraction blocks;
the feature fusion layer is used for performing activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the historical infrared data to obtain probabilities of different waveform identification types corresponding to the historical infrared data, and calculating loss values of the probabilities of the waveform identification types of the historical infrared data and the different waveform identification types corresponding to the historical infrared data based on a cross entropy loss function.
In particular, the local feature extraction layer may be abbreviated as soft min layer. Referring to fig. 4, a network architecture of the feature extraction and classification network is used to implement local feature extraction of the data waveform, and the local feature and the global feature of the data waveform are fused in the feature extraction and classification network, so as to implement waveform classification.
As shown in fig. 4, the input to the feature extraction and classification network is raw time-series waveform data of length Q of 256. Wherein the Soft min layer uses a learning shape algorithm to extract local features of the input data. In the conventional shapelets algorithm, the distance d from each shapelet to the original time sequence is calculated Sm,n As a feature, the distance is calculated as follows:
in formula (3), TSm represents the mth sequence in the dataset, sn represents the nth shape found, and L is the length of the shape, however this distance is not trivial to Sn. j represents an index; l represents an index; TS (transport stream) m,j+l-1 Representing j+l-1 elements in the mth sequence in the dataset; s is S n,l An nth element representing an nth shape, m representing an index; n represents an index.
While a leachable Shaplets algorithm uses a minimalist soft min distance estimate d Sm,n Alpha= -100 in the formula (4) is obtained through experiments and can accurately estimate d Sm,n The parameter value of (2) can be calculated from the following formula (4):
wherein:
wherein soft min dist represents the minimum distance that the shaplet can make to the original time series, D m,n,j Representing the j-th subsequence truncated by the m-th subsequence in the data set, the euclidean distance to the n-th shape,representing the calculation of the intermediate quantity, j' representing the index, < > >Representing the calculation of the intermediate quantity.
Each block in the soft min layer will output it to the soft min dist between the input time series as a local feature of the input series. The number of blocks in the Soft min layer and the length of each block are settable super parameters, and since the original time sequence length is h=256, the lengths of the blocks in the Soft min layer are set to be 0.25H,0.5H and 0.75H respectively, and each length of the blocks is set to be 0.2h=51. The feature vector dimension of the soft min layer output is (1, 51x 3). The feature vector output by the Soft min layer is changed into (1, 24) after passing through a full-connection layer, and is spliced with the global feature of the input sequence calculated in fig. 4, and the dimension of the fusion feature vector after splicing is (1, 52). The features all use a sigmoid activation function before stitching. The fusion characteristics are converted into probabilities of four waveform categories through a full connection layer and a sigmoid activation function, and loss values are calculated with real category labels of input data by using a cross entropy loss function.
As can be seen from the above description, according to the pest drop-in identification method provided by the embodiment of the application, by designing a specific architecture of the feature extraction and classification network based on the learnable time sequence classification algorithm, the effectiveness and accuracy of feature extraction and classification network for identifying local features can be effectively improved, and the reliability and effectiveness of the training feature extraction and classification network process can be further improved.
In order to further improve the effectiveness and accuracy of feature extraction and classification network recognition of local features, in the pest falling-into-recognition method provided in the embodiment of the present application, referring to fig. 3, between step 0200 and step 0300 in the pest falling-into-recognition method, the following contents are specifically included:
step 0250: and acquiring subsequences with the same length as each feature extraction block in the local feature extraction layer from each historical infrared data based on a sliding window method.
Step 0260: and obtaining a clustering center of each subsequence based on a clustering algorithm, and taking the clustering center as an initial parameter of the local feature extraction layer to finish initializing the local feature extraction layer.
From the above description, it can be seen that by initializing the local feature extraction layer, the method for identifying the pest falls into the local feature by using the feature extraction and classification network can further improve the effectiveness and accuracy of identifying the local feature.
In order to further improve the application reliability and effectiveness of the time series classification model, in the pest falling identification method provided by the embodiment of the application, the time series classification model specifically includes the following contents:
An input layer for receiving a two-channel discrete time sequence and corresponding global features in the target infrared data;
the local feature extraction layer is used for respectively extracting local features of the two-channel discrete time sequences in the target infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences;
and the feature fusion layer is used for carrying out activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the target infrared data so as to obtain different probabilities of the waveform identification types.
It is understood that the network architecture of the time series classification model may be the same as the feature extraction and classification network, wherein the feature fusion layer of the time series classification model does not need to calculate a loss value of probability of waveform identification type and label.
From the above description, it can be seen that the pest drop-in identification method provided by the embodiment of the application can effectively improve the effectiveness and accuracy of identifying local features by designing a specific architecture of a time sequence classification model based on a learnable time sequence classification algorithm.
The present application also provides an embodiment of a pest type recognition method that can be implemented by a pest type recognition device, referring to fig. 5, the pest type recognition method specifically includes the following contents:
step 3000: acquiring a notification message of the pest detection waveform of the waveform identification type corresponding to the target infrared data;
in step 3000, the pest type recognition device may acquire, from other devices, a notification message that the waveform recognition type corresponding to the target infrared data is a pest detection waveform, or may be disposed in the same cloud server as the pest drop recognition device, and if the pest drop is the same, the pest type recognition device may directly determine whether to trigger step 4000 according to the recognition result of the pest drop recognition method in the foregoing embodiment.
Step 4000: and acquiring pest type identification result data in the current insect trap based on a preset pest type identification model.
For example, if the pest falling identification method provided in the foregoing embodiment of the present application knows that the waveform identification type is a pest detection waveform to confirm that the pest falling into the current pest trap and control the pest trap to perform image acquisition on the pest falling into the current pest trap, step 4000 obtains pest type identification result data in the current pest trap based on a preset pest type identification model.
As can be seen from the above description, according to the pest type identification method provided by the embodiment of the present application, by acquiring pest type identification result data in the current insect trap based on the preset pest type identification model, the automation degree and efficiency of pest type identification can be further realized on the basis that the identified pests fall into the insect trap, and thus the timeliness and pertinence of pest treatment of the region where the insect trap is located by a user based on the pest type identification result can be effectively improved.
In order to improve accuracy and efficiency of pest type identification, in the pest type identification method provided in the present application, referring to fig. 6, step 4000 of the pest type identification method may include the following:
step 4100: and receiving live-action image data of the insect falling into the insect trap, wherein the live-action image data is collected by the insect trap.
Step 4200: inputting the live-action image data into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the live-action image data, wherein the pest type recognition model comprises a first convolutional neural network.
From the above description, it can be seen that the pest type identification method provided by the embodiment of the application can effectively improve accuracy and efficiency of pest type identification by performing automated pest type identification on live-action image data.
In order to improve accuracy and efficiency of pest type identification, in the pest type identification method provided in the present application, referring to fig. 7, step 4000 in the pest type identification method may further include the following:
step 4300: and converting the double-channel discrete time sequence in the target infrared data into a two-dimensional recursive image based on a preset recursive graph algorithm.
Step 4400: inputting the two-dimensional recursive image into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the two-dimensional recursive image data, wherein the pest type recognition model comprises a second convolutional neural network.
Specifically, aiming at the problem that the original one-dimensional time series data generated by trapping pests by the infrared electronic probe trap contains insufficient characteristics to distinguish pest species with similar body types, the embodiment of the application starts from the characteristics of the original waveform data of deep mining pest detection, converts target infrared data into two-dimensional images from the two-channel one-dimensional time series data through a recursion graph algorithm, and then uses a deep neural network to perform characteristic extraction and classification on the two-dimensional images after the conversion of the original data, thereby realizing the identification of the trapped pest species.
From the above description, it can be seen that the pest type identification method provided by the embodiment of the present application can effectively improve convenience and efficiency of pest type identification by performing pest type identification on the two-dimensional recursive image after waveform conversion.
Specifically, the pest type recognition algorithm is implemented as follows:
(1) converting raw data into two-dimensional images using a recursive graph algorithm
First, the phase space of the original time sequence is generated by embedding the time delay. Assume that the original time sequence X is:
X={x i |i=1,2,...N} (6)
wherein x is i Representing the i-th element in the original sequence; n represents the length of the original sequence.
The subsequence v of X can be obtained by selecting the embedding dimension d and the time delay τ i
v i (d,τ)={x i+kτ ∈X|k=0,1,2…d-1} (7)
Wherein v is i (d, τ) represents a subsequence of x obtained by applying the embedding dimension d and the time delay τ; x is x i+kτ The (i+kτ) th element in the above sequence is represented. Because v is to be ensured i Is X, i.e. the index of the subscript of X in formula (7) cannot exceed N, where X represents v i The elements in the (d, τ) sequence are:
i≤N-(d-1)τ (8)
all subsequences v of X under the condition of satisfying formula (8) i A vector space of dimension (d, N- (d-1) τ) is constructed, called the phase space of the original sequence X. And the recursion diagram indicates whether there is recursion between any two vectors in the phase space, i.e. the recursion diagram R is an N- (d-1) τ -th order square matrix:
Any one element R in the recursion diagram R ij Representing vector v in phase space i And v j The degree of recursion between them is calculated by the formula (10):
R ij =θ(t) (10)
wherein t is a vector v i And v j Euclidean distance between:
t=||v i -v j || (11)
wherein θ represents a function name; v j Representing the j-th vector in the vector space.
There can be many expressions of the function θ, where the function θopt (t) is set forth in the optimization recursive graph algorithm, which causes Rij to follow the vector v i And v j The specific values of the euclidean distance therebetween vary continuously.
First θopt (t) is decreasing with respect to t and the rate of decrease gradually decreases, where ε is a threshold. Thus θopt (t) satisfies the differential equation (12):
/>
in addition, θopt (t) satisfies two boundary conditions, and assuming that the set formed by the euclidean distance between any two vectors in the phase space is D, the boundary conditions satisfied by θopt (t) are:
θ opt (max(D))=0 (14)
where max (D) represents the euclidean distance between any two vectors in the phase space, and the set of euclidean distances is the maximum value of the elements in D.
The three formulas (12) to (14) of the combined type can be solved:
an (ct) represents calculating the intermediate quantity.
Wherein:
where ln (c ε) represents the calculated intermediate quantity.
The algorithm flow is that a single piece of time series data is converted into a two-dimensional recursion chart, however, the trap pest detection waveform data consists of time series with the length of 128 channels, if the time series are spliced together to form a time series with the length of 256, and then the time series are converted into the recursion chart, the state information of each of the two channels in the original data and the related information between the two channels are mixed together. The present application therefore devised a method of converting a raw time sequence of a trap into a three-channel optimized recursive graph, namely, first generating phase spaces Vch1 and Vch2 for two-channel raw data according to equations (6) to (8), respectively, and then generating three-channel recursive graphs according to equations (18) to (20), with channel 1 and channel 2 representing the recursion of vectors in Vch1 and Vch2, respectively, and channel 3 representing the recursion between any vector in Vch1 and any vector in Vch 2:
Wherein R is ch1 A recursion diagram representing channel 1; r is R ch2 A recursion diagram representing channel 2; r is R ch3 A recursion diagram representing channel 3; r is R ij Representing an element in the recursion diagram R.
In summary, a process of converting raw data into a two-dimensional image using a recursive graph algorithm is illustrated in fig. 8.
(2) Parameter selection for recursive graph algorithm
In the method for converting the original time series data of pest detection into the two-dimensional recursion map, three variable parameters are included, namely an embedding dimension d, an embedding time delay tau and a threshold value epsilon when calculating vector recursion. Wherein the embedding dimension d and the embedding delay τ only affect the generated phase space dimension, and the threshold e only affects the specific recursion map generated by the phase space.
The practical meaning of the phase space is to represent all possible states of the system, and to measure the quality of the phase space reconstructed by embedding time delay in a single piece of time series data, the similarity between the reconstructed phase space and the real phase space of the system is required. However, there are virtually innumerable conditions in which pests may fall through the detection zone, since it is not possible to determine in what pose the pest is passing through the detection zone, resulting in a real phase space of the system that is difficult to calculate, i.e. without quantifying the quality of the phase space reconstruction. And directly determining the two-dimensional recursion pattern formed by converting the original data is threshold epsilon, instead of embedding dimension d and embedding delay tau, so that the value of the threshold epsilon is prioritized when selecting parameters.
When the value of epsilon is 0, all elements in the recursion graph will be 0, and when the value of epsilon is 1, all elements in the recursion graph will be 1. When the values of the elements in the recursive graph are too concentrated, the characteristics thereof will disappear, and therefore, it is desirable that the values of the elements in the recursive graph are dispersed as much as possible. Assuming that the set U is the set of all values in Rij, Z is a discrete random variable defined on U, PZ (Z) is a probability density function of Z, and the information entropy is used to evaluate whether the distribution of Z is discrete or concentrated:
wherein H (Z) represents the information entropy of Z, P Z A probability density function representing Z, which represents the value of the random variable Z, Z k Represents the kth value of the random variable Z.
It is evident that H (Z) is related to ε. The optimization of the parameter e is to make H (Z) as large as possible. In practice, in order to avoid that all elements in the recursion diagram are 0 or 1, the range of values of the parameter e should satisfy:
min(D)<∈<max(D) (22)
where min (D) represents the minimum value of set D.
Searching the optimal parameter epsilon by using a uniform sampling mode in the value range of epsilon, uniformly selecting 10 points in the range of the formula (22) as the value of epsilon, and drawing a change curve of H (Z) along with epsilon as shown in figure 9.
When generating a recursion diagram for all data in a data set, it is found herein that when the phase space of each channel time sequence is converted into the recursion diagram, the change trend of H (Z) along with epsilon is shown in fig. 9, so that the application determines the epsilon value according to the following flow, and the finally determined epsilon value is 0.0473:
1) The average of all data min (D) and max (D) in the dataset is calculated as min (D) and max (D) for the entire dataset.
2) E is determined using equation (23).
In order to improve accuracy and efficiency of pest type identification, in the pest type identification method provided in the present application, referring to fig. 10, step 4000 in the pest type identification method may further include the following:
step 5000: if the pest type recognition model comprises a first convolutional neural network and a second convolutional neural network, wherein the first convolutional neural network is used for recognizing pest type recognition result data corresponding to real image data of the pests which fall into the pest catcher, the second convolutional neural network is used for recognizing pest type recognition result data corresponding to a two-dimensional recursive image obtained by converting a two-channel discrete time sequence in the target infrared data, and iterative optimization is performed on the second convolutional neural network based on the pest type recognition result data corresponding to the real image data output by the first convolutional neural network.
As can be seen from the above description, according to the pest type identification method provided by the embodiment of the present application, by performing iterative optimization on the second convolutional neural network by using pest type identification result data corresponding to live-action image data, accuracy and effectiveness of pest type identification on the two-dimensional recursive image after waveform conversion can be effectively improved.
In order to further improve accuracy of pest type identification by performing two-dimensional recursive image after waveform conversion, in the pest type identification method provided by the application, the second convolutional neural network in the pest type identification method is an acceptance v3 network.
That is, after converting a single piece of time-series data into a two-dimensional recursive graph, it can be feature-extracted and classified using an imperceptin v3 network. The network structure list of the acceptance v3 is shown in table 2, and the structure of the acceptance module is shown in fig. 11 (a) to 11 (c). In fig. 11 (a) to 11 (c), base represents an input; filter Concat represents a splice at the convolutional kernel channel level.
TABLE 2
Layer type Patch size (core size/step size/remark) Input size
Conv 3×3/2 299×299×3
Conv 3×3/1 149×149×32
Conv padded 3×3/1 147×147×32
Pool 3×3/2 147×147×64
Conv 3×3/1 73×73×64
Conv 3×3/2 71×71×80
Conv 3×3/1 35×35×192
3*inception As shown in FIG. 11 (a) 35×35×288
5*inception As shown in FIG. 11 (b) 17×17×768
2*inception As shown in FIG. 11 (c) 8×8×1280
Pool 8×8 8×8×2048
Linear / 1×1×2048
Softmax / 1×1×8
Wherein Conv represents convolution; conv packed represents filled convolutions; pool represents pooling; the indication represents a decomposition convolutional layer; linear represents full connectivity; softmax refers to the Softmax sorting layer.
In order to further improve timeliness and pertinence of pest disposal of the region where the pest catcher is located by a user based on the pest type recognition result, in the pest type recognition method provided by the embodiment of the application, the pest type recognition method specifically includes: long-angle flat-grain theft, turkish flat-grain theft, saw-grain theft, red-grain theft, hybrid-grain theft, bark beetle, rice weevil and corn weevil.
From the above description, it can be seen that the pest type identification method provided by the embodiment of the application can effectively improve the applicability and comprehensiveness of pest type identification, and further can effectively improve the timeliness and pertinence of pest treatment on the region where the pest catcher is located by a user based on the pest type identification result.
In terms of software, the present application further provides a pest falling identification device for executing all or part of the pest falling identification method, referring to fig. 12, the pest falling identification device specifically includes the following contents:
the infrared data receiving module 10 is used for receiving target infrared data which is currently acquired from the insect trap and contains a double-channel discrete time sequence;
the pest falling into recognition and control image acquisition module 20 is configured to obtain a waveform recognition type corresponding to a discrete time sequence of the two channels in the target infrared data based on a preset time sequence classification model, and if the waveform recognition type is a pest detection waveform, confirm that the pest falls into the pest trap at present and control the pest trap to perform image acquisition on the pest falling into the pest trap.
The embodiment of the pest drop identifying apparatus provided in the present application may be specifically used to execute the process flow of the embodiment of the pest drop identifying method in the above embodiment, and the functions thereof are not described herein in detail, and reference may be made to the detailed description of the embodiment of the pest drop identifying method.
The part of the pest falling identification device for pest falling identification can be executed in a server or can be completed in a client device. Specifically, the selection may be made according to the processing capability of the client device, and restrictions of the use scenario of the user. The present application is not limited in this regard. If all operations are performed in the client device, the client device may further include a processor for specific handling of pest fall-through identification.
The client device may have a communication module (i.e. a communication unit) and may be connected to a remote server in a communication manner, so as to implement data transmission with the server. The server may include a server on the side of the task scheduling center, and in other implementations may include a server of an intermediate platform, such as a server of a third party server platform having a communication link with the task scheduling center server. The server may include a single computer device, a server cluster formed by a plurality of servers, or a server structure of a distributed device.
Any suitable network protocol may be used for communication between the server and the client device, including those not yet developed at the filing date of this application. The network protocols may include, for example, TCP/IP protocol, UDP/IP protocol, HTTP protocol, HTTPS protocol, etc. Of course, the network protocol may also include, for example, RPC protocol (Remote Procedure Call Protocol ), REST protocol (Representational State Transfer, representational state transfer protocol), etc. used above the above-described protocol.
From the above description, it can be known that the pest that this application embodiment provided falls into recognition device, can effectively discern and distinguish whether the object that falls into the insect trap is the pest, and can effectively improve pest and fall into the accuracy and the reliability of discernment, and can just control the insect trap and carry out timely and effectual image acquisition to the pest that falls into its inside when confirming to fall into the pest, and then can effectively reduce the invalid collection number and the frequent start-up number of times of image acquisition equipment in the insect trap, and then can reduce the insect trap energy consumption, make the difficult accumulation of heat, weaken the growth and the propagation of pest produce the favorable influence for current insect trap, and also reduced the probability of incident emergence.
In terms of software, the present application also provides a pest type recognition device for executing all or part of the pest type recognition method, referring to fig. 13, the pest type recognition device specifically includes the following contents:
the trigger detection module 30 is configured to obtain a notification message that the waveform identification type corresponding to the target infrared data is a pest detection waveform;
in the trigger detection module 30, the pest type recognition device may acquire, from other devices, a notification message that the waveform recognition type corresponding to the target infrared data is a pest detection waveform, or may be disposed in the same cloud server as the pest drop recognition device, and if the pest drop is the latter, the pest type recognition device may directly determine whether to trigger the type recognition module 40 according to the recognition result of the pest drop recognition method in the foregoing embodiment.
And a type recognition module 40 for acquiring pest type recognition result data in the current insect trap based on a preset pest type recognition model.
The embodiments of the pest type recognition device provided in the present application may be specifically used to execute the process flow of the embodiments of the pest type recognition method in the above embodiments, and the functions thereof are not described herein in detail, and reference may be made to the detailed description of the embodiments of the pest type recognition method.
The part of the pest type recognition device for pest type recognition may be executed in a server or may be completed in a client device.
As can be seen from the above description, according to the pest type recognition device provided by the embodiment of the present application, by acquiring pest type recognition result data in the pest trap based on the preset pest type recognition model, the degree of automation and efficiency of pest type recognition can be further realized on the basis that the recognized pests fall into the pest trap, and thus the timeliness and pertinence of pest disposal of the region where the pest trap is located by a user based on the pest type recognition result can be effectively improved.
The embodiment of the application also provides an electronic device, which may be a cloud server, where the cloud server may include a processor, a memory, a receiver, and a transmitter, where the processor is configured to execute the pest drop identification method or the pest type identification method mentioned in the foregoing embodiment, and the processor and the memory may be connected by a bus or other manners, for example, through a bus connection. The receiver may be connected to the processor, memory, by wire or wirelessly.
The processor may be a central processing unit (Central Processing Unit, CPU). The processor may also be any other general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof.
The memory, as a non-transitory computer readable storage medium, may be used to store a non-transitory software program, a non-transitory computer executable program, and a module, such as program instructions/modules corresponding to the pest fall into the identification method or the pest type identification method in the embodiments of the present application. The processor executes various functional applications of the processor and data processing by running non-transitory software programs, instructions, and modules stored in the memory, that is, implements the pest fall-into recognition method or pest type recognition method in the above-described method embodiments.
The memory may include a memory program area and a memory data area, wherein the memory program area may store an operating system, at least one application program required for a function; the storage data area may store data created by the processor, etc. In addition, the memory may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory may optionally include memory located remotely from the processor, the remote memory being connectable to the processor through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory that, when executed by the processor, perform the pest fall into the identification method or the pest type identification method in the embodiments.
In some embodiments of the present application, the user equipment may include a processor, a memory, and a transceiver unit, where the transceiver unit may include a receiver and a transmitter, and the processor, the memory, the receiver, and the transmitter may be connected by a bus system, the memory storing computer instructions, and the processor executing the computer instructions stored in the memory to control the transceiver unit to transmit and receive signals.
As an implementation manner, the functions of the receiver and the transmitter in the present application may be considered to be implemented by a transceiver circuit or a dedicated chip for transceiver, and the processor may be considered to be implemented by a dedicated processing chip, a processing circuit or a general-purpose chip.
As another implementation manner, a manner of using a general-purpose computer may be considered to implement the server provided in the embodiments of the present application. I.e. program code for implementing the functions of the processor, the receiver and the transmitter are stored in the memory, and the general purpose processor implements the functions of the processor, the receiver and the transmitter by executing the code in the memory.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the aforementioned pest fall-into-recognition method or pest-type recognition method. The computer readable storage medium may be a tangible storage medium such as Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, floppy disks, hard disk, a removable memory disk, a CD-ROM, or any other form of storage medium known in the art.
The application also provides an embodiment of the insect trap, wherein the insect trap is used for collecting target infrared data containing a double-channel discrete time sequence therein and sending the target infrared data to a cloud server so as to enable the cloud server to execute the pest falling into the identification method in the first aspect or realize the pest type identification method in the second aspect;
the insect trap is also used for collecting images of pests falling into the insect trap when or after receiving the image collection control instruction sent by the cloud server.
From the above description, it can be known that the insect trap provided by the embodiment of the application can effectively identify and distinguish whether the object falling into the insect trap is a pest, and can effectively improve the accuracy and reliability of pest falling into identification, and can control the insect trap to timely and effectively collect images of pests falling into the insect trap when confirming that the pests fall into the insect trap, so that the ineffective collection times and frequent starting times of the image collection equipment in the insect trap can be effectively reduced, and further the energy consumption of the insect trap can be reduced, and heat is not easy to accumulate.
To current electron probe that has image function adopts normally open mode, this will lead to the scattered high power of system, produces the heat and piles up, and then can help pest growth and reproduce scheduling problem, in the insect trap that this application embodiment provided, see fig. 14 through 27, the insect trap specifically contains following:
an insect catching body 100, the insect catching body 100 including a control module 150, a trapping module 110, and a monitoring module 120;
the trapping module 110 includes a first tube 111, and an insect collecting funnel 112 installed in the first tube 111, the insect collecting funnel 112 being coaxially disposed with the first tube 111, one end of the insect collecting funnel 112 being a first end and the other end being a second end in a first direction, the diameter of the insect collecting funnel 112 gradually shrinking from the first end to the second end, the second end being provided with a worm hole 112b,
The monitoring module 120 includes a second tube 121, and all installs infrared data acquisition assembly and clapping insect subassembly in the second tube 121, the second tube 121 with first tube 111 coaxial setting and interconnect, infrared data acquisition assembly 124 with clapping insect subassembly all with control module 150 communication connection, infrared data acquisition assembly 124 includes first infrared transmitting tube 124b and first infrared receiving tube 124a, the worm hole 112b is close to infrared data acquisition assembly 124 setting, the projection 112b' of worm hole 112b in the first plane is located first infrared receiving tube 124a can receive the light within range of first infrared transmitting tube 124b transmission, first plane perpendicular to first direction, and the axis of first infrared transmitting tube 124b and the axis of first infrared receiving tube 124a all are located this plane, first direction is the axial of first tube 111.
In this embodiment, the via 112b is disposed near the infrared data collection assembly 124, that is, the second end of the insect collection funnel 112 is disposed further near the infrared data collection assembly 124 than the first end of the insect collection funnel 112; the insect shooting component means a component for shooting insects; the edge of the insect collecting funnel 112 is preferably clamped between the first pipe 111 and the second pipe 121; the via 112b has a diameter of 2mm to 10mm (preferably 4 mm). As shown in fig. 21, the range between the solid lines on both sides of the projection 112b 'of the via 112b in the first plane is the light range emitted by the first infrared emission tube 124b, and the range between the dashed lines on both sides of the projection 112b' of the via 112b in the first plane is the light range that the first infrared receiving tube 124a can receive the light emitted by the first infrared emission tube 124 b.
Preferably, the insect trap provided in the embodiment of the present application further includes a gram head 400, a cable 200 (which may be a thicker bus or a thinner wire) and an infinite communication device 300, and is made of a tensile and wear-resistant material, the first tube 111 includes a cap section 111a and a trap section 111b which are connected to each other, the trap section 111b is connected to the second tube 121, the cap section 111a is located at one end of the trap section 111b away from the second tube 121, the cap section 111a is preferably a hollow cone structure, the trap section 111b is preferably a circular hollow plastic tube, and a plurality of insect trapping holes 111ba with diameters not greater than 2.5mm are distributed on the trap section 111 b; the gram head 400 is fixed at the end of the pipe cap section 111a, one end of the cable 200 passes through the gram head 400, enters the insect catching body 100 through the pipe cap section 111a, is connected with electrical components in the insect catching body 100, and the infinite communication device 300 is arranged at the other end of the cable 200; the trapping module 110 further comprises a cable conversion plate 114 and a light trapping member 115 mounted on the cable conversion plate 114, wherein the light trapping member 115 is preferably a patch light emitting diode with adjustable wavelength, and the light emitting diode is arranged in a wave band with optimal trapping effect; both the cable transition plate 114 and the light trap 115 are mounted within the cap section 111a, with the light trap 115 disposed proximate to the trap section 111 b. The light emitting diode is fixed to the bottom surface of the cable converting plate 114 by welding; the light trap 115 may also be a fixed wavelength patch diode or an in-line diode.
Preferably, the trapping module 110 further comprises a worm pipe 113, the top end of the worm pipe 113 is communicated with the worm hole 112b, the bottom end of the worm pipe 113 extends to the infrared data acquisition assembly 124, and objects falling out of the worm hole 112b enter the worm pipe 113 and fall to the infrared data acquisition assembly 124 through the worm pipe 113.
According to the insect trap provided by the embodiment of the application, when the insect trap is used, the insect trap is placed in the grain pile, the position of the first pipe 111 is located above the second pipe 121, insects in the grain pile enter the first pipe 111 and enter the insect collecting hopper 112, as the first infrared receiving pipe 124a can receive the light range emitted by the first infrared transmitting pipe 124b and covers the insect passing hole 112b in the first direction, insects can enter the first infrared receiving pipe 124a after falling out of the insect passing hole 112b on the insect collecting hopper 112 and can receive the light range emitted by the first infrared transmitting pipe 124b, the control module 150 receives signals acquired by the infrared data acquisition assembly 124, and photographs of insects are shot by the insect capturing assembly.
As shown in fig. 21 to 23, optionally, the infrared data acquisition assembly 124 further includes a second infrared receiving tube 124d and a second infrared emitting tube 124e that are coaxially disposed, where an axis of the second infrared receiving tube 124d and an axis of the second infrared emitting tube 124e are both located in the first plane, the first infrared emitting tube 124b and the first infrared receiving tube 124a are coaxially disposed, an axis of the first infrared receiving tube 124a is a first axis, an axis of the second infrared receiving tube 124d is a second axis, an intersection point of the first axis and the second axis is located at a center of a projection 112b 'of the worm hole 112b in the first plane, and a projection 112b' of the worm hole 112b in the first plane is located in a range in which the second infrared receiving tube 124d can receive light emitted by the second infrared emitting tube 124 e. It is understood that the first axis and the second axis are each infinitely extendable straight lines. The number of the infrared receiving tubes and the infrared transmitting tubes is increased, so that the infrared monitoring sensitivity can be improved, and further, when an object falls from the worm hole 112b, the control module 150 can timely and accurately receive signals; the intersection point of the first axis and the second axis is located at the center of the projection 112b ' of the via 112b in the first plane, so that the first infrared receiving tube 124a can receive the light range emitted by the first infrared emitting tube 124b, and the second infrared receiving tube 124d can receive the light range emitted by the second infrared emitting tube 124e, which can more accurately cover the projection 112b ' of the via 112b in the first plane, so that the projection 112b ' of the via 112b in the first plane is not deviated to a certain side, resulting in that a part of the projection is located outside the light range, and finally, when an object falls from the via 112b, the object may fall outside the light range, resulting in the problem of reduced infrared monitoring sensitivity. In fig. 22, the range between the two broken lines on both sides of the first infrared emission tube 124b is the range of the first infrared emission tube 124b emitting light, and the range between the two broken lines on both sides of the second infrared emission tube 124e is the range of the second infrared emission tube 124e emitting light. Fig. 23 is a schematic diagram illustrating a partial top view structure of an embodiment of an infrared data acquisition assembly 124 provided in the present application, where the diameter of the first infrared receiving tube 124a, the diameter of the first infrared transmitting tube 124b, the diameter of the second infrared receiving tube 124d, and the diameter of the second infrared transmitting tube 124e in fig. 23 are all 5mm, and the numerical values marked in fig. 23 are all in millimeters except 45.0 °.
Optionally, the first axis is perpendicular to the second axis. In this way, the range of the first infrared receiving tube 124a capable of receiving the light emitted by the first infrared transmitting tube 124b and the range of the second infrared receiving tube 124d capable of receiving the light emitted by the second infrared transmitting tube 124e are larger in cross overlapping, so that the object falling from the worm hole 112b is not easy to fall out of the light range, and the infrared monitoring sensitivity is further improved.
Optionally, a distance between the first infrared transmitting tube 124b and the projection 112b 'of the via 112b in the first plane is greater than a distance between the first infrared receiving tube 124a and the projection 112b' of the via 112b in the first plane, and a distance between the second infrared transmitting tube 124e and the projection 112b 'of the via 112b in the first plane is greater than a distance between the second infrared receiving tube 124d and the projection 112b' of the via 112b in the first plane. Because the light rays emitted by the infrared emission tubes are distributed in a conical shape, the farther the distance from the infrared emission tubes is, the larger the coverage range of the light rays is, so that the distance between the projection 112b' of the via 112b in the first plane and the infrared emission tubes is relatively far, and the projection can be easily covered by the light ray projection emitted by the infrared emission tubes; the greater the distance between the infrared receiving tube and the infrared transmitting tube, the smaller the range that the light emitted by the infrared transmitting tube can directly irradiate on the infrared receiving tube, so that the infrared receiving tube is closer to the projection 112b' of the worm hole 112b in the first plane, and the projection is easier to fall in the light range that the infrared receiving tube can receive.
Optionally, the insect shooting assembly includes a camera 125 and a shooting barrel 122, a shooting bin 122a is formed in the shooting barrel 122, a shooting bin inlet 122b is formed at one end of the shooting barrel 122 in the first direction, a shooting bin outlet 122c is formed at the other end of the shooting barrel 122, the insect passing hole 112b is communicated with the shooting bin inlet 122b, the infrared data acquisition assembly 124 is located between the insect collecting funnel 112 and the shooting bin inlet 122b in the first direction, the camera 125 is disposed close to the shooting bin inlet 122b, the camera 125 is fixed on the inner wall of the shooting bin 122a, and the camera 125 is in communication connection with the control module 150 to shoot into the shooting bin 122 a. Thus, when the insect falls out of the insect passing hole 112b to trigger the infrared data acquisition assembly 124, the control module 150 receives the signal of the infrared data acquisition assembly 124, and the camera 125 is controlled to shoot into the shooting bin 122a in the falling process of the object in the shooting bin 122a, and the sufficient shooting time is provided for the camera 125 in the stay process of the object in the shooting bin 122 a. Preferably, the camera 125 is embedded in a groove at the top of the shooting bin 122 a; the insect swatter assembly also preferably includes a light supplement lamp that is activated and deactivated simultaneously with the camera 125.
Alternatively, the diameter of the photographing bin 122a gradually decreases from the photographing bin inlet 122b to the photographing bin outlet 122 c. Since the diameter of the photographing bin 122a gradually decreases from the photographing bin inlet 122b to the photographing bin outlet 122c, an object may contact with a sidewall of the photographing bin 122a after falling into the photographing bin 122a and before falling out from the photographing bin outlet 122c, and slow down the falling speed due to the friction force, giving the camera 125 sufficient photographing time.
In this embodiment, preferably, the control module 150 is preferably a main control circuit board, where the main control circuit board preferably includes six parts including a power module, a microprocessor, a communication module, a multidimensional data acquisition module, a camera 125 driving module and an infrared signal processing module, and the power module is used to output appropriate voltage and current to supply power to the system; the microprocessor is used for running a processing program, driving each acquisition module and executing an instruction issued by the upper computer; the communication module is used for sending information of the microprocessor to the bus and acquiring instructions from the bus to the microprocessor; the multidimensional data acquisition module is used for acquiring multidimensional data in the grain pile such as carbon dioxide concentration, oxygen concentration, temperature and humidity; the camera 125 driving module is used for outputting a proper driving power supply and a control signal to the camera 125 module and reading image information from the camera 125 module; the infrared signal processing module is used for collecting infrared waveform data of pests and distinguishing whether the pests fall off, so as to judge whether the camera 125 is started to acquire images.
In this embodiment, preferably, the infrared data acquisition assembly 124 further includes an infrared data acquisition circuit board 124c, the first infrared transmitting tube 124b and the first infrared receiving tube 124a are both installed on the infrared data acquisition circuit board 124c, the infrared data acquisition circuit board 124c is connected to the main control circuit board through pin arrangement and bus arrangement, and the pin arrangement and bus arrangement not only play a role in fixing positions, but also can realize transmission of power and signals.
Existing traps lack pest cleaning mechanisms and rely on manual work to clean pests. This problem increases the complexity of the operation of the monitoring system and increases the labor costs, especially when monitoring and cleaning pests after an increased number of deployment in the grain bin becomes more difficult and time consuming. In this embodiment, optionally, the monitoring module 120 further includes a cleaning assembly installed in the second pipe 121, the cleaning assembly 123 includes a housing 123b, an insect-carrying platform 123a and a driving member 123d, the housing 123b is connected with the shooting barrel 122, the driving member 123d is installed in the housing 123b, the insect-carrying platform 123a is installed in the driving member 123d, the driving member 123d is used for driving the insect-carrying platform 123a to reciprocate between a first position and a second position, in the first position the insect-carrying platform 123a cooperates with the shooting bin outlet 122c, in the second position a gap is formed between the insect-carrying platform 123a and the shooting bin outlet 122c, and the driving member 123d is in communication connection with the control module 150. Thus, after the camera 125 finishes shooting the object falling into the shooting bin 122a, the control module 150 can control the driving piece 123d to start and drive the insect carrying platform 123a to move from the first position to the second position, so that the object can fall out from the shooting bin outlet 122c, and before shooting the object in the shooting bin 122a of the next round, the control module 150 starts the driving piece 123d again and drives the insect carrying platform 123a to move from the second position to the first position, thereby realizing automatic cleaning, having relatively simple whole structure and reducing the manpower cost. The housing 123b is connected to the photographing bin 122a by a screw and a nut.
Optionally, the driving member 123d includes a driving member 123d body and a first output shaft mounted on the driving member body, the driving member 123d is a rotary driving member, the first output shaft and the second pipe 121 are coaxially disposed, the cleaning assembly 123 further includes a threaded rod 123c mounted on the first output shaft, a threaded hole is formed on the casing 123b, and the threaded rod 123c is matched with the threaded hole. Thus, when the first output shaft of the driving member 123d rotates, the threaded rod 123c is driven to rotate, and as the threaded rod 123c is matched with the threaded hole on the casing 123b, the rotation of the threaded rod 123c drives the threaded rod 123c, the driving member 123d and the insect-carrying platform 123a to move in the first direction, and the rotation direction of the first output shaft is changed to drive the rotation direction of the threaded rod 123c to change, so that the corresponding driving member 123d and the insect-carrying platform 123a also change the movement direction in the first direction. Of course, the driving member 123d may be a linear driving member, and the worm platform 123a is connected to a driving rod of the linear driving member, and the worm platform 123a is driven by the driving rod of the linear driving member to move between the first position and the second position. Preferably, the rotary drive is a direct current motor.
Optionally, the cleaning assembly 123 further includes a trigger member, and a first limit switch and a second limit switch both installed on the housing 123b, where the first limit switch and the second limit switch are both used for being in communication connection with the control module 150, the trigger member is installed on the body of the driving member 123d, and the trigger member is used for moving along with the carrier platform 123a, so that the carrier platform 123a contacts with the first limit switch when in a first position, and the carrier platform 123a contacts with the second limit switch when in a second position. Thus, when the carrier platform 123a moves to the first position, the limiting piece triggers the first limiting switch to automatically stop the movement of the carrier platform 123a towards the shooting bin 122a, and when the carrier platform 123a moves to the second position, the limiting piece triggers the second limiting switch to automatically stop the movement of the carrier platform 123a away from the shooting bin 122 a.
The existing part of the probe tube trapper with the cleaning function has structural design defects, and living insects with stronger climbing capacity cannot be cleaned effectively. For example, a stepping motor is used for controlling the overturning mode of the falling insect plate, and some pests with strong climbing capacity cannot be cleaned, so that the monitoring accuracy is affected. In addition, some probe traps use an electric strip brush to clean pests, and this method can cause pests to catch the brush when cleaning living pests, resulting in cleaning failure. In this embodiment, optionally, the driving member 123d further includes a second output shaft, the first output shaft is coaxially disposed with the second output shaft, and in the first direction, the first output shaft and the second output shaft are located at opposite ends of the body of the driving member 123d, and the insect-carrying platform 123a is fixed on the second output shaft. That is, the first output shaft and the second output shaft are driven by the body of the driving member 123d, after the shooting operation on the object in the shooting bin 122a is completed, the control module 150 starts the driving member 123d, and the driving member 123d drives the first output shaft and the second output shaft to rotate, so that, on one hand, the carrier platform 123a moves from the first position to the second position, on the other hand, the carrier platform 123a is driven to rotate, the object falling onto the carrier platform 123a is thrown off from the carrier platform 123a under the action of centrifugal force, and after the object on the carrier platform 123a is thrown off, the control module 150 drives the driving member 123d to drive the first output shaft and the second output shaft to change the rotation direction, and the carrier platform 123a moves from the second position to the first position. Preferably, the first output shaft is in transmission connection with the first reduction gear set, the threaded rod 123c is mounted on the output shaft of the first reduction gear set, the second output shaft is in transmission connection with the second reduction gear set, and the insect carrying platform 123a is mounted on the output shaft of the second reduction gear set.
Optionally, the insect-carrying platform 123a includes a top surface facing the shooting bin 122a, a first rib 123aa and a second rib 123ac are formed on the top surface, the length direction of the first rib 123aa and the length direction of the second rib 123ac are parallel to the top surface, the first rib 123aa and the second rib 123ac vertically intersect, and a convex cone 123ab is further formed at the intersection point of the first rib 123aa and the second rib 123 ac.
Preferably, the control module 150 further includes a motor driving board, and each limit switch can control the movement amplitude of the driving piece 123d and the carrier platform 123a, and when the movement reaches the preset position, the corresponding limit switch will be triggered to provide a control signal for the motor driving board. A wiring hole 123ba is formed in the housing 123b, and a power cord of the motor drive board is passed through the wiring hole 123ba in the housing 123b and connected to the infrared data acquisition circuit board 124c. The housing 123b has two parts, and is connected to each other by a snap-fit connection.
Optionally, the insect catching body 100 further includes a collecting pipe 130 coaxially disposed with the first pipe 111, the collecting pipe 130 is detachably connected to an end of the second pipe 121 remote from the first pipe 111, and a collecting cavity communicating with the first pipe 111 is formed in the collecting pipe 130. The insect trap provided by the embodiment of the application, when in use, is placed in a grain pile, the position of the first pipe 111 is located above the second pipe 121, the second pipe 121 is located above the collecting pipe 130, when an object falling onto the insect carrying platform 123a is thrown away from the insect carrying platform 123a under the action of centrifugal force, the object can fall into the collecting pipe 130 downwards, and after the insect trap is used for a period of time, the collecting pipe 130 is separated from the second pipe 121, so that the object in the collecting cavity can be poured out. In the present embodiment, it is preferable that the cap section 111a, the trap section 111b, the second tube 121 and the collection tube 130 are screw-coupled in this order.
Optionally, the insect capturing body 100 further includes a wire pipe 140, the wire pipe 140 is coaxially disposed with the first pipe 111, one end of the wire pipe 140 extends to an end of the first pipe 111 away from the second pipe 121, the other end of the wire pipe 140 extends to the insect collecting funnel 112, a wire passing hole 112a is formed on the insect collecting funnel 112, and the wire pipe 140 is communicated with the wire passing hole 112 a. The wiring tube 140 facilitates wiring in the insect catching body 100, so that the wires are not easy to interfere with elements in the insect catching body 100, and the wires are not easy to block the movement of objects such as insects in the insect catching body 100. When the first tube 111 includes the cap section 111a and the trap section 111b, the cable 200 passes through the glans 400 into the cap section 111a, the cable 200 is connected with the cable conversion plate 114, the cable conversion plate 114 is also connected with wires, the thicker cable 200 is converted into thinner wires at the cable conversion plate 114, the wires enter the wiring tube 140, and extend to and are connected with the main control circuit board along the wiring tube 140. In this embodiment, the routing tube 140 is preferably a routing metal tube, the routing metal tube is a hollow long tube (preferably 4 mm) with a diameter of 2 mm-10 mm, the inside of the routing metal tube can be penetrated by a wire, the diameter of the wire passing hole 112a is slightly larger than the inner diameter of the routing hole 123ba, the routing metal tube is fastened and fixed with the cable conversion plate 114 through a through hole reserved in the center of the cable conversion plate 114, and the routing tube 140 can also be made of materials such as plastics. The main control circuit board is fixed on the side surface of the insect collecting funnel 112 through screws and nuts, and the infrared data acquisition circuit board 124c is fixed between the top of the shooting bin 122a and the bottom of the insect collecting funnel 112 through screws and nuts.
In order to better explain the insect trap provided by the application, the application also provides an application example of the insect trap, wherein the application example is as follows:
the application example provides an electronic probe trap for grain heap pests.
The electronic probe trap is composed of an electronic probe main body and a wireless communication device from the outside, and the electronic probe trap main body and the wireless communication device are connected by adopting a tensile and wear-resistant cable to supply power to the main body and transmit signals.
The electronic probe body is connected with the cable by adopting a gram head.
The electronic probe body comprises a pipe cap section, a trapping section, an electronic monitoring section and a pest collecting section. The pipe cap section and the trapping section, the trapping section and the electronic monitoring section are all in threaded connection with each other, and the electronic monitoring section and the pest collecting section are all in threaded connection with each other.
The pipe cap section is a hollow conical structure, and an opening is arranged at the top of the pipe cap section for installing a gram head; the interior of the pipe cap section comprises a cable conversion plate, a lamplight trapping module and a metal pipe for wiring.
The wiring metal tube is a hollow long tube (preferably 4 mm) with the diameter of 2 mm-10 mm, and the inside of the wiring metal tube can be penetrated by a lead.
The cable conversion plate is used for converting a thicker bus outside the probe tube into a thinner wire, and then is connected to a main control circuit board inside the electronic monitoring section through the wiring metal tube.
The lamplight trapping module uses a patch light-emitting diode with adjustable wavelength and can be used for setting a wave band with the best trapping effect.
The lamplight trapping module is fixed on the bottom surface of the cable conversion plate in a welding mode.
The wiring metal pipe is clamped and fixed with the through hole reserved in the center of the cable conversion plate.
The trapping section is a circular hollow plastic pipe, and a plurality of insect trapping holes with diameters not larger than 2.5mm are distributed on the circular hollow plastic pipe.
The electronic monitoring section comprises the following parts: the device comprises an insect collecting funnel, a wiring hole, a main control circuit board, a camera connector, an infrared data acquisition circuit board, a camera module, a shooting bin, an insect carrying platform and a cleaning mechanism.
The edge joint of collection worm funnel is between the shell of trapping section and electronic monitoring section.
The middle of the insect collecting funnel is provided with a wiring hole and an insect passing hole.
The inner diameter of the wiring hole is slightly larger than the diameter of the wiring metal tube.
The wire metal tube is inserted into the wire hole of the insect collecting funnel to realize fixation, and the wire penetrates through the wire hole to be connected to the main control circuit board.
The diameter of the via hole is 2 mm-10 mm (preferably 4 mm).
The main control circuit board is fixed on the side surface of the insect collecting funnel through screws and nuts.
The infrared data acquisition circuit board is connected to the main control circuit board through pin arrangement and bus arrangement, and the pin arrangement and bus arrangement not only play a role in fixing positions, but also can realize transmission of power supply and signals.
The infrared data acquisition circuit board is fixed between the top of the shooting bin and the bottom of the insect collecting funnel by using screws and nuts.
The white patch light-emitting diode is welded at the bottom of the infrared data acquisition circuit board and is used for supplementing light to the shooting bin when the camera shoots.
The camera module is embedded in the groove at the top of the shooting bin.
The insect carrying platform is connected with the cleaning mechanism through a rotating shaft.
The cleaning mechanism is connected with the shooting bin through a screw and a nut.
The main control circuit board mainly comprises a power supply module, a microprocessor, a communication module, a multidimensional data acquisition module, a camera driving module and an infrared signal processing module. The power supply module is used for outputting proper voltage and current to supply power to the system; the microprocessor is used for running a processing program, driving each acquisition module and executing an instruction issued by the upper computer; the communication module is used for sending information of the microprocessor to the bus and acquiring instructions from the bus to the microprocessor; the multidimensional data acquisition module is used for acquiring multidimensional data in the grain pile such as carbon dioxide concentration, oxygen concentration, temperature and humidity; the camera driving module is used for outputting a proper driving power supply and a control signal to the camera module and reading image information from the camera module; the infrared signal processing module is used for collecting infrared waveform data of pests and distinguishing whether the pests fall off or not, so that whether the camera is started or not is judged to acquire an image.
The infrared data acquisition circuit board uses two pairs of infrared light emitting tubes with the diameter of 5mm and infrared receiving tubes.
The infrared transmitting tube and the infrared receiving tube are oppositely arranged on the same straight line, and the two pairs of tubes are vertically arranged.
The worm hole in the insect collecting funnel passes through the space between the infrared transmitting tube and the receiving tube and is fully covered by the light emitted by the infrared transmitting tube.
The cleaning mechanism comprises a rotating shaft, a cleaning mechanism shell, a wiring hole, a first reduction gear set, a driving circuit board, a limit switch, a direct current motor, a second reduction gear set and a threaded rod. The rotary shaft card is gone into the cargo platform bottom for drive cargo platform rotation, and the rotation axis of direct current motor is through reduction gear train No. one and reduction gear train No. two the speed reduction respectively drive rotary shaft and threaded rod rotation, and the threaded rod can realize the relative motion of clearance mechanism inner structure and clearance mechanism shell when rotatory. The limit switch can control the movement amplitude of the internal structure, and when the internal structure moves to a preset position, the limit switch is triggered to provide a control signal for the motor drive plate. The power line of the motor drive board passes through the wiring hole and is connected to the infrared data acquisition circuit board. The cleaning mechanism shell is divided into two parts which are connected in a clamping way.
The diameter and length of each portion of the electron probe body in this application example may also enable this application example.
The diameter of the wired metal tube may also enable this application example.
The material of the routing metal tube is metal, but other materials (such as plastics and the like) can be used to realize the application example.
The bus uses thicker wires and therefore requires the use of a cable switch board, but it is also possible to implement the present application if the bus uses thinner wires and not the cable switch board.
The light trapping module uses a patch light emitting diode with adjustable wavelength, but the application example can be realized by using a patch diode with fixed wavelength or an in-line diode.
The use of adhesives for the routing metal tubes and the cable transition plates is also possible to implement the present application example.
The trapping section is a transparent hollow plastic pipe with the inner diameter of 44mm and the outer diameter of 50mm, and the application example can be realized by changing the transparency of the transparent plastic pipe by scaling the inner diameter or the outer diameter of the transparent plastic pipe and replacing the material of the transparent plastic pipe.
The diameter of the insect trapping hole is 2mm, the angle is 45 degrees upwards, and the application example can be realized by scaling the diameter of the insect trapping hole or changing the angle of the insect trapping hole.
The edge of the insect collecting funnel is fixed between the trapping section and the shell of the electronic monitoring section in a clamping manner, but the application example can be realized in a manner of using an adhesive.
The inner diameter of the wiring hole is 4.1mm, and the application example can be realized by scaling the inner diameter.
The via hole has a diameter of 4mm, and this application example is also possible by scaling its diameter.
The main control circuit board and the insect collecting hopper are linked by using screws and nuts, but the application example can be realized by an adhesive or a buckling or clamping way.
The infrared data acquisition circuit board and the main control circuit board are connected by using pin headers and bus headers, but the application example can be realized by using wire connection.
The infrared data acquisition circuit board is connected with the shooting bin and the insect collecting hopper by adopting screws and nuts, but the application example can be realized by an adhesive or a buckling or clamping mode.
The light-compensating diode used is a white patch light-emitting diode, but patch diodes or in-line diodes of other colors are also possible to realize the application example.
The cleaning mechanism is connected with the shooting bin through screws and nuts, but the application example can be realized through an adhesive or a buckling or clamping mode.
The infrared light emitting tube and the infrared receiving tube used have a diameter of 5mm, but the present application example may be realized by changing the infrared emitting tube and the infrared receiving tube with other diameters (for example, 3 mm).
The relative positions of the infrared transmitting tube, the infrared receiving tube and the worm hole on the infrared data acquisition circuit board, but the application example can be realized by translating or rotating the infrared transmitting tube and the infrared receiving tube.
The surface of the insect-carrying platform is provided with two edges which are perpendicular to each other and a cone in the middle, but the application example can be realized by not arranging the edges or the cones or arranging the edges (such as wave shape) with other shapes.
The present application example may also be realized without using a reduction gear.
A limit switch is used as a control signal for the amplitude of the movement, but it is also possible to implement the present application example using a pressure sensor or an infrared sensor.
A dc motor is used, but it is also possible to implement the present application example using a stepper motor.
Based on the embodiments of the insect trap, the cloud server and the like, the application also provides an embodiment of a pest drop-in identification system, referring to fig. 28, the pest drop-in identification system specifically comprises the following contents:
the cloud server is in communication connection with the insect trap; the cloud server is used for executing the pest falling identification method or executing the pest type identification method.
From the above description, it can be known that the pest that this application embodiment provided falls into identification system, can effectively discern and distinguish whether the object that falls into the insect trap is the pest, and can effectively improve the accuracy and the reliability that the pest falls into the discernment, and can just control the insect trap and carry out timely and effectual image acquisition to the pest that falls into its inside when confirming to fall into the pest, and then can effectively reduce the invalid collection number of times and the frequent start number of times of image acquisition equipment in the insect trap, and then can reduce insect trap energy consumption, make the heat be difficult for piling up.
In order to further explain the above embodiments, the present application also provides an electronic probe trap with an automatic cleaning function and a pest real-time monitoring system implemented by adopting the above insect trap (i.e., electronic probe trap) and various methods, which relate to the field of grain depot pest monitoring, i.e., how to trap and accurately detect pests in grain piles during raw grain storage.
The existing electronic probe tube trapper and system for monitoring pests in grain piles have the following problems:
1. part of the probe tube traps lack pest cleaning mechanisms and need to rely on manpower to clean pests. This problem increases the complexity of the operation of the monitoring system and increases the labor costs, especially when monitoring and cleaning pests after an increased number of deployment in the grain bin becomes more difficult and time consuming.
2. The counting type probe trap with the cleaning function is partially provided, a negative pressure vacuum fluke pipe is required to be paved in the grain pile, when the external fluke pump is used for cleaning pests in the trap through the fluke pipe, the deployment mode limits the flexibility of the system, the fluke pipe is easy to block, and if the blocking condition needs to be manually cleaned, the labor cost is increased. In addition, the counting mode has certain error, the pest sucking operation is performed irregularly, the body of the pest trapped in the probe tube is dehydrated after death, and the pest is damaged during negative pressure pest sucking, so that the pest type is not easy to be checked and judged by grain depot custodians, and the control is possibly influenced.
3. Part has the probe trap of clearance function to have structural design defect, can't effectively clear up the stronger live worm of climbing ability. For example, a stepping motor is used for controlling the overturning mode of the falling insect plate, and some pests with strong climbing capacity cannot be cleaned, so that the monitoring accuracy is affected. In addition, some probe traps use an electric strip brush to clean pests, and this method can cause pests to catch the brush when cleaning living pests, resulting in cleaning failure.
4. The existing electronic probe trap cannot effectively distinguish grain scraps or impurities from pests, the pests can be wrongly counted, for some electronic probe traps using cameras, the false start of the cameras can be caused, and the frequent start of the cameras can generate larger heat, so that the pest reproduction and growth in grain piles can be facilitated, and meanwhile, larger fire safety hazards exist.
5. The dimension of the collected information is small, the common design purpose of the electronic probe tube trappers is to count the number of the trapped pests only, and a few probe tube trappers collect temperature and humidity data in the grain pile, but the collected temperature and humidity data are not enough to comprehensively reflect the condition of a ecological system in the grain pile, and the pest can not be effectively predicted.
6. Remote upgrades of the probe trap driver are not possible. This can lead to difficulties in program version iterations, and when the functionality or performance of the probe trap needs to be improved, the driver of the probe trap cannot be updated in a remote upgrade manner, requiring a special personnel or technical team to perform the field upgrade. This involves investment in human resources and material resources, increasing maintenance and upgrade costs.
In order to solve the above problem, in the electronic probe trap provided in the application example of the present application, the function of the main control circuit board is shown in fig. 29, and when the pest or the grain debris falls from the pest through hole, the infrared data acquisition circuit board will obtain a continuous voltage sequence by continuously monitoring the voltage of the infrared receiving tube, and fig. 30 is an example of the pest voltage sequence, and fig. 31 is an example of the grain debris voltage sequence.
The microcontroller of the system transmits the acquired voltage sequence data to a cloud server through a communication module, and at the cloud server, a machine learning algorithm is called to classify waveform types, and the method comprises the following steps: it is determined whether the waveform is triggered by pests or by grain debris. If the pest is triggered, performing a second step: the pest waveforms will be classified and give probabilities of belonging to a particular pest species.
The flow of the pest falling into the recognition algorithm is shown in fig. 32, and the original data is subjected to data preprocessing firstly and then is subjected to characteristic extraction and classification network to realize waveform data classification.
Then, the original data is collected and marked: the real-bin data source is a real-bin data acquisition experiment which is carried out in 2021 in No. 12 flat bins of the modern logistics center library of Anhui Liuan. In the experiment, a monitoring system consisting of 51 sets of electron probe traps was installed. The actual number of pests trapped in the trap collector and the number of data detected by the trap for 8 times in total were manually checked and recorded during the experiment from 7 months 24 to 7 months 27, 7 months 27 to 7 months 30, 7 months 30 to 8 months 3, 8 months 5 to 8 months 6, 8 months 6 to 8 months 9, 8 months 9 to 8 months 10, 8 months 10 to 8 months 11, 8 months 11 to 8 months 13. In addition, as the actual warehouse experiment is positioned in the medium-temperature high-humidity grain storage area in seven grain storage ecological areas in China and is in summer, a large number of booklices exist in the warehouse, however, the booklices are not in the effective monitoring range of the trap. Therefore, booklice are considered in the process of collecting and labeling the original data.
The raw data statistics obtained during the experiment are shown in table 3.
TABLE 3 Table 3
When the original data is manually marked, the data waveforms collected when the trap traps the pests and the data waveforms collected when the trap is triggered by mistake are distinguished, wherein typical comparison of several types of data with normal pest data is shown in fig. 33 and fig. 34 (a) to 34 (d).
According to the real bin data acquisition condition of the electronic probe trap, in the process of manually marking data, normal pest detection waveforms with obvious characteristics of the data waveforms are classified into a 1 st class, waveforms triggered by booklices (fig. 34 (a)) are classified into a 2 nd class, and double-linear waveforms generated by repeated triggering of pests (fig. 34 (b)) are classified into a 3 rd class. In other cases (fig. 34 (c), fig. 34 (d)) the data waveforms generated by false triggering are classified into class 4 because they are extreme cases, have a small data amount, or have no obvious regularity as in fig. 34 (d).
Data screening and manual labeling are carried out under the classification rules, and the profile of the constructed real-bin pest detection waveform data set is shown in table 4.
TABLE 4 Table 4
Waveform class Marking data volume
1 9891
2 7712
3 18502
4 1339
Model training process: the training environment of the waveform feature fusion classification network is shown in table 5 by using the real bin detection data set obtained by the real bin data acquisition experiment.
TABLE 5
During training, firstly initializing parameters of a soft min layer, wherein the initializing method is to apply a sliding window method to raw data in a data set to obtain subsequences with the same length as each block in the soft min layer, then apply a k-means clustering algorithm to the subsequences, and take a clustering center obtained by the algorithm as the initial parameters of the soft min layer. And then dividing the data set into a training set and a verification set according to a ratio of 7:3, training the network on the training set, setting the learning rate to be 0.01, and setting the training epoch number to be 30.
After classifying the pest waveforms by using the recognition algorithm, probabilities belonging to four classes can be obtained, and the class with the highest probability is selected as a recognition result, wherein class 1 is considered as a normal waveform, and classes 2, 3 and 4 are considered as abnormal waveforms. Identification of pest species will be performed for the class 1 waveform.
In this application example, the flow of the pest type recognition algorithm is as follows:
pest species identification dataset establishment: the data in the data set are waveform data sets generated by the fact that the electronic probe trap actually detects 8 common grain storage pests. The pest data is collected by manually throwing the pest entry. The data amounts for each category of pest in the data set are shown in table 6.
TABLE 6
Pest class Data quantity
Long-angle flat valley theft 871
Turkish flat valley theft 885
Theft of saw grain 912
Radix Et rhizoma Rhei 785
Hybrid theft of cereal 1122
Bark beetle 984
Rice elephant 876
Corn elephant 865
After converting the original data into a two-dimensional multi-channel optimized recursion diagram, the image size is (112,112,3). The loss function is a cross entropy loss function when the InceptionV3 network is trained; the optimizer is an Adam optimizer; the initial learning rate is 0.001, and 10% of the attenuation is reduced every 5000 times; the input batch_size is 16; the training set to test set ratio was 8:2. The equipment and environment used for the network training are shown in table 7.
TABLE 7
In addition, in order to solve the problem of over-fitting in the network training process, artificial data cleaning is firstly carried out on the pest species identification data set, and data with excessive differences in the species are cleaned. Examples of retained data and washed data are shown in fig. 35 and 36, respectively, taking long-angle flat valley theft (CF) as an example.
After data cleaning, the data in the data set is enhanced by using the following two modes:
(1) And shifting the original data leftwards or rightwards by 5 units, filling the sampling points with the vacancies at the two sides with the sequence endpoint values after shifting, and obtaining new data.
(2) And turning over two channel data in the original data to obtain a new set of data.
In addition, as the red-like chaff (TC) and the hybrid-like chaff (TCH) in the pests to be classified are too similar in appearance, and the pests are detected by using an infrared photoelectric sensor, the basic resolution is not high. Therefore, in order to further reduce the overfitting of the classified network and reduce the complexity of the network, the method combines two pests of red-simulated-grain Theft (TC) and hybrid-simulated-grain Theft (TCH) into simulated-grain theft to participate in network training.
In summary, the application example of the application provides a new actual stored grain pest counting algorithm flow based on the electronic probe trap, namely, firstly, waveform classification is performed on the original data detected by the trap, and the specific content includes:
(1) A waveform data set acquired by the trap real bin is established, and the original data is manually divided into 4 types of typical waveforms according to the generation reasons.
(2) A trap detection waveform feature extraction and classification network with integrated global features and local features is designed, so that waveform data classification in a data set is realized.
The counting algorithm then counts the number of each type of waveform collected by the trap and predicts the number of pests trapped by the trap using linear regression.
The application example of the application also provides a grain storage pest type identification method based on the electronic probe trap, which comprises the steps of firstly converting pest original data acquired by the trap into a two-dimensional image by applying an optimized multi-channel recursion graph algorithm, and then carrying out feature extraction and classification on the converted two-dimensional image by utilizing an InceptionV3 network in a classical convolution neural network structure, so that the type identification of 8 common grain storage pests is realized.
According to the application example, through analyzing the infrared waveform triggered by the pests, the occurrence of the pests can be monitored, the types of the pests can be identified, and the real-time effective monitoring of the stored-quantity pests is realized. In addition, the system records historical data and generates reports, provides key information for pest management, and can play an important role in monitoring and controlling stored grain pests.
In addition, in the embodiments and application examples mentioned above in the present application, the following alternatives are also possible:
1. at the data acquisition end, the number of the infrared photodiodes of the trapper is increased, for example, the number of the infrared photodiodes of the trapper is increased from the existing two pairs of infrared photodiodes to three pairs of infrared photodiodes, four pairs of infrared photodiodes are equal to each other, so that the dimension of the original data of the pests acquired by the trapper is improved, and then the trapper can still realize the counting and the type identification of the pests trapping by the trapper according to the thought of the invention.
2. In the classification of the waveform of the data collected by the real bin of the trapper, the waveform classification of the trapper can also be realized by using different time sequence feature extraction methods, such as wavelet analysis, transformation, LSTM and the like, or using different classifiers, such as SVM, xgboost, KNN and the like. In addition, when the number of waveforms of different types is subjected to linear regression, different regression algorithms such as logistic regression, ridge regression and the like are used.
3. In the identification of species of insect trapped by the trap, one-dimensional raw time series data acquired by the trap is converted into two-dimensional images by using different methods, such as: a Gladhand (GAF) method, a markov transfer field method, a graphic difference method, a relative position matrix method, and the like. The two-dimensional image is then feature extracted and classified using different deep learning networks, such as: resnet, VGG, denseNet, etc.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein can be implemented as hardware, software, or a combination of both. The particular implementation is hardware or software dependent on the specific application of the solution and the design constraints. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, a plug-in, a function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine readable medium or transmitted over transmission media or communication links by a data signal carried in a carrier wave.
It should be clear that the present application is not limited to the particular arrangements and processes described above and illustrated in the drawings. For the sake of brevity, a detailed description of known methods is omitted here. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications, and additions, or change the order between steps, after appreciating the spirit of the present application.
The features described and/or illustrated in this application for one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments.
The foregoing description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and variations may be made to the embodiment of the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.

Claims (33)

1. A method of identifying a pest fall through, comprising:
receiving target infrared data which are acquired from the insect trap and contain a double-channel discrete time sequence;
and acquiring a waveform identification type corresponding to the discrete time sequence of the two channels in the target infrared data based on a preset time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall into the pest trap at present and controlling the pest trap to acquire images of the pests falling into the pest trap.
2. The pest-drop-in identification method according to claim 1, wherein the acquiring the waveform identification type corresponding to the discrete time series of the two channels in the target infrared data based on the preset time series classification model includes:
global feature extraction is carried out on the two-channel discrete time sequence in the target infrared data so as to obtain global features of the target infrared data;
inputting the two-channel discrete time sequence and the global feature into a preset time sequence classification model, so that the time sequence classification model extracts local features corresponding to the two-channel discrete time sequence, and fusing the local features with the global feature to obtain waveform type identification result data of the target infrared data, wherein the waveform type identification result data comprises: probabilities of different waveform identification types including pest detection waveforms and at least one non-pest detection waveform.
3. The pest-drop identification method of claim 2, wherein the performing global feature extraction on the two-channel discrete time sequence in the target infrared data to obtain global features of the target infrared data comprises:
respectively carrying out waveform downward translation processing with a minimum value of 0 on the two-channel discrete time sequences in the target infrared data so as to obtain two preprocessed time sequences corresponding to the target infrared data;
based on a preset effective threshold value, effective sampling points are respectively selected from the two preprocessed time sequences to form reaction areas corresponding to the two preprocessed time sequences respectively;
and determining global features of the target infrared data according to the corresponding reaction areas of the two preprocessed time sequences.
4. The pest-drop-in recognition method according to claim 1, further comprising, before the acquiring of the waveform recognition type corresponding to the discrete time series of the two channels in the target infrared data based on the preset time series classification model:
acquiring each historical infrared data respectively comprising different two-channel discrete time sequences and each corresponding waveform identification type label, wherein the waveform identification type labels comprise labels corresponding to pest detection waveforms and at least one non-pest detection waveform;
Global feature extraction is carried out on the two-channel discrete time sequences in each historical infrared data so as to obtain global features corresponding to each historical infrared data;
based on each historical infrared data, the corresponding global feature and the waveform identification type label, training a preset feature extraction and classification network by a learnable time sequence classification algorithm so as to train the feature extraction and classification network into a time sequence classification model for identifying the waveform identification type of the infrared data.
5. The pest-drop identification method of claim 4, wherein the global feature extraction of the two-channel discrete time sequence in each of the historical infrared data to obtain the global feature corresponding to each of the historical infrared data comprises:
performing downward translation processing on the waveform with the minimum value of 0 on the two-channel discrete time sequences in each historical infrared data respectively to obtain two preprocessed time sequences corresponding to each historical infrared data;
based on a preset effective threshold value, effective sampling points are respectively selected from the two preprocessed time sequences corresponding to the historical infrared data respectively so as to form reaction areas of the two preprocessed time sequences corresponding to the historical infrared data respectively;
Acquiring at least one sub-reaction zone of each pre-processed time sequence, and selecting one with the largest number of effective sampling points from the sub-reaction zones corresponding to each pre-processed time sequence to be used as the maximum length reaction zone of the pre-processed time sequence;
and determining global features of the historical infrared data according to the maximum length reaction regions of the two preprocessed time sequences corresponding to the historical infrared data.
6. The pest-drop identification method of claim 4, wherein the feature extraction and classification network comprises:
the input layer is used for receiving the two-channel discrete time sequence, the corresponding global feature and the waveform identification type tag in the historical infrared data;
the local feature extraction layer is used for respectively carrying out local feature extraction on the two-channel discrete time sequences in the historical infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences, wherein the local feature extraction layer comprises a plurality of feature extraction blocks;
and the feature fusion layer is used for carrying out activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the historical infrared data so as to obtain probabilities of different waveform identification types corresponding to the historical infrared data, and calculating the loss value of the probabilities of the waveform identification types of the historical infrared data and the different waveform identification types corresponding to the historical infrared data based on a cross entropy loss function.
7. The pest-drop identification method of claim 6, further comprising, prior to said training a predetermined feature extraction and classification network with a learnable time-series classification algorithm based on each of said historical infrared data, corresponding global features, and waveform identification type labels:
acquiring subsequences with the same length as each feature extraction block in the local feature extraction layer from each historical infrared data based on a sliding window method;
and obtaining a clustering center of each subsequence based on a clustering algorithm, and taking the clustering center as an initial parameter of the local feature extraction layer to finish initializing the local feature extraction layer.
8. The pest-drop identification method according to claim 1 or 4, wherein the time-series classification model includes:
an input layer for receiving a two-channel discrete time sequence and corresponding global features in the target infrared data;
the local feature extraction layer is used for respectively extracting local features of the two-channel discrete time sequences in the target infrared data based on a learnable time sequence classification algorithm so as to obtain local features corresponding to the two-channel discrete time sequences;
And the feature fusion layer is used for carrying out activation function calculation, feature fusion and probability conversion processing on the global features and the local features corresponding to the target infrared data so as to obtain different probabilities of the waveform identification types.
9. The pest-drop identification method of claim 2, wherein the non-pest detection waveform comprises: a booklice trigger waveform, a pest repetition trigger waveform and a false trigger waveform.
10. A pest type recognition method, characterized by comprising:
if the pest falling identification method according to any one of claims 1 to 9 is used for obtaining that the waveform identification type is a pest detection waveform so as to confirm that the pest falling into the pest trap currently and control the pest trap to acquire images of the pest falling into the pest trap currently, acquiring pest type identification result data in the pest trap currently based on a preset pest type identification model.
11. The pest type recognition method according to claim 10, wherein the acquiring pest type recognition result data in the current pest trap based on a preset pest type recognition model includes:
receiving live-action image data of pests falling into the insect trap, wherein the live-action image data is collected by the insect trap;
Inputting the live-action image data into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the live-action image data, wherein the pest type recognition model comprises a first convolutional neural network.
12. The pest type recognition method according to claim 10, wherein the acquiring pest type recognition result data in the current pest trap based on a preset pest type recognition model includes:
converting the double-channel discrete time sequence in the target infrared data into a two-dimensional recursive image based on a preset recursive graph algorithm;
inputting the two-dimensional recursive image into a preset pest type recognition model so that the preset pest type recognition model correspondingly outputs pest type recognition result data of the two-dimensional recursive image data, wherein the pest type recognition model comprises a second convolutional neural network.
13. The pest type recognition method of claim 10, further comprising:
if the pest type recognition model comprises a first convolutional neural network and a second convolutional neural network, wherein the first convolutional neural network is used for recognizing pest type recognition result data corresponding to real image data of the pests which fall into the pest catcher, the second convolutional neural network is used for recognizing pest type recognition result data corresponding to a two-dimensional recursive image obtained by converting a two-channel discrete time sequence in the target infrared data, and iterative optimization is performed on the second convolutional neural network based on the pest type recognition result data corresponding to the real image data output by the first convolutional neural network.
14. The pest type recognition method according to claim 12 or 13, wherein the second convolutional neural network is an conceptionv 3 network.
15. The pest type recognition method according to any one of claims 10 to 13, wherein the pest type recognition result data includes probabilities of each of different pest recognition types;
wherein the pest identification type includes: long-angle flat-grain theft, turkish flat-grain theft, saw-grain theft, red-grain theft, hybrid-grain theft, bark beetle, rice weevil and corn weevil.
16. A pest drop-in identification device, comprising:
the infrared data receiving module is used for receiving target infrared data which is acquired from the insect trap and contains a double-channel discrete time sequence;
the pest falling-in identification and control image acquisition module is used for acquiring a waveform identification type corresponding to a double-channel discrete time sequence in the target infrared data based on a preset time sequence classification model, and if the waveform identification type is a pest detection waveform, confirming that pests fall in the pest trap at present and controlling the pest trap to acquire images of the pests falling into the pest trap.
17. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the pest fall-through identification method of any one of claims 1 to 9 or implements the pest type identification method of any one of claims 10 to 15.
18. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the pest-fall-into-recognition method according to any one of claims 1 to 9 or implements the pest-type recognition method according to any one of claims 10 to 15.
19. An insect trap, characterized in that the insect trap is used for collecting target infrared data containing a double-channel discrete time sequence therein and sending the target infrared data to a cloud server so that the cloud server executes the pest falling identification method of any one of claims 1 to 9;
the insect trap is also used for collecting images of pests falling into the insect trap when or after receiving the image collection control instruction sent by the cloud server.
20. The insect trap of claim 19, comprising an insect trap body comprising a control module, a trap module, and a monitoring module;
the trapping module comprises a first pipe and an insect collecting funnel arranged in the first pipe, wherein the insect collecting funnel and the first pipe are coaxially arranged, one end of the insect collecting funnel in a first direction is a first end, the other end of the insect collecting funnel is a second end, the diameter of the insect collecting funnel gradually contracts from the first end to the second end, and an insect passing hole is formed in the second end;
The monitoring module comprises a second pipe, and an infrared data acquisition assembly and a insect shooting assembly which are all installed in the second pipe, wherein the second pipe and the first pipe are coaxially arranged and are mutually connected, and the infrared data acquisition assembly and the insect shooting assembly are both in communication connection with the control module; the insect capturing device comprises an insect capturing component, an infrared data acquisition component and a control module, wherein the infrared data acquisition component is used for acquiring target infrared data which contains a double-channel discrete time sequence and is arranged in the insect capturing device according to an instruction of the control module, and the insect capturing component is used for acquiring images of insects falling into the insect capturing device according to the instruction of the control module.
21. The insect trap of claim 20, wherein the infrared data acquisition assembly comprises a first infrared emission tube and a first infrared receiving tube, the via is disposed proximate to the infrared data acquisition assembly, a projection of the via in a first plane is within a range in which the first infrared receiving tube can receive light emitted by the first infrared emission tube, the first plane is perpendicular to the first direction, and an axis of the first infrared emission tube and an axis of the first infrared receiving tube are both located in the plane, the first direction being an axial direction of the first tube;
The infrared data acquisition assembly further comprises a second infrared receiving tube and a second infrared transmitting tube which are coaxially arranged, the axis of the second infrared receiving tube and the axis of the second infrared transmitting tube are both located in the first plane, the first infrared transmitting tube and the first infrared receiving tube are coaxially arranged, the axis of the first infrared receiving tube is a first axis, the axis of the second infrared receiving tube is a second axis, the intersection point of the first axis and the second axis is located in the center of the projection of the worm hole in the first plane, and the projection of the worm hole in the first plane is located in the range of light rays which can be received by the second infrared receiving tube and transmitted by the second infrared transmitting tube.
22. The insect trap of claim 21, wherein the first axis is perpendicular to the second axis.
23. The insect trap of claim 22, wherein a distance between the first infrared emitting tube and the projection of the via in a first plane is greater than a distance between the first infrared receiving tube and the projection of the via in the first plane, and a distance between the second infrared emitting tube and the projection of the via in the first plane is greater than a distance between the second infrared receiving tube and the projection of the via in the first plane.
24. The insect trap of claim 20, wherein the insect capturing assembly comprises a camera and a shooting barrel, wherein a shooting bin is formed in the shooting barrel, a shooting bin inlet is formed at one end of the shooting barrel in the first direction, a shooting bin outlet is formed at the other end of the shooting barrel, the insect passing hole is communicated with the shooting bin inlet, the infrared data acquisition assembly is positioned between the insect collecting funnel and the shooting bin inlet in the first direction, the camera is arranged close to the shooting bin inlet, and is fixed on the inner wall of the shooting bin, and the camera is in communication connection with the control module so as to shoot into the shooting bin.
25. The insect trap of claim 24, wherein the shooting pot is tapered in diameter from the shooting pot inlet to the shooting pot outlet.
26. The insect trap of claim 24 wherein the monitoring module further comprises a cleaning assembly mounted within the second tube, the cleaning assembly comprising a housing, an insect-carrying platform and a driving member, the housing being connected to the shooting pot, the driving member being mounted to the housing, the insect-carrying platform being mounted to the driving member for driving the insect-carrying platform to reciprocate between a first position in which the insect-carrying platform cooperates with the shooting pot outlet and a second position in which a gap is formed between the insect-carrying platform and the shooting pot outlet, the driving member being in communication with the control module.
27. The insect trap of claim 26 wherein the drive member includes a drive member body and a first output shaft mounted to the drive member body, the drive member being a rotary drive member, the first output shaft being coaxially disposed with the second tube, the cleaning assembly further comprising a threaded rod mounted to the first output shaft, a threaded bore being formed in the housing, the threaded rod being mated with the threaded bore.
28. The insect trap of claim 27, wherein the cleaning assembly further comprises a trigger, and a first limit switch and a second limit switch both mounted to the housing, the first limit switch and the second limit switch both being adapted for communication connection with the control module, the trigger being mounted to the drive body, the trigger being adapted to move with the insect-carrying platform such that the trigger contacts the first limit switch when the insect-carrying platform is in a first position, and the trigger contacts the second limit switch when the insect-carrying platform is in a second position.
29. The insect trap of claim 27 wherein the drive member further comprises a second output shaft, the first output shaft being coaxially disposed with the second output shaft, and the first output shaft and the second output shaft being located at opposite ends of the drive member body in the first direction, the insect-carrying platform being secured to the second output shaft.
30. The insect trap of claim 29, wherein the insect-bearing platform comprises a top surface disposed facing the photographing bin, a first rib and a second rib are formed on the top surface, a length direction of the first rib and a length direction of the second rib are parallel to the top surface, the first rib and the second rib vertically intersect, and a convex cone is further formed at an intersection of the first rib and the second rib.
31. The insect trap of claim 29, wherein the trap body further comprises a collection tube coaxially disposed with the first tube, the collection tube being removably connected to an end of the second tube remote from the first tube, a collection chamber being formed within the collection tube in communication with the first tube.
32. An insect trap according to any one of claims 20 to 31, wherein the trap body further comprises a routing tube coaxially arranged with the first tube, one end of the routing tube extending to an end of the first tube remote from the second tube, the other end of the routing tube extending to an insect collecting funnel, a wire passing hole being formed in the insect collecting funnel, the routing tube communicating with the wire passing hole.
33. A pest drop-in identification system, comprising: a cloud server in communication with each other and an insect trap as claimed in any one of claims 20 to 32;
the cloud server is for performing the pest falling identification method according to any one of claims 1 to 9 or the pest type identification method according to any one of claims 10 to 15.
CN202311316016.9A 2023-10-11 2023-10-11 Insect pest falling-in identification method, type identification method, device, insect trap and system Active CN117290762B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410281131.5A CN118077660A (en) 2023-10-11 2023-10-11 Insect trap and system
CN202311316016.9A CN117290762B (en) 2023-10-11 2023-10-11 Insect pest falling-in identification method, type identification method, device, insect trap and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311316016.9A CN117290762B (en) 2023-10-11 2023-10-11 Insect pest falling-in identification method, type identification method, device, insect trap and system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410281131.5A Division CN118077660A (en) 2023-10-11 2023-10-11 Insect trap and system

Publications (2)

Publication Number Publication Date
CN117290762A true CN117290762A (en) 2023-12-26
CN117290762B CN117290762B (en) 2024-04-02

Family

ID=89258458

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410281131.5A Pending CN118077660A (en) 2023-10-11 2023-10-11 Insect trap and system
CN202311316016.9A Active CN117290762B (en) 2023-10-11 2023-10-11 Insect pest falling-in identification method, type identification method, device, insect trap and system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410281131.5A Pending CN118077660A (en) 2023-10-11 2023-10-11 Insect trap and system

Country Status (1)

Country Link
CN (2) CN118077660A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006056032A1 (en) * 2004-11-26 2006-06-01 Panov, Ivan, Yordanov Methods and devices for pest control
CN102915446A (en) * 2012-09-20 2013-02-06 复旦大学 Plant disease and pest detection method based on SVM (support vector machine) learning
WO2016045002A1 (en) * 2014-09-24 2016-03-31 上海星让实业有限公司 Smart imaging system and insect-trapping apparatus provided with same
KR20180057851A (en) * 2016-11-23 2018-05-31 이선희 The pest recognition system, and method for controlling pest using the same
US20200111335A1 (en) * 2018-10-04 2020-04-09 9138-4529 Québec Inc. Infrared motion sensing device and method
CN111814866A (en) * 2020-07-02 2020-10-23 深圳市万物云科技有限公司 Disease and pest early warning method and device, computer equipment and storage medium
CN111931581A (en) * 2020-07-10 2020-11-13 威海精讯畅通电子科技有限公司 Agricultural pest identification method based on convolutional neural network, terminal and readable storage medium
AU2020103613A4 (en) * 2020-11-23 2021-02-04 Agricultural Information and Rural Economic Research Institute of Sichuan Academy of Agricultural Sciences Cnn and transfer learning based disease intelligent identification method and system
CN114897090A (en) * 2022-05-26 2022-08-12 浙江工业大学 Long-time sequence classification method based on graph neural network
CN115226684A (en) * 2022-08-22 2022-10-25 河南云飞科技发展有限公司 Insect sucking tower capable of identifying insect situations
CN116758415A (en) * 2023-05-29 2023-09-15 李晖 Lightweight pest identification method based on two-dimensional discrete wavelet transformation

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006056032A1 (en) * 2004-11-26 2006-06-01 Panov, Ivan, Yordanov Methods and devices for pest control
CN102915446A (en) * 2012-09-20 2013-02-06 复旦大学 Plant disease and pest detection method based on SVM (support vector machine) learning
WO2016045002A1 (en) * 2014-09-24 2016-03-31 上海星让实业有限公司 Smart imaging system and insect-trapping apparatus provided with same
KR20180057851A (en) * 2016-11-23 2018-05-31 이선희 The pest recognition system, and method for controlling pest using the same
US20200111335A1 (en) * 2018-10-04 2020-04-09 9138-4529 Québec Inc. Infrared motion sensing device and method
CN111814866A (en) * 2020-07-02 2020-10-23 深圳市万物云科技有限公司 Disease and pest early warning method and device, computer equipment and storage medium
CN111931581A (en) * 2020-07-10 2020-11-13 威海精讯畅通电子科技有限公司 Agricultural pest identification method based on convolutional neural network, terminal and readable storage medium
AU2020103613A4 (en) * 2020-11-23 2021-02-04 Agricultural Information and Rural Economic Research Institute of Sichuan Academy of Agricultural Sciences Cnn and transfer learning based disease intelligent identification method and system
CN114897090A (en) * 2022-05-26 2022-08-12 浙江工业大学 Long-time sequence classification method based on graph neural network
CN115226684A (en) * 2022-08-22 2022-10-25 河南云飞科技发展有限公司 Insect sucking tower capable of identifying insect situations
CN116758415A (en) * 2023-05-29 2023-09-15 李晖 Lightweight pest identification method based on two-dimensional discrete wavelet transformation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘思琪等: "《基于视频目标跟踪算法的储粮害虫活跃程度判别研究》", 中国粮油学报, vol. 36, no. 11, 18 March 2021 (2021-03-18), pages 179 - 186 *

Also Published As

Publication number Publication date
CN117290762B (en) 2024-04-02
CN118077660A (en) 2024-05-28

Similar Documents

Publication Publication Date Title
Sun et al. Automatic in-trap pest detection using deep learning for pheromone-based Dendroctonus valens monitoring
Beery et al. Recognition in terra incognita
Junos et al. An optimized YOLO‐based object detection model for crop harvesting system
CN109726700B (en) Insect pest recognition early warning method and device based on multiple features
CN112990262B (en) Integrated solution system for monitoring and intelligent decision of grassland ecological data
EP3868202B1 (en) Method and apparatus for determining an index of insect biodiversity
Pothen et al. Texture-based fruit detection via images using the smooth patterns on the fruit
CN101701906A (en) Method and device for detecting stored-grain insects based on near infrared super-spectral imaging technology
CN113312999B (en) High-precision detection method and device for diaphorina citri in natural orchard scene
CN104918007A (en) Computer vision-based large field pest situation monitoring sampling device and sampling method
Ariza-Sentís et al. Object detection and tracking on UAV RGB videos for early extraction of grape phenotypic traits
Giakoumoglou et al. White flies and black aphids detection in field vegetable crops using deep learning
Rustia et al. An online unsupervised deep learning approach for an automated pest insect monitoring system
CN117290762B (en) Insect pest falling-in identification method, type identification method, device, insect trap and system
Dang et al. DeepCottonWeeds (DCW): a novel benchmark of YOLO object detectors for weed detection in cotton production systems
CN108064745A (en) Animal yelps monitoring system and the state identification method of yelping based on machine learning
Dandekar et al. Weed plant detection from agricultural field images using yolov3 algorithm
Lello et al. Fruit fly automatic detection and monitoring techniques: A review
KR101642712B1 (en) method for recognizing image of mosquito
Nickolas Deep learning based betelvine leaf disease detection (piper betlel.)
Abdulla et al. Agriculture based on internet of things and deep learning
CN113469014A (en) Deep learning-based bird hidden danger prevention and control method for power transmission line
Du et al. Rodent hole detection in a typical steppe ecosystem using UAS and deep learning
CN115024298B (en) Counting insecticidal lamp based on lightweight neural network and counting method
CN116188872A (en) Automatic forestry plant diseases and insect pests identification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant