CN112418052A - Mosquito recognition and expelling device and method based on deep learning - Google Patents
Mosquito recognition and expelling device and method based on deep learning Download PDFInfo
- Publication number
- CN112418052A CN112418052A CN202011294922.XA CN202011294922A CN112418052A CN 112418052 A CN112418052 A CN 112418052A CN 202011294922 A CN202011294922 A CN 202011294922A CN 112418052 A CN112418052 A CN 112418052A
- Authority
- CN
- China
- Prior art keywords
- mosquito
- microphone array
- mosquitoes
- controller
- deep learning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 241000255925 Diptera Species 0.000 title claims abstract description 111
- 238000013135 deep learning Methods 0.000 title claims abstract description 23
- 238000000034 method Methods 0.000 title claims description 19
- 230000001846 repelling effect Effects 0.000 claims abstract description 19
- 238000003491 array Methods 0.000 claims abstract description 6
- 238000012549 training Methods 0.000 claims description 23
- 239000013598 vector Substances 0.000 claims description 17
- 238000012360 testing method Methods 0.000 claims description 15
- 238000013527 convolutional neural network Methods 0.000 claims description 14
- 230000006870 function Effects 0.000 claims description 14
- 238000000926 separation method Methods 0.000 claims description 10
- 238000012706 support-vector machine Methods 0.000 claims description 9
- 238000001228 spectrum Methods 0.000 claims description 7
- 238000004422 calculation algorithm Methods 0.000 claims description 6
- 238000013178 mathematical model Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 6
- 241000238633 Odonata Species 0.000 claims description 5
- 238000004458 analytical method Methods 0.000 claims description 5
- 230000005236 sound signal Effects 0.000 claims description 5
- 230000001052 transient effect Effects 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000035945 sensitivity Effects 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 2
- 230000009466 transformation Effects 0.000 claims description 2
- 238000007405 data analysis Methods 0.000 claims 1
- 238000005457 optimization Methods 0.000 claims 1
- 230000001681 protective effect Effects 0.000 claims 1
- 206010003399 Arthropod bite Diseases 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 241000238631 Hexapoda Species 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 2
- 229910052744 lithium Inorganic materials 0.000 description 2
- 206010020751 Hypersensitivity Diseases 0.000 description 1
- 208000026935 allergic disease Diseases 0.000 description 1
- 230000007815 allergy Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 210000001638 cerebellum Anatomy 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 238000002790 cross-validation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000000077 insect repellent Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 210000001161 mammalian embryo Anatomy 0.000 description 1
- 230000029264 phototaxis Effects 0.000 description 1
- 239000002574 poison Substances 0.000 description 1
- 231100000614 poison Toxicity 0.000 description 1
- 230000001850 reproductive effect Effects 0.000 description 1
- 230000002087 whitening effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/02—Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M1/00—Stationary means for catching or killing insects
- A01M1/02—Stationary means for catching or killing insects with devices or substances, e.g. food, pheronones attracting the insects
- A01M1/04—Attracting insects by using illumination or colours
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M29/00—Scaring or repelling devices, e.g. bird-scaring apparatus
- A01M29/16—Scaring or repelling devices, e.g. bird-scaring apparatus using sound waves
- A01M29/18—Scaring or repelling devices, e.g. bird-scaring apparatus using sound waves using ultrasonic signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M2200/00—Kind of animal
- A01M2200/01—Insects
- A01M2200/012—Flying insects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Pest Control & Pesticides (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Insects & Arthropods (AREA)
- Wood Science & Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Zoology (AREA)
- Environmental Sciences (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computing Systems (AREA)
- Evolutionary Biology (AREA)
- Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Signal Processing (AREA)
- Birds (AREA)
- Catching Or Destruction (AREA)
Abstract
The mosquito identification and repelling device comprises a controller and a plurality of microphone arrays to form an annular structure, wherein the controller is connected with the microphone arrays to form the annular structure. According to the invention, the microphone array is used for collecting the sound of the mosquito wing vibration sound around the annular area, and the mosquito category of the area is judged in a deep learning manner, so that the accuracy of mosquito information judgment is improved.
Description
Technical Field
The invention relates to the field of mosquito identification and repelling, in particular to a mosquito identification and repelling device and method based on deep learning.
Background
People go out to travel in summer, and are particularly easy to be bitten by mosquitoes during waterside or outdoor travel. In most cases, the mosquito bites do not have serious consequences, but if allergy to the poison of certain insects or the bites of a large number of mosquitoes are encountered, the health of the mosquito bites can be endangered. Therefore, in order to solve the above problems, the present invention provides a mosquito recognition and repelling apparatus and method based on deep learning.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a mosquito identification and repelling device based on deep learning and a method thereof, which solve the problem that mosquitoes bite in the field during traveling.
In order to achieve the purpose, the invention adopts the technical scheme that:
the utility model provides a mosquito discernment and drives away device based on deep learning, includes controller 1 and microphone array 3, microphone array 3 is a plurality of, forms ring structure, controller 1 links to each other with microphone array 3 that forms ring structure.
The controller 1 is connected with the microphone array 3 through a low-current high-voltage line, and a protection net is arranged on the outer side of the controller.
Three microphones with staggered angles are respectively embedded in the front and the back of each microphone array 3, white light LEDs 7 are arranged in the microphone arrays 3, and ultraviolet fluorescent lamps 6 are arranged on the outer sides of the microphone arrays 3.
The surface of the controller 1 is provided with a power supply module 2, a keyboard module 4, a display module 5 and a sounder 8, wherein the power supply module 2 provides power for the device; the controller 1 receives signal input of a microphone array 3 and a keyboard module 4; the controller 1 controls the signal outputs of the display module 5, the ultraviolet fluorescent lamp 6, the white light LED7 and the sound generator 8.
The microphone array 3 collects sound of the mosquito wing vibration sound around the area, carries out mosquito recognition model training on the collected sound in a deep learning mode, and judges the mosquito type of the area according to the training result.
The controller 1 judges whether the surrounding mosquitoes are attracted according to the surrounding mosquito information collected by the microphone array 3, and after the surrounding mosquitoes are attracted, the sound generator 8 chip is driven to release ultrasonic signals of male mosquitoes or similar natural enemies of the mosquitoes, and the white light LED7 is controlled to be lightened instantly to frighten the mosquitoes, so that the mosquitoes are repelled quickly.
The display module 5 is used for displaying the mosquito species identified in the microphone array 3 and the working state of the microphone, and is used for attracting the mosquito or repelling the mosquito.
The keyboard module 4 is used for setting system related parameters.
A mosquito identification and repelling method based on deep learning comprises the following steps;
the method comprises the following steps:
collecting sound by a microphone array 3;
step two:
separating the signals by adopting an instantaneous linear aliasing blind separation algorithm;
step three:
carrying out wavelet threshold denoising processing on the separated sound;
step four:
thirdly, after denoising processing, performing signal analysis on the sound spectrum and extracting characteristic parameters of the sound spectrum;
step five:
dividing the extracted characteristic parameters into two parts, wherein one part is a training set and the other part is a testing set;
training the training set by using the training data to establish a complete convolutional neural network;
testing the accuracy of the test set by using the test data, and readjusting and re-optimizing the parameters of the convolutional neural network;
step six:
carrying out mosquito type identification on the mosquito god stick collected by the microphone array 3 through the optimized convolutional neural network; the recognition result is transmitted to the controller 1, and the controller 1 controls the sounder 8 to release the biological frequency close to the mosquitoes, and simultaneously drives the ultraviolet fluorescent lamp 6 to light up to attract the mosquitoes in the surrounding area. After the controller 1 judges that the mosquitoes are attracted according to the detected sound intensity, the sound generator 8 is controlled to release ultrasonic signals similar to natural enemies of the mosquitoes, and meanwhile, the white light LED7 is controlled to light, so that the mosquitoes are driven quickly.
The step two transient linear aliasing mathematical model is as follows:
the transient linear aliasing mathematical model is as follows:
where x (t) as (t) is an aliasing model, y (t) wx (t) is a separation model, and s (t) s1(t),s2(t),…,sn(t)]TIs a vector of source signals, x (t) ═ x1(t),x2(t),…,xm(t)]TTo observe the signal vector, y (t) ═ y1(t),y2(t),…,yn(t)]TTo separate the signal vectors, A is the aliasing matrix of m x n positions;
and (3) carrying out blind separation on the sound signals collected by the microphone array through an instantaneous linear aliasing mathematical model, and estimating the source signal waveform and parameters from x (t) so that the separated signals meet y (t) and s (t).
The wavelet threshold denoising treatment in the third step specifically comprises the following steps:
the wavelet transform is linear transform, and discrete wavelet transform is performed on the signal x (t) containing noise to obtain wavelet coefficient omegax(j, k) as ωj,kIt is still composed of two parts: one part is original letterWavelet coefficient ω of number s (t)x(j, k) as uj,k(ii) a The other part is a wavelet coefficient omega corresponding to the noise n (t)n(j, k) as vj,k;
When ω isj,kBelow a certain threshold, ωj,kMainly caused by noise, and can be considered as omegaj,k≈uj,kDirectly taking omega by using a hard threshold functionj,k=uj,kFor ω larger than the threshold portionj,kProcessing to obtain wavelet coefficientWavelet reconstruction is carried out on the signal to obtain a denoised signal
The fourth step is specifically as follows:
wavelet packet analysis is adopted to extract signal characteristic information, proper wavelet basis functions are selected to carry out wavelet packet transformation on the signals, and energy on each frequency band is extracted to form a characteristic vector.
The fifth step is specifically as follows:
the training process can be divided into a forward propagation stage and a backward propagation stage;
in the forward propagation stage, different small random numbers which are 0 in average value after considering nonlinear mapping influence and obey Gaussian distribution are used as initial values of weights to be trained and bias, training samples are input into a network after initialization parameters, and model prediction output is obtained after layer-by-layer operation;
and (3) a back propagation stage: according to the formulaCalculating the error between the prediction category and the actual category;
wherein E is a cross entropy loss function; k and N are categories and sample numbers; h isjIs the predicted output of the model for class j; h isyiIs the prediction output corresponding to the actual category of the ith sample;
testing the accuracy of the test set, andperforming parameter readjustment and parameter reoptimization on the convolutional neural network, giving the extracted characteristics to an SVM classifier for data division, and adoptingAs a function of the sensitivity of the CNN-SVM output layer;
wherein: c is a penalty factor; omega is the normal vector of the support vector machine, omegaTA device vector of ω; the activation function output of the 2 nd layer in the model is hi。
The invention has the beneficial effects that:
the microphone array is used for collecting the sound of the mosquito wing vibration sound around the annular area, and the mosquito category of the area is judged in a deep learning mode, so that the accuracy of mosquito information judgment is improved. This device lures the mosquito and adopts sound wave and light dual mode simultaneously with the mosquito repellent, and this device is for splicing into annular region not of uniform size according to the demand of oneself, and the controller manages regional equipment through CAN bus or carrier wave, helps solving the condition that open-air mosquito bites when travelling in summer now, easily carries, and is simple nimble.
Drawings
Fig. 1 is a schematic block diagram of the present invention.
Fig. 2 is a schematic diagram of an embodiment of the present invention.
Fig. 3 is a basic flow chart of mosquito recognition model training according to the present invention.
Fig. 4 is a schematic view of the overall structure of the present invention.
Fig. 5 is a schematic diagram of the controller structure of the present invention.
Fig. 6 is a schematic structural diagram of the microphone array 3 of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples.
Referring to fig. 1, a mosquito recognition and repelling device based on deep learning and a method thereof include a controller 1, a power module 2, a microphone array 3, a keyboard module 4, a display module 5, an ultraviolet fluorescent lamp 6, a white light LED7, and a sound generator 8; the power supply module 2 provides power supply for the system; the controller 1 receives signal input of a microphone array 3 and a keyboard module 4; the controller 1 controls the signal output of the display module 5, the ultraviolet fluorescent lamp 6, the white light LED7 and the sounder 8; the microphone array 3 collects sound of the mosquito wing vibration sound around the area, carries out mosquito identification model training on the collected sound in a deep learning mode, and judges the mosquito type of the area according to the training result; the microphone array 3 comprises at least one microphone for collecting sound emitted by a sound source, a certain distance is formed between the microphones, and the directions of the adjacent microphones are different, so that the sound collecting area of the microphone array 3 can be ensured to fully cover an annular area; the keyboard module 4 is used for setting system related parameters; the display module 5 is used for displaying the specific installation position and working state of the microphones in the microphone array 3, and is used for attracting mosquitoes or repelling mosquitoes,
the controller 1 drives the sounder 8 chip to release biological frequency close to the mosquitoes according to the recognized mosquito categories, and simultaneously drives the ultraviolet fluorescent lamp 6 to light up to attract the mosquitoes in the surrounding area. In particular, many insects prefer light at night, as do mosquitoes, and are preferred to ultraviolet light. For mosquitoes, blue light with a wavelength below 500nm has a very strong attraction to mosquitoes. Therefore, when the type of the surrounding mosquitoes is judged by utilizing the phototaxis of the mosquitoes, the ultraviolet fluorescent lamp 6 is driven to be lightened to emit ultraviolet light with the wavelength between 300nm and 400nm, and the acoustic generator 8 is driven to release the biological frequency close to the self frequency of the mosquitoes to attract the surrounding mosquitoes.
The controller 1 judges whether the surrounding mosquitoes are attracted according to the surrounding mosquito information collected by the microphone array 3, and after the surrounding mosquitoes are attracted, the sound generator 8 is driven to release ultrasonic signals similar to natural enemies of the mosquitoes, and meanwhile, the white light LED7 is controlled to light, so that the mosquitoes are driven quickly. Specifically, scientific experiments have demonstrated that only female mosquitoes in reproductive stage bite the human being and they take blood to nourish the embryo, whereas pregnant female mosquitoes actively avoid male mosquitoes and refuse to contact them during reproduction. Meanwhile, the dragonfly is a natural enemy of the mosquito, and the mosquito can be actively avoided when the dragonfly is nearby. According to the principle, the frequency of the ultrasonic signal can be equivalent to the sound of a male mosquito or a natural enemy dragonfly of the mosquito when the mosquito moves or catches food, the sound of the mosquito is similar to that of the mosquito, when the mosquito receives the ultrasonic signal sent by the sounder 8, the mosquito can be mistaken to be the sound of a male mosquito or the dragonfly and can retreat from the mosquito quickly, so that the mosquito is thoroughly repelled, the mosquito bite is prevented, meanwhile, the light with the wavelength of more than 500nm and especially not less than 560nm can obviously avoid the mosquito when the mosquito moves, the mosquito individuals which cannot avoid the light can have states of disordered flight, reduced activity and the like, and the mosquito can be ensured to fly away quickly in the annular area.
As shown in fig. 2, 4, 5 and 6, the area to be dispersed is enclosed into an annular area by the device, at least one microphone with a staggered orientation is arranged at the edge of the device, and only one controller 1 is arranged on the annular area to control the microphone array 3. The device CAN be spliced into annular areas with different sizes according to actual demands, unified management is carried out on the controller through the CAN bus or the carrier, and meanwhile, the controller 1 CAN manage any microphone in the microphone array 3 through the CAN bus or the carrier, so that the device is simpler and more reliable. The microphones in the microphone array 3 are installed at intervals of several meters, and the orientation of the microphones is taken into consideration when installing, one microphone faces the annular area, and the adjacent microphones face the annular area, so that the sound collecting range of the microphone array 3 can fully cover the annular area.
Referring to fig. 3, the blind separation of the sound collected by the microphone array 3 is performed first, and the blind separation of the sound is performed by using the linear blind separation algorithm of the cerebellum model neural network because the signal separation is quicker and the error is smaller. And after the wavelet threshold denoising treatment is carried out, the signal analysis is carried out on the sound spectrum and the characteristic parameters of the sound spectrum are extracted, and after the denoising and the characteristic extraction are carried out on the sound spectrum, the SVM with strong generalization capability and the CNN are adopted to form a hybrid recognition model, so that the accuracy and the reliability of recognition are improved.
A Support Vector Machine (SVM) is a machine learning classification algorithm depending on a kernel function, relevant parameters of the kernel function are adjusted through cross validation of training samples, the fitting problem is optimized, and some support vectors are generated on a hyperplane and some support vectors are generated between the hyperplanes. After the output layer of the CNN network adopts a multi-SVM classifier of a one-against-all mode to replace a common softmax classifier, a back propagation gradient method is adopted, and the least square of L2-SVM is used, and the expression is as follows:
wherein: c is a penalty factor; omega is the normal vector of the support vector machine, omegaTA device vector of ω; phi is the mapping of the feature space implied by the definition of the kernel function K to the high-dimensional space;representing the square of the vector 2 norm. Based on an L2-SVM loss function represented by formula (1), a CNN-SVM model is trained layer by layer from an output layer formed by SVM by adopting a back propagation gradient method, and the output of an activation function of a 2 nd layer from the last in the model is hiX in the formula (1)iBy using hiInstead, then the derivative can be taken:
using a linear support vector machine classifier, equation (2) can be written as:
equation (3) is adopted as a sensitivity function of the output layer of the CNN-SVM, and from here, a forward back propagation algorithm, namely a back propagation algorithm of the CNN network adopting a standard softmax classifier, is started.
Firstly, a sampled sound signal is divided into groups and processed to form an experimental data set { (x)i 1,yi 1),(xi 2,yi 2),(xi 3,yi 3)}i N1, i.e. each group of inputs comprises 3 consecutive sound signals each having a duration Δ T, the consecutive 3 sound signals are converted into a representation of a spectrogram, and the PCA whitening is performed to obtain a set of data x Mi 1、xi 2、xi 3As input to the CNN-SVM model, where yiAre the corresponding category labels. The experimental data is divided into a training set and a testing set. Training the training set through a back propagation algorithm to obtain convolutional layer weight parameters, establishing a complete convolutional neural network, testing the accuracy of the test set, performing parameter readjustment and parameter reoptimization on the convolutional neural network, and identifying the mosquito types of the collected mosquito sounds through the optimized convolutional neural network.
Claims (10)
1. The utility model provides a mosquito discernment and drives device based on deep learning, its characterized in that includes controller (1) and microphone array (3), microphone array (3) are a plurality of, form ring structure, controller (1) links to each other with microphone array (3) that form ring structure.
2. The mosquito recognition and repelling device based on deep learning of claim 1, wherein the controller (1) and the microphone array (3) are connected by low current high voltage lines, and a protective net is arranged outside.
3. The mosquito recognition and repelling device based on deep learning of claim 1, wherein three microphones with staggered angles are embedded in the front and back of each microphone array (3), white light LEDs (7) are arranged inside the microphone arrays (3), and ultraviolet fluorescent lamps (6) are arranged outside the microphone arrays.
4. The mosquito recognition and repelling device based on deep learning of claim 3, wherein the surface of the controller (1) is provided with a power module (2), a keyboard module (4), a display module (5) and a sounder (8), and the power module (2) provides power for the device; the controller (1) receives signal inputs of a microphone array (3) and a keyboard module (4); the controller (1) controls the signal output of the display module (5), the ultraviolet fluorescent lamp (6), the white light LED (7) and the sounder (8).
5. The mosquito recognition and repelling device based on deep learning of claim 1, wherein the microphone array (3) collects sounds of the flapping sounds of the mosquito wings around the area, performs model training of mosquito recognition on the collected sounds in a deep learning manner, and judges the mosquito type of the area according to the training result;
the microphone array (3) comprises at least one microphone for collecting sound emitted by a sound source, a certain distance is formed between the microphones, the directions of the adjacent microphones are different, the sound collection range of the microphone array (3) CAN be ensured to fully cover an annular area, and the controller (1) manages and controls the microphone array through a CAN bus or a carrier wave;
the controller (1) judges whether the surrounding mosquitoes are attracted according to the surrounding mosquito information collected by the microphone array (3), and after the surrounding mosquitoes are attracted, the chip of the sounder (8) is driven to release ultrasonic signals of male mosquitoes or similar natural enemies (dragonflies) of the mosquitoes, and the white light LED (7) is controlled to be instantly lightened to frighten the mosquitoes, so that the mosquitoes are quickly repelled;
the display module (5) is used for displaying the mosquito types identified in the microphone array (3) and the working state of the microphones and attracting or repelling mosquitoes.
6. The method for deep learning based mosquito identification and repelling device according to claim 1, comprising the steps of;
the method comprises the following steps:
collecting sound by a microphone array (3);
step two:
separating the signals by adopting an instantaneous linear aliasing blind separation algorithm;
step three:
carrying out wavelet threshold denoising processing on the separated sound;
step four:
thirdly, after denoising processing, performing signal analysis on the sound spectrum and extracting characteristic parameters of the sound spectrum;
step five:
dividing the extracted characteristic parameters into two parts, wherein one part is a training set and the other part is a testing set;
training the training set by using the training data to establish a complete convolutional neural network;
testing the accuracy of the test set by using the test data, and readjusting and re-optimizing the parameters of the convolutional neural network;
step six:
carrying out mosquito type identification on the mosquito god stick collected by the microphone array (3) through the convolutional neural network obtained after optimization; the recognition result is transmitted to the controller (1), the controller (1) controls the sounder (8) to release the biological frequency close to the mosquitoes, the ultraviolet fluorescent lamp (6) is driven to light, the mosquitoes in the surrounding area are attracted, after the controller (1) judges that the mosquitoes are attracted according to the detected sound intensity, the sounder (8) is controlled to release ultrasonic signals similar to natural enemies of the mosquitoes, and the white light LED (7) is controlled to light, so that the mosquitoes are driven quickly.
7. The deep learning-based mosquito identification and repelling method according to claim 6, wherein the second step transient linear aliasing mathematical model is as follows:
the transient linear aliasing mathematical model is as follows:
where x (t) as (t) is an aliasing model, y (t) wx (t) is a separation model, and s (t) s1(t),s2(t),...,sn(t)]TIs a vector of source signals, x (t) ═ x1(t),x2(t),...,xm(t)]TTo observe the signal vector, y (t) ═ y1(t),y2(t),...,yn(t)]TFor separating signal vectors, A is the aliasing matrix for mxn of positions;
and (3) carrying out blind separation on the sound signals collected by the microphone array through an instantaneous linear aliasing mathematical model, and estimating the source signal waveform and parameters from x (t) so that the separated signals meet y (t) and s (t).
8. The method for mosquito recognition and repelling based on deep learning of claim 6, wherein the wavelet threshold denoising process in the third step is specifically:
the wavelet transform is linear transform, and discrete wavelet transform is performed on the signal x (t) containing noise to obtain wavelet coefficient omegax(j, k) as ωj,kIt is still composed of two parts: a portion of wavelet coefficients ω of the original signal s (t)x(j, k) as uj,k(ii) a The other part is a wavelet coefficient omega corresponding to the noise n (t)n(j, k) as vj,k;
When ω isj,kBelow a certain threshold, ωj,kMainly caused by noise, and can be considered as omegaj,k≈uj,kDirectly taking omega by using a hard threshold functionj,k=uj,kFor ω larger than the threshold portionj,kProcessing to obtain wavelet coefficientWavelet reconstruction is carried out on the signal to obtain a denoised signal
9. The method for mosquito recognition and repelling based on deep learning of claim 6, wherein the fourth step is specifically:
wavelet packet analysis is adopted to extract signal characteristic information, proper wavelet basis functions are selected to carry out wavelet packet transformation on the signals, and energy on each frequency band is extracted to form a characteristic vector.
10. The method for mosquito recognition and repelling based on deep learning of claim 6, wherein the step five is specifically:
the training process can be divided into a forward propagation stage and a backward propagation stage;
in the forward propagation stage, different small random numbers which are 0 in average value after considering nonlinear mapping influence and obey Gaussian distribution are used as initial values of weights to be trained and bias, training samples are input into a network after initialization parameters, and model prediction output is obtained after layer-by-layer operation;
and (3) a back propagation stage: according to the formulaCalculating the error between the prediction category and the actual category;
wherein E is a cross entropy loss function; k and N are categories and sample numbers; h isjIs the predicted output of the model for class j;is the prediction output corresponding to the actual category of the ith sample;
testing the accuracy of the test set, readjusting and optimizing the parameters of the convolutional neural network, dividing the extracted features into data by an SVM classifier, and performing data analysis by using the extracted featuresAs a function of the sensitivity of the CNN-SVM output layer;
wherein: c is a penalty factor; omega is the normal vector of the support vector machine, omegaTA device vector of ω; the activation function output of the 2 nd layer in the model is hi。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011294922.XA CN112418052A (en) | 2020-11-18 | 2020-11-18 | Mosquito recognition and expelling device and method based on deep learning |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011294922.XA CN112418052A (en) | 2020-11-18 | 2020-11-18 | Mosquito recognition and expelling device and method based on deep learning |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112418052A true CN112418052A (en) | 2021-02-26 |
Family
ID=74774276
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011294922.XA Pending CN112418052A (en) | 2020-11-18 | 2020-11-18 | Mosquito recognition and expelling device and method based on deep learning |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112418052A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113143570A (en) * | 2021-04-27 | 2021-07-23 | 福州大学 | Multi-sensor fusion feedback adjustment snore stopping pillow |
CN113261758A (en) * | 2021-06-01 | 2021-08-17 | 江西恒必达实业有限公司 | Intelligent outdoor alpenstock system with talkback function |
CN113940327A (en) * | 2021-10-14 | 2022-01-18 | 怀化市正泰农机装备有限公司 | Self-cleaning household energy-saving insect trap |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205828817U (en) * | 2016-05-26 | 2016-12-21 | 宋光荣 | A kind of anophelifuge fly socket |
CN106852309A (en) * | 2016-12-31 | 2017-06-16 | 肇庆高新区黑阳科技有限公司 | A kind of environmentally friendly mosquito killing device based on biological nature |
CN208001974U (en) * | 2018-01-25 | 2018-10-26 | 永春县书杰家庭农场 | A kind of vehicle environment protection mosquito repellant |
CN208300837U (en) * | 2018-05-08 | 2019-01-01 | 福建光能能源科技有限公司 | Portable type solar energy mosquito repellant |
CN109751547A (en) * | 2019-01-15 | 2019-05-14 | 西北工业大学 | Street lamp with sound positioning and identification function |
CN109793506A (en) * | 2019-01-18 | 2019-05-24 | 合肥工业大学 | A kind of contactless radial artery Wave shape extracting method |
CN110347187A (en) * | 2019-08-09 | 2019-10-18 | 北京机械设备研究所 | A kind of target detection tracing System and method for based on sound and image information |
CN111131939A (en) * | 2019-12-23 | 2020-05-08 | 无锡中感微电子股份有限公司 | Audio playing device |
CN111640437A (en) * | 2020-05-25 | 2020-09-08 | 中国科学院空间应用工程与技术中心 | Voiceprint recognition method and system based on deep learning |
-
2020
- 2020-11-18 CN CN202011294922.XA patent/CN112418052A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN205828817U (en) * | 2016-05-26 | 2016-12-21 | 宋光荣 | A kind of anophelifuge fly socket |
CN106852309A (en) * | 2016-12-31 | 2017-06-16 | 肇庆高新区黑阳科技有限公司 | A kind of environmentally friendly mosquito killing device based on biological nature |
CN208001974U (en) * | 2018-01-25 | 2018-10-26 | 永春县书杰家庭农场 | A kind of vehicle environment protection mosquito repellant |
CN208300837U (en) * | 2018-05-08 | 2019-01-01 | 福建光能能源科技有限公司 | Portable type solar energy mosquito repellant |
CN109751547A (en) * | 2019-01-15 | 2019-05-14 | 西北工业大学 | Street lamp with sound positioning and identification function |
CN109793506A (en) * | 2019-01-18 | 2019-05-24 | 合肥工业大学 | A kind of contactless radial artery Wave shape extracting method |
CN110347187A (en) * | 2019-08-09 | 2019-10-18 | 北京机械设备研究所 | A kind of target detection tracing System and method for based on sound and image information |
CN111131939A (en) * | 2019-12-23 | 2020-05-08 | 无锡中感微电子股份有限公司 | Audio playing device |
CN111640437A (en) * | 2020-05-25 | 2020-09-08 | 中国科学院空间应用工程与技术中心 | Voiceprint recognition method and system based on deep learning |
Non-Patent Citations (1)
Title |
---|
王永兰,程利容编著: "《LED照明设计与应用》", 30 June 2017, 成都:电子科技大学出版社, pages: 99 - 100 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113143570A (en) * | 2021-04-27 | 2021-07-23 | 福州大学 | Multi-sensor fusion feedback adjustment snore stopping pillow |
CN113143570B (en) * | 2021-04-27 | 2023-08-11 | 福州大学 | Snore relieving pillow with multiple sensors integrated with feedback adjustment |
CN113261758A (en) * | 2021-06-01 | 2021-08-17 | 江西恒必达实业有限公司 | Intelligent outdoor alpenstock system with talkback function |
CN113940327A (en) * | 2021-10-14 | 2022-01-18 | 怀化市正泰农机装备有限公司 | Self-cleaning household energy-saving insect trap |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112418052A (en) | Mosquito recognition and expelling device and method based on deep learning | |
Lima et al. | Automatic detection and monitoring of insect pests—A review | |
US11617353B2 (en) | Animal sensing system | |
CN102340988B (en) | Photonic fence | |
Rydhmer et al. | Automating insect monitoring using unsupervised near-infrared sensors | |
CN106793768B (en) | Photon fence | |
Dangles et al. | Spider's attack versus cricket's escape: velocity modes determine success | |
Kodama et al. | Development of classification system of rice disease using artificial intelligence | |
Clark et al. | Evolution and ecology of silent flight in owls and other flying vertebrates | |
CN204502024U (en) | For simulate severe psychiatric stress with the device of extensive experiment | |
CN204500970U (en) | For studying the experimental provision of batch toy anxiety/frightened behavior reaction | |
WO2018065308A1 (en) | Identification of beneficial insects and/or pollutants in a field for crop plants | |
Parmezan et al. | Changes in the wing-beat frequency of bees and wasps depending on environmental conditions: a study with optical sensors | |
König | IndusBee 4.0–integrated intelligent sensory systems for advanced bee hive instrumentation and hive keepers' assistance systems | |
JP4469961B2 (en) | Pest collection and detection device using attraction and evasion by physical and mechanical action | |
Carlile et al. | Detection of a looming stimulus by the Jacky dragon: selective sensitivity to characteristics of an aerial predator | |
CN114022656A (en) | Intelligent air-suction type LED characteristic spectrum pest trapping and killing system and method | |
Hartbauer | Artificial neuronal networks are revolutionizing entomological research | |
Van Goethem et al. | An IoT solution for measuring bee pollination efficacy | |
CN108304818A (en) | A kind of mosquito matchmaker automatic distinguishing method for image | |
Fitt et al. | Methods for studying behavior | |
CN112200368A (en) | Mosquito quantity prediction method and system | |
Ali et al. | Multi-Features and Multi-Deep Learning Networks to identify, prevent and control pests in tremendous farm fields combining IoT and pests sound analysis | |
Kanwal | Sonic and ultrasonic communication in bats: acoustics, perception, and production | |
PushpaLakshmi | Development of an IoT-Based Bird Control System Using a Hybrid Deep Learning CNN-SVM Classifier Model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210226 |