CN113545332A - Intelligent mouse trap - Google Patents

Intelligent mouse trap Download PDF

Info

Publication number
CN113545332A
CN113545332A CN202110773066.4A CN202110773066A CN113545332A CN 113545332 A CN113545332 A CN 113545332A CN 202110773066 A CN202110773066 A CN 202110773066A CN 113545332 A CN113545332 A CN 113545332A
Authority
CN
China
Prior art keywords
module
unit
information
detection
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110773066.4A
Other languages
Chinese (zh)
Other versions
CN113545332B (en
Inventor
黄吉双
何汉宗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Bangheng Environmental Science Co ltd
Original Assignee
Zhongwei Zhilian Fujian Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongwei Zhilian Fujian Information Technology Co ltd filed Critical Zhongwei Zhilian Fujian Information Technology Co ltd
Priority to CN202110773066.4A priority Critical patent/CN113545332B/en
Publication of CN113545332A publication Critical patent/CN113545332A/en
Application granted granted Critical
Publication of CN113545332B publication Critical patent/CN113545332B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M23/00Traps for animals
    • A01M23/38Electric traps
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M23/00Traps for animals

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Engineering & Computer Science (AREA)
  • Insects & Arthropods (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention discloses an intelligent mousetrap, which comprises a detection module, a control module, a human-computer interaction module, a trapping module, a data transmission module and an intelligent terminal module, wherein the detection module is used for acquiring video image information, voice information, environment information, mouse information and mouse bait information of a detection area, the control module is used for controlling the human-computer interaction module and the trapping module to work after processing the detection information, the human-computer interaction module is used for sending an alarm signal and receiving operation instruction information of a user, the trapping module is used for trapping mice in the detection area, the intelligent terminal module is used for remotely acquiring the mouse trapping information by the user and sending a remote control instruction to the control module, the mouse trap can identify the acquired mouse image information and the voice information through the control module, the mouse identification accuracy can be improved, the probability of mistaken killing is reduced, and meanwhile, the detection information is remotely acquired through the intelligent terminal module, and the labor is saved.

Description

Intelligent mouse trap
Technical Field
The invention relates to the field of mousetraps, in particular to an intelligent mousetrap.
Background
Although some existing deratization devices can kill rats in a short time, the number of the rats cannot be monitored, regional population distribution, density index and seasonal growth and decline conditions of the rats cannot be known, and therefore appropriate deratization measures cannot be made.
Therefore, it is an urgent need to solve the above problems by providing a new technical solution.
Disclosure of Invention
In view of this, the invention provides an intelligent mouse trap to solve the above technical problems.
In order to achieve the purpose, the invention provides the following technical scheme:
an intelligent mouse trap comprises a detection module, a control module, a human-computer interaction module, a trapping module, a data transmission module and an intelligent terminal module.
In the above scheme, the detection module is configured to obtain video image information, voice information, environmental information, mouse information, and rodent bait information of the detection area.
In the above scheme, the control module is connected to the detection module, the human-computer interaction module and the trapping module are both connected to the control module, and the control module is configured to process information detected by the detection module and then control the human-computer interaction module and the trapping module to operate.
In the above scheme, the human-computer interaction module is configured to send an alarm signal under the control of the control module, send operation instruction information of a user to the control module, and receive receipt information sent by the control module.
In the above scheme, the trapping module is used for trapping mice in the detection area under the control of the control module.
In the above scheme, the intelligent terminal module communicates with the control module through the data transmission module, and the intelligent terminal module is used for a user to remotely obtain mouse trapping information and send a remote control instruction to the control module.
In the above scheme, the detection module includes a sensor unit, the sensor unit includes a biological sensing module and an environmental information acquisition module, and the biological sensing module acquires biological information and distance information of a detection area through a plurality of infrared sensing receivers; the environment information acquisition module comprises an air quality sensor, a temperature and humidity sensor and a brightness sensor, the air quality sensor is used for acquiring air quality information of a detection area, the temperature and humidity sensor is used for acquiring environment temperature and humidity information of the detection area, and the brightness sensor is used for acquiring environment brightness information of the detection area.
In the above scheme, detection module still includes bait detecting element, bait detecting element includes material level detector, a weighing sensor and miniature camera head, material level detector is arranged in acquireing the surplus information of bait in the bait box, a weighing sensor is arranged in acquireing the weight information of bait in the bait box, miniature camera head is arranged in the environment and acquires the image information of bait in the bait box.
In the above scheme, detection module still includes video acquisition unit and mouse weight detecting element, video acquisition unit includes rotatable infrared camera, step motor and relay, step motor with the relay all with rotatable infrared camera is connected, rotatable infrared camera is used for gathering detection area's video information, step motor is used for the drive rotatable infrared camera rotates, the relay is used for control opening and closing of rotatable infrared camera, mouse weight detecting element acquires the weight information of mouse through setting up the second weight sensor in bait box the place ahead.
In the above scheme, the detection module further comprises a voice acquisition unit, the voice acquisition unit is used for acquiring the sound emitted by the mouse, the voice acquisition unit comprises an audio collection module and a preprocessing module, the audio collection module acquires the audio information of a detection area through a plurality of omnidirectional microphones, the preprocessing module is connected with the audio collection module, the preprocessing module comprises an audio filter and an audio signal amplifier, the audio filter is used for filtering the audio information acquired by the omnidirectional microphones, the audio signal amplifier is connected with the audio filter, and the audio signal amplifier is used for amplifying the audio signal processed by the audio filter.
In the above scheme, the control module includes a cache unit and an analysis and comparison unit, the cache unit is configured to store the information sent by the detection module, the preset value information of each parameter, the image database, and the audio database, the analysis and comparison unit is connected to the cache unit, and the analysis and comparison unit is configured to analyze and compare the information stored in the storage unit.
In the above solution, the analyzing and comparing unit includes a video preprocessing unit and an image recognition unit, the video preprocessing unit includes a video conversion module, a denoising module, a color recognition module and a graying module, the video conversion module is configured to convert video information of a detection region acquired by the detection module into an image frame, the denoising module is connected to the video conversion module, the denoising module is configured to suppress noise of the image frame processed by the video conversion module through a median filtering algorithm, the color recognition module is connected to the denoising module, the color recognition module is configured to detect color of the image processed by the denoising module, the graying module is connected to the color recognition module, and the graying module is configured to detect the image detected by the color recognition module as an RGB model image, carrying out graying processing on the image; the image recognition unit is connected with the video preprocessing unit and comprises a feature extraction module and a feature matching module, the feature extraction module is used for extracting features of images obtained after the images are processed by the video preprocessing unit by calling a feature extraction algorithm of an OpenCV (open content library) library, the feature matching module is connected with the feature extraction module, the feature matching module is used for performing feature matching on feature information extracted by the feature extraction module and feature information of the image database through a deep learning algorithm to obtain a matching rate, and an image recognition result is obtained according to the matching rate.
In the above solution, the analysis comparing unit further includes an audio identifying unit and a data comparing unit, the audio identifying unit includes an audio converting module, a feature extracting module and an identifying module, the audio converting module includes a framing unit, a windowing unit, a fourier transforming unit, a spectrogram forming unit and a spectrogram converting unit, the framing unit is used for framing the voice information stored in the cache unit, the windowing unit is connected to the framing unit, the windowing unit is used for windowing the voice signal passing through the framing unit, the fourier transforming unit is connected to the windowing unit, the fourier transforming unit is used for fourier transforming the voice signal passing through the windowing unit, the spectrogram forming unit is connected to the fourier transforming unit, the spectrogram forming unit is used for performing a modular squaring on the signal passing through the fourier transforming unit, obtaining a spectrogram, wherein the spectrogram conversion unit is connected with the spectrogram forming unit, the spectrogram conversion unit is used for converting the spectrogram into a Mel spectrogram through a Mel filter, the feature extraction module is connected with the audio conversion module, the feature extraction module is used for extracting features of the Mel spectrogram through a trained residual error network model, the recognition module is connected with the feature extraction module, and the recognition module is used for classifying and recognizing the features extracted by the feature extraction module through a random forest classifier; the data comparison unit is used for comparing each parameter stored in the cache unit with a parameter preset value corresponding to each parameter, and sending a control signal to a corresponding module according to a comparison result.
In the scheme, the human-computer interaction module comprises an alarm unit, a display unit and an indication unit, the alarm unit comprises a bicolor LED lamp and a voice broadcast device, the bicolor LED lamp is used for indicating whether bait in a bait box is sufficient, the voice broadcast device is used for broadcasting insufficient bait information and mouse killing information, the display unit displays mouse trapping information through an LCD touch display screen and sends touch operation instructions of a user to the control module, and the indication unit comprises a plurality of monochromatic LED indication lamps to indicate the working condition of each module.
In the above scheme, the trapping module comprises a driving unit and a killing unit, the driving unit is connected with the killing unit, the driving unit is used for driving the killing unit to kill rats, the driving unit comprises a gate driving motor and a silicon controlled trigger circuit, and the killing unit comprises a high-voltage device.
In conclusion, the beneficial effects of the invention are as follows: video image information, voice information, environmental information, mouse information and mouse bait information of a detection area are acquired through the detection module, the acquired mouse image information and voice information are identified through the control module, the accuracy of mouse identification can be improved, the probability of mistaken killing is reduced, meanwhile, the detection information is remotely acquired through the intelligent terminal module, and manpower is saved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention.
Fig. 1 is a schematic composition diagram of the intelligent mousetrap.
Fig. 2 is a schematic diagram of the detection module.
Fig. 3 is a schematic composition diagram of a sensor unit.
Fig. 4 is a schematic composition diagram of a bait detection unit.
Fig. 5 is a schematic diagram of the composition of the video capture unit.
Fig. 6 is a schematic diagram of the components of the voice acquisition unit.
Fig. 7 is a schematic diagram of the control module.
FIG. 8 is a schematic diagram showing the composition of an analysis and comparison unit.
Fig. 9 is a schematic diagram of the composition of the video pre-processing unit.
Fig. 10 is a schematic diagram of the composition of the image recognition unit.
Fig. 11 is a schematic diagram of the composition of the audio recognition unit.
Fig. 12 is a schematic diagram of the audio conversion module.
FIG. 13 is a schematic diagram of the components of the human interaction module.
Figure 14 is a schematic composition diagram of a trap module.
Fig. 15 is a schematic structural view of the intelligent mousetrap of the invention.
List of reference numerals: the induction receiving head 1, the induction transmitting head 2 and the disposable bait box 3.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
As shown in fig. 1, the intelligent mousetrap of the invention comprises a detection module, a control module, a human-computer interaction module, a trapping module, a data transmission module and an intelligent terminal module.
The connection relationship between the above modules of the present invention will be further described in detail with reference to the accompanying drawings.
Further, the detection module is used for acquiring video image information, voice information, environment information, mouse information and mouse bait information of a detection area; the control module is connected with the detection module, the human-computer interaction module and the trapping module are both connected with the control module, and the control module is used for processing the information detected by the detection module and then controlling the human-computer interaction module and the trapping module to work; the man-machine interaction module is used for sending an alarm signal under the control of the control module, sending operation instruction information of a user to the control module and then receiving receipt information sent by the control module; the trapping module is used for trapping mice in the detection area under the control of the control module; the intelligent terminal module is communicated with the control module through the data transmission module, and the intelligent terminal module is used for remotely acquiring mouse trapping information by a user and sending a remote control instruction to the control module.
In this embodiment, the data transmission module includes one or more of a WIFI communication unit, a 4G communication unit, and a 5G communication unit.
In this embodiment, the intelligent terminal module includes one or more of a smart phone, an iPad, a notebook computer, and the like.
As shown in fig. 2 and 3, the detection module includes a sensor unit including a bio-sensing module and an environmental information collection module, the bio-sensing module acquires bio-information and distance information of a detection area through a plurality of infrared sensing receivers; the environment information acquisition module comprises an air quality sensor, a temperature and humidity sensor and a brightness sensor, the air quality sensor is used for acquiring air quality information of a detection area, the temperature and humidity sensor is used for acquiring environment temperature and humidity information of the detection area, and the brightness sensor is used for acquiring environment brightness information of the detection area.
As shown in fig. 2 and 4, detection module still includes bait detecting element, bait detecting element includes material level detector, a weighing sensor and miniature camera head, material level detector is arranged in acquireing the surplus information of bait in the bait box, a weighing sensor is arranged in acquireing the weight information of bait in the bait box, miniature camera head is arranged in the environment and acquires the image information of bait in the bait box.
As shown in fig. 2 and 5, detection module still includes video acquisition unit and mouse weight detecting element, the video acquisition unit includes rotatable infrared camera, step motor and relay, step motor with the relay all with rotatable infrared camera is connected, rotatable infrared camera is used for gathering detection area's video information, step motor is used for the drive rotatable infrared camera rotates, the relay is used for control rotatable infrared camera open and close, mouse weight detecting element acquires the weight information of mouse through setting up the second weight sensor in bait box the place ahead.
As shown in fig. 2 and 6, the detection module further includes a voice collection unit, the voice collection unit is used for collecting sounds emitted by the mouse, the voice collection unit includes an audio collection module and a preprocessing module, the audio collection module obtains audio information of the detection area through a plurality of omnidirectional microphones, the preprocessing module is connected with the audio collection module, the preprocessing module includes an audio filter and an audio signal amplifier, the audio filter is used for filtering the audio information collected by the omnidirectional microphones, the audio signal amplifier is connected with the audio filter, and the audio signal amplifier is used for amplifying the audio signal processed by the audio filter.
As shown in fig. 7, the control module includes a cache unit and an analysis and comparison unit, the cache unit is configured to store the information sent by the detection module, the preset value information of each parameter, the image database, and the audio database, the analysis and comparison unit is connected to the cache unit, and the analysis and comparison unit is configured to analyze and compare the information stored in the storage unit.
As shown in fig. 8 to 10, the analyzing and comparing unit includes a video preprocessing unit and an image recognizing unit, the video preprocessing unit includes a video converting module, a denoising module, a color recognizing module and a graying module, the video converting module is configured to convert video information of a detection region acquired by the detecting module into an image frame, the denoising module is connected to the video converting module, the denoising module is configured to suppress noise of the image frame processed by the video converting module through a median filtering algorithm, the color recognizing module is connected to the denoising module, the color recognizing module is configured to detect color of the image processed by the denoising module, the graying module is connected to the color recognizing module, and the graying module is configured to detect the color of the image processed by the denoising module when the image detected by the color recognizing module is an RGB model image, carrying out graying processing on the image; the image recognition unit is connected with the video preprocessing unit and comprises a feature extraction module and a feature matching module, the feature extraction module is used for extracting features of images obtained after the images are processed by the video preprocessing unit by calling a feature extraction algorithm of an OpenCV (open content library) library, the feature matching module is connected with the feature extraction module, the feature matching module is used for performing feature matching on feature information extracted by the feature extraction module and feature information of the image database through a deep learning algorithm to obtain a matching rate, and an image recognition result is obtained according to the matching rate.
As shown in fig. 11 and 12, the analysis and comparison unit further includes an audio recognition unit and a data comparison unit, the audio recognition unit includes an audio conversion module, a feature extraction module and a recognition module, the audio conversion module includes a framing unit, a windowing unit, a fourier transform unit, a spectrogram forming unit and a spectrogram conversion unit, the framing unit is used for framing the voice information stored in the buffer unit, the windowing unit is connected to the framing unit, the windowing unit is used for windowing the voice signal passing through the framing unit, the fourier transform unit is connected to the windowing unit, the fourier transform unit is used for fourier transforming the voice signal passing through the windowing unit, the spectrogram forming unit is connected to the fourier transform unit, the spectrogram forming unit is used for performing a modulo square on the signal passing through the fourier transform unit, obtaining a spectrogram, wherein the spectrogram conversion unit is connected with the spectrogram forming unit, the spectrogram conversion unit is used for converting the spectrogram into a Mel spectrogram through a Mel filter, the feature extraction module is connected with the audio conversion module, the feature extraction module is used for extracting features of the Mel spectrogram through a trained residual error network model, the recognition module is connected with the feature extraction module, and the recognition module is used for classifying and recognizing the features extracted by the feature extraction module through a random forest classifier; the data comparison unit is used for comparing each parameter stored in the cache unit with a parameter preset value corresponding to each parameter, and sending a control signal to a corresponding module according to a comparison result.
In this embodiment, the control module, according to the recognition results of the image recognition unit and the audio recognition unit, according to the following formula: obtaining a final recognition result by using the sum of S and aI + bV, wherein S is the final matching rate of the detected mouse and a certain mouse, I is the matching rate obtained by the image recognition unit, and V is the matching rate obtained by the audio recognition unit; when the final matching rate is higher than the preset value, the control module judges that the detected mice belong to a certain mouse, so that the mouse identification accuracy is improved.
As shown in fig. 13, the human-computer interaction module includes alarm unit, display element and indicating unit, alarm unit includes double-colored LED lamp and voice broadcast ware, whether the bait in the double-colored LED lamp is used for instructing the bait box is sufficient, voice broadcast ware is used for broadcasting bait shortage information and mouse information of killing, display element shows the information of mousing and sends user's touch operation instruction to through LCD touch display screen control module, indicating unit includes a plurality of monochromatic LED pilot lamps and indicates the behavior of each module.
In this embodiment, the dual-color LED lamp displays green when the bait in the bait box is sufficient, and displays red when the bait in the bait box is less than a preset value; the single color LED indicator lights indicating the respective modules are turned on when the module indicated by them is operated and turned off when the module indicated by them is not operated.
In this embodiment, the control module forms an air quality-rat number curve, a temperature and humidity-rat number curve and a brightness-rat number curve respectively according to the environment information and the number of rats detected within a period of time, and displays the curves by displaying the LCD touch display screen or the intelligent terminal module, so that a user can make accurate prejudgment and preparation for rat damage prevention and control according to the environment and the rat information.
As shown in fig. 14, the trap module includes a driving unit and a killing unit, the driving unit is connected to the killing unit, the driving unit is used for driving the killing unit to kill rats, the driving unit includes a gate driving motor and a silicon controlled trigger circuit, and the killing unit includes a high voltage device.
As shown in fig. 15, the intelligent mousetrap is made of an easily anticorrosive material, and comprises an induction receiving head 1, an induction emitting head 2 and a disposable bait box 3, wherein the disposable bait box 3 is used for placing mouse bait and attracting mice, 1 miniature camera is arranged behind the bait box and can shoot the feed condition in the bait box, 4 induction receiving heads 1 and 4 induction emitting heads 2 are respectively arranged at two sides of a passage of the mouse box, when infrared induction receivers at two sides of the passage sense that objects enter the mouse box, the control module controls the rotatable infrared camera to start the shooting function, in addition, the control module controls inlets at two sides to be closed to prevent the objects in the mouse box from escaping, and if the camera identifies that the objects are non-mice, the control module controls inlets at two sides to be opened to prevent other non-mouse organisms from being killed by mistake, the other 4 sets up in the passageway middle part for statistics mouse quantity, and weight sensor is installed to the passageway bottom directly in front of disposable bait box, can accurately weigh mouse weight, through with this weight conveying to control module, control module passes through the data comparing unit and compares mouse weight and mouse weight default, can judge whether mouse is pregnant, and further the accurate mouse information of collecting is killed for utilizing high pressure to kill rats, the high voltage device is installed to the bottom of passageway, still installs by silicon controlled rectifier trigger circuit in the mouse box, can pass through high voltage device and shock by the electric shock to the mouse.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The utility model provides an intelligence treasured that kills rats which characterized in that includes: the system comprises a detection module, a control module, a man-machine interaction module, a trapping module, a data transmission module and an intelligent terminal module;
the detection module is used for acquiring video image information, voice information, environment information, mouse information and mouse bait information of a detection area;
the control module is connected with the detection module, the human-computer interaction module and the trapping module are both connected with the control module, and the control module is used for processing the information detected by the detection module and then controlling the human-computer interaction module and the trapping module to work;
the man-machine interaction module is used for sending an alarm signal under the control of the control module, sending operation instruction information of a user to the control module and then receiving receipt information sent by the control module;
the trapping module is used for trapping mice in the detection area under the control of the control module;
the intelligent terminal module is communicated with the control module through the data transmission module, and the intelligent terminal module is used for remotely acquiring mouse trapping information by a user and sending a remote control instruction to the control module.
2. The intelligent mousetrap according to claim 1, wherein the detection module comprises a sensor unit, the sensor unit comprises a biosensing module and an environmental information acquisition module, and the biosensing module acquires biological information and distance information of a detection area through a plurality of infrared induction receivers; the environment information acquisition module comprises an air quality sensor, a temperature and humidity sensor and a brightness sensor, the air quality sensor is used for acquiring air quality information of a detection area, the temperature and humidity sensor is used for acquiring environment temperature and humidity information of the detection area, and the brightness sensor is used for acquiring environment brightness information of the detection area.
3. The intelligent mousetrap of claim 2, wherein the detection module further comprises a bait detection unit, wherein the bait detection unit comprises a material level detector, a first weight sensor and a miniature camera, the material level detector is used for acquiring residual amount information of bait in the bait box, the first weight sensor is used for acquiring weight information of the bait in the bait box, and the miniature camera is used for environment acquisition of image information of the bait in the bait box.
4. The intelligent mousetrap of claim 3, wherein the detection module further comprises a video acquisition unit and a mouse weight detection unit, the video acquisition unit comprises a rotatable infrared camera, a stepping motor and a relay, the stepping motor and the relay are both connected with the rotatable infrared camera, the rotatable infrared camera is used for acquiring video information of a detection area, the stepping motor is used for driving the rotatable infrared camera to rotate, the relay is used for controlling the rotatable infrared camera to be opened and closed, and the mouse weight detection unit acquires the weight information of a mouse through a second weight sensor arranged in front of the bait box.
5. The intelligent mousetrap of claim 4, wherein the detection module further comprises a voice collection unit, the voice collection unit is configured to collect sounds made by a mouse, the voice collection unit comprises an audio collection module and a preprocessing module, the audio collection module obtains audio information of a detection area through a plurality of omnidirectional microphones, the preprocessing module is connected to the audio collection module, the preprocessing module comprises an audio filter and an audio signal amplifier, the audio filter is configured to filter the audio information collected by the omnidirectional microphones, the audio signal amplifier is connected to the audio filter, and the audio signal amplifier is configured to amplify the audio signal processed by the audio filter.
6. The intelligent mousetrap according to claim 1, wherein the control module comprises a cache unit and an analysis and comparison unit, the cache unit is configured to store the information sent by the detection module, the preset value information of each parameter, an image database and an audio database, the analysis and comparison unit is connected to the cache unit, and the analysis and comparison unit is configured to analyze and compare the information stored in the storage unit.
7. The intelligent mousetrap according to claim 6, wherein the analysis and comparison unit comprises a video preprocessing unit and an image recognition unit, the video preprocessing unit comprises a video conversion module, a denoising module, a color recognition module and a graying module, the video conversion module is used for converting video information of the detection area obtained by the detection module into an image frame, the denoising module is connected with the video conversion module, the denoising module is used for suppressing noise of the image frame processed by the video conversion module through a median filtering algorithm, the color recognition module is connected with the denoising module, the color recognition module is used for detecting color of the image processed by the denoising module, the graying module is connected with the color recognition module, and the graying module is used for detecting the image detected by the color recognition module when the image is an RGB model image, carrying out graying processing on the image; the image recognition unit is connected with the video preprocessing unit and comprises a feature extraction module and a feature matching module, the feature extraction module is used for extracting features of images obtained after the images are processed by the video preprocessing unit by calling a feature extraction algorithm of an OpenCV (open content library) library, the feature matching module is connected with the feature extraction module, the feature matching module is used for performing feature matching on feature information extracted by the feature extraction module and feature information of the image database through a deep learning algorithm to obtain a matching rate, and an image recognition result is obtained according to the matching rate.
8. The intelligent mousetrap of claim 7, wherein the analysis and comparison unit further comprises an audio recognition unit and a data comparison unit, the audio recognition unit comprises an audio conversion module, a feature extraction module and a recognition module, the audio conversion module comprises a framing unit, a windowing unit, a Fourier transform unit, a spectrogram forming unit and a spectrogram conversion unit, the framing unit is used for framing the voice information stored in the cache unit, the windowing unit is connected with the framing unit, the windowing unit is used for windowing the voice signal passing through the framing unit, the Fourier transform unit is connected with the windowing unit, the Fourier transform unit is used for Fourier transforming the voice signal passing through the windowing unit, and the spectrogram forming unit is connected with the Fourier transform unit, the spectrogram forming unit is used for performing modulus squaring on the signal passing through the Fourier transform unit to obtain a spectrogram, the spectrogram conversion unit is connected with the spectrogram forming unit and is used for converting the spectrogram into a Mel spectrogram through a Mel filter, the feature extraction module is connected with the audio conversion module and is used for performing feature extraction on the Mel spectrogram through a trained residual error network model, the recognition module is connected with the feature extraction module and is used for classifying and recognizing the features extracted by the feature extraction module through a random forest classifier; the data comparison unit is used for comparing each parameter stored in the cache unit with a parameter preset value corresponding to each parameter, and sending a control signal to a corresponding module according to a comparison result.
9. The intelligent mousetrap of claim 1, wherein the human-computer interaction module comprises an alarm unit, a display unit and an indication unit, the alarm unit comprises a bicolor LED lamp and a voice broadcast device, the bicolor LED lamp is used for indicating whether bait in a bait box is sufficient or not, the voice broadcast device is used for broadcasting insufficient bait information and mouse catching and killing information, the display unit displays mouse trapping information through an LCD touch display screen and sends touch operation instructions of a user to the control module, and the indication unit comprises a plurality of monochromatic LED indication lamps to indicate the working conditions of the modules.
10. The intelligent mousetrap of claim 1, wherein the trapping module comprises a driving unit and a killing unit, the driving unit is connected with the killing unit and is used for driving the killing unit to kill the mouse, the driving unit comprises a door driving motor and a silicon controlled trigger circuit, and the killing unit comprises a high-voltage device.
CN202110773066.4A 2021-07-08 2021-07-08 Intelligent mouse trap Active CN113545332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110773066.4A CN113545332B (en) 2021-07-08 2021-07-08 Intelligent mouse trap

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110773066.4A CN113545332B (en) 2021-07-08 2021-07-08 Intelligent mouse trap

Publications (2)

Publication Number Publication Date
CN113545332A true CN113545332A (en) 2021-10-26
CN113545332B CN113545332B (en) 2023-03-28

Family

ID=78102809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110773066.4A Active CN113545332B (en) 2021-07-08 2021-07-08 Intelligent mouse trap

Country Status (1)

Country Link
CN (1) CN113545332B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118020749A (en) * 2024-04-09 2024-05-14 深圳市纬信科技有限公司 Rat repelling method and system based on sensor data analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003070408A (en) * 2001-09-05 2003-03-11 Hideaki Murakami Rat expeller
CN205624111U (en) * 2016-05-25 2016-10-12 黄天琪 Open electron trapper
CN111387170A (en) * 2020-03-05 2020-07-10 中国地质大学(武汉) Intelligent inspection mouse trapping robot device
CN111866455A (en) * 2020-07-03 2020-10-30 山东科技大学 Intelligent electronic sensing device
CN112189656A (en) * 2020-09-19 2021-01-08 庾舜 Intelligent mouse blocking board for internet of things

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003070408A (en) * 2001-09-05 2003-03-11 Hideaki Murakami Rat expeller
CN205624111U (en) * 2016-05-25 2016-10-12 黄天琪 Open electron trapper
CN111387170A (en) * 2020-03-05 2020-07-10 中国地质大学(武汉) Intelligent inspection mouse trapping robot device
CN111866455A (en) * 2020-07-03 2020-10-30 山东科技大学 Intelligent electronic sensing device
CN112189656A (en) * 2020-09-19 2021-01-08 庾舜 Intelligent mouse blocking board for internet of things

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118020749A (en) * 2024-04-09 2024-05-14 深圳市纬信科技有限公司 Rat repelling method and system based on sensor data analysis
CN118020749B (en) * 2024-04-09 2024-06-07 深圳市纬信科技有限公司 Rat repelling method and system based on sensor data analysis

Also Published As

Publication number Publication date
CN113545332B (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN109886999B (en) Position determination method, device, storage medium and processor
JP7018462B2 (en) Target object monitoring methods, devices and systems
Miller A method for determining relative activity of free flying bats using a new activity index for acoustic monitoring
Fanioudakis et al. Mosquito wingbeat analysis and classification using deep learning
CN109299683B (en) Security protection evaluation system based on face recognition and behavior big data
CN109145032A (en) A kind of bee raising intelligent monitoring method and system
KR101118245B1 (en) A preconsideration management system of fruit tree insect pest by extracting insect type and distribution from photograph images
CN109984054B (en) Estrus detection method, estrus detection device and estrus detection system
CN106332855A (en) Automatic early warning system for pests and diseases
CN203324781U (en) Pest trapping apparatus and pest remote identifying and monitoring system
JP3796526B2 (en) Pest counting device
CN110728810B (en) Distributed target monitoring system and method
CN107346424A (en) Lamp lures insect identification method of counting and system
CN112106747A (en) Intelligent agricultural insect pest remote automatic monitoring system
CN113545332B (en) Intelligent mouse trap
CN106570534A (en) Automatic small insect trapping detection method and system thereof
CN107681784B (en) Steel tower ground wire monitoring system and method based on multisource data fusion
CN109757395A (en) A kind of pet behavioral value monitoring system and method
CN107609600A (en) A kind of greenhouse insect-sticking plate insect automatic recognition classification method and system
CN113419445A (en) Intelligent elevator control system and control method based on Internet of things and AI
CN113135368A (en) Intelligent garbage front-end classification system and method
CN114460080A (en) Rice disease and pest intelligent monitoring system
CN111652128B (en) High-altitude power operation safety monitoring method, system and storage device
CN201004268Y (en) Mouse type and density monitor
CN108462855A (en) A kind of trapping lamp long-distance video monitoring system that can observe desinsection situation in real time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230105

Address after: 350004 27 Business Office, 34th Floor, B1 # Building, Fuli Business Center (Zone 2) (Zone B of Fuli Center), No. 6, Xiangban Street, Ninghua Street, Taijiang District, Fuzhou City, Fujian Province

Applicant after: Fujian bangheng Environmental Science Co.,Ltd.

Address before: No. 6, Xiangban Street, Ninghua Street, Taijiang District, Fuzhou City, Fujian Province (former south side of Shangpu Road), 23/F, B1 # 2, Fuli Business Center (Zone 2) (Zone B, Fuli Center), 350004

Applicant before: Zhongwei Zhilian (Fujian) Information Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant