CN116168415A - Animal monitoring and identifying method, device, equipment and storage medium - Google Patents

Animal monitoring and identifying method, device, equipment and storage medium Download PDF

Info

Publication number
CN116168415A
CN116168415A CN202211623835.3A CN202211623835A CN116168415A CN 116168415 A CN116168415 A CN 116168415A CN 202211623835 A CN202211623835 A CN 202211623835A CN 116168415 A CN116168415 A CN 116168415A
Authority
CN
China
Prior art keywords
environment
animal
sound data
animal monitoring
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211623835.3A
Other languages
Chinese (zh)
Inventor
孔庆云
李杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Tianhe Defense Technology Co ltd
Original Assignee
Xi'an Tianhe Defense Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Tianhe Defense Technology Co ltd filed Critical Xi'an Tianhe Defense Technology Co ltd
Priority to CN202211623835.3A priority Critical patent/CN116168415A/en
Publication of CN116168415A publication Critical patent/CN116168415A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/26Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The application is applicable to the technical field of target identification, and provides an animal monitoring and identifying method, device, equipment and storage medium, wherein the animal monitoring and identifying method comprises the following steps: collecting sound in an environment to obtain sound data of the environment; according to the sound data, shooting a target in the environment to obtain video image data of the environment; and carrying out animal monitoring and identification according to the sound data and the video image data, obtaining and outputting an animal monitoring and identification result of the environment. The method and the device can enhance the information sensing capability of animal monitoring and identification and improve the accuracy of animal monitoring and identification.

Description

Animal monitoring and identifying method, device, equipment and storage medium
Technical Field
The application belongs to the technical field of target identification, and particularly relates to an animal monitoring and identification method, an animal monitoring and identification device, animal monitoring and identification equipment and a storage medium.
Background
Currently, a field animal monitoring and identifying method mainly comprises shooting by a camera and analyzing video images. In a wild jungle environment, the method generally has the defects of light and visibility limitation, weak concealment, disordered background, easiness in electromagnetic interference and the like, so that the information sensing capability of the monitoring and identifying equipment is poor, and the accuracy of animal monitoring and identifying is poor only by means of video images when animals are shielded due to the complexity of the environment.
Disclosure of Invention
The embodiment of the application provides an animal monitoring and identifying method, device, equipment and storage medium, which can solve the problems of poor information sensing capability and poor accuracy of animal monitoring and identifying in the prior art.
A first aspect of embodiments of the present application provides an animal monitoring and identification method, including:
collecting sound in an environment to obtain sound data of the environment;
according to the sound data, shooting a target in the environment to obtain video image data of the environment;
and carrying out animal monitoring and identification according to the sound data and the video image data, obtaining and outputting an animal monitoring and identification result of the environment.
A second aspect of embodiments of the present application provides an animal monitoring and identification device comprising:
the sound collection module is used for collecting sound in the environment to obtain sound data of the environment;
the image acquisition module is used for shooting a target in the environment according to the sound data to obtain video image data of the environment;
and the monitoring and identifying module is used for carrying out animal monitoring and identifying according to the sound data and the video image data, obtaining and outputting an animal monitoring and identifying result of the environment.
A third aspect of the embodiments of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the animal monitoring and identification method as described above when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer readable storage medium storing a computer program which when executed by a processor implements an animal monitoring and identification method as described above.
According to the animal monitoring and identifying method provided by the first aspect of the embodiment of the application, the sound data of the environment is obtained by carrying out sound collection in the environment, the target shooting is carried out in the environment according to the sound data, the video image data of the environment is obtained, the animal monitoring and identifying is carried out according to the sound data and the video image data, the animal monitoring and identifying result of the environment is obtained and output, the information perception capability of the animal monitoring and identifying can be enhanced, and the accuracy of the animal monitoring and identifying is improved.
It will be appreciated that the advantages of the second, third and fourth aspects may be found in the relevant description of the first aspect and are not repeated here.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a first method for monitoring and identifying animals according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a second flow chart of an animal monitoring and identifying method according to an embodiment of the present disclosure;
fig. 3 is a schematic flow chart of a third method for monitoring and identifying animals according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a fourth method for monitoring and identifying animals according to an embodiment of the present application;
fig. 5 is a schematic flow chart of a fifth method for monitoring and identifying animals according to an embodiment of the present application;
FIG. 6 is a general flow chart of an animal monitoring and identification method provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of an animal monitoring and identifying device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. "plurality" means "two or more".
Example 1
The first embodiment of the present application provides an animal monitoring and identifying method, which may be executed by a processor of a terminal device when running a corresponding computer program, and is configured to perform sound collection in an environment to obtain sound data of the environment, perform target shooting in the environment according to the sound data to obtain video image data of the environment, perform animal monitoring and identifying according to the sound data and the video image data, obtain and output an animal monitoring and identifying result of the environment, so as to enhance information sensing capability of animal monitoring and identifying, and improve accuracy of animal monitoring and identifying.
As shown in fig. 1, the animal monitoring and identifying method provided in this embodiment includes the following steps S11 to S13:
s11, collecting sound in the environment to obtain sound data of the environment.
In application, the sound collection in the environment may be front-end edge side sound collection in the environment by a sound sensor of the wild animal monitoring device, where the wild animal needs to monitor and identify, such as wild jungle environment, grassland environment, etc., and the sound collection may be collection of various sounds emitted in the environment, including animal sounds. The sound is used as an important information carrier, has the characteristics of longer wavelength, good penetrability and the like, and can effectively enhance the information perception capability of wild animal monitoring equipment in a wild jungle environment by applying sound data to animal monitoring and identification.
And S12, shooting a target in the environment according to the sound data to obtain video image data of the environment.
In the application, the shooting of the target in the environment to obtain the video image data of the environment may be that the shooting device of the wild animal monitoring device shoots the target that emits the sound data in the view angle in the environment, and obtains the video image data containing the target, where the target includes the wild animal. The photographing device may be a visible light camera or an infrared camera. The shooting of the object in the environment according to the sound data may be performed by controlling the shooting device to start when the sound data meets the preset condition, and the shooting device performs shooting of the object in the environment, where the shooting duration may be 10 to 60 seconds. For example, the preset condition may be that the frequency of the sound data satisfies a certain preset frequency interval, or that the sound data is detected as the sound of certain target animals.
And S13, carrying out animal monitoring and identification according to the sound data and the video image data, obtaining and outputting an animal monitoring and identification result of the environment.
In application, the method includes performing animal monitoring and identification according to the sound data and the video image data, obtaining and outputting an animal monitoring and identification result of the environment, or transmitting the sound data and the video image data back to background software through a communication network by using a current timestamp as a name, wherein the communication network can include wired, wireless, ioT and other modes, performing fusion monitoring and identification on the sound data and the video image data through an algorithm program preset by the background software, monitoring whether a target animal exists in the environment, identifying the category of the target animal if the target animal exists, and finally outputting the identification result. By fusing, monitoring and identifying the sound data and the video image data, the problem that the accuracy of video image identification is low due to shielding of wild animals when the wild animals move in jungle environments and the like can be effectively solved.
According to the animal monitoring and identifying method, the sound data of the environment is obtained through sound collection in the environment, the target shooting is carried out in the environment according to the sound data, the video image data of the environment is obtained, the animal monitoring and identifying is carried out according to the sound data and the video image data, the animal monitoring and identifying result of the environment is obtained and output, the information perception capability of the animal monitoring and identifying can be enhanced, and the accuracy of the animal monitoring and identifying is improved.
Example two
An embodiment II of the present application provides an animal monitoring and identifying method implemented based on the embodiment I, which can be executed by a processor of a terminal device when running a corresponding computer program.
As shown in fig. 2, step S11 includes: s21, performing infrared heat source induction in the environment; s22, when an infrared heat source is sensed, sound collection is carried out in the environment, and sound data of the environment are obtained.
In application, the above-mentioned infrared heat source induction is performed in the environment, and the infrared heat source induction may be performed in the environment by an infrared sensor of the wild animal monitoring device. When no animal appears in the environment, the whole wild animal monitoring equipment does not work, and only the built-in main control board has ultralow power consumption standby and the infrared sensor works. When the infrared sensor senses an infrared heat source, the sound sensor of the wild animal monitoring equipment is started through the main control board to collect sound in the environment. The sound sensor is started when the infrared heat source is sensed, so that false starting of the wild animal monitoring equipment can be reduced, and the service life of the wild animal monitoring equipment is prolonged.
In one embodiment, as shown in fig. 3, step S12 includes: s31, determining whether a target animal appears in the environment according to the sound data and the infrared heat source; and S32, when the occurrence of the target animal in the environment is determined, shooting the target in the environment to obtain video image data of the environment. Optionally, as shown in fig. 4, step S31 includes: s41, detecting the sound data through a first preset algorithm to obtain the probability that the sound data is animal sound; s42, determining that a target animal appears in the environment when the probability is greater than or equal to a preset threshold value or the probability is smaller than the preset threshold value and the stay time of the infrared heat source is longer than a first preset time. Optionally, step S31 further includes: and when the sound data are not acquired and the stay time of the infrared heat source is longer than a second preset time, determining that the target animal appears in the environment.
In application, after the sound data of the environment is obtained by carrying out sound collection in the environment through the upper sound sensor, the sound data collected by the sound sensor can be transmitted to the main control board of the wild animal monitoring equipment, and the embedded judging trigger module of the main control board works to judge whether the target animal appears. Specifically, the sound data can be detected by a front-end embedded software algorithm program built in the main control board, so that the probability that the sound data is judged to be animal sound is obtained. If the probability of the sound data being judged to be the sound of the animal is larger than or equal to a preset threshold value, such as 70%, determining that the target animal appears in the environment, controlling a shooting device of the wild animal monitoring device to be started at the moment, and shooting the target animal in the environment. If the probability of the sound data being judged to be the animal sound is smaller than the preset threshold value, such as 70%, but the stay time of the infrared heat source is longer than the first preset time period, such as 1 second, determining that the target animal appears in the environment, and controlling the shooting equipment of the wild animal monitoring equipment to be started at the moment, so as to shoot the target animal in the environment.
In application, when the sound sensor does not collect the sound data, whether the target animal appears in the environment can be judged according to the stay time of the infrared heat source. Specifically, when the stay time of the infrared heat source is longer than a second preset time, such as 2 seconds, it is determined that a target animal appears in the environment, at this time, the photographing device of the wild animal monitoring device is controlled to be started, the target animal is photographed in the environment, otherwise, the sound sensor is turned off, and the wild animal monitoring device enters a standby state. The shooting equipment is controlled by the sound data and the infrared heat source to start to shoot the target, so that the false start rate of the wild animal monitoring equipment can be further reduced, and the requirements of low cost, low power consumption, high reliability in long-time working and the like of the wild animal monitoring equipment are met.
In one embodiment, as shown in fig. 5, step S13 includes: s51, recognizing the sound data through a second preset algorithm to obtain a first recognition result; s52, identifying the video image data through a third preset algorithm to obtain a second identification result; s53, fusing the first recognition result and the second recognition result to obtain a fused recognition result; s54, recognizing the fusion recognition result through a fourth preset algorithm, obtaining and outputting an animal monitoring recognition result of the environment. Optionally, step S13 further includes: and when the sound data is not acquired, the video image data is identified through a third preset algorithm, and an animal monitoring and identifying result of the environment is obtained and output.
In the application, after the voice data and the video image data adopt the current time stamp as names and are transmitted back to the background software through the communication network, if the voice data and the video image data exist in the same time stamp at the same time, the voice data can be identified through a depth residual convolution neural network algorithm to obtain a first identification result, and the video image data can be identified through a YOLOv5 algorithm to obtain a second identification result. The depth residual convolution neural network algorithm can adopt the mode of fusion of three characteristics of MFCCs, logmel and short-time energy in the training process, and the application effect is better improved than that of a single characteristic. The YOLOv5 algorithm can adopt technical measures such as Mosaic data enhancement, self-adaptive anchor frame calculation, self-adaptive picture scaling and the like at an input end, and an FPN+PAN structure is added, so that the accuracy and the robustness of the algorithm can be improved, and compared with other algorithms, the speed and the accuracy of the algorithm can be greatly improved.
In the application, the first recognition result and the second recognition result are fused by adopting a decision stage to obtain a fusion recognition result, and the final target category of the target animal is confirmed on the fusion recognition result through a Bayesian network. Based on the practical requirement of the wild animal monitoring equipment, the fusion recognition algorithm for monitoring and recognition needs small calculation amount, and meanwhile, the detection precision under the given environmental conditions is improved by considering that sensors are possibly increased or decreased under different environments, and the expansion requirement on the wild animal monitoring equipment is high, so that a Bayesian network can be selected for fusion recognition to achieve the expected effect. If the sound sensor does not collect the sound data and the same time stamp only has the video image data, the animal type identification is carried out on the video image data only through a YOLOv5 algorithm.
Fig. 6 is a general flow chart of the animal monitoring and identifying method provided in this embodiment, and as shown in fig. 6, the general flow chart of the animal monitoring and identifying method provided in this embodiment is as follows: the main control board operates in an ultralow-power-consumption standby state, and the infrared sensor operates; when the infrared sensor senses an infrared heat source, the main control board starts the sound sensor to acquire sound in the environment to obtain sound data, and starts a front-end embedded software algorithm program in the main control board in real time to detect the sound data; judging whether a target animal exists in the environment through the comprehensive sound data and the infrared heat source, and starting shooting equipment to shoot the target when the target animal exists in the environment, so as to obtain video image data; and transmitting the sound data and the video image data back to background software through a communication network in a time stamp mode, and carrying out animal category identification in a mode of fusion identification of sound and video.
According to the animal monitoring and identifying method, the identification result of the sound data and the identification result of the video image data are fused and identified, so that final animal category identification is performed, and the accuracy of animal monitoring and identifying can be improved.
Example III
As shown in fig. 7, the present embodiment further provides an animal monitoring and identifying device, the animal monitoring and identifying device 700 includes:
the sound collection module 701 is configured to collect sound in an environment to obtain sound data of the environment;
the image acquisition module 702 is configured to perform target shooting in the environment according to the sound data, so as to obtain video image data of the environment;
and the monitoring and identifying module 703 is configured to perform animal monitoring and identifying according to the sound data and the video image data, obtain and output an animal monitoring and identifying result of the environment.
Optionally, the sound collection module 701 includes:
the infrared induction unit is used for conducting infrared heat source induction in the environment;
and the sound collection unit is used for collecting sound in the environment when the infrared heat source is sensed, so as to obtain sound data of the environment.
Optionally, the image acquisition module 702 includes:
the target judging unit is used for determining whether a target animal appears in the environment according to the sound data and the infrared heat source;
and the image acquisition unit is used for shooting a target in the environment when the presence of the target animal in the environment is determined, so as to obtain video image data of the environment.
Optionally, the target judgment unit includes:
the probability calculation unit is used for detecting the sound data through a first preset algorithm to obtain the probability that the sound data is animal sound;
the target determining unit is used for determining that a target animal appears in the environment when the probability is larger than or equal to a preset threshold value or the probability is smaller than the preset threshold value and the stay time of the infrared heat source is longer than a first preset time length.
Optionally, the target judging unit is configured to determine that a target animal appears in the environment when the sound data is not collected and the stay time of the infrared heat source is longer than a second preset time.
Optionally, the monitoring and identifying module 703 includes:
the first recognition unit is used for recognizing the voice data through a second preset algorithm to obtain a first recognition result;
the second identification unit is used for identifying the video image data through a third preset algorithm to obtain a second identification result;
the identification fusion unit is used for fusing the first identification result and the second identification result to obtain a fusion identification result;
and the final recognition unit is used for recognizing the fusion recognition result through a fourth preset algorithm to obtain and output an animal monitoring recognition result of the environment.
Optionally, the monitoring and identifying module 703 is configured to identify the video image data by a third preset algorithm when the sound data is not collected, and obtain and output an animal monitoring and identifying result of the environment.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the present application further provides a terminal device 800, as shown in fig. 8, including a memory 801, a processor 802, and a computer program 803 stored in the memory 801 and executable on the processor 802, where the processor 802 implements the steps of the animal monitoring and identifying method provided in the first aspect when executing the computer program 803.
In application, the terminal device may include, but is not limited to, a processor and a memory, fig. 8 is merely an example of the terminal device and does not constitute limitation of the terminal device, and may include more or less components than illustrated, or combine some components, or different components, such as an input-output device, a network access device, and the like. The input output devices may include cameras, audio acquisition/playback devices, display screens, and the like. The network access device may include a network module for wireless networking with an external device.
In application, the processor may be a central processing unit (Central Processing Unit, CPU), which may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In applications, the memory may in some embodiments be an internal storage unit of the terminal device, such as a hard disk or a memory of the terminal device. The memory may in other embodiments also be an external storage device of the terminal device, for example a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the terminal device. The memory may also include both internal storage units of the terminal device and external storage devices. The memory is used to store an operating system, application programs, boot Loader (Boot Loader), data, and other programs, etc., such as program code for a computer program, etc. The memory may also be used to temporarily store data that has been output or is to be output.
The embodiments of the present application also provide a computer readable storage medium storing a computer program, where the computer program can implement the steps in the above-mentioned method embodiments when executed by a processor.
All or part of the process in the method of the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and which, when executed by a processor, implements the steps of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a terminal device, a recording medium, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunication signal, and a software distribution medium. Such as a U-disk, removable hard disk, magnetic or optical disk, etc.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative apparatus and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the embodiments of the apparatus described above are illustrative only, and the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, the apparatus may be indirectly coupled or in communication connection, whether in electrical, mechanical or other form.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A method of monitoring and identifying animals, comprising:
collecting sound in an environment to obtain sound data of the environment;
according to the sound data, shooting a target in the environment to obtain video image data of the environment;
and carrying out animal monitoring and identification according to the sound data and the video image data, obtaining and outputting an animal monitoring and identification result of the environment.
2. The method of claim 1, wherein the performing sound collection in the environment to obtain sound data of the environment comprises:
performing infrared heat source induction in the environment;
and when an infrared heat source is sensed, sound collection is carried out in the environment, and sound data of the environment are obtained.
3. The method of claim 2, wherein said capturing a target in said environment based on said sound data to obtain video image data of said environment, comprising:
determining whether a target animal is present in the environment based on the sound data and the infrared heat source;
and when the presence of the target animal in the environment is determined, shooting the target in the environment to obtain video image data of the environment.
4. The method of claim 3, wherein said determining whether a target animal is present in said environment based on said sound data and said infrared heat source comprises:
detecting the sound data through a first preset algorithm to obtain the probability that the sound data is animal sound;
and when the probability is greater than or equal to a preset threshold value, or the probability is less than the preset threshold value and the stay time of the infrared heat source is longer than a first preset time length, determining that the target animal appears in the environment.
5. The method of claim 3, wherein said determining whether a target animal is present in said environment based on said sound data and said infrared heat source further comprises:
and when the sound data are not acquired and the stay time of the infrared heat source is longer than a second preset time, determining that the target animal appears in the environment.
6. The animal monitoring and recognition method according to any one of claims 1 to 5, wherein the performing animal monitoring and recognition based on the sound data and the video image data, obtaining and outputting an animal monitoring and recognition result of the environment, comprises:
identifying the sound data through a second preset algorithm to obtain a first identification result;
identifying the video image data through a third preset algorithm to obtain a second identification result;
fusing the first recognition result and the second recognition result to obtain a fused recognition result;
and identifying the fusion identification result through a fourth preset algorithm to obtain and output an animal monitoring identification result of the environment.
7. The animal monitoring and recognition method according to any one of claims 1 to 5, wherein the animal monitoring and recognition is performed based on the sound data and the video image data to obtain and output an animal monitoring and recognition result of the environment, further comprising:
and when the sound data is not acquired, the video image data is identified through a third preset algorithm, and an animal monitoring and identifying result of the environment is obtained and output.
8. An animal monitoring and identification device, comprising:
the sound collection module is used for collecting sound in the environment to obtain sound data of the environment;
the image acquisition module is used for shooting a target in the environment according to the sound data to obtain video image data of the environment;
and the monitoring and identifying module is used for carrying out animal monitoring and identifying according to the sound data and the video image data, obtaining and outputting an animal monitoring and identifying result of the environment.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the animal monitoring and identification method according to any one of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program, when executed by a processor, implements the animal monitoring and identification method according to any one of claims 1 to 7.
CN202211623835.3A 2022-12-16 2022-12-16 Animal monitoring and identifying method, device, equipment and storage medium Pending CN116168415A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211623835.3A CN116168415A (en) 2022-12-16 2022-12-16 Animal monitoring and identifying method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211623835.3A CN116168415A (en) 2022-12-16 2022-12-16 Animal monitoring and identifying method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116168415A true CN116168415A (en) 2023-05-26

Family

ID=86412310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211623835.3A Pending CN116168415A (en) 2022-12-16 2022-12-16 Animal monitoring and identifying method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116168415A (en)

Similar Documents

Publication Publication Date Title
CN109166261B (en) Image processing method, device and equipment based on image recognition and storage medium
US10070053B2 (en) Method and camera for determining an image adjustment parameter
CN108288044B (en) Electronic device, face recognition method and related product
EP2757771B1 (en) Image pickup apparatus, remote control apparatus, and methods of controlling image pickup apparatus and remote control apparatus
CN108076386B (en) Video jamming detection method and device and storage medium
US10277888B2 (en) Depth triggered event feature
KR100964886B1 (en) Vehicle Number Recognition System and Its Recognition Method
CN112689221B (en) Recording method, recording device, electronic equipment and computer readable storage medium
CN112820014A (en) Intelligent access control system control method, device, equipment and medium
JP2001338302A (en) Monitoring device
CN107370961B (en) Image exposure processing method, device and terminal device
CN113158773B (en) Training method and training device for living body detection model
CN114495395A (en) Human shape detection method, monitoring and early warning method, device and system
CN116168415A (en) Animal monitoring and identifying method, device, equipment and storage medium
CN109040604B (en) Shot image processing method and device, storage medium and mobile terminal
KR20130062489A (en) Device for tracking object and method for operating the same
CN114885096B (en) Shooting mode switching method, electronic equipment and storage medium
CN112419635B (en) Perimeter alarm method integrating grating and video
CN116071813A (en) Scene detection method, device, medium and equipment
CN117992128A (en) Awakening method and device
CN113691730A (en) Task switching control method and device for camera, medium and electronic equipment
CN108833794B (en) Shooting method and mobile terminal
CN113674217A (en) Glare detection method, device, equipment and storage medium
CN109960995B (en) Motion data determination system, method and device
CN116913320B (en) Accident detection method, device and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination