CN115835012B - Artificial intelligence-based active organism shooting method and intelligent hunting camera - Google Patents

Artificial intelligence-based active organism shooting method and intelligent hunting camera Download PDF

Info

Publication number
CN115835012B
CN115835012B CN202211450056.8A CN202211450056A CN115835012B CN 115835012 B CN115835012 B CN 115835012B CN 202211450056 A CN202211450056 A CN 202211450056A CN 115835012 B CN115835012 B CN 115835012B
Authority
CN
China
Prior art keywords
intelligent
habit
complete
hunting camera
hunting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211450056.8A
Other languages
Chinese (zh)
Other versions
CN115835012A (en
Inventor
王尔康
周松河
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Original Assignee
HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD filed Critical HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Priority to CN202211450056.8A priority Critical patent/CN115835012B/en
Publication of CN115835012A publication Critical patent/CN115835012A/en
Application granted granted Critical
Publication of CN115835012B publication Critical patent/CN115835012B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

According to the method, the acquired degree of the habit of the object in the infrared induction area can be intelligently identified, so that high-pixel shooting or low-pixel shooting can be correspondingly triggered, even shooting can not be triggered, and the intelligent hunting camera can acquire a high-definition picture supporting field animal observation and research and greatly improve the duration of the intelligent hunting camera.

Description

Artificial intelligence-based active organism shooting method and intelligent hunting camera
Technical Field
The application relates to the field of artificial intelligence and intelligent terminals, in particular to an artificial intelligence-based active organism shooting method and an intelligent hunting camera.
Background
The hunting camera is mainly used for hunting, field animal observation research and the like. Compared with the traditional common camera, the camera has the advantages of extremely long standby time, extremely strong waterproof capability and adaptability to various field environments.
With the development of camera technology, the pixels of hunting cameras are greatly improved from 500 ten thousand pixels to 800 ten thousand pixels to 1200 ten thousand pixels and even higher, so that the requirements of field biological research data acquisition can be better met.
But the improvement of the pixels of the hunting camera brings higher definition pictures, and meanwhile, the power consumption of the hunting camera is increased sharply, so that the duration of the hunting camera is shortened greatly.
Disclosure of Invention
The application provides an artificial intelligence-based active organism shooting method and an intelligent hunting camera, which can improve the duration of the intelligent hunting camera while the intelligent hunting camera can acquire a high-definition picture supporting field animal observation and research.
In a first aspect, the present application provides an artificial intelligence based live organism shooting method applied to an intelligent hunting camera, the method comprising: under the condition that a new living organism enters an infrared sensing area and the current shooting mode is an ecological diversity mode, the intelligent hunting camera shoots a high-pixel image; the intelligent hunting camera inputs all media shot in the high-pixel image and the gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model to acquire the completeness of the habit of the acquired first object; wherein the first object is a movable living being identified from the high-pixel image; the completeness of the habit is any one of the completeness of the near habit, the near habit and the distant habit, or the near habit and the distant habit; under the condition that the completeness of the habit of the collected first object is determined to be incomplete in the short-distance habit, the intelligent hunting camera starts shooting high-pixel video; under the condition that the completeness of the habit of the collected first object is determined to be complete in the near habit but incomplete in the distant habit, the intelligent hunting camera starts shooting the low-pixel video; the resolution of the low-pixel image obtained by the low-pixel video is less than 1/2 of the resolution of the high-pixel image; in the event that the degree of completeness of the habit of the first object is determined to be complete for the near habit and complete for the distance habit, the intelligent hunting camera does not initiate video recording.
In the above embodiment, when the shooting mode is the biodiversity mode, the intelligent hunting camera 100 does not directly start shooting the high-pixel video after sensing that the infrared sensing area has a new living creature, but shoots a high-pixel image first, judges the completeness of the habit of the object in the high-pixel image recorded in the currently shot medium through the habit sampling completeness analysis intelligent model, and then carries out corresponding processing according to the completeness. For example, in the case that the near habit of the object is determined to be complete and the far habit of the object is not complete, only the low-pixel video is shot, so as to satisfy the acquisition requirement of the biodiversity habit research data with less power consumption. Even under the condition that the near habit and the far habit of the object are determined to be complete, video shooting can be directly not started, more electric quantity is reserved for shooting the object with incomplete habit data, and the duration of the intelligent hunting camera 100 in ecological diversity habit research is greatly improved.
With reference to some embodiments of the first aspect, in some embodiments, the intelligent hunting camera analyzes the high pixel image and all media input habit sampling completeness analysis intelligent model of captured images in a gallery of the intelligent hunting camera, and obtains a completeness of habit of the captured first object, which specifically includes: the intelligent hunting camera identifies the first object in the high-pixel image through the habit sampling completeness analysis intelligent model; the intelligent hunting camera analyzes whether all media shot by the intelligent model comprise N preset near-distance situations and M preset far-distance situations of the first object through the habit sampling completeness; n and M are positive integers greater than or equal to 2; under the condition that N preset near-distance situations of the first object are not completely included in all shot media, the intelligent hunting camera receives a near-distance habit incomplete result output by the habit sampling completeness analysis intelligent model; in the case that the N preset near-distance situations of the first object are completely included in all the shot media, but the M far-distance situations of the first object are not completely included, the intelligent hunting camera receives a result that the near-distance habit outputted by the habit sampling completeness analysis intelligent model is complete but the far-distance habit is incomplete; in the case that the N preset near-distance situations of the first object are completely included in all the shot media and the M far-distance situations of the first object are completely included, the intelligent hunting camera receives the result that the near-distance habit outputted by the habit sampling completeness analysis intelligent model is complete and the far-distance habit is complete.
In the above embodiment, the habit sampling completeness analysis intelligent model can determine whether the near habit or the far image of the target object is complete according to whether the photographed media covers N preset near situations and M far situations of the target object, so that accuracy of determining the habit completeness by the habit sampling completeness analysis intelligent model is improved.
With reference to some embodiments of the first aspect, in some embodiments, before the step of capturing a high-pixel image by the smart hunting camera, the method further includes: the intelligent hunting camera determines whether a currently set shooting mode is a normal mode or the ecological diversity mode; under the condition that the shooting mode is determined to be the normal mode, the intelligent hunting camera starts shooting the high-pixel video; in the event that it is determined that there are no more living organisms in the infrared sensing region, the intelligent hunting camera stops shooting.
In the above embodiment, the intelligent hunting camera can also adopt shooting of other required scenes in a common mode, so that the applicability of the intelligent hunting camera to different scenes is improved.
In some embodiments, under the condition that the shooting in the ecological diversity mode meets the preset condition, the ordinary mode can be automatically switched to, so that the capability of the intelligent hunting camera for meeting different use requirements in a personalized manner is further improved.
With reference to some embodiments of the first aspect, in some embodiments, in a case where the shooting mode is determined to be the ecological diversity mode, before the step of the intelligent hunting camera shooting a high pixel image, the method further includes: the intelligent hunting camera caches the infrared induction characteristic data, wherein the infrared induction characteristic data comprises the shape and the size of a triggered infrared radiation change area and the radiation value of a characteristic point in the area; the intelligent hunting camera determines whether infrared sensing characteristic data with the similarity exceeding a preset approximate threshold value with the current infrared sensing characteristic data exists in a diversity complete result library; the intelligent hunting camera is used for acquiring a plurality of acquired habit of an object, wherein the diversity complete result library is recorded with a plurality of infrared complete corresponding relations, and one infrared complete corresponding relation is a corresponding relation between a recorded characteristic identifier of the object and infrared induction characteristic data of the object entering an infrared induction area at the present time under the condition that the acquired habit of the object is determined to be complete in short-distance habit and complete in long-distance habit; under the condition that the infrared sensing characteristic data with the similarity exceeding the preset approximate threshold value exists in the diversity complete result library, the intelligent hunting camera does not start video shooting.
In the above embodiment, the diversity complete result library of the correspondence between the feature identifier of the object with complete habit and the infrared sensing feature data is recorded in the intelligent hunting camera, when the moving creature entering the infrared sensing area is determined to be the object with complete habit through the infrared sensing feature data, the shooting of the high-pixel image and the judgment of the habit sampling completion analysis intelligent model are not triggered any more, and under the condition of meeting the requirement of the biodiversity habit research, the electric energy of the intelligent hunting camera is further saved.
With reference to some embodiments of the first aspect, in some embodiments, the method further includes: under the condition that the infrared sensing characteristic data with the similarity exceeding a preset approximate threshold value exists in the diversity complete result library, the intelligent hunting camera determines whether the infrared sensing characteristic data is identical with the infrared sensing characteristic data exceeding the preset approximate threshold value in the diversity complete result library; if the detected characteristic data are the same, the intelligent hunting camera clears the cached current infrared sensing characteristic data; if the two types of the information are different, the intelligent hunting camera establishes a corresponding relation with the infrared induction characteristic data by using the characteristic identification of the object corresponding to the infrared induction characteristic data exceeding a preset approximate threshold value in the diversity complete result library, and the corresponding relation is used as a new infrared complete corresponding relation; the intelligent hunting camera records the new infrared complete corresponding relation into the diversity complete result library.
In the above embodiment, when it is determined that the approximation degree of the current infrared sensing characteristic data and the record in the diversity complete result library exceeds the preset approximation threshold, it is further determined whether the same is performed, so as to determine whether to add the current infrared sensing characteristic data to the diversity complete result library. When the storage resources are the same, the storage resources occupied by the complete result library can be saved; when the infrared sensing characteristic data are not added at the same time, the accuracy of the subsequent comparison of the infrared sensing characteristic data can be further improved.
With reference to some embodiments of the first aspect, in some embodiments, the method further includes: when the preset synchronization condition is met, the intelligent hunting camera sends the data in the diversity complete result library to other intelligent hunting cameras in a communication range, and receives the data of the diversity complete result library sent by the intelligent hunting camera; the intelligent hunting camera processes the received data from the diversity complete result libraries of the other intelligent hunting cameras, so that the diversity complete result libraries of the intelligent hunting camera comprise infrared complete correspondence relations in the diversity complete result libraries of all other intelligent hunting cameras in a communication range.
In the above embodiment, by synchronizing the diversity complete result library among the plurality of intelligent hunting cameras, the intelligent hunting cameras can not shoot the object of which the complete habit data is shot by all the communicable intelligent hunting cameras, thereby greatly reducing the shooting of the repeated habit data, improving the duration of the intelligent hunting cameras and avoiding the occupation of the storage space of the intelligent hunting cameras by the repeated low-value data.
With reference to the embodiments of the first aspect, in some embodiments, the preset synchronization condition is that a number of the newly added infrared complete correspondence relationships in the diversity complete result library exceeds a preset synchronization value.
In the above embodiment, the synchronization diversity complete result library can be triggered only when the number of the newly added infrared complete correspondence exceeds the preset synchronization value, so that the electric quantity consumed for the synchronization diversity complete result library is reduced, the data quantity required to be transmitted is reduced, and the synchronization efficiency is improved.
In a second aspect, an embodiment of the present application provides a smart hunting camera including: the device comprises a processor, a memory, an infrared monitor and a camera; the infrared monitor is used for receiving infrared radiation data of the infrared induction area and transmitting the infrared radiation data to the processor; the camera is used for receiving the instruction of the processor to start or stop shooting and transmitting the shot image to the processor; the memory is coupled to the processor for storing computer program code comprising computer instructions that the processor invokes to cause the intelligent hunting camera to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a third aspect, embodiments of the present application provide a computer program product comprising instructions that, when run on a smart hunting camera, cause the smart hunting camera to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium comprising instructions that, when executed on a smart hunting camera, cause the smart hunting camera to perform a method as described in the first aspect and any possible implementation of the first aspect.
It will be appreciated that the electronic device provided in the second aspect, the computer program product provided in the third aspect and the computer storage medium provided in the fourth aspect described above are all configured to perform the method provided by the embodiment of the present application. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
One or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
1. because when the shooting mode is in the biodiversity mode, the intelligent hunting camera 100 does not directly start shooting high-pixel video after sensing that the infrared sensing area has new living beings, but shoots a high-pixel image first, judges the completeness degree of the habit of recording the object in the high-pixel image in the current shot media through the habit sampling completeness degree analysis intelligent model, carries out corresponding processing according to the completeness degree, and can adopt low-pixel shooting or even not start shooting. Therefore, the problem that the shooting task of long-time biodiversity research cannot be completed due to the fact that the power consumption is large because of continuous high-pixel shooting in the related technology is effectively solved, and the duration of the intelligent hunting camera in ecological diversity habit research is greatly prolonged.
2. Because the intelligent hunting camera is recorded with a diversity complete result library of the corresponding relation between the characteristic identification of the object with complete habit and the infrared induction characteristic data, when the moving organism entering the infrared induction area is determined to be the object with complete habit through the infrared induction characteristic data, the shooting of the high-pixel image and the judgment of the habit sampling completion analysis intelligent model are not triggered any more, and under the condition of meeting the requirement of the study of the biodiversity habit, the electric energy of the intelligent hunting camera is further saved.
3. Through synchronous diversity complete result library among a plurality of intelligent hunting cameras for intelligent hunting cameras can no longer shoot the object that has been shot complete habit data by communicable whole intelligent hunting cameras, very big reduction the shooting of repetition habit data, promoted the duration of these intelligent hunting cameras, avoided the low value data of repeatability to occupy the storage space of these intelligent hunting cameras.
Drawings
FIG. 1 is a schematic view of a scene in which a hunting camera is used in the related art;
FIG. 2 is a schematic view of a case of shooting in field study using a hunting camera of the related art;
FIG. 3 is a schematic diagram of a smart hunting camera according to an embodiment of the present application in field study shots;
Fig. 4 is a schematic structural diagram of an intelligent hunting camera 100 according to an embodiment of the present application;
FIG. 5 is an exemplary diagram of the functionality and training data of the habit sample completeness analysis intelligent model in an embodiment of the application;
FIG. 6 is a schematic flow chart of an artificial intelligence based live biological shooting method in an embodiment of the present application;
FIG. 7 is an exemplary schematic diagram of an intelligent hunting camera 100 analyzing an intelligent model using the habit sampling completeness in accordance with an embodiment of the present application;
FIG. 8 is another exemplary flow diagram of an artificial intelligence based live biological imaging method in an embodiment of the application;
FIG. 9 is an exemplary diagram of the storage of content in a diverse complete results library in accordance with embodiments of the present application;
FIG. 10 is another exemplary flow diagram of an artificial intelligence based live biological imaging method in an embodiment of the application;
FIG. 11 is a schematic view of an exemplary scenario in which multiple intelligent hunting cameras interact in an embodiment of the present application;
fig. 12 is a schematic block diagram of the intelligent hunting camera 100 according to the embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
Because the embodiment of the application relates to application of artificial intelligence technology, for convenience of understanding, the following briefly describes the concept of an artificial intelligence model:
the artificial intelligent model is a deep learning model constructed based on an artificial neural network. The general or customized neural network is characterized by training a large amount of marked training data, so that the artificial intelligent model can complete self-learning and has corresponding functions.
For example, by training an artificial intelligence model using a large number of pictures marked with person positions as training data, the artificial intelligence model may have the ability to mark person positions from the output pictures.
For another example, by using pictures or videos marked with different character actions or expressions as training data, the artificial intelligent model is trained, and the artificial intelligent model can have the capability of identifying the character actions or expressions in the output pictures.
The hunting camera according to the present application is generally applied in a field environment, as shown in fig. 1, which is a schematic view of a scene in which the hunting camera is used.
The hunting camera generally comprises a lens, an infrared monitor, a light supplementing lamp and a fixing buckle. Shooting or video recording by a lens user; the infrared monitor is used for sensing an infrared heat source of the surrounding environment and triggering the start or stop of a photographing or video recording function; the light supplementing lamp is used for supplementing light when the environment is too dark; the fixing buckle is used for stably fixing the hunting camera at a certain position of the environment.
As shown in fig. 1 (a), after the hunting camera is fixed on the tree using the fixing button and the hunting camera is activated, the infrared monitor can monitor an infrared sensing area in front of the hunting camera. Although there is a tiger in a place where the hunting camera is not far, the hunting camera does not trigger photographing because the tiger does not enter the infrared sensing area of the hunting camera.
As shown in fig. 1 (b), after the tiger moves to the infrared sensing area of the hunting camera, the infrared monitor senses that the infrared energy of the tiger causes the change of the infrared energy in the environment, and determines that the living creature enters the infrared sensing area, so that the hunting camera can be triggered to start shooting, and the moving picture of the tiger is recorded, so as to accumulate materials for the follow-up related study on the habit of the tiger.
As shown in fig. 2, a diagram is shown illustrating a case of shooting in a field study using a hunting camera according to the related art.
In order to obtain higher definition research materials, high-pixel hunting cameras are generally used for shooting at present. The high-pixel camera not only consumes higher electric energy to process a large amount of image pixel data during shooting, but also has a very large data size due to the high-pixel image finally processed. In storing the high-pixel image, a large amount of electric power is consumed for storage. When wild animals are subjected to field studies, the duration of the general study is relatively long, for example, a study period may be 4 months, so that a large amount of power is consumed, and the high-pixel hunting camera is often not continuously operated until the expected field observation study duration is reached. As shown in fig. 2, after the high-pixel photographing is triggered a plurality of times, the power is turned off due to the shortage of the power after not being operated for 4 months.
In addition, the storage space of the hunting camera is limited, and under the occupation of a large number of high-pixel images requiring extremely large storage space, even if the electric quantity exists in many cases, the storage space is fully occupied before the expected field observation research duration is reached, and the shooting cannot be continued. As shown in fig. 2, after the high-pixel photographing is triggered a plurality of times, photographing is not effective due to insufficient storage space even if it is not operated for 4 months.
By adopting the artificial intelligence-based active organism shooting method and the intelligent hunting camera provided by the embodiment of the application, the acquired habit of the object in the current infrared sensing area can be intelligently identified, so that the high-pixel shooting or the low-pixel shooting can be correspondingly triggered, even the shooting can not be triggered, and the duration of the intelligent hunting camera is greatly improved while the intelligent hunting camera acquires a high-definition picture supporting field animal observation and research.
Fig. 3 is a schematic diagram illustrating a situation when the intelligent hunting camera according to the embodiment of the present application is used for field study shooting. The intelligent hunting camera is triggered by infrared induction shortly after being started, and determines that the near-distance habit of an object A (tiger) entering the infrared induction area is incomplete, so that high-pixel shooting is started, and near-distance detail habit actions such as tiger predation and combat can be clearly shot. The intelligent hunting camera stops shooting after the tiger leaves.
After the high-pixel shooting is performed for many times, the tiger enters the infrared sensing area again on a certain day, and under the condition that the intelligent hunting camera determines that the close-range habit of the tiger is complete, the high-pixel shooting is not started any more, but the low-pixel shooting is started, and the long-range overall habit actions such as the activity route, the habit position and the like of the tiger can be shot. The intelligent hunting camera stops shooting after the tiger leaves.
Because the intelligent switch to the low-pixel shooting after the condition is satisfied, the electric quantity consumption is greatly saved, and the intelligent hunting camera can easily reach the common field observation research duration (for example, 4 months).
After the tiger is shot for a plurality of times with low pixels, the tiger enters the infrared induction area again on a certain day, and under the condition that the intelligent hunting camera determines that all habits of the tiger are complete, shooting can not be triggered. Further save the electric energy, promoted duration.
On a certain day later, the infrared induction is triggered, and the intelligent hunting camera determines that the close habit of the object B (snake) entering the infrared induction area is incomplete, so that high-pixel shooting is started, and the detail habit action of the snake is clearly shot.
Therefore, by adopting the artificial intelligence-based live organism shooting method and the intelligent hunting camera, a large number of high-definition pictures can be obtained to support field animal observation and research, and the duration of the intelligent hunting camera can be greatly prolonged, so that more valuable research images can be obtained.
An exemplary smart hunting camera 100 provided by an embodiment of the present application is first described below.
Fig. 4 is a schematic structural diagram of an intelligent hunting camera 100 according to an embodiment of the present application.
The following describes embodiments by taking the intelligent hunting camera 100 as an example. It should be understood that the intelligent hunting camera 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The intelligent hunting camera 100 may include: processor 101, camera 102, memory 103, keys 104, leds 105, battery 106, infrared monitor 107, display 108, etc.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the intelligent hunting camera 100. In other embodiments of the present application, smart hunting camera 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 101 may include one or more processing units, such as: the processor 101 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Wherein the controller may be a neural hub and command center of the intelligent hunting camera 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that has just been used or recycled by the processor 101. If the processor 101 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 101 is reduced, thus improving the efficiency of the system.
In some embodiments, processor 101 may include one or more interfaces to communicate information with other modules.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the intelligent hunting camera 100.
The battery 106 may power the processor 101, camera 102, memory 103, keys 104, LED105, infrared monitor 107, display 108, etc.
The intelligent hunting camera 100 may implement display functions through a GPU, a display screen 108, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 108 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 101 may include one or more GPUs that execute program instructions to generate or change display information. In some embodiments, the smart hunting camera 100 may also be devoid of the display screen 108, not limited herein.
The intelligent hunting camera 100 may implement photographing functions through an ISP, a video camera 102, a video codec, a GPU, an application processor, and the like.
The ISP is used to process the data fed back by the camera 102. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, an ISP may be provided in the camera 102.
The camera 102 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the smart hunting camera 100 may include 1 or N cameras 102, N being a positive integer greater than 1.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the intelligent hunting camera 100 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
Memory 103 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may be read directly from and written to by the processor 101, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 101 to directly read and write.
Memory 103 may include a memory card for storing photographs or video taken by smart hunting camera 100.
The keys 104 include a power on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The smart hunting camera 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the smart hunting camera 104.
The LED105 may constitute a light supplement lamp for supplementing the photographing when the ambient light is too dark.
The infrared monitor 107 may sense an infrared heat source of the surrounding environment, thereby triggering the photographing or video recording function of the intelligent hunting camera 100 to be turned on or off.
In order to enable the intelligent hunting camera 100 to have the capability of intelligently judging the degree of completeness of the habit recorded by the photographed object, a pre-trained habit sampling completeness analysis intelligent model is stored in the memory 103 of the intelligent hunting camera 100.
FIG. 5 is a schematic diagram showing the functions of the habit sample completeness analysis intelligent model and training data according to an embodiment of the application.
And taking a picture of a certain object and a picture and/or video collection of the object as input data, and inputting the picture and/or video collection into the habit sampling completeness analysis intelligent model to obtain a completeness result of the habit of the object recorded in the picture and/or video collection as output data.
The habit sampling completeness analysis intelligent model outputs a target completeness result with three possibilities: "object near habit is incomplete", "object near habit is complete, distance habit is incomplete" and "object near habit is complete, and distance habit is complete".
It will be appreciated that the close-range habit of an object represents the habit of the object that needs to be clearly observed at a relatively short distance. The origin habit of an object represents the habit of the object that can be observed and learned at a longer distance. For the near habit and the far habit of an object, some preset situations can be preset, which can be called a near situation and a far situation respectively.
If all preset near-distance situations are included in the picture or video obtained by taking a subject, the near-distance habit of the subject can be considered to be complete, otherwise, the near-distance habit of the subject can be determined to be incomplete.
Similarly, if all preset distance situations are included in a photograph or video obtained by taking an object, the distance habit of the object can be considered to be complete, otherwise, the distance habit of the object can be determined to be incomplete.
It will be appreciated that in some cases, if a photograph or video of an object is taken, both the near and far habits of the object are incomplete, in which case the habit sample completeness analysis intelligent model outputs a result of "incomplete near habit of the object". After the picture or video of the object is continuously shot, the intelligent model for analyzing the habit sampling completeness can output the result of 'the object near habit is complete and the distant habit is incomplete' under the condition that the near habit of the object is complete and the distant habit is not complete. After the picture or video of the object is continuously shot, the habit sampling completeness analysis intelligent model can output the result of 'the object near habit is complete and the distance habit is complete' under the condition that the distant habit of the object is also complete.
It should be noted that in some cases, there may be a case where a picture or video obtained by taking an object has a complete distant habit, and the near habit of the object is not complete. In this case, the habit sampling completeness analysis intelligent model will only output the result of "object near habit incomplete". After the shooting is continued to complete the near habit of the object, the habit sampling completeness analysis intelligent model directly outputs the result of 'the near habit of the object is complete and the far habit is complete'.
To implement the functionality of the habit sample completeness analysis intelligent model, the model may be provided with the ability to target identify an AI model and to target habit completeness analysis AI models. The target recognition AI model is used for recognizing a target object from a picture or video. The object habit completeness analysis AI model is used for analyzing what kind of situation the image including the target object belongs to and whether the corresponding situation in the image and video collection already covers all preset near distance situations or far distance situations.
Specifically, a large amount of training data can be used for training the habit sampling completeness analysis intelligent model, so that the habit sampling completeness analysis intelligent model has the capability. One piece of training data may include a picture of an object (for example, pictures 1-6 of an object a, or pictures 7 of an object B, etc.), a picture or video collection including the object (for example, pictures 1-6 of an object a, or pictures 7 of an object B, etc.), and a pre-labeled degree of completeness of the object (for example, degree of completeness of habit of the object a 1-3, or degree of completeness of habit of the object B1, etc.) that has been recorded in the picture or video collection.
The following specifically describes an artificial intelligence-based active organism shooting method in the embodiment of the present application in combination with the above-described hardware structure of the exemplary intelligent hunting device 100 and a habit sampling completeness analysis intelligent model stored in the intelligent hunting device 100 in advance:
Referring to fig. 6, an exemplary flowchart of an artificial intelligence-based live biological shooting method according to an embodiment of the present application is shown.
S601, determining that a new living organism enters an infrared induction area;
the intelligent hunting device 100 may continuously sense a change in infrared radiant energy in the infrared sensing region using the infrared monitor 107, and determine that a new living being is likely to enter the infrared sensing region if the amount of infrared radiant energy sensed from the infrared sensing region per unit time exceeds a threshold amount of radiation as compared to the amount of increase in the absence of the living being.
In some embodiments, if there are living organisms in the current infrared sensing region, it is also determined that there is a possibility that a new living organism will enter the infrared sensing region when the sensed increase in the amount of infrared radiant energy exceeds the radiant value.
S602, determining whether a currently set shooting mode is a common mode or an ecological diversity mode;
after the intelligent hunting camera 100 determines that a new living organism enters the infrared sensing area, it may be determined whether the currently set photographing mode is a normal mode or an ecological diversity mode; the common mode is mainly used for common hunting shooting, and the biodiversity mode is mainly used for acquiring habit research data of diverse organisms in a region.
After the user places the intelligent hunting camera 100 and turns on, the photographing mode may be set to a normal mode or an ecological diversity mode through the key 104. For example, the ecological diversity mode may be initiated by toggling one of the keys 104 down close to the camera body.
After determining that a new living organism enters the infrared sensing area, if the shooting mode is a normal mode, executing step S603; if the shooting mode is the ecological diversity mode, step S605 is executed.
In some embodiments, the intelligent hunting camera 100 may also have only an ecology diversity mode and default to the ecology diversity mode, so that step S602 may also be absent, which is not limited herein.
S603, under the condition that the normal mode is determined, shooting high-pixel video is started;
in the event that the normal mode is determined, the intelligent hunting camera 100 may activate the camera 102 to take a high pixel video.
It will be appreciated that the image in the high-pixel video has very high definition, so that taking a high-pixel video requires more power and more memory space than taking a low-pixel video. But can clearly show the details of the shooting object, and can clearly show the detail characteristics in the image even after multiple times of amplification.
S604, stopping shooting under the condition that the infrared induction area is determined to have no active living things;
in case that the intelligent hunting camera 100 determines that there is no more living creature in the infrared sensing area through the infrared monitor 107, the photographing may be stopped and the distinction of whether there is a new living creature into the infrared sensing area may be triggered S601.
It can be understood that the power consumption of infrared induction is far less than that of taking pictures and processing, so that the shooting is stopped under the condition that no active living things exist in the infrared induction area, the electric quantity can be greatly saved, and the endurance time is prolonged.
S605, under the condition that the biological diversity mode is determined, shooting a high-pixel image;
in the event that a biodiversity pattern is determined, intelligent hunting camera 100 may first activate camera 102 to capture a high pixel image, rather than directly record the video.
The purpose of capturing the high-pixel image is to acquire an image of the living organism that enters the infrared sensing region, so as to facilitate the subsequent step judgment processing.
In some embodiments, after capturing a high-pixel image, the intelligent hunting camera 100 may also first determine whether there is a living creature in the high-pixel image. If it is determined that there is no living being active, a false alarm may be indicated by the infrared monitor 107, and the step S601 may be triggered directly to perform the judgment again.
In some embodiments, if it is determined that no active creatures are present in the high pixel images, the intelligent hunting camera 100 may also take a preset number of high pixel images (e.g., 3) at preset time intervals (e.g., 2 seconds) to determine whether active creatures are present in the high pixel images. If no movable living creature exists in the high-pixel images, the detection result is determined to be false alarm of the infrared monitor 107, and the detection can be continued by directly triggering the step S601.
If it is determined that there is an active living being in these images, execution of step S606 may be triggered.
S606, inputting all media in the high-pixel image and the camera gallery into a habit sampling completeness analysis intelligent model, and determining the completeness of the habit of the acquired first object;
after capturing the high pixel image containing the movable living being (the first object), the intelligent hunting camera 100 may input the high pixel image into a pre-stored habit sampling completeness analysis intelligent model, and at the same time, all media (pictures and/or video collections) in the camera gallery are also used as an input of the habit sampling completeness analysis intelligent model, so as to determine the completeness of the habit of the first object contained in the collected media, the first object being the movable living being in the high pixel image.
Triggering and executing step S603 under the condition that the output result of the habit sampling completeness analysis intelligent model is 'short-distance habit incomplete';
step S607 is triggered and executed when the output result of the habit sampling completeness analysis intelligent model is that the near habit is complete and the distant habit is not complete;
if the output result of the habit sampling completeness analysis intelligent model is "the near habit is complete and the distant habit is complete", step S608 is triggered and executed.
FIG. 7 is a schematic diagram of an exemplary intelligent model for analyzing the complexity of a game using the habit sampling completeness of an intelligent hunting camera 100 according to an embodiment of the present application.
The intelligent hunting camera 100 inputs the photographed high-pixel image a into the habit sampling completeness analysis intelligent model, and at the same time, all multimedia data (images, videos, etc.) stored in a gallery of a memory card of the hunting camera are used as one persistent input of the habit sampling completeness analysis intelligent model.
The object recognition AI model may recognize that the object in the high-pixel image a is a tiger, and the object habit completeness analysis AI model may analyze all preset close-range situations (such as feeding, combat and mating) including the tiger in pictures and videos in a gallery, but only include strolling situations in preset far-range situations, and neither path selection nor observation situations have been shot yet. Therefore, the output result of the habit sampling completeness analysis intelligent model is "the near habit of the object a is complete, the distant habit is incomplete", and the execution S607 may be triggered.
S607, under the condition that the close-range habit is determined to be complete and the distant habit is determined to be incomplete, starting shooting of low-pixel video;
in the event that it is determined that the near habit is complete and the distance habit is incomplete, the intelligent hunting camera 100 may activate the camera 102 to take a low pixel video. The pixels of the low-pixel image are lower than the high-pixel image, e.g. only 1/4 or less of the pixels of the high-pixel image, but the low-pixel image has been able to meet the need to learn the distance habit of the object from the image.
Therefore, in the case that the near habit is complete and the distant habit is not complete, only the low-pixel video is shot instead of the high-pixel video, and in the case that the habit research data of the target object is acquired, the electric energy is greatly saved, the duration of the intelligent hunting camera 100 is prolonged, the storage space of the memory 103 of the intelligent hunting camera 100 is also saved, and more valuable contents can be stored.
In the process of executing step S607, the determination of whether the condition for executing step S604 is satisfied may be continued, and when the condition is satisfied, step S604 may be executed.
S608, under the condition that the near habit and the far habit are determined to be complete, video shooting is not started.
Under the condition that the near-distance habit and the far-distance habit are complete, it means that the habit data of the object is studied in the ecological diversity mode, and more electric quantity should be reserved to shoot the object whose habit data is not complete, so that the intelligent hunting camera 100 may not start video shooting, and trigger step S601.
In some embodiments, it may also be configured that, in the case that the preset near-distance habit and far-distance habit of the multiple target objects are complete, the intelligent hunting camera 100 may automatically switch the shooting mode from the biodiversity mode to the normal mode, which is not limited herein.
In the embodiment of the present application, when the shooting mode is the biodiversity mode, after the intelligent hunting camera 100 senses that a new living creature exists in the infrared sensing area, the shooting of the high-pixel video is not directly started, but a high-pixel image is shot first, the completeness degree of habit of the object in the high-pixel image recorded in the currently shot medium is judged by the habit sampling completeness degree analysis intelligent model, and then corresponding processing is performed according to the completeness degree. For example, in the case that the near habit of the object is determined to be complete and the far habit of the object is not complete, only the low-pixel video is shot, so as to satisfy the acquisition requirement of the biodiversity habit research data with less power consumption. Even under the condition that the near habit and the far habit of the object are determined to be complete, video shooting can be directly not started, more electric quantity is reserved for shooting the object with incomplete habit data, and the duration of the intelligent hunting camera 100 in ecological diversity habit research is greatly improved.
In the above embodiment, the intelligent hunting camera 100 may intelligently determine the completeness of the habit of the object in the image by shooting a high-pixel image of the object entering the sensing area, so as to adjust the duration of the intelligent hunting camera 100 by adopting high-pixel video, low-pixel video, or even not video. In some embodiments, enhanced application of low power consumption infrared sensing technology in this scenario may be used to further reduce the power consumption of the smart hunting camera 100.
Referring to fig. 8, another exemplary flowchart of an artificial intelligence-based live biological shooting method according to an embodiment of the present application is shown.
In connection with the embodiment shown in fig. 6, steps S801 and S802 may be performed first, between performing steps S602 and S605:
s801, caching the infrared induction characteristic data;
when the intelligent hunting camera 100 determines that a new living creature enters the infrared sensing area and the current shooting mode is the biodiversity mode, the infrared sensing characteristic data sensed by the infrared monitor 107 can be cached. The infrared sensing characteristic data may include the shape, size, and radiation value of the characteristic point in the triggered infrared radiation change region.
S802, determining whether infrared sensing characteristic data with the similarity to the current infrared sensing characteristic data exceeding a preset approximate threshold exists in a diversity complete result library;
the intelligent hunting camera 100 maintains a continuously updated diversity complete result library in which the correspondence between the characteristic identification of a complete object and the infrared induction characteristic data caused by the object is recorded. The complete object refers to an object that has complete both the near habit and the distant habit determined in step S606. The correspondence is recorded and updated in step S803.
That is, the diversity complete result library records the infrared sensing characteristic data of all the objects, which are currently determined to be complete by the intelligent hunting camera 100, in both the near habit and the distance habit.
Fig. 9 is an exemplary schematic diagram of the content stored in the diversity complete result library according to the embodiment of the present application. The diversity complete result library is recorded with the marks of tigers, lions, rabbits, other objects (such as the characteristics of the object B) and the like, and the objects are the objects with complete near-distance habit and distant-distance habit. And each object identifier corresponds to the infrared induction characteristic data caused by the object entering the infrared induction area. For example, when a tiger enters the infrared induction region, the shape and size of the induced infrared radiation change region are S1, and the set of infrared radiation values at the feature points selected by the feature point model is F1. It is different from the shape and size S2 of the infrared radiation change area caused by the lion entering the infrared application area and the infrared radiation value set F2 at the characteristic point selected by the characteristic point model. Similarly, the shape and the size of the infrared radiation change area caused by the rabbit entering the infrared induction area are S3 and an infrared radiation value set F3 at the characteristic point selected by the characteristic point model are also recorded in the diversity complete result library. Many other objects may also be recorded with infrared sensing characteristic data (e.g., object B), and is not limited herein.
After caching the current infrared induction characteristic data, the intelligent hunting camera 100 can compare the current infrared induction characteristic data with the infrared induction characteristic data of the completed objects stored in the diversity complete result library, and determine whether the infrared induction characteristic data with the similarity exceeding a preset approximate threshold value exists in the current infrared induction characteristic data;
if so, the moving organism entering the infrared sensing area is an object with complete habit, and the step S608 can be directly triggered and executed without shooting a high-pixel image to judge the completeness of habit;
if not, the step S605 is triggered to be executed to capture the high-pixel image and use the habit sampling completeness analysis intelligent model to perform corresponding judgment and processing.
In connection with the embodiment shown in fig. 6, after step S608 is performed, step S803 may also be performed:
s803, recording the complete corresponding relation of the infrared ray to a diversity complete result library;
the intelligent hunting camera 100 can record the corresponding relationship between the feature identifier of the object and the current infrared induction feature as the current infrared complete corresponding relationship in the diversity complete result library under the condition that the near-distance habit and the far-distance habit of the active organism triggering the infrared induction are determined to be complete and video shooting is not started. The feature identifier of the object is used to uniquely identify the object in the diverse complete result library, and may be, for example, a name of the object, a label of the object, a portion of a certain feature value of the object, etc., which is not limited herein. For example, if the object is a first object, the present-time ir complete correspondence is a correspondence between the present-time ir sensing feature and a feature identifier of the first object.
It should be noted that, if the correspondence between the feature identifier of the object and the previously obtained infrared sensing feature already exists in the diversity complete result library, the present infrared complete correspondence may also be added to the diversity complete result library, so as to enhance the comprehensiveness of the diversity complete result library. For example, there is a record of the first ir complete correspondence in the diversity complete result library: the corresponding relation between the characteristic identification of the first object and the infrared induction characteristic data acquired when the first object enters the infrared induction area for the nth time; then when the first object enters the infrared sensing region for the second time, a record of a second infrared complete correspondence may also be added to the diversity complete result library: the corresponding relation between the characteristic identification of the first object and the infrared induction characteristic data acquired when the first object enters the infrared induction region for the (n+1) th time. In some embodiments, only the correspondence between the characteristic identifier of the first object and the corresponding infrared sensing characteristic data may be retained in the diversity complete result library, which is not limited herein.
In the embodiment of the present application, by continuously updating the diversity complete result library recorded with the correspondence between the characteristic identifier of the object with complete habit and the infrared sensing characteristic data in the intelligent hunting camera 100, when the moving creature entering the infrared sensing area is determined to be the object with complete habit through the infrared sensing characteristic data, the shooting of the high-pixel image and the judgment of the habit sampling completion analysis intelligent model are not triggered any more, and under the condition of meeting the requirement of the study of the biodiversity habit, the electric energy of the intelligent hunting camera 100 is further saved.
In the above embodiment, by comparing and judging the infrared sensing characteristic data in the diversity complete result library, the repeated shooting of the habit complete biological data can be reduced by the intelligent hunting camera 100. In some embodiments, in order to obtain a wider range of biodiversity data, a plurality of intelligent hunting cameras 100 are typically placed in a large area for data acquisition. The intelligent hunting cameras 100 can often shoot repeated worthless materials, and the problem can be solved by sharing the diversity complete result library, and the duration of the intelligent hunting cameras 100 is greatly prolonged.
For convenience of description and understanding, only two intelligent hunting cameras 100 (intelligent hunting camera a and intelligent hunting camera B) are taken as examples in the embodiment of the present application, and in practical application, more intelligent hunting cameras 100 may be matched with each other, which is not limited herein.
In some embodiments, smart hunting camera a may also be referred to as a first smart hunting camera, and smart hunting camera B may also be referred to as a second smart hunting camera; in other embodiments, smart hunting camera a may be referred to as a second smart hunting camera, smart hunting camera B may be referred to as a first smart hunting camera, and smart hunting camera a or smart hunting camera B may be referred to as other smart hunting cameras, without limitation.
The following describes embodiments of the present application in three stages in conjunction with the specific application scenario exemplary diagram shown in fig. 11. Referring to fig. 10, another exemplary flowchart of an artificial intelligence-based live biological shooting method according to an embodiment of the present application is shown.
Stage one: the content of the object A shot by the intelligent hunting camera B is incomplete;
s1001, an intelligent hunting camera B determines that an active organism enters an infrared induction area;
when the object a enters the infrared sensing region of the intelligent hunting camera B, the intelligent hunting camera B may determine that the living creature enters the infrared sensing region.
S1002, determining a shooting mode as a biodiversity mode;
s1003, determining that the infrared sensing characteristic data with the similarity 
 of the infrared sensing characteristic data exceeding a preset approximate threshold value does not exist in a diversity complete result library;
s1004, shooting a high-pixel image;
s1005, analyzing an intelligent model based on habit sampling completeness, and determining that the near-distance habit is incomplete;
s1006, starting shooting high-pixel video;
s1007, stopping shooting;
after a very short time of high pixel video recording, the living creature leaves the infrared sensing area of the intelligent hunting camera B, and the intelligent hunting camera B stops shooting.
The specific execution process of steps S1001 to S1007 is similar to the steps in the embodiment shown in fig. 6 and the embodiment shown in fig. 8, and will not be repeated here.
In the first stage, since the content photographed by the intelligent hunting camera B before does not cover the near-distance habit of the object a, after the object a enters the infrared induction range of the intelligent hunting camera B, the high-pixel image photographed by the intelligent hunting camera B is analyzed by adopting a habit sampling completeness analysis intelligent model, and under the condition that the near-distance habit is determined to be incomplete, the hunting camera directly starts the high-pixel video recording.
Stage two: the content of the object A shot by the intelligent hunting camera A is complete;
s1008, determining that the living organism enters an infrared induction area;
after a period of time, the subject a is active in the infrared sensing region of the intelligent hunting camera a, which determines that an active living being enters the infrared sensing region.
S1009, determining that the infrared sensing characteristic data with the similarity 
 between the infrared sensing characteristic data and the current infrared sensing characteristic data exceeds a preset approximate threshold exists in the diversity complete result library;
s1010, shooting is not started;
the specific execution process of steps S1008 to S1010 is similar to that of the embodiment shown in fig. 8, and will not be repeated here.
In this stage two, as shown in fig. 11 (a), the intelligent hunting camera a has previously shot enough data of the object a (tiger), that is, the near-distance habit and the distant habit of the object a are complete, and the infrared induction characteristic data of the object a are recorded into a diversity complete result library. After the object A enters the infrared induction area of the intelligent hunting camera A, the intelligent hunting camera A can directly determine that the object A is a complete object in habit by comparing the infrared induction characteristic data with the infrared induction characteristic data in the complete diversity result library, so that shooting is not started.
Stage three: the content of the object a shot by the hunting camera B is not complete yet, but because the cameras synchronize with the complete result library, the hunting camera B does not shoot the object a shot by the hunting camera a.
S1011, synchronizing the diversity complete result library among the intelligent hunting cameras;
the intelligent hunting cameras can synchronize the diversity complete result library according to certain rules or conditions. For example, a library of complete results that are synchronized once every preset period of time (e.g., 1 day) may be set; the multiple complete result library may be set to be triggered and synchronized once each time the multiple complete result library is changed, and the present invention is not limited thereto.
The intelligent hunting cameras can be integrated with a communication module, and a plurality of intelligent hunting cameras can communicate through the communication module. In order to save power, a fixed time interval may be set to simultaneously activate a plurality of intelligent hunting camera communication modules to enable communication between the plurality of intelligent hunting cameras, which is not limited herein.
The intelligent hunting cameras can synchronize the diversity complete result library, and can transmit the diversity complete result library to all other communicable intelligent hunting cameras. After receiving one or more diversity complete result libraries, the intelligent hunting camera may add all data to its own diversity complete result library. In order to save the consumption of data transmission, the data changed from the last synchronous diversity complete result library can be transmitted to all other communicable intelligent hunting cameras, which is not limited herein.
As shown in fig. 11 (a), the intelligent hunting camera a may synchronize the diversity complete result library with the intelligent hunting camera B, which includes the infrared sensing characteristic data related to the object a not included in the diversity complete result library of the intelligent hunting camera B.
S1012, determining that living organisms enter an infrared induction area;
when the object a enters the infrared sensing region of the intelligent hunting camera B again, the intelligent hunting camera B determines that the living creature enters the infrared sensing region.
S1013, determining that the infrared sensing characteristic data with the similarity 
 of the infrared sensing characteristic data exceeding a preset approximate threshold exists in a diversity complete result library;
the data of the diversity complete result library of the intelligent hunting camera A is synchronized, and the diversity complete result library of the intelligent hunting camera B also comprises the infrared sensing characteristic data of the object A. By comparing the current infrared induction characteristic data with the infrared induction characteristic data in the diversity complete result library, the intelligent hunting camera B can determine that the infrared induction characteristic data with the similarity 
 of the current infrared induction characteristic data exceeding a preset approximate threshold value exists in the diversity complete result library, namely the object A is a habit recorded complete object.
S1014, not starting shooting.
As shown in (B) of fig. 11, although the intelligent hunting camera B does not store enough media content to cover the near and far habits of the subject a, the intelligent hunting camera B does not start shooting because the diversity of the complete result library of the intelligent hunting camera B has the infrared sensing characteristic data having the similarity exceeding the preset approximation threshold with the infrared sensing characteristic data when the subject a enters the infrared sensing region for the first time.
According to the embodiment of the application, the intelligent hunting cameras can not shoot the object with complete habit information shot by all the communicable intelligent hunting cameras by synchronizing the diversity complete result libraries among the plurality of intelligent hunting cameras, so that the shooting of repeated habit information is greatly reduced, the duration of the intelligent hunting cameras is prolonged, and the storage space of the intelligent hunting cameras is prevented from being occupied by repeated low-value information.
The functional modular structure of the intelligent hunting camera 100 according to the embodiment of the present application will be described in conjunction with the above-described artificial intelligence-based live-action shooting method. Referring to fig. 12, a schematic block diagram of a smart hunting camera 100 according to the present application is shown.
The intelligent hunting camera 100 includes:
an image capturing module 1201, configured to capture a high-pixel image when it is determined that a new living organism enters the infrared sensing area and the current capturing mode is an ecological diversity mode;
a completeness intelligent analysis module 1202, configured to obtain a completeness of a habit of a collected first object after inputting all media captured in the high-pixel image and the gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model; wherein the first object is a movable living being identified from the high-pixel image; the completeness of the habit is any one of the completeness of the short-distance habit, the short-distance habit and the long-distance habit, or the short-distance habit and the long-distance habit;
The high-pixel video recording module 1203 is configured to start shooting a high-pixel video recording if it is determined that the degree of completeness of the habit of the acquired first object is incomplete in the near habit;
a low-pixel video recording module 1204, configured to start shooting a low-pixel video recording if it is determined that the degree of completeness of the habit of the acquired first object is complete in the near habit but incomplete in the distance habit; the resolution of the low-pixel image obtained by the low-pixel video is less than 1/2 of the resolution of the high-pixel image;
the shooting blocking module 1205 is configured to not start video shooting when it is determined that the degree of completeness of the habit of the acquired first object is complete in the near habit and complete in the distance habit.
Optionally, in some embodiments, the completeness intelligent analysis module 1202 may specifically include:
an object recognition unit 12021 for recognizing the first object in the high-pixel image by the habit sampling completeness analysis intelligent model;
a situation coverage analysis unit 12022, configured to analyze whether all media shot by the intelligent model include N preset near-distance situations and M preset far-distance situations of the first object according to the habit sampling completeness; n and M are positive integers greater than or equal to 2;
A near-distance result unit 12023, configured to receive a near-distance habit incomplete result output by the habit sampling completeness analysis intelligent model, where the N preset near-distance situations of the first object are determined to be not fully included in all the captured media;
a distant result unit 12024, configured to receive a result that the close habit outputted by the habit sampling completeness analysis intelligent model is complete but the distant habit is incomplete, in a case where it is determined that the N preset close instances of the first object have been completely included in all the photographed media, but the M distant instances of the first object have not been completely included;
a complete result unit 12025, configured to receive a result that the close habit and the distant habit output by the habit sampling completeness analysis intelligent model are complete, where the N preset close instances of the first object are determined to be completely included in all captured media, and the M distant instances of the first object are determined to be completely included.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
a mode determining module 1206, configured to determine whether a currently set shooting mode is a normal mode or the ecological diversity mode;
The high-pixel video recording module 1203 is further configured to start shooting the high-pixel video recording if the shooting mode is determined to be the normal mode;
a stop shooting module 1207 for controlling to stop shooting in the case that it is determined that there is no more living organism in the infrared sensing area.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
the infrared feature buffer module 1208 is configured to buffer the current infrared sensing feature data, where the infrared sensing feature data includes a shape and a size of a triggered infrared radiation change area, and a radiation value of a feature point in the area;
an infrared feature comparison module 1209, configured to determine whether there is infrared sensing feature data in the diversity complete result library, where the degree of approximation of the infrared sensing feature data exceeds a preset approximation threshold; the intelligent hunting camera is used for acquiring a plurality of acquired habit of an object, wherein the diversity complete result library is recorded with a plurality of infrared complete corresponding relations, and one infrared complete corresponding relation is a corresponding relation between a recorded characteristic identifier of the object and infrared induction characteristic data of the object entering an infrared induction area at the present time when the acquired habit of the object is determined to be complete in short-distance habit and complete in long-distance habit;
The shooting blocking module 1205 is further configured to not start video shooting when it is determined that the diversity complete result library has infrared sensing characteristic data with an approximation degree exceeding a preset approximation threshold value.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
the infrared repeat comparison module 1210 is configured to determine, when it is determined that the diversity complete result library includes infrared sensing feature data having an approximation degree with the current infrared sensing feature data exceeding a preset approximation threshold, whether the current infrared sensing feature data is identical to the infrared sensing feature data having an approximation degree with the current infrared sensing feature data exceeding a preset approximation threshold in the diversity complete result library;
a cache clearing module 1211, configured to clear the cached current infrared sensing characteristic data under the same condition;
the infrared complete establishing module 1212 is configured to establish, under different conditions, a correspondence with the current infrared sensing feature data by using feature identifiers of objects corresponding to the infrared sensing feature data exceeding a preset approximate threshold in the diversity complete result library, as a new infrared complete correspondence;
And an infrared complete new adding module 1213, configured to record the new infrared complete corresponding relationship into the diversity complete result library.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
a complete library synchronization module 1214, configured to send data in the diversity complete result library to other intelligent hunting cameras within a communication range when a preset synchronization condition is satisfied, and receive data in the diversity complete result library sent by the intelligent hunting cameras;
and the complete library updating module 1215 is configured to process the received data from the diversity complete result libraries of the other intelligent hunting cameras, so that the diversity complete result libraries include the infrared complete correspondence in the diversity complete result libraries of all the other intelligent hunting cameras within the communication range.
Optionally, in some embodiments, the preset synchronization condition is that the number of the newly added infrared complete correspondence in the diversity complete result library exceeds a preset synchronization value.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.

Claims (8)

1. An artificial intelligence-based active organism shooting method applied to an intelligent hunting camera, which is characterized by comprising the following steps:
under the condition that a new living organism enters an infrared sensing area and the current shooting mode is an ecological diversity mode, the intelligent hunting camera shoots a high-pixel image;
the intelligent hunting camera inputs all media shot in the high-pixel image and the gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model to acquire the completeness of the habit of the acquired first object; wherein the first object is a movable living being identified from the high-pixel image; the completeness of the habit is any one of the completeness of the short-distance habit, the short-distance habit and the long-distance habit, or the short-distance habit and the long-distance habit;
Under the condition that the completeness of the habit of the collected first object is determined to be incomplete in the short-distance habit, the intelligent hunting camera starts shooting high-pixel video;
under the condition that the completeness of the habit of the collected first object is determined to be complete in the near habit but incomplete in the distant habit, the intelligent hunting camera starts shooting the low-pixel video; the resolution of the low-pixel image obtained by the low-pixel video is less than 1/2 of the resolution of the high-pixel image;
under the condition that the completeness of the habit of the collected first object is determined to be complete in the near habit and complete in the distant habit, the intelligent hunting camera does not start video shooting;
the intelligent hunting camera inputs all media shot in the high-pixel image and the gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model to acquire the completeness of the habit of the acquired first object, and the intelligent hunting camera specifically comprises the following steps:
the intelligent hunting camera identifies the first object in the high-pixel image through the habit sampling completeness analysis intelligent model;
the intelligent hunting camera analyzes an intelligent model through the habit sampling completeness to determine whether all shot media comprise N preset near-distance situations and M preset far-distance situations of the first object; n and M are positive integers greater than or equal to 2;
Under the condition that N preset near-distance situations of the first object are not completely included in all shot media, the intelligent hunting camera receives a near-distance habit incomplete result output by the habit sampling completeness analysis intelligent model;
in the case that the N preset near-distance situations of the first object are completely included in all the shot media, but the M far-distance situations of the first object are not completely included, the intelligent hunting camera receives a result that the near-distance habit output by the habit sampling completeness analysis intelligent model is complete but the far-distance habit is incomplete;
in the case that it is determined that the N preset near-distance situations of the first object have been completely included in all the photographed media and the M far-distance situations of the first object have been completely included, the intelligent hunting camera receives a result that the near-distance habit outputted by the habit sampling completeness analysis intelligent model is complete and the far-distance habit is complete.
2. The method of claim 1, wherein prior to the step of capturing a high pixel image by the smart hunting camera, the method further comprises:
The intelligent hunting camera determines whether a currently set shooting mode is a common mode or the ecological diversity mode;
under the condition that the shooting mode is determined to be the normal mode, the intelligent hunting camera starts shooting the high-pixel video;
in the event that it is determined that there are no more living organisms in the infrared sensing region, the intelligent hunting camera stops shooting.
3. The method according to claim 1 or 2, wherein in case it is determined that the photographing mode is the ecological diversity mode, before the step of photographing a high pixel image by the intelligent hunting camera, the method further comprises:
the intelligent hunting camera caches the infrared induction characteristic data, wherein the infrared induction characteristic data comprises the shape and the size of a triggered infrared radiation change area and the radiation value of a characteristic point in the area;
the intelligent hunting camera determines whether infrared sensing characteristic data with the similarity to the current infrared sensing characteristic data exceeding a preset approximate threshold value exists in a diversity complete result library; the intelligent hunting camera is used for acquiring a plurality of acquired habit of an object, wherein the diversity complete result library is recorded with a plurality of infrared complete corresponding relations, and one infrared complete corresponding relation is a corresponding relation between a recorded characteristic identifier of the object and infrared induction characteristic data of the object entering an infrared induction area at the present time when the acquired habit of the object is determined to be complete in short-distance habit and complete in long-distance habit;
And under the condition that the infrared sensing characteristic data with the similarity with the current infrared sensing characteristic data exceeding a preset approximate threshold exists in the diversity complete result library, the intelligent hunting camera does not start video shooting.
4. A method according to claim 3, characterized in that the method further comprises:
under the condition that the fact that the infrared sensing characteristic data with the similarity exceeding a preset approximate threshold value exists in the diversity complete result library is determined, the intelligent hunting camera determines whether the infrared sensing characteristic data is identical with the infrared sensing characteristic data with the similarity exceeding the preset approximate threshold value in the diversity complete result library;
if the current infrared sensing characteristic data is the same, the intelligent hunting camera clears the cached current infrared sensing characteristic data;
if the two types of the information are different, the intelligent hunting camera establishes a corresponding relation with the current infrared induction characteristic data by using the characteristic identification of an object corresponding to the infrared induction characteristic data exceeding a preset approximate threshold value in the diversity complete result library, and the corresponding relation is used as a new infrared complete corresponding relation;
the intelligent hunting camera records the new infrared complete corresponding relation into the diversity complete result library.
5. A method according to claim 3, characterized in that the method further comprises:
when a preset synchronization condition is met, the intelligent hunting camera sends data in the diversity complete result library to other intelligent hunting cameras in a communication range, and receives the data of the diversity complete result library sent by the intelligent hunting camera;
the intelligent hunting cameras process the received data from the diversity complete result libraries of the other intelligent hunting cameras, so that the diversity complete result libraries of the intelligent hunting cameras comprise infrared complete corresponding relations in the diversity complete result libraries of all the other intelligent hunting cameras in a communication range.
6. The method of claim 5, wherein the predetermined synchronization condition is that a number of newly added ir complete correspondence in the diversity complete result library exceeds a predetermined synchronization value.
7. An intelligent hunting camera, characterized in that the intelligent hunting camera comprises: the device comprises a processor, a memory, an infrared monitor and a camera;
the infrared monitor is used for receiving infrared radiation data of the infrared induction area and transmitting the infrared radiation data to the processor; the camera is used for receiving the instruction of the processor to start or stop shooting and transmitting the shot image to the processor;
The memory is coupled to the processor, the memory for storing computer program code comprising computer instructions that the processor invokes to cause the intelligent hunting camera to perform the method of any of claims 1-6.
8. A computer readable storage medium comprising instructions that, when run on a smart hunting camera, cause the smart hunting camera to perform the method of any of claims 1-6.
CN202211450056.8A 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera Active CN115835012B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211450056.8A CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211450056.8A CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Publications (2)

Publication Number Publication Date
CN115835012A CN115835012A (en) 2023-03-21
CN115835012B true CN115835012B (en) 2023-09-15

Family

ID=85529362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211450056.8A Active CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Country Status (1)

Country Link
CN (1) CN115835012B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101033237B1 (en) * 2011-02-18 2011-05-06 (주)테라테코 Multi-function detecting system for vehicles and security using 360 deg. wide image and method of detecting thereof
KR101625471B1 (en) * 2014-12-30 2016-05-30 목원대학교 산학협력단 Method and apparatus for enhancing resolution of popular low cost thermal image camera
CN106973235A (en) * 2017-04-28 2017-07-21 深圳东方红鹰科技有限公司 The image pickup method and device detected based on rpyroelectric infrared
CN108737716A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Image pickup method, device and smart machine
CN112235338A (en) * 2020-07-27 2021-01-15 北京图力普联科技有限公司 Animal husbandry breeding monitoring system with artificial intelligence
CN112637503A (en) * 2020-12-22 2021-04-09 深圳市九洲电器有限公司 Photographing apparatus, photographing method, and computer-readable storage medium
CN113411504A (en) * 2021-08-18 2021-09-17 成都大熊猫繁育研究基地 Intelligent shooting method and system for field infrared camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009037460A (en) * 2007-08-02 2009-02-19 Sanyo Electric Co Ltd Image processing method, image processor, and electronic equipment equipped with image processor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101033237B1 (en) * 2011-02-18 2011-05-06 (주)테라테코 Multi-function detecting system for vehicles and security using 360 deg. wide image and method of detecting thereof
KR101625471B1 (en) * 2014-12-30 2016-05-30 목원대학교 산학협력단 Method and apparatus for enhancing resolution of popular low cost thermal image camera
CN106973235A (en) * 2017-04-28 2017-07-21 深圳东方红鹰科技有限公司 The image pickup method and device detected based on rpyroelectric infrared
CN108737716A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Image pickup method, device and smart machine
CN112235338A (en) * 2020-07-27 2021-01-15 北京图力普联科技有限公司 Animal husbandry breeding monitoring system with artificial intelligence
CN112637503A (en) * 2020-12-22 2021-04-09 深圳市九洲电器有限公司 Photographing apparatus, photographing method, and computer-readable storage medium
CN113411504A (en) * 2021-08-18 2021-09-17 成都大熊猫繁育研究基地 Intelligent shooting method and system for field infrared camera

Also Published As

Publication number Publication date
CN115835012A (en) 2023-03-21

Similar Documents

Publication Publication Date Title
CN110035141B (en) Shooting method and equipment
TWI765304B (en) Image reconstruction method and image reconstruction device, electronic device and computer-readable storage medium
CN101860679B (en) Digital camera and image capturing method
CN104320571B (en) Electronic equipment and method for electronic equipment
CN101742101B (en) Imaging apparatus and display control method in imaging apparatus
US9088721B2 (en) Imaging apparatus and display control method thereof
CN106664366B (en) Image processing apparatus, photographic device and image processing method
CN104488258A (en) Method and apparatus for dual camera shutter
CN107820021A (en) Automatic image capture
JP2022512125A (en) Methods and Electronic Devices for Taking Long Exposure Images
CN101563913B (en) Removal of artifacts in flash images
CN115086567A (en) Time-delay shooting method and device
CN104506770A (en) Method and device for photographing image
CN108259767B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114390212B (en) Photographing preview method, electronic device and storage medium
CN115835012B (en) Artificial intelligence-based active organism shooting method and intelligent hunting camera
CN109766473A (en) Information interacting method, device, electronic equipment and storage medium
CN115633262B (en) Image processing method and electronic device
CN111512625B (en) Image pickup apparatus, control method thereof, and storage medium
WO2023142830A1 (en) Camera switching method, and electronic device
CN108495038B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114495395A (en) Human shape detection method, monitoring and early warning method, device and system
KR20210101697A (en) Method for create a plurality of content and electronic device thereof
CN112492216B (en) Shooting control method and shooting control device
CN117880634A (en) Video shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant