CN115835012A - Moving organism shooting method based on artificial intelligence and intelligent hunting camera - Google Patents

Moving organism shooting method based on artificial intelligence and intelligent hunting camera Download PDF

Info

Publication number
CN115835012A
CN115835012A CN202211450056.8A CN202211450056A CN115835012A CN 115835012 A CN115835012 A CN 115835012A CN 202211450056 A CN202211450056 A CN 202211450056A CN 115835012 A CN115835012 A CN 115835012A
Authority
CN
China
Prior art keywords
intelligent
habit
complete
hunting
hunting camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211450056.8A
Other languages
Chinese (zh)
Other versions
CN115835012B (en
Inventor
王尔康
周松河
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Original Assignee
HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD filed Critical HUARUI YANNENG TECHNOLOGY (SHENZHEN) CO LTD
Priority to CN202211450056.8A priority Critical patent/CN115835012B/en
Publication of CN115835012A publication Critical patent/CN115835012A/en
Application granted granted Critical
Publication of CN115835012B publication Critical patent/CN115835012B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

According to the method, the degree of the habit of the current object in the infrared sensing area being collected can be intelligently identified, so that high-pixel shooting or low-pixel shooting is correspondingly triggered, even shooting can be not triggered, the intelligent hunting camera greatly improves the endurance time of the intelligent hunting camera while acquiring a high-definition picture supporting field animal observation and research.

Description

Moving organism shooting method based on artificial intelligence and intelligent hunting camera
Technical Field
The application relates to the field of artificial intelligence and intelligent terminals, in particular to an artificial intelligence-based moving organism shooting method and an intelligent hunting camera.
Background
The hunting camera is mainly used for hunting, field animal observation and research and the like. Compared with the traditional common camera, the camera has overlong standby time and superstrong waterproof capability, and can adapt to various field environments.
With the development of camera technology, the pixels of hunting cameras are greatly improved from 500 ten thousand pixels to 800 ten thousand pixels and then to 1200 ten thousand pixels or even higher, so that the requirements of field biological research data acquisition can be better met.
However, the improvement of the pixels of the hunting camera brings higher definition pictures, and simultaneously, the power consumption of the hunting camera is increased sharply, so that the endurance time of the hunting camera is greatly shortened.
Disclosure of Invention
The application provides a live creature shooting method based on artificial intelligence and an intelligent hunting camera, so that the intelligent hunting camera can acquire a high-definition picture supporting field animal observation and research, and meanwhile the duration of the intelligent hunting camera is prolonged.
In a first aspect, the present application provides a method for shooting a living creature based on artificial intelligence, which is applied to an intelligent hunting camera, and comprises: under the condition that a new active organism enters an infrared sensing area and the current shooting mode is an ecological diversity mode, shooting a high-pixel image by the intelligent hunting camera; the intelligent hunting camera inputs the high-pixel image and all media shot in a gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model to obtain the completeness of the acquired habit of the first object; wherein the first object is a moveable creature identified from the high-pixel image; the degree of completeness of the habit is any one of three degrees of completeness, i.e., near-distance habit is incomplete, near-distance habit is complete and far-distance habit is incomplete, or near-distance habit is complete and far-distance habit is complete; under the condition that the completeness of the acquired habit of the first object is determined to be incomplete of the short-distance habit, the intelligent hunting camera starts to shoot a high-pixel video; under the condition that the completeness of the acquired habit of the first object is determined to be that the short-distance habit is complete but the long-distance habit is not complete, the intelligent hunting camera starts to shoot a low-pixel video; wherein, the resolution of the low pixel image obtained by the low pixel video is less than 1/2 of the resolution of the high pixel image; the intelligent hunting camera does not start video shooting under the condition that the completeness of the acquired habit of the first object is determined to be complete in short-distance habit and complete in long-distance habit.
In the above embodiment, when the shooting mode is the biodiversity mode, the intelligent hunting camera 100 does not directly start shooting the high-pixel video after sensing that there is a new living creature in the infrared sensing area, but shoots a high-pixel image first, judges the completeness of the habit of the object in the high-pixel image recorded in the currently shot media through the habit sampling completeness analysis intelligent model, and then performs corresponding processing according to the completeness. For example, in the case that the close-distance habit of the subject is determined to be complete and the far-distance habit is not complete, only a low-pixel video is shot, so that the acquisition requirement of biodiversity habit research data can be met with less electric energy consumption. Even under the condition that the short-distance habit and the long-distance habit of the object are determined to be complete, the video shooting can be directly started, more electric quantity is reserved to shoot the objects with incomplete habit data, and the duration of the intelligent hunting camera 100 in ecological diversity habit research is greatly prolonged.
With reference to some embodiments of the first aspect, in some embodiments, the intelligent hunting camera inputting all media captured in the gallery of the intelligent hunting camera and the high-pixel image into the habit sample completeness analysis intelligent model, and acquiring the completeness of the acquired habit of the first subject, specifically including: the intelligent hunting camera identifies the first object in the high-pixel image through the habit sample completeness analysis intelligent model; the intelligent hunting camera analyzes whether all media shot by an intelligent model comprise N preset close-distance situations and M preset long-distance situations of the first object through the habit sampling completeness; n and M are both positive integers greater than or equal to 2; under the condition that the N preset close-range conditions of the first object are not completely included in all shot media, the intelligent hunting camera receives a close-range habit incomplete result output by the habit sample completeness analysis intelligent model; in the case that it is determined that the N preset near situations of the first object are completely included in all the media that have been shot, but the M far situations of the first object are not completely included, the intelligent hunting camera receives the result that the habit sample completeness analysis intelligent model outputs that the near habit is complete but the far habit is not complete; in the case that it is determined that the N preset near cases of the first object are completely included in all the media that have been shot and the M far cases of the first object are completely included, the intelligent hunting camera receives the results of the habit sample completeness analysis intelligent model that outputs the close habit and the far habit are complete.
In the above embodiment, the habit sample completeness analysis intelligent model can determine whether the near-distance habit or the far-distance image of the target object is complete according to whether the shot media cover N preset near-distance situations and M far-distance situations of the target object, so that the accuracy of the habit sample completeness analysis intelligent model for determining the habit completeness is improved.
With reference to some embodiments of the first aspect, in some embodiments, before the step of taking a high-pixel image by the smart hunting camera, the method further includes: the intelligent hunting camera determines whether the current set shooting mode is a common mode or the ecological diversity mode; under the condition that the shooting mode is determined to be the common mode, the intelligent hunting camera starts to shoot the high-pixel video; in the event that it is determined that there are no more living creatures in the infrared sensing area, the intelligent hunting camera stops shooting.
In the above embodiment, the intelligent hunting camera can also adopt a common mode to shoot in other required scenes, so that the applicability of the intelligent hunting camera to different scenes is improved.
In some embodiments, the normal mode can be automatically switched to when the ecological diversity mode is adopted to shoot the shot object meeting the preset condition, so that the capability of the intelligent hunting camera for individually meeting different use requirements is further improved.
With reference to some embodiments of the first aspect, in some embodiments, in the case that the shooting mode is determined to be the ecological diversity mode, before the step of the intelligent hunting camera shooting a high-pixel image, the method further comprises: the intelligent hunting camera caches the infrared induction characteristic data, wherein the infrared induction characteristic data comprise the shape and the size of a triggered infrared radiation change area and the radiation numerical value of a characteristic point in the area; the intelligent hunting camera determines whether the infrared induction characteristic data with the similarity exceeding a preset approximate threshold exists in a complete diversity result library; the complete result library of the diversity records a plurality of complete infrared corresponding relations, wherein one complete infrared corresponding relation is the corresponding relation between the recorded feature identifier of the object and the infrared induction feature data of the object entering the infrared induction area at the same time when the intelligent hunting camera determines that the completeness of the acquired habit of the object is complete in short-distance habit and complete in long-distance habit; and under the condition that the infrared induction characteristic data with the similarity exceeding a preset approximate threshold exists in the complete diversity result base, the intelligent hunting camera does not start video shooting.
In the embodiment, the intelligent hunting camera records the diversity complete result base of the corresponding relation between the feature identifier of the object with complete habit and the infrared induction feature data, when the infrared induction feature data is used for determining that the living creature entering the infrared induction area at this time is the object with complete habit, the shooting of the high-pixel image and the judgment of the habit sampling completeness analysis intelligent model are not triggered, and the electric energy of the intelligent hunting camera is further saved under the condition of meeting the requirement of biodiversity habit research.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: under the condition that the infrared induction characteristic data with the similarity exceeding the preset approximate threshold exists in the diversity complete result base, the intelligent hunting camera determines whether the infrared induction characteristic data of the time is the same as the infrared induction characteristic data exceeding the preset approximate threshold in the diversity complete result base; if the infrared induction characteristic data are the same, the intelligent hunting camera clears the cached infrared induction characteristic data; if the difference is not the same, the intelligent hunting camera uses the characteristic identifier of the object corresponding to the infrared induction characteristic data exceeding the preset approximate threshold value in the diversity complete result base to establish the corresponding relation with the infrared induction characteristic data of the time as a new infrared complete corresponding relation; the intelligent hunting camera records the new infrared complete corresponding relation into the diversity complete result library.
In the above embodiment, when it is determined that the degree of approximation between the current infrared sensing characteristic data and the record in the complete diversity result base exceeds the preset approximation threshold, it is further determined whether the current infrared sensing characteristic data is the same as the record in the complete diversity result base, so as to determine whether to add the current infrared sensing characteristic data to the complete diversity result base. When the result library is the same, the result library is not added, so that the storage resources occupied by the complete result library with diversity can be saved; when not adding simultaneously, can further promote follow-up accuracy when carrying out infrared induction characteristic data comparison again.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: when the preset synchronization condition is met, the intelligent hunting camera sends the data in the complete result library of diversity of the intelligent hunting camera to other intelligent hunting cameras in the communication range, and receives the data of the complete result library of diversity sent by the intelligent hunting cameras; the intelligent hunting camera processes the received data from the complete result library of diversity of other intelligent hunting cameras, so that the complete result library of diversity of the intelligent hunting camera comprises the infrared complete corresponding relation in the complete result library of diversity of all other intelligent hunting cameras in the communication range.
In the embodiment, the complete result library of the diversity is synchronized among the plurality of intelligent hunting cameras, so that the intelligent hunting cameras can not shoot the object which is shot by all the communicable intelligent hunting cameras to complete the habit data any more, the shooting of the repeated habit data is greatly reduced, the endurance time of the intelligent hunting cameras is prolonged, and the situation that the low-value repetitive data occupies the storage space of the intelligent hunting cameras is avoided.
With reference to some embodiments of the first aspect, in some embodiments, the preset synchronization condition is that the number of the infrared complete correspondences newly added in the complete result library of diversity exceeds a preset synchronization value.
In the above embodiment, the complete result library of synchronization diversity may be triggered only when the number of the newly added complete infrared correspondences exceeds the preset synchronization value, so that the power consumption for synchronizing the complete result library of synchronization diversity is reduced, the data size to be transmitted is also reduced, and the synchronization efficiency is improved.
In a second aspect, an embodiment of the present application provides an intelligent hunting camera, including: the device comprises a processor, a memory, an infrared monitor and a camera; the infrared monitor is used for receiving infrared radiation data of the infrared sensing area and transmitting the infrared radiation data to the processor; the camera is used for receiving the instruction of the processor to start or stop shooting and transmitting the shot image to the processor; the memory is coupled to the processor, the memory for storing computer program code comprising computer instructions, the processor invoking the computer instructions to cause the intelligent hunting camera to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a third aspect, embodiments of the present application provide a computer program product including instructions, which, when run on an intelligent hunting camera, causes the intelligent hunting camera to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which includes instructions that, when executed on an intelligent hunting camera, cause the intelligent hunting camera to perform the method as described in the first aspect and any possible implementation manner of the first aspect.
It is understood that the electronic device provided by the second aspect, the computer program product provided by the third aspect, and the computer storage medium provided by the fourth aspect are all used to execute the method provided by the embodiments of the present application. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and the details are not repeated here.
One or more technical solutions provided in the embodiments of the present application have at least the following technical effects or advantages:
1. when the shooting mode is the biological diversity mode, after the intelligent hunting camera 100 senses that a new living creature exists in the infrared sensing area, the shooting of a high-pixel video cannot be directly started, but a high-pixel image is shot firstly, the completeness of the habit of the object in the high-pixel image recorded in the currently shot media is judged through the habit sampling completeness analysis intelligent model, and then corresponding processing is carried out according to the completeness, so that low-pixel shooting can be adopted, and even shooting is not started. Therefore, the problem that the shooting task of long-time biodiversity research cannot be completed due to large power consumption caused by continuous high-pixel shooting in the related technology is effectively solved, and the duration of the intelligent hunting camera during ecological diversity habit research is greatly prolonged.
2. Because the intelligent hunting camera records the diversity complete result base of the corresponding relation between the characteristic identification of the object with complete habit and the infrared induction characteristic data, when the infrared induction characteristic data is used for determining that the living creature entering the infrared induction area is the object with complete habit, the shooting of high-pixel images and the judgment of the habit sampling completeness analysis intelligent model are not triggered, and the electric energy of the intelligent hunting camera is further saved under the condition of meeting the requirement of biodiversity habit research.
3. Through the complete result library of synchronous diversity among a plurality of intelligent hunting cameras, the intelligent hunting cameras can not shoot objects which are shot by all the intelligent hunting cameras capable of communicating, the shooting of repeated habit data is greatly reduced, the endurance time of the intelligent hunting cameras is prolonged, and the situation that the storage space of the intelligent hunting cameras is occupied by the repetitive low-value data is avoided.
Drawings
FIG. 1 is a schematic diagram of a scenario in which a hunting camera is used in the related art;
FIG. 2 is a schematic view showing a case where a hunting camera of the related art is used for field study photographing;
FIG. 3 is a schematic diagram of a situation when a field research photography is performed by using the intelligent hunting camera in the embodiment of the present application;
fig. 4 is a schematic structural diagram of an intelligent hunting camera 100 provided in the embodiment of the present application;
FIG. 5 is an exemplary diagram of the functionality and training data of the habit sample completeness analysis intelligent model in the embodiment of the present application;
FIG. 6 is a schematic flow chart of an example active biometric capture method based on artificial intelligence in an embodiment of the present application;
FIG. 7 is an exemplary diagram of an intelligent hunting camera 100 using the habit sample completeness analysis intelligence model in an embodiment of the present application;
FIG. 8 is a schematic flow chart of another example of an artificial intelligence based active biometric capture method in an embodiment of the present application;
FIG. 9 is an exemplary diagram of the contents stored in the diversity complete result library in the embodiment of the present application;
FIG. 10 is a schematic flow chart diagram illustrating an example method for artificial intelligence based live biometric capture in an embodiment of the present application;
FIG. 11 is a schematic diagram of an exemplary scenario in which multiple intelligent hunting cameras cooperate in an embodiment of the present application;
fig. 12 is a schematic diagram of a modular structure of the intelligent hunting camera 100 according to the embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
Since the embodiment of the present application relates to the application of the artificial intelligence technology, for easy understanding, the following is a brief introduction to the concept of the artificial intelligence model:
the artificial intelligence model in the application is a deep learning model constructed based on an artificial neural network. The characteristic training is carried out on the general or customized neural network through a large amount of marked training data, so that the artificial intelligence model can complete self-learning and has corresponding functions.
For example, an artificial intelligence model may be trained using a large number of pictures tagged with the position of a person as training data, and the artificial intelligence model may be capable of tagging the position of the person from the output pictures.
For another example, by using pictures or videos marked with different character motions or expressions as training data, the artificial intelligence model can be trained, and the artificial intelligence model can have the capability of recognizing the character motions or expressions in the output pictures.
The hunting camera is generally applied in field environment, as shown in fig. 1, which is a schematic view of a scene using the hunting camera.
Hunting cameras generally include a lens, an infrared monitor, a fill-in light, and a fixing buckle. Taking pictures or recording videos by a lens user; the infrared monitor is used for sensing an infrared heat source of the surrounding environment and triggering the opening or closing of a photographing or video recording function; the light supplement lamp is used for supplementing light when the environment is too dark; the fixing buckle is used for stably fixing the hunting camera at a certain position of the environment.
As shown in fig. 1 (a), after the hunting camera is fixed on the tree using the fixing buckle and the hunting camera is started, the infrared monitor may monitor an infrared sensing area in front of the hunting camera. Although there is a tiger in a place short of the hunting camera, the hunting camera does not trigger photographing because the tiger does not enter the infrared sensing area of the hunting camera.
As shown in fig. 1 (b), after the tiger moves to the infrared sensing area of the hunting camera, the infrared monitor senses that the infrared energy of the tiger causes the change of the infrared energy in the environment, and determines that a living creature enters the infrared sensing area, so that the hunting camera can be triggered to start shooting, and the pictures of the tiger activity are recorded, so as to accumulate materials for the subsequent research on the tiger's habit.
Fig. 2 is a schematic diagram showing a situation of field study photographing using a hunting camera in the related art.
In order to acquire higher definition research materials, high-pixel hunting cameras are generally adopted for shooting at present. The high-pixel camera not only needs to consume higher electric energy to process a large amount of image pixel data during shooting, but also has a very large data volume of the high-pixel image obtained by final processing. When storing such a high pixel image, a large amount of electric power is also consumed for storage. While wild animal field studies are being conducted, the typical study duration is relatively long, for example, a study period may be 4 months, which results in a large power consumption, and the high pixel hunting camera often fails to work continuously until the expected field observation study duration is reached. As shown in fig. 2, after triggering high pixel capture multiple times, the full 4 months of non-operation has been shut down due to power shortage.
In addition, the storage space of the hunting camera is limited, and under the condition that a large number of high-pixel images requiring extremely large storage space are occupied, even if the electric quantity exists, the storage space is occupied before the hunting camera does not work for the expected field observation and research time, and the shooting cannot be continued. As shown in fig. 2, after triggering high-pixel shooting a plurality of times, shooting is not effective due to insufficient storage space without working for 4 months.
By adopting the moving creature shooting method based on artificial intelligence and the intelligent hunting camera provided by the embodiment of the application, the degree of the acquired habit of the object in the infrared induction area can be intelligently identified, so that high-pixel shooting or low-pixel shooting is correspondingly triggered, even shooting can not be triggered, the intelligent hunting camera obtains a high-definition picture supporting field animal observation and research, and meanwhile, the duration of the intelligent hunting camera is greatly prolonged.
Fig. 3 is a schematic diagram of a situation when the intelligent hunting camera in the embodiment of the present application is used for field research shooting. After the intelligent hunting camera is started, infrared induction is triggered, and the intelligent hunting camera determines that the short-distance habit of an object A (tiger) entering an infrared induction area is incomplete, so that high-pixel shooting is started, and short-distance detail habit actions such as tiger predation, fighting and the like can be clearly shot. The intelligent hunting camera stops shooting after the tiger leaves.
After the high-pixel shooting is carried out for multiple times, the tiger enters the infrared induction area again in the next day, and the intelligent hunting camera does not start the high-pixel shooting any more but starts the low-pixel shooting under the condition that the close-range habit of the tiger is determined to be complete, so that the long-distance overall habit actions such as the movement route, the habit position and the like of the tiger can be shot. The intelligent hunting camera stops shooting after the tiger leaves.
Because the intelligent shooting is switched to the low-pixel shooting after the condition is met, the power consumption is greatly saved, and the intelligent hunting camera can easily reach the general field observation and research duration (for example, 4 months).
After the tiger is shot in low pixels for many times, the tiger enters the infrared sensing area again in a certain day later, and the intelligent hunting camera does not trigger shooting under the condition that the intelligent hunting camera determines that all habits of the tiger are complete. Further save the electric energy, promoted duration of endurance.
At a certain day later, infrared induction is triggered, and the intelligent hunting camera determines that the short-distance habit of the object B (snake) entering the infrared induction area is incomplete, so that high-pixel shooting is started, and the detail habit action of shooting the snake is clear.
Therefore, by adopting the moving creature shooting method based on artificial intelligence and the intelligent hunting camera in the embodiment of the application, a large number of high-definition pictures can be acquired to support the observation and research of field animals, and the endurance time of the intelligent hunting camera can be greatly prolonged, so that more valuable research images can be acquired.
An exemplary intelligent hunting camera 100 provided by embodiments of the present application is first described below.
Fig. 4 is a schematic structural diagram of the intelligent hunting camera 100 according to the embodiment of the present application.
The embodiment will be specifically described below by taking the intelligent hunting camera 100 as an example. It should be understood that the intelligent hunting camera 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The intelligent hunting camera 100 may include: a processor 101, a camera 102, a memory 103, keys 104, a ledd 105, a battery 106, an infrared monitor 107, a display screen 108, and the like.
It is understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the intelligent hunting camera 100. In other embodiments of the present application, the intelligent hunting camera 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 101 may include one or more processing units, such as: the processor 101 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
Wherein the controller may be the neural center and the command center of the intelligent hunting camera 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 101, thereby increasing the efficiency of the system.
In some embodiments, processor 101 may include one or more interfaces for communicating information with other modules.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an illustration, and does not form a structural limitation on the intelligent hunting camera 100.
The battery 106 may power the processor 101, the camera 102, the memory 103, the keys 104, the LEDs 105, the infrared monitor 107, the display screen 108, and the like.
The intelligent hunting camera 100 may implement display functions via a GPU, a display screen 108, and an application processor, among others. The GPU is a microprocessor for image processing, and is connected to the display screen 108 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 101 may include one or more GPUs that execute program instructions to generate or alter display information. In some embodiments, the intelligent hunting camera 100 may also be without the display screen 108, which is not limited herein.
The intelligent hunting camera 100 may implement a shooting function through an ISP, a camera 102, a video codec, a GPU, an application processor, and the like.
The ISP is used to process the data fed back by the camera 102. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 102.
The camera 102 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the smart hunting camera 100 may include 1 or N cameras 102, N being a positive integer greater than 1.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the intelligent hunting camera 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
Memory 103 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory may be read and written directly by the processor 101, may be used to store executable programs (e.g., machine instructions) of an operating system or other programs in operation, and may also be used to store data of users and application programs, etc.
The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded in advance into the random access memory for the processor 101 to directly read and write.
The memory 103 may include a memory card for storing pictures or videos taken by the intelligent hunting camera 100.
The keys 104 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The intelligent hunting camera 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the intelligent hunting camera 104.
The LEDs 105 may constitute a fill-in light for shooting when the ambient light is too dark.
The infrared monitor 107 may sense the infrared heat source of the surrounding environment, thereby triggering the on or off of the photographing or video recording function of the intelligent hunting camera 100.
In order to provide the intelligent hunting camera 100 with the capability of intelligently determining the completeness of the recorded habit of the photographed object, a pre-trained habit sample completeness analyzing intelligent model is stored in the memory 103 of the intelligent hunting camera 100.
Fig. 5 is a schematic diagram illustrating the functions of the training data and the intelligence model for analyzing the completeness of the habit sample in the embodiment of the present application.
After a picture of a certain object and a set of pictures and/or videos of the object are used as input data and the habit sample completeness analysis intelligent model is input, a completeness result of the habit of the object recorded in the set of pictures and/or videos can be obtained and used as output data.
The completeness result of an object output by the habit sample completeness analysis intelligent model can have three possibilities: "subject near habit is incomplete", "subject near habit is complete, far habit is incomplete", and "subject near habit is complete and far habit is complete".
It will be appreciated that a close habit of an object represents the habit of the object that needs to be observed at a close distance in order to be clearly observable. The origin habit of an object means the habit of the object that can be observed at a longer distance. For both the near and far habits of an object, some preset conditions can be preset, which can be called as near and far conditions, respectively.
If all preset near situations are included in a picture or video obtained by shooting an object, the near habit of the object can be considered to be complete, otherwise, the near habit of the object is determined to be incomplete.
Similarly, if all the remote situations preset in the picture or video obtained by taking a subject are included, the remote habit of the subject is considered to be complete, otherwise it is determined that the remote habit of the subject is not complete.
It is understood that in some cases, if a photograph or video of an object is taken, the near-distance habit and the far-distance habit of the object are not complete, in which case, the habit sampling completeness analysis intelligent model outputs a result of "the near-distance habit of the object is not complete". After the photo or video of the object is continuously taken, the short-distance habit of the object is complete, and the long-distance habit is not complete, the habit sampling completeness analysis intelligent model can output the result of 'the short-distance habit of the object is complete and the long-distance habit is not complete'. After the photo or video of the object is continuously taken, and the long-distance habit of the object is complete, the habit sample completeness analysis intelligent model can output the result of 'the short-distance habit of the object is complete and the long-distance habit is complete'.
It should be noted that in some cases, there may be a case where in a photograph or video taken of a subject, the long-distance habit of the subject is complete, and the short-distance habit of the subject is not complete. In this case, the habit sample completeness analysis intelligent model only outputs the result of 'close-up habit of object is incomplete'. After the close-range habit of the subject is completed by continuous shooting, the habit sample completeness analysis intelligent model directly outputs the results that the close-range habit of the subject is complete and the far-range habit of the subject is complete.
To implement the function of the habit sample completeness analysis intelligent model, the model can be provided with the capabilities of a target recognition AI model and a target habit completeness analysis AI model. The target identification AI model is used for identifying a target object from a picture or a video. The AI model is used to analyze the situation of the target object and whether the corresponding situation in the image and video set covers all the preset near or far situations.
Specifically, the habit sample completeness analysis intelligent model can be trained by adopting a large amount of training data, so that the habit sample completeness analysis intelligent model has the capability. A piece of training data may include a picture of a subject (e.g., picture 1~6 of subject a, or picture 7 of subject B, etc.), a picture or video set including the subject (e.g., picture or video set 1~6 of subject a, or picture or video 7 including subject B, etc.), and a pre-labeled degree of completeness of the subject recorded in the picture or video set (e.g., degree of completeness of habit 1~3 of subject a, or degree of completeness of habit 1 of subject B, etc.).
The following describes in detail the active biometric photography method based on artificial intelligence in the embodiment of the present application, with reference to the hardware structure of the above-mentioned exemplary intelligent hunting apparatus 100 and the habit sample completeness analyzing intelligent model pre-stored in the intelligent hunting apparatus 100:
please refer to fig. 6, which is a flowchart illustrating an example of an artificial intelligence-based live creature shooting method according to an embodiment of the present application.
S601, determining that a new active organism enters an infrared sensing area;
the intelligent hunting apparatus 100 may continuously sense a change in the infrared radiation energy in the infrared sensing area using the infrared monitor 107, and determine that a new active creature may enter the infrared sensing area if the amount of the infrared radiation energy sensed from the infrared sensing area per unit time exceeds a threshold value compared to the amount of the infrared radiation energy sensed without the active creature.
In some embodiments, if there are active creatures in the current infrared sensing area, it is also determined that there may be new active creatures entering the infrared sensing area when the amount of the sensed infrared radiant energy increases beyond the threshold.
S602, determining whether the current set shooting mode is a common mode or an ecological diversity mode;
after the intelligent hunting camera 100 determines that a new living creature enters the infrared sensing area, it can be determined whether the currently set shooting mode is a normal mode or an ecological diversity mode; the common mode is mainly used for performing common hunting shooting, and the biodiversity mode is mainly used for acquiring habit research data of a variety of living beings in a region.
After the user places the intelligent hunting camera 100 and turns on, the shooting mode can be set to be the normal mode or the ecological diversity mode through the key 104. For example, the ecological diversity mode may be initiated by toggling one of the sliding keys 104 close to the underside of the camera body.
After determining that a new active creature enters the infrared sensing area, if the shooting mode is the normal mode, executing step S603; if the shooting mode is the ecological diversity mode, step S605 is executed.
In some embodiments, the intelligent hunting camera 100 may also only have the ecological diversity mode and default to the ecological diversity mode, so that step S602 may also not exist, which is not limited herein.
S603, under the condition that the mode is determined to be the common mode, starting to shoot the high-pixel video;
in the case of determining the normal mode, the intelligent hunting camera 100 may start the camera 102 to shoot a high pixel video.
It can be appreciated that the image in the high-pixel video recording has extremely high definition, so that more power is consumed and more storage space is needed to shoot the high-pixel video recording than to shoot the low-pixel video recording. But the detail of the shooting object can be clearly shown, and the detail characteristics in the image can be clearly shown even after the shooting object is magnified for a plurality of times.
S604, stopping shooting under the condition that the infrared sensing area does not have the movable organisms any more;
in the case where the intelligent hunting camera 100 determines that there is no more living creature in the infrared sensing area through the infrared monitor 107, the photographing may be stopped and S601 is triggered to make a distinction as to whether there is a new living creature entering the infrared sensing area.
It can be understood that because the power consumption of infrared induction is far less than the power consumption of taking pictures and processing, therefore stop taking under the condition that there is no longer active organism in the infrared induction area, can be very big save the electric quantity, prolong the time of endurance.
S605, shooting a high-pixel image under the condition that the biological diversity mode is determined;
in the case of determining the biodiversity mode, the intelligent hunting camera 100 may first start the camera 102 to capture a high-pixel image, rather than directly recording the image.
The purpose of shooting the high-pixel image is to acquire an image of a moving organism entering the infrared sensing area so as to facilitate subsequent step judgment processing.
In some embodiments, after capturing a high-pixel image, the intelligent hunter camera 100 may first determine whether there is a movable creature in the high-pixel image. If it is determined that no movable creature is present, indicating a false alarm by the infrared monitor 107, step S601 may be triggered directly to make a determination again.
In some embodiments, if it is determined that no movable creature is present in the high-pixel images, the intelligent hunting camera 100 may also take a preset number of high-pixel images (e.g., 3) at a preset time interval (e.g., 2 seconds) and determine whether a movable creature is present in the high-pixel images. If no movable living creature exists in the high-pixel images, it can be determined that the infrared monitor 107 has a false alarm, and step S601 can be directly triggered to continue monitoring.
If it is determined that a movable living being is present in the images, then execution of step S606 may be triggered.
S606, inputting the high-pixel image and all media in the camera gallery into an habit sampling completeness analysis intelligent model, and determining the completeness of the acquired habit of the first object;
after capturing a high-pixel image containing a movable living being (a first object), the intelligent hunting camera 100 may input the high-pixel image into a pre-stored habit sample completeness analysis intelligent model, and simultaneously, all media (pictures and/or video collections) in the camera gallery are also used as an input of the habit sample completeness analysis intelligent model, so as to determine the completeness of the habit of a first object contained in the captured media, where the first object is the movable living being in the high-pixel image.
Triggering and executing the step S603 under the condition that the output result of the habit sample completeness analysis intelligent model is 'near habit incomplete';
triggering and executing the step S607 when the output result of the habit sample completeness analysis intelligent model is that the near-distance habit is complete and the far-distance habit is not complete;
if the output result of the habit sample completeness analysis intelligent model is "close habit is complete and remote habit is complete", the step S608 is triggered to be executed.
FIG. 7 is a schematic diagram of an embodiment of the present invention in which the intelligent hunting camera 100 analyzes an intelligent model using the habit sample completeness.
The intelligent hunting camera 100 inputs the shot high-pixel image a into the habit sample completeness analysis intelligent model, and simultaneously, all multimedia data (images, videos and the like) stored in the gallery of the memory card of the hunting camera are used as a permanent input of the habit sample completeness analysis intelligent model.
The target recognition AI model can recognize that the object in the high-pixel image A is a tiger, and the object habit completeness analysis AI model can analyze all preset near situations (such as eating, fighting and mating) which comprise the tiger in pictures and videos in the gallery, but only a strolling situation is included in a preset far situation, and no path selection and observation situation is shot yet. Therefore, the output result of the habit sample completeness analysis intelligent model is that "the close-distance habit of the object a is complete and the far-distance habit is not complete", which may trigger the execution of S607.
S607, under the condition that the short-distance habit is complete and the long-distance habit is not complete, starting to shoot a low-pixel video;
in the case where it is determined that the near habit is complete and the far habit is not complete, the intelligent hunting camera 100 may activate the camera 102 to take a low pixel video. Low pixel images have pixels that are lower than high pixel images, e.g., only 1/4 or even less than the pixels of high pixel images, but low pixel images have met the need to know the distance habit of the subject from the image.
Therefore, when the short-distance habit is complete and the long-distance habit is incomplete, only the low-pixel video is shot and the high-pixel video is not shot any more, so that the electric energy is greatly saved, the endurance time of the intelligent hunting camera 100 is prolonged, the storage space of the memory 103 of the intelligent hunting camera 100 is saved, and more shot contents with more values can be stored under the condition that the habit research data of the target object is obtained.
In the process of executing step S607, the determination as to whether or not the condition for executing step S604 is satisfied may be continued, and step S604 may be executed when the condition is satisfied.
And S608, under the condition that the short-distance habit and the long-distance habit are determined to be complete, the video shooting is not started.
In the case that both the short-distance and long-distance habits are determined to be complete, it means that in the ecological diversity mode, the study habit data of the subject is complete, and more power should be reserved to shoot the subject whose habit data is not complete, so the intelligent hunting camera 100 may not start the video shooting, and trigger step S601.
In some embodiments, it may be also configured that, in the case that both the short-distance and long-distance habits of the preset target objects are complete, the intelligent hunting camera 100 may automatically switch the shooting mode from the bio-diversity mode to the normal mode, which is not limited herein.
In the embodiment of the present application, when the shooting mode is the biodiversity mode, after the intelligent hunting camera 100 senses that there is a new living creature in the infrared sensing area, the shooting of the high-pixel video is not directly started, but a high-pixel image is shot first, the completeness of the habit of the object in the high-pixel image recorded in the currently shot media is judged by the habit sampling completeness analysis intelligent model, and then the corresponding processing is performed according to the completeness. For example, in the case that the close-distance habit of the subject is determined to be complete and the far-distance habit is not complete, only a low-pixel video is shot, so that the acquisition requirement of biodiversity habit research data can be met with less electric energy consumption. Even under the condition that the short-distance habit and the long-distance habit of the object are determined to be complete, the video shooting can be directly started, more electric quantity is reserved to shoot the objects with incomplete habit data, and the duration of the intelligent hunting camera 100 in ecological diversity habit research is greatly prolonged.
In the above embodiment, the intelligent hunting camera 100 can intelligently determine the completeness of the habit of the object in the image by shooting a high-pixel image of the object entering the sensing area, so as to adjust the high-pixel video recording, the low-pixel video recording or even the non-video recording, thereby improving the endurance of the intelligent hunting camera 100. In some embodiments, the enhanced application of low power infrared sensing technology in this scenario can be used to further reduce the power consumption of the intelligent hunting camera 100.
Please refer to fig. 8, which is a schematic flow chart illustrating an example of an artificial intelligence-based active biometric shooting method according to an embodiment of the present application.
In connection with the embodiment shown in fig. 6, between the step S602 and the step S605, the steps S801 and S802 may be executed:
s801, caching the infrared induction characteristic data;
when the intelligent hunting camera 100 determines that a new living creature enters the infrared sensing area and the current shooting mode is the biodiversity mode, the infrared sensing characteristic data sensed by the infrared monitor 107 at this time can be cached. The infrared sensing characteristic data can comprise the shape and the size of the triggered infrared radiation change area, the radiation value of the characteristic point in the area and the like.
S802, determining whether infrared induction characteristic data with the similarity degree exceeding a preset approximate threshold exists in a diversity complete result base;
the intelligent hunting camera 100 maintains a continuously updated complete diversity result library, which records a corresponding relationship between a complete feature identifier of an object and infrared induction feature data caused by the object. The complete object refers to an object having complete near and far habits determined in step S606. The correspondence relationship is recorded and updated in step S803.
That is, the complete result library of diversity records the infrared sensing characteristic data of all the objects whose near-distance and far-distance habits have been determined by the intelligent hunting camera 100 to be complete.
Fig. 9 is a schematic diagram illustrating an exemplary content stored in a diversity complete result library according to an embodiment of the present application. The complete result library of diversity records the identifications of tiger, lion, rabbit, other objects (such as the characteristics of object B), etc., which are all the objects with complete short-distance and long-distance habits. And each object identification corresponds to infrared induction characteristic data caused when the object enters the infrared induction area. For example, when the tiger enters the infrared sensing region, the shape and the size of the caused infrared radiation change region are S1, and the set of infrared radiation values at the feature points selected by the feature point model is F1. It is different from the lion entering the infrared region, the shape and size S2 of the infrared radiation change region and the infrared radiation value set F2 at the characteristic point selected by the characteristic point model. Similarly, the shape and size of the infrared radiation change region caused when the rabbit enters the infrared sensing region are S3 and the infrared radiation value set F3 at the feature point selected by the feature point model are also recorded in the complete diversity result library. Many other infrared sensing characteristic data of the object (e.g., object B) may also be recorded, and are not limited herein.
After the current infrared sensing characteristic data is cached, the intelligent hunting camera 100 may compare the current infrared sensing characteristic data with the infrared sensing characteristic data of the completed object stored in the diversity complete result base, and determine whether there is infrared sensing characteristic data whose degree of approximation with the current infrared sensing characteristic data exceeds a preset approximation threshold;
if yes, the living creature entering the infrared sensing area is a perfect habit object, and the step S608 can be directly triggered to be executed without shooting a high-pixel image for judging the habit completeness;
if the target is not present, it indicates that the living creature entering the infrared sensing area may not be the target with complete habit, and step S605 is triggered to be executed, and the high-pixel image is captured to perform corresponding judgment and processing by using the habit sampling completeness analysis intelligent model.
In connection with the embodiment shown in fig. 6, after step S608 is executed, step S803 may also be executed:
s803, recording the infrared complete corresponding relation to a diversity complete result library;
the intelligent hunting camera 100 can record the corresponding relationship between the feature identifier of the object and the infrared sensing feature of the time as the infrared complete corresponding relationship of the time into the diversity complete result library under the condition that the short-distance habit and the long-distance habit of the living creature triggering the infrared sensing of the time are complete and the video shooting is not started. The characteristic identifier of the object is used to uniquely identify the object in the diversity complete result library, and may be, for example, a name of the object, a label of the object, a part of a characteristic value of the object, or the like, which is not limited herein. For example, if the object is a first object, the infrared complete correspondence is a correspondence between the infrared sensing feature of this time and the feature identifier of the first object.
It should be noted that, if the complete result library of diversity has a corresponding relationship between the feature identifier of the object and the previously acquired infrared sensing feature, the complete infrared corresponding relationship may also be added to the complete result library of diversity, so as to enhance the comprehensiveness of the complete result library of diversity. For example, a first infrared complete correspondence has been recorded in the complete result library of diversity: the characteristic identification of the first object corresponds to the infrared induction characteristic data acquired when the first object enters the infrared induction area for the Nth time; then, when the first object enters the infrared sensing area for the second time, a record of a second infrared complete correspondence relationship may also be added in the complete result library of diversity: and the characteristic identification of the first object corresponds to the infrared induction characteristic data acquired when the first object enters the infrared induction area for the (N + 1) th time. In some embodiments, only the correspondence between the non-repetitive first object feature identifier and the corresponding infrared sensing feature data may be retained in the diversity complete result library, which is not limited herein.
In the embodiment of the application, the complete result library of the diversity, in which the corresponding relation between the feature identifier of the object with complete habit and the infrared induction feature data is recorded, is continuously updated in the intelligent hunting camera 100, when the infrared induction feature data is used for determining that the living creature entering the infrared induction area at this time is the object with complete habit, the shooting of the high-pixel image and the judgment of the intelligent model for analyzing the habit sampling completeness are not triggered, and the electric energy of the intelligent hunting camera 100 is further saved under the condition of meeting the requirement of researching the biodiversity habit.
In the above embodiment, the comparison and determination of the infrared sensing characteristic data in the complete diversity result library can reduce the repeated shooting of the complete habit biological data by the intelligent hunting camera 100. In some embodiments, to acquire a wider range of biodiversity data, multiple intelligent hunting cameras 100 are typically placed in a large area for data acquisition. The multiple intelligent hunting cameras 100 can often shoot repeated and worthless materials, the problem can be solved through sharing the complete result library of diversity, and the endurance time of the multiple intelligent hunting cameras 100 is greatly prolonged.
For convenience of description and understanding, only two intelligent hunting cameras 100 (intelligent hunting camera a and intelligent hunting camera B) are taken as an example in the embodiment of the present application, and in practical applications, more intelligent hunting cameras 100 may be matched with each other, and the present application is not limited herein.
In some embodiments, intelligent hunting camera a may also be referred to as a first intelligent hunting camera, and intelligent hunting camera B may also be referred to as a second intelligent hunting camera; in other embodiments, the intelligent hunting camera a may be referred to as a second intelligent hunting camera, the intelligent hunting camera B may be referred to as a first intelligent hunting camera, and the intelligent hunting camera a or the intelligent hunting camera B may be referred to as other cameras without limitation.
The following describes an embodiment of the present application in three stages with reference to a specific application scenario example diagram shown in fig. 11. Please refer to fig. 10, which is a schematic flowchart illustrating an artificial intelligence-based live creature shooting method according to an embodiment of the present application.
Stage one: the content of the object A shot by the intelligent hunting camera B is incomplete;
s1001, the intelligent hunting camera B determines that a moving organism enters an infrared sensing area;
when the subject a enters the infrared sensing area of the intelligent hunting camera B, the intelligent hunting camera B can determine that a living creature enters the infrared sensing area.
S1002, determining a shooting mode as a biodiversity mode;
s1003, determining that no infrared induction characteristic data with the similarity 
 exceeding a preset approximate threshold value with the current infrared induction characteristic data exists in the diversity complete result base;
s1004, shooting a high-pixel image;
s1005, analyzing the intelligent model based on the completeness of the habit sample, and determining that the short-distance habit is incomplete;
s1006, starting to shoot the high-pixel video;
s1007, stopping shooting;
after a very short time of high pixel recording, the living creature leaves the infrared sensing area of the intelligent hunting camera B, and the intelligent hunting camera B stops shooting.
The specific execution process of steps S1001 to S1007 is similar to the steps in the embodiment shown in fig. 6 and the embodiment shown in fig. 8, and is not described here again.
At the first stage, because the content shot by the intelligent hunting camera B before does not cover the short-distance habit of the object a, after the object a enters the infrared sensing range of the intelligent hunting camera B, the shot high-pixel image is analyzed by adopting the habit sampling completeness analysis intelligent model, and under the condition that the short-distance habit is determined to be incomplete, the hunting camera can directly start the high-pixel video.
And a second stage: the content of the object A shot by the intelligent hunting camera A is complete;
s1008, determining that the living creatures enter an infrared sensing area;
after a period of time, the subject a moves into the infrared sensing area of the intelligent hunting camera a, which determines that a moving organism enters the infrared sensing area.
S1009, determining that the infrared induction characteristic data with the similarity 
 exceeding a preset approximate threshold exists in the diversity complete result base;
s1010, not starting shooting;
the specific execution process of steps S1008 to S1010 is similar to that of the embodiment shown in fig. 8, and is not described herein again.
At the second stage, as shown in fig. 11 (a), the intelligent hunting camera a has previously captured enough data of the object a (tiger), that is, the short-distance and long-distance habits of the object a are complete, and the infrared sensing characteristic data of the object a is recorded into the diversity complete result library. After the object a enters the infrared sensing area of the intelligent hunting camera a, the intelligent hunting camera a can directly determine that the object a is an object with complete habit by comparing the infrared sensing characteristic data of the time with the infrared sensing characteristic data in the diversity complete result library, and therefore, shooting is not started.
And a third stage: the contents of the object a photographed by the hunting camera B are not yet complete, but the hunting camera B does not photograph the complete object a photographed by the hunting camera a any more since the complete result base is synchronized between the cameras.
S1011, synchronizing the complete result library of diversity among the intelligent hunting cameras;
the intelligent hunting cameras can synchronize the diversity complete result base according to a certain rule or condition. For example, the diversity complete result library may be set to be synchronized once every preset time (e.g. 1 day); the complete result library of diversity may be triggered once every time there is a change in the complete result library of diversity, and the like, which is not limited herein.
The intelligent hunting camera can be integrated with the communication module, can communicate through the communication module between a plurality of intelligent hunting cameras. In order to save power, the communication modules of the plurality of intelligent hunting cameras can be simultaneously started at fixed time intervals to realize communication among the plurality of intelligent hunting cameras, which is not limited herein.
The multiple intelligent hunting cameras synchronize the complete result library of diversity and can transmit the complete result library of diversity to all other communicable intelligent hunting cameras. After receiving one or more complete result libraries of diversity, the intelligent hunting camera may add all of the data to its complete result library of diversity. In order to save the data transmission consumption, the data changed in the complete result library of diversity can be transmitted to all other communicable intelligent hunting cameras only after the complete result library of diversity is synchronized last time, which is not limited herein.
As shown in fig. 11 (a), the intelligent hunting camera a can synchronize the complete result library of diversity to the intelligent hunting camera B, which includes the infrared induction characteristic data related to the object a that is not included in the complete result library of diversity of the intelligent hunting camera B.
S1012, determining that a moving organism enters an infrared sensing area;
when the object A enters the infrared sensing area of the intelligent hunting camera B again, the intelligent hunting camera B determines that a moving organism enters the infrared sensing area.
S1013, determining that the infrared induction characteristic data with the similarity 
 exceeding a preset approximate threshold value exists in the complete diversity result base;
since the data of the complete result library of diversity of the intelligent hunting camera a is synchronized, the complete result library of diversity of the intelligent hunting camera B also includes the infrared sensing characteristic data of the object a. By comparing the current infrared induction characteristic data with the infrared induction characteristic data in the complete result library of diversity, the intelligent hunting camera B can determine that the infrared induction characteristic data with the similarity 
 of the current infrared induction characteristic data exceeding a preset approximate threshold exists in the complete result library of diversity, namely, the object a is an object with completely recorded habits.
And S1014, not starting shooting.
As shown in fig. 11 (B), although the media contents enough to cover the short-distance and long-distance habits of the subject a are not stored in the smart hunting camera B, the smart hunting camera B does not start shooting because the infrared sensing feature data whose proximity to the infrared sensing feature data when the subject a enters the infrared sensing area at this time exceeds the preset proximity threshold exists in the diversity complete result base of the smart hunting camera B.
In the embodiment of the application, the complete result library of the synchronous diversity among the intelligent hunting cameras enables the intelligent hunting cameras to be free from shooting objects which are shot by all the intelligent hunting cameras capable of communicating, the shooting of repeated habit data is greatly reduced, the endurance time of the intelligent hunting cameras is prolonged, and the situation that the low-value data in repeatability occupy the storage space of the intelligent hunting cameras is avoided.
The functional modular structure of the intelligent hunting camera 100 in the embodiment of the present application will be described below with reference to the above-mentioned method for photographing an active living being based on artificial intelligence. Please refer to fig. 12, which is a schematic diagram of a modular structure of the intelligent hunting camera 100 according to an embodiment of the present application.
This intelligent hunting camera 100 includes:
the image shooting module 1201 is used for shooting a high-pixel image under the condition that a new moving organism enters the infrared sensing area and the current shooting mode is an ecological diversity mode;
a completeness intelligent analysis module 1202, configured to obtain a completeness of a habit of a collected first object after inputting all media captured in the high-pixel image and the gallery of the intelligent hunting camera into a habit sampling completeness analysis intelligent model; wherein the first object is a moveable creature identified from the high-pixel image; the degree of completeness of the habit is any one of three degrees of completeness, i.e., near-distance habit is incomplete, near-distance habit is complete and far-distance habit is incomplete, or near-distance habit is complete and far-distance habit is complete;
a high pixel video recording module 1203, configured to start shooting a high pixel video recording when it is determined that the completeness of the acquired habit of the first object is that the short-range habit is incomplete;
a low-pixel video module 1204, configured to start shooting a low-pixel video when it is determined that the completeness of the acquired habit of the first object is complete in the near-distance habit but incomplete in the far-distance habit; the resolution of a low pixel image obtained by the low pixel video is less than 1/2 of the resolution of the high pixel image;
the shooting blocking module 1205 is used for not starting the video shooting when the completeness of the habit of the acquired first object is determined to be that the near habit is complete and the far habit is complete.
Optionally, in some embodiments, the completeness intelligent analysis module 1202 may specifically include:
an object identification unit 12021 for identifying the first object in the high pixel image by the habit sample completeness analysis intelligent model;
a situation coverage analysis unit 12022, configured to analyze, by the habit sample completeness, whether N preset close situations and M preset far situations of the first object are included in all media captured by the smart model; n and M are both positive integers greater than or equal to 2;
a close-by result unit 12023, configured to receive a close-by habit incomplete result output by the habit sample completeness analysis intelligent model, if it is determined that the N preset close-by situations of the first object are not completely included in all the captured media;
a far-distance result unit 12024, configured to receive a near-distance habit completed but far-distance habit incomplete result output by the habit sample completeness analysis intelligent model, if it is determined that the N preset near-distance situations of the first object are completely included in all the media that have been photographed, but the M far-distance situations of the first object are not completely included;
a complete result unit 12025, configured to receive the results of complete near and complete far habits output by the habit sample complete analysis intelligent model, if it is determined that the N preset near situations of the first object have been completely included in all the media that have been captured, and the M far situations of the first object have been completely included.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
a mode determination module 1206 for determining whether a currently set photographing mode is a normal mode or the ecological diversity mode;
the high-pixel video recording module 1203 is further configured to start shooting the high-pixel video recording under the condition that it is determined that the shooting mode is the normal mode;
a shooting stopping module 1207, configured to control to stop shooting when it is determined that there are no more moving creatures in the infrared sensing area.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
an infrared characteristic caching module 1208, configured to cache the current infrared sensing characteristic data, where the infrared sensing characteristic data includes the shape and size of a triggered infrared radiation change area, and a radiation numerical value of a characteristic point in the area;
an infrared characteristic comparison module 1209, configured to determine whether there is infrared sensing characteristic data in the complete diversity result base, where an approximation degree of the infrared sensing characteristic data of this time exceeds a preset approximation threshold; the complete result library of the diversity records a plurality of complete infrared corresponding relations, wherein one complete infrared corresponding relation is the corresponding relation between the recorded feature identifier of an object and the infrared induction feature data of the object entering an infrared induction area at the same time when the intelligent hunting camera determines that the complete degree of the acquired habit of the object is that the short-distance habit is complete and the long-distance habit is complete;
the shooting blocking module 1205 is further configured to not start video shooting when it is determined that the infrared induction feature data whose similarity with the current infrared induction feature data exceeds a preset approximate threshold exists in the complete diversity result base.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
an infrared repeated comparison module 1210, configured to determine whether the current infrared sensing characteristic data is the same as the infrared sensing characteristic data exceeding a preset approximate threshold in the complete diversity result base, when it is determined that the infrared sensing characteristic data exceeding the preset approximate threshold in the similarity complete diversity result base exists;
a buffer removal module 1211, configured to remove the buffered current infrared sensing feature data under the same condition;
an infrared complete establishing module 1212, configured to establish, under different conditions, a corresponding relationship with the current infrared sensing characteristic data by using a characteristic identifier of an object corresponding to the infrared sensing characteristic data exceeding a preset approximate threshold in the complete diversity result base, as a new infrared complete corresponding relationship;
an infrared complete adding module 1213, configured to record the new infrared complete correspondence to the complete diversity result library.
Optionally, in some embodiments, the intelligent hunting camera 100 may further include:
a complete library synchronization module 1214, configured to send data in the complete result library of diversity of the self to other intelligent hunting cameras within a communication range and receive data of the complete result library of diversity sent by the other intelligent hunting cameras when a preset synchronization condition is met;
and a complete library updating module 1215 for processing the received data from the complete result libraries of diversity of the other intelligent hunting cameras, so that the complete result library of diversity of the self-complete result library of diversity includes the infrared complete corresponding relationship in the complete result libraries of diversity of all other intelligent hunting cameras within the communication range.
Optionally, in some embodiments, the preset synchronization condition is that the number of the infrared complete correspondences newly added in the diversity complete result library exceeds a preset synchronization value.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection of …", depending on the context. Similarly, the phrase "in determining …" or "if (a stated condition or event) is detected" may be interpreted to mean "if … is determined" or "in response to … is determined" or "in response to (a stated condition or event) is detected", depending on the context.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (10)

1. An artificial intelligence-based moving organism shooting method applied to an intelligent hunting camera is characterized by comprising the following steps:
under the condition that a new active organism enters an infrared sensing area and the current shooting mode is an ecological diversity mode, shooting a high-pixel image by the intelligent hunting camera;
the intelligent hunting camera inputs all media shot in the high-pixel image and the image library of the intelligent hunting camera into a habit sampling completeness analysis intelligent model to obtain the completeness of the acquired habit of the first object; wherein the first object is a moveable creature identified from the high-pixel image; the degree of completeness of the habit is any one of three degrees of completeness, that is, near-distance habit is incomplete, that is, near-distance habit is complete but far-distance habit is incomplete, or that is, near-distance habit is complete and far-distance habit is complete;
under the condition that the completeness of the acquired habit of the first object is determined to be incomplete in short-distance habit, starting shooting a high-pixel video by the intelligent hunting camera;
in the case that the completeness of the acquired habit of the first object is determined to be that the short-distance habit is complete but the long-distance habit is not complete, the intelligent hunting camera starts to shoot a low-pixel video; wherein, the resolution of the low pixel image obtained by the low pixel video is less than 1/2 of the resolution of the high pixel image;
the intelligent hunting camera does not start video shooting if it is determined that the acquired habit of the first subject is complete, the near-distance habit is complete, and the far-distance habit is complete.
2. The method of claim 1, wherein the intelligent hunting camera inputting the high pixel image and all media captured in the intelligent hunting camera's gallery into an habit sample completeness analysis intelligent model to obtain the completeness of the acquired first subject's habit, specifically comprising:
the intelligent hunting camera identifies the first object in the high-pixel image through the habit sample completeness analysis intelligent model;
the intelligent hunting camera analyzes whether all media shot by an intelligent model comprise N preset close-distance situations and M preset long-distance situations of the first object or not through the habit sampling completeness; both N and M are positive integers greater than or equal to 2;
in the case that it is determined that the N preset close-up situations of the first object are not completely included in all the media that have been shot, the intelligent hunting camera receives a result that the short-up habit output by the habit sample completeness analysis intelligent model is not complete;
in the event that it is determined that the N preset near cases of the first object have been completely included in all media that have been captured, but the M far cases of the first object have not been completely included, the intelligent hunting camera receives the near-habit completed but far-habit incomplete result output by the habit sample completeness analysis intelligent model;
in the case where it is determined that the N preset near cases of the first object have been completely included in all media that have been photographed, and the M far cases of the first object have been completely included, the intelligent hunting camera receives the results of the habit sample completeness analysis intelligent model outputting near and far habits completed.
3. The method of claim 1, wherein said step of said intelligent hunting camera capturing a high pixel image is preceded by the step of:
the intelligent hunting camera determines whether a currently set shooting mode is a normal mode or the ecological diversity mode;
in a case that it is determined that the shooting mode is the normal mode, the smart hunting camera starts shooting the high-pixel video;
in the event that it is determined that there are no more living creatures in the infrared sensing area, the intelligent hunting camera stops shooting.
4. The method according to any one of claims 1 to 3, wherein in the case where it is determined that the shooting mode is the ecological diversity mode, before the step of the smart hunting camera shooting a high-pixel image, the method further comprises:
the intelligent hunting camera caches the current infrared induction characteristic data, wherein the infrared induction characteristic data comprise the shape and the size of a triggered infrared radiation change area and the radiation numerical value of a characteristic point in the area;
the intelligent hunting camera determines whether the infrared induction characteristic data with the similarity exceeding a preset approximate threshold exists in a complete diversity result library or not; the complete result library of the diversity records a plurality of complete infrared corresponding relations, wherein one complete infrared corresponding relation is the corresponding relation between the recorded feature identifier of an object and the infrared induction feature data of the object entering an infrared induction area at the same time when the intelligent hunting camera determines that the complete degree of the acquired habit of the object is that the short-distance habit is complete and the long-distance habit is complete;
and under the condition that the infrared induction characteristic data with the similarity degree exceeding a preset approximate threshold value exists in the complete diversity result base, the intelligent hunting camera does not start video shooting.
5. The method of claim 4, further comprising:
under the condition that the infrared induction characteristic data with the similarity exceeding a preset approximate threshold exists in the diversity complete result base, the intelligent hunting camera determines whether the infrared induction characteristic data of the time is the same as the infrared induction characteristic data exceeding the preset approximate threshold in the diversity complete result base;
if the infrared induction characteristic data are the same, the intelligent hunting camera clears the cached current infrared induction characteristic data;
if the infrared sensing characteristic data are different, the intelligent hunting camera uses the characteristic identification of the object corresponding to the infrared sensing characteristic data exceeding the preset approximate threshold value in the diversity complete result base to establish a corresponding relation with the infrared sensing characteristic data of the time as a new infrared complete corresponding relation;
and the intelligent hunting camera records the new infrared complete corresponding relation into the diversity complete result library.
6. The method of claim 4, further comprising:
when the preset synchronization condition is met, the intelligent hunting cameras send the data in the complete result library of diversity of the intelligent hunting cameras to other intelligent hunting cameras in the communication range, and receive the data of the complete result library of diversity of the intelligent hunting cameras sent by the other intelligent hunting cameras;
the intelligent hunting cameras process the received data from the complete result libraries of diversity of the other intelligent hunting cameras, so that the complete result libraries of diversity of the intelligent hunting cameras comprise infrared complete corresponding relations in the complete result libraries of diversity of all other intelligent hunting cameras in the communication range.
7. The method of claim 6, wherein the predetermined synchronization condition is that the number of the infrared complete correspondences newly added in the complete result library of diversity exceeds a predetermined synchronization value.
8. An intelligent hunting camera, characterized in that, intelligent hunting camera includes: the device comprises a processor, a memory, an infrared monitor and a camera;
the infrared monitor is used for receiving infrared radiation data of the infrared sensing area and transmitting the infrared radiation data to the processor; the camera is used for receiving an instruction of the processor to start or stop shooting and transmitting a shot image to the processor;
the memory coupled with the processor, the memory for storing computer program code, the computer program code comprising computer instructions, the processor invoking the computer instructions to cause the smart hunting camera to perform the method of any one of claims 1-7.
9. A computer program product comprising instructions that, when run on a smart hunting camera, cause the smart hunting camera to perform the method of any one of claims 1-7.
10. A computer-readable storage medium comprising instructions that, when executed on a smart hunting camera, cause the smart hunting camera to perform the method of any one of claims 1-7.
CN202211450056.8A 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera Active CN115835012B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211450056.8A CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211450056.8A CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Publications (2)

Publication Number Publication Date
CN115835012A true CN115835012A (en) 2023-03-21
CN115835012B CN115835012B (en) 2023-09-15

Family

ID=85529362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211450056.8A Active CN115835012B (en) 2022-11-19 2022-11-19 Artificial intelligence-based active organism shooting method and intelligent hunting camera

Country Status (1)

Country Link
CN (1) CN115835012B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033792A1 (en) * 2007-08-02 2009-02-05 Sanyo Electric Co., Ltd. Image Processing Apparatus And Method, And Electronic Appliance
KR101033237B1 (en) * 2011-02-18 2011-05-06 (주)테라테코 Multi-function detecting system for vehicles and security using 360 deg. wide image and method of detecting thereof
KR101625471B1 (en) * 2014-12-30 2016-05-30 목원대학교 산학협력단 Method and apparatus for enhancing resolution of popular low cost thermal image camera
CN106973235A (en) * 2017-04-28 2017-07-21 深圳东方红鹰科技有限公司 The image pickup method and device detected based on rpyroelectric infrared
CN108737716A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Image pickup method, device and smart machine
CN112235338A (en) * 2020-07-27 2021-01-15 北京图力普联科技有限公司 Animal husbandry breeding monitoring system with artificial intelligence
CN112637503A (en) * 2020-12-22 2021-04-09 深圳市九洲电器有限公司 Photographing apparatus, photographing method, and computer-readable storage medium
CN113411504A (en) * 2021-08-18 2021-09-17 成都大熊猫繁育研究基地 Intelligent shooting method and system for field infrared camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090033792A1 (en) * 2007-08-02 2009-02-05 Sanyo Electric Co., Ltd. Image Processing Apparatus And Method, And Electronic Appliance
KR101033237B1 (en) * 2011-02-18 2011-05-06 (주)테라테코 Multi-function detecting system for vehicles and security using 360 deg. wide image and method of detecting thereof
KR101625471B1 (en) * 2014-12-30 2016-05-30 목원대학교 산학협력단 Method and apparatus for enhancing resolution of popular low cost thermal image camera
CN106973235A (en) * 2017-04-28 2017-07-21 深圳东方红鹰科技有限公司 The image pickup method and device detected based on rpyroelectric infrared
CN108737716A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Image pickup method, device and smart machine
CN112235338A (en) * 2020-07-27 2021-01-15 北京图力普联科技有限公司 Animal husbandry breeding monitoring system with artificial intelligence
CN112637503A (en) * 2020-12-22 2021-04-09 深圳市九洲电器有限公司 Photographing apparatus, photographing method, and computer-readable storage medium
CN113411504A (en) * 2021-08-18 2021-09-17 成都大熊猫繁育研究基地 Intelligent shooting method and system for field infrared camera

Also Published As

Publication number Publication date
CN115835012B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN112585940B (en) System and method for providing feedback for artificial intelligence based image capture devices
CN110035141B (en) Shooting method and equipment
US8564684B2 (en) Emotional illumination, and related arrangements
CN104320571B (en) Electronic equipment and method for electronic equipment
CN109087376B (en) Image processing method, image processing device, storage medium and electronic equipment
CN101860679A (en) Digital camera and image capturing method
CN110248098A (en) Image processing method, device, storage medium and electronic equipment
US8726324B2 (en) Method for identifying image capture opportunities using a selected expert photo agent
CN104488258A (en) Method and apparatus for dual camera shutter
CN107395957B (en) Photographing method and device, storage medium and electronic equipment
CN114019744A (en) Image pickup apparatus and control method thereof
CN115086567B (en) Time delay photographing method and device
KR20180086662A (en) The Apparatus And The System For Monitoring
CN109766473A (en) Information interacting method, device, electronic equipment and storage medium
CN115604572A (en) Image acquisition method and device
CN108259767B (en) Image processing method, image processing device, storage medium and electronic equipment
CN114390212B (en) Photographing preview method, electronic device and storage medium
CN115835012B (en) Artificial intelligence-based active organism shooting method and intelligent hunting camera
CN115633262B (en) Image processing method and electronic device
KR102321498B1 (en) Apparatus for running application for discrimination of animal information, server and application management system including them
CN111512625B (en) Image pickup apparatus, control method thereof, and storage medium
CN108495038B (en) Image processing method, image processing device, storage medium and electronic equipment
CN116128739A (en) Training method of downsampling model, image processing method and device
CN109496289A (en) A kind of terminal control method and device
WO2024109225A1 (en) Photographic mode switching method and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant