Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, the application provides a sow oestrus monitoring method, a sow oestrus monitoring device, a sow oestrus monitoring system, an electronic device and a storage medium.
In a first aspect, the application provides a sow oestrus monitoring method, which comprises the following steps:
acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the method further includes:
and when the oestrus judgment result is that the sow oestrus, executing preset reminding operation.
Optionally, the follow the surplus food information of trough is obtained in the discernment in the image that awaits measuring, includes:
identifying a trough area from the image to be detected through a trough detection model;
cutting the image to be detected according to the trough area to obtain a trough image;
and inputting the crib image into a pre-trained after-eat detection model to obtain the after-eat information.
Optionally, the method for recognizing the posture information of the sow from the image to be detected includes:
identifying a sow pig body from the image to be detected through a pig body detection model;
cutting the sow pig body from the image to be detected to obtain a sow image;
and inputting the sow image into a pre-trained posture detection model to obtain the posture information.
Optionally, the method further includes:
acquiring the acquisition time of the image to be detected;
the identification of the sow vulva oestrus information from the image to be detected comprises the following steps:
identifying and obtaining the position information of the vulva from the image to be detected according to a pre-trained vulva detection model;
cutting the image to be detected according to the position information to obtain a vulva image;
analyzing the similarity of the two vulva images separated by a first time period between the acquisition times according to a pre-trained similarity analysis model;
and analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
Optionally, the after-feeding information, the posture information and the sow vulva oestrus information are input into a pre-trained oestrus judgment model to obtain the oestrus judgment result of the sow, and the method includes:
acquiring historical food remaining information and historical posture information corresponding to the sows;
determining feeding information of the sow according to the after-meal information and the historical after-meal information;
determining posture change information of the sow according to the posture information and historical posture information;
and analyzing and judging the feeding information, the posture change information and the sow vulva estrus information through the estrus judging model to obtain the estrus judging result.
Optionally, after acquiring the image to be measured, the method further includes:
detecting the definition and integrity of a trough, a sow and a vulva in the image to be detected through a pre-trained image detection model to obtain an image quality detection result;
and when the definition and the integrity are determined to be not in accordance with the preset conditions according to the image quality detection result, excluding the image to be detected.
In a second aspect, the present application provides a sow estrus monitoring device, including:
the acquisition module is used for acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
the identification module is used for identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and the judging module is used for inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judging model to obtain an oestrus judging result of the sow.
In a third aspect, the present application provides a sow oestrus monitoring system comprising: a shooting device and a processing device are arranged in the shell,
the shooting device is used for shooting from the rear side of the sow to obtain an image to be detected, and the image to be detected comprises the sow, the trough and the vulva of the sow;
the processing device is used for acquiring an image to be detected; identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected; and inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the system further includes: an estrus reminding device is arranged on the frame,
and the estrus reminding device is used for executing preset reminding operation when the estrus judgment result is sow estrus.
In a fourth aspect, the present application provides an electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the above method steps when executing the computer program.
In a fifth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the above-mentioned method steps.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
the method comprises the steps of continuously collecting images to be detected including target objects, obtaining state information of the sow from multiple dimensions based on the images, and comprehensively analyzing and judging whether the sow is oestrous or not based on the state information. Like this, the mode through computer vision monitors the sow oestrus, need not the action such as artifical patrolling and examining or driving boar, just can be real-time, accurate discovery sow oestrus for can in time breed the sow, not only improve the reproductive rate of sow, also greatly reduced human cost and monitoring efficiency. Simultaneously, this monitoring facilities's hardware cost is lower, and contactless with the sow, reduces epidemic disease infection risk, can not lead to the stress of sow to cause the sow health to receive the influence.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
First, a sow oestrus monitoring method provided by the embodiment of the invention is introduced below.
Fig. 1 is a flowchart of a sow estrus monitoring method provided in an embodiment of the present application. As shown in fig. 1, the method comprises the steps of:
step S11, acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
step S12, identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and step S13, inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain the oestrus judgment result of the sow.
In this embodiment, through constantly gathering the image that awaits measuring including the target object, obtain the status information of sow from a plurality of dimensions based on the image, whether the sow is estrus based on these status information integrated analysis judgement sows. Like this, the mode through computer vision monitors the sow oestrus, need not the action such as artifical patrolling and examining or driving boar, just can be real-time, accurate discovery sow oestrus for can in time breed the sow, not only improve the reproductive rate of sow, also greatly reduced human cost and monitoring efficiency. Simultaneously, this monitoring facilities's hardware cost is lower, and contactless with the sow, reduces epidemic disease infection risk, can not lead to the stress of sow to cause the sow health to receive the influence.
Optionally, the method further includes:
and when the oestrus judgment result is that the sow oestrus, executing preset reminding operation.
Wherein, the preset reminding operation includes but is not limited to the following operations:
(1) determining the field information corresponding to the oestrus sows, sending an oestrus reminding message comprising the field information to a specified terminal, and reminding feeding personnel.
(2) A display device is arranged at the sow column and can control the display device of the column where the sow in estrus is positioned to carry out estrus reminding display;
the display device can be a display screen and controls the oestrus information displayed on the display screen; alternatively, the display device may be a lamp of a specific color that is controlled to light when the sow is in heat.
(3) The electronic collar worn by the oestrous sow can be controlled to emit light of a specific color to prompt the worker.
Like this, raise personnel and remind the sow that can in time to this field through estrus and mate, improved the reproduction rate of sow.
Fig. 2 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 2, in step S12, the identification of the remaining food information of the food trough from the image to be detected includes:
step S21, identifying a crib area from the image to be detected through a crib detection model;
step S22, cutting the trough image from the image to be detected according to the trough area;
and step S23, inputting the trough image into a pre-trained after-eat detection model to obtain the after-eat information.
Optionally, the remaining food information may be only whether there is remaining food, or the ratio of the remaining feed in the trough, and the like.
In this embodiment, the allowance of the feed in the trough is determined by performing image recognition on the trough. Like this, need not artifical fodder to in the trough and monitor, can be in real time, accurate low fodder surplus and the sow situation of eating in knowing the trough.
Optionally, the after-eat detection model may be obtained by training based on a target detection algorithm, such as YOLOv1, YOLOv2, YOLOv3, R-CNN, Fast R-CNN, SPP-net, Fast R-CNN, R-FCN, SSD, etc., or a target detection algorithm using a lightweight network such as MobileNet as a backbone network, such as MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, etc., replacing the Darknet backbone network in YOLO with MobileNet, so as to improve the network operation speed while ensuring accuracy.
Fig. 3 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 3, in step S12, the posture information of the sow is recognized and obtained from the image to be detected, which includes:
step S31, identifying a sow pig body from the image to be detected through a pig body detection model;
step S32, cutting a sow image from the image to be detected according to the sow body;
in step S33, the sow image is input to a previously trained posture detection model to obtain posture information.
Experiments show that the posture change of the sow in the estrus period is obviously different from that in the estrus period. In the case of no boar, the sow has a longer standing time and a greater number of posture changes when estrusing, while the sow remains still for a period of time when performing a "boar estrus test". Therefore, the oestrus state of the sow can be analyzed through the posture information of the sow.
Optionally, the gesture detection model may be trained based on convolutional neural networks such as MobileNet-YOLO, MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, Faster R-CNN, R-FCN, and the like. Collecting a sow sample image in the column through a camera; obtaining a label of a sow sample image, wherein the label comprises a label frame for selecting a pig body and a posture mark, and the label can be marked as standing, sitting, lying on left side and lying on right side according to different postures of the pig; and inputting the sow sample image and the corresponding label into a convolutional neural network for training to obtain the posture detection model.
In the embodiment, the posture information of the sow can be rapidly and accurately identified through the posture detection model, so that the oestrus judgment can be timely and accurately carried out in the following process.
Fig. 4 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 4, the method further includes: and acquiring the acquisition time of the image to be detected. In step S12, identifying and obtaining sow vulva oestrus information from the image to be detected includes:
step S41, identifying and obtaining the position information of the vulva from the image to be detected according to the pre-trained vulva detection model;
step S42, cutting the image to be detected according to the position information to obtain a vulva image;
step S43, analyzing the similarity of two vulva images with a first time interval between the acquisition time according to a pre-trained similarity analysis model;
and step S44, analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
Artificial experience and existing data show that the sows can estre about 7 days from the beginning of weaning, and as time changes, the vulva of the first few days does not change obviously, and the closer to estrus, the smoother, swollen and redder the vulva of the sows. Compared with the similarity of the same time period of the same day as the previous day when the estrus is not in use, the similarity of the same time period of the same day as the previous day when the estrus is in use is lower. Therefore, the embodiment can predict whether the sow is oestrous at the current moment according to the change degree of the size and the color of the vulva.
Therefore, the sow is monitored by the fixed camera all the time from the time when the sow is in the limit fence, the image information of the sow is collected every 5 seconds, and the vulva only occupies a small part of the picture, so the vulva needs to be detected out first. Then, a vulva part is cut out according to the detection result, and a series of vulva image sequences are obtained. And finally, calculating to obtain a similarity sequence of the images of the vulva according to two groups of images of the vulva with the fixed time length of each day and the same time interval of the previous day, and judging whether the sow is oestrous at the current moment by classifying the similarity sequence.
Optionally, the clitoral detection model in step S41 may be obtained based on convolutional neural network training such as MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, Faster R-CNN, R-FCN, and the like.
Optionally, the similarity analysis model in step S43 may be obtained based on twin neural network training of the deep residual network (ResNet), such as Siamese-ResNet50, Siamese-ResNet101, and so on. The depth residual error network is used for respectively extracting the feature vectors of two target object sample images, and the similarity of the two feature vectors is calculated by using the contrast loss (contrast loss) as a loss function. The similarity analysis model may include 2 depth residual error networks for extracting the feature vectors of the two target object sample images in each group, or only 1 depth residual error network for extracting the feature vectors of the two target object sample images in each group at a time.
Optionally, the classification model in step S44 may be obtained by training based on neural networks such as VGG16, Googlenet, MobileNetV2, and the like. The classification model can also be based on a relatively simple neural network structure, for example, only including 2 convolutional layers and 1 fully-connected layer, obtaining the feature vectors of the sequences through the convolutional layers, and classifying through the fully-connected layer and a softmax function, wherein the softmax function maps the output result into a (0, 1) interval, namely a classification probability, so as to perform classification, and the cross entropy loss function is continuously propagated backwards until the network converges, so as to obtain the classification model.
In the embodiment, the sow images are continuously collected, the vulva images are identified from the sow images, and the similarity between the vulva images obtained at different times is used for determining whether the sow is oestrous.
Fig. 5 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 5, step S13 includes:
step S51, acquiring historical after-eating information and historical posture information corresponding to the sow;
step S52, determining feeding information of the sow according to the after-meal information and the historical after-meal information;
step S53, determining the posture change information of the sow according to the posture information and the historical posture information;
and step S54, analyzing and judging the feeding information, the posture change information and the sow vulva oestrus information through the oestrus judging model to obtain an oestrus judging result.
Optionally, in step S52, a feeding curve of the sow can be obtained based on the after-feeding information and the historical after-feeding information, and based on the curve, the change of the feeding condition of the sow during the oestrus period and the oestrus period can be analyzed.
Alternatively, in step S53, the posture change information of the sow is determined into two cases:
(one) sow without boar test
Step S53 includes:
b1, counting the posture change times and each posture duration of the sow in a preset time period according to the posture information and the historical posture information;
and step B2, obtaining the posture change information of the sow according to the times and the number of the posture changes of the sow and each posture duration.
(II) sow for boar examination
Step S53 includes:
step C1, acquiring the minimum circumscribed rectangle of the sow from the sow image according to the pre-trained posture detection model;
step C2, when the sow is determined to be in a standing posture according to the posture information, calculating the intersection ratio of two minimum circumscribed rectangles corresponding to adjacent acquisition time;
step C3, when the crossing ratio is larger than or equal to the second threshold value, determining that the sow does not move; for example, when the cross-over ratio is greater than or equal to 0.98, it is determined that no movement of the sow has occurred;
step C4, when the sow is determined not to move, counting a second time period when the sow does not move;
and step C5, obtaining the posture change information of the sow according to the second time period when the sow does not move.
Optionally, the oestrus determination model in step S54 may be obtained based on lightweight neural network training, such as inclusion, inclusion v3, Xception, and so on.
In the embodiment, after feeding information, posture change information and sow vulva oestrus information are input into the oestrus judgment model, the model synthesizes multidimensional state information of sows to comprehensively judge the oestrus state of the sows, and the oestrus state of the sows can be obtained more accurately. Like this, the mode through computer vision monitors the sow estrus, need not the manual work and patrols and examines or drive actions such as boar, just can be real-time, accurate discovery sow estrus for can in time mate the sow, not only improve the reproduction rate of sow, also greatly reduced human cost and monitoring efficiency.
Fig. 6 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 6, in another embodiment, after step S11, the method further includes:
step S61, detecting the definition and integrity of the trough, the sow and the vulva in the image to be detected through a pre-trained image detection model to obtain an image quality detection result;
and step S62, when the definition and the integrity are determined to be not in accordance with the preset conditions according to the image quality detection result, excluding the image to be detected.
In this embodiment, because the shooting reason, the fence probably appears in the image that awaits measuring and shelters from sow vulva or trough, perhaps the fuzzy condition in sow vulva, trough, consequently, acquire the image that awaits measuring after, need screen the image that awaits measuring in advance, get rid of trough, sow and sow vulva unclear, incomplete image that awaits measuring. Thus, the accuracy of subsequent image recognition is improved, and the accuracy of oestrus judgment is also improved.
The image detection model can be obtained based on target detection neural network training, such as yolo v1, yolo v2, yolo v3 and the like, and images containing clear food troughs, sows and vulvas of sows are input into the target detection neural network for training to obtain the image detection model capable of distinguishing whether the images contain clear and complete food troughs, sows and vulvas of sows.
Fig. 7 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 7, the method involves a model including:
and the image detection model 71 is used for screening the definition and the integrity of the acquired image to be detected, and inputting the screened image to be detected into the crib detection model 72, the pig body detection model 74 and the vulva detection model 76 respectively.
And the crib detection model 72 is used for identifying the crib from the image to be detected. The trough image cut out from the image to be measured is input to the food remaining detection model 73.
The food remaining detection model 73 recognizes food remaining information from the trough image, and inputs the food remaining information to the estrus determination model 79.
And the pig body detection model 74 is used for identifying the sow pig body from the image to be detected. The sow image cropped from the image to be measured is input to the posture detection model 75.
The posture detection model 75 recognizes the posture information of the sow from the sow image, and inputs the posture information to the oestrus judgment model 79.
And a vulva detection model 76 for identifying the vulva of the sow from the image to be detected. The vulva image clipped from the image to be measured is input to the similarity analysis model 77.
And the similarity analysis model 77 is used for analyzing the similarity of the two vulva images at the interval of the first time period between the acquisition times and inputting the obtained similarity into the classification model 78.
And the classification model 78 is used for analyzing the similarity to obtain the oestrus information of the female sows corresponding to the images of the female females and inputting the oestrus information into the oestrus judgment model 79.
The oestrus judging model 79 is used for judging the oestrus according to the historical after-eating information and the historical posture information corresponding to the sow; determining feeding information of the sow according to the after-feeding information and the historical after-feeding information; determining posture change information of the sow according to the posture information and the historical posture information; and analyzing and judging the feeding information, the posture change information and the sow vulva oestrus information to obtain an oestrus judgment result.
In this embodiment, all the models are deployed in the processing device, and the processing device sequentially inputs the models for processing and analysis after receiving the images to be detected, which are shot by the shooting device, so as to obtain the oestrus state of the sow.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods.
Fig. 8 is a block diagram of a sow estrus monitoring device provided in an embodiment of the present application, which may be implemented as part or all of an electronic device through software, hardware or a combination of the two. As shown in fig. 8, the sow estrus monitoring device includes:
the acquisition module 81 is used for acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
the identification module 82 is used for identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the vulva of the sow from the image to be detected;
and the judging module 83 is used for inputting the after-eating information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judging model to obtain the oestrus judging result of the sow.
Fig. 9 is a block diagram of a sow estrus monitoring system provided in an embodiment of the present application, and as shown in fig. 9, the system includes: a photographing device 91 and a processing device 92.
And the shooting device 91 is used for shooting from the rear side of the sow to obtain an image to be detected, wherein the image to be detected comprises the sow, the trough and the vulva of the sow.
A processing device 92 for acquiring an image to be measured; identifying and obtaining the rest food information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected; and inputting the after-feeding information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the system further comprises: an estrus reminder 93. And the estrus reminding device 93 is used for executing preset reminding operation when the estrus judgment result is the sow estrus.
Wherein, every sow column all is equipped with the shooting device, shoots the image from the rear side, and the image of shooing includes clear sow (including pig buttock, back leg), vulva and trough. By the image pickup device, images can be taken many times a day to make a sow oestrus judgment.
An embodiment of the present application further provides an electronic device, as shown in fig. 10, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501, when executing the computer program stored in the memory 1503, implements the steps of the method embodiments described below.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (pci) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this is not intended to represent only one bus or type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method embodiments described below.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.