CN110839557B - Sow oestrus monitoring method, device and system, electronic equipment and storage medium - Google Patents

Sow oestrus monitoring method, device and system, electronic equipment and storage medium Download PDF

Info

Publication number
CN110839557B
CN110839557B CN201910984811.2A CN201910984811A CN110839557B CN 110839557 B CN110839557 B CN 110839557B CN 201910984811 A CN201910984811 A CN 201910984811A CN 110839557 B CN110839557 B CN 110839557B
Authority
CN
China
Prior art keywords
sow
image
information
vulva
oestrus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910984811.2A
Other languages
Chinese (zh)
Other versions
CN110839557A (en
Inventor
韩旭泉
郑磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Technology Information Technology Co Ltd
Original Assignee
Jingdong Technology Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Technology Information Technology Co Ltd filed Critical Jingdong Technology Information Technology Co Ltd
Priority to CN201910984811.2A priority Critical patent/CN110839557B/en
Publication of CN110839557A publication Critical patent/CN110839557A/en
Application granted granted Critical
Publication of CN110839557B publication Critical patent/CN110839557B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Animal Husbandry (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to a sow estrus monitoring method, device and system, electronic equipment and a storage medium. According to the technical scheme, the images to be detected comprising the target object are continuously collected, the state information of the sow is acquired from multiple dimensions based on the images, and whether the sow is oestrous is comprehensively analyzed and judged based on the state information. Like this, the mode through computer vision monitors the sow oestrus, need not the action such as artifical patrolling and examining or driving boar, just can be real-time, accurate discovery sow oestrus for can in time breed the sow, not only improve the reproductive rate of sow, also greatly reduced human cost and monitoring efficiency. Simultaneously, this monitoring facilities's hardware cost is lower, and contactless with the sow, reduces epidemic disease infection risk, can not lead to the stress of sow to cause the sow health to receive the influence.

Description

Sow oestrus monitoring method, device and system, electronic equipment and storage medium
Technical Field
The application relates to the field of artificial intelligence, in particular to a sow oestrus monitoring method, device and system, electronic equipment and a storage medium.
Background
In the process of raising the sows, artificial participation in the production flow of the sows is needed. The sow oestrus is an important part of sow production, and the sow oestrus is monitored, so that the sow can be bred and inoculated with piglets in time.
At present, a boar is driven to manually observe standing reaction, color change of a vulva and the like of a sow in a manual inspection monitoring mode, and the oestrus condition of the sow is monitored. When the sow is in estrus, artificial hybridization needs to be carried out on the sow in time. The sow oestrus all the year around is not limited by seasons, generally lasts for about two days, the existing oestrus monitoring mode not only consumes manpower and time cost, but also is limited by the number of times of daily patrol, and the oestrus period and the best mating opportunity of the sow cannot be found in time.
Disclosure of Invention
In order to solve the technical problems or at least partially solve the technical problems, the application provides a sow oestrus monitoring method, a sow oestrus monitoring device, a sow oestrus monitoring system, an electronic device and a storage medium.
In a first aspect, the application provides a sow oestrus monitoring method, which comprises the following steps:
acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the method further includes:
and when the oestrus judgment result is that the sow oestrus, executing preset reminding operation.
Optionally, the follow the surplus food information of trough is obtained in the discernment in the image that awaits measuring, includes:
identifying a trough area from the image to be detected through a trough detection model;
cutting the image to be detected according to the trough area to obtain a trough image;
and inputting the crib image into a pre-trained after-eat detection model to obtain the after-eat information.
Optionally, the method for recognizing the posture information of the sow from the image to be detected includes:
identifying a sow pig body from the image to be detected through a pig body detection model;
cutting the sow pig body from the image to be detected to obtain a sow image;
and inputting the sow image into a pre-trained posture detection model to obtain the posture information.
Optionally, the method further includes:
acquiring the acquisition time of the image to be detected;
the identification of the sow vulva oestrus information from the image to be detected comprises the following steps:
identifying and obtaining the position information of the vulva from the image to be detected according to a pre-trained vulva detection model;
cutting the image to be detected according to the position information to obtain a vulva image;
analyzing the similarity of the two vulva images separated by a first time period between the acquisition times according to a pre-trained similarity analysis model;
and analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
Optionally, the after-feeding information, the posture information and the sow vulva oestrus information are input into a pre-trained oestrus judgment model to obtain the oestrus judgment result of the sow, and the method includes:
acquiring historical food remaining information and historical posture information corresponding to the sows;
determining feeding information of the sow according to the after-meal information and the historical after-meal information;
determining posture change information of the sow according to the posture information and historical posture information;
and analyzing and judging the feeding information, the posture change information and the sow vulva estrus information through the estrus judging model to obtain the estrus judging result.
Optionally, after acquiring the image to be measured, the method further includes:
detecting the definition and integrity of a trough, a sow and a vulva in the image to be detected through a pre-trained image detection model to obtain an image quality detection result;
and when the definition and the integrity are determined to be not in accordance with the preset conditions according to the image quality detection result, excluding the image to be detected.
In a second aspect, the present application provides a sow estrus monitoring device, including:
the acquisition module is used for acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
the identification module is used for identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and the judging module is used for inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judging model to obtain an oestrus judging result of the sow.
In a third aspect, the present application provides a sow oestrus monitoring system comprising: a shooting device and a processing device are arranged in the shell,
the shooting device is used for shooting from the rear side of the sow to obtain an image to be detected, and the image to be detected comprises the sow, the trough and the vulva of the sow;
the processing device is used for acquiring an image to be detected; identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected; and inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the system further includes: an estrus reminding device is arranged on the frame,
and the estrus reminding device is used for executing preset reminding operation when the estrus judgment result is sow estrus.
In a fourth aspect, the present application provides an electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the above method steps when executing the computer program.
In a fifth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the above-mentioned method steps.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
the method comprises the steps of continuously collecting images to be detected including target objects, obtaining state information of the sow from multiple dimensions based on the images, and comprehensively analyzing and judging whether the sow is oestrous or not based on the state information. Like this, the mode through computer vision monitors the sow oestrus, need not the action such as artifical patrolling and examining or driving boar, just can be real-time, accurate discovery sow oestrus for can in time breed the sow, not only improve the reproductive rate of sow, also greatly reduced human cost and monitoring efficiency. Simultaneously, this monitoring facilities's hardware cost is lower, and contactless with the sow, reduces epidemic disease infection risk, can not lead to the stress of sow to cause the sow health to receive the influence.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a sow estrus monitoring method provided in an embodiment of the present application;
fig. 2 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application;
fig. 3 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application;
fig. 4 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application;
fig. 5 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application;
fig. 6 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application;
fig. 7 is a flowchart illustrating a sow estrus monitoring method according to another embodiment of the present application;
fig. 8 is a block diagram of a sow estrus monitoring device provided in an embodiment of the present application;
fig. 9 is a block diagram of a sow oestrus monitoring system provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
First, a sow oestrus monitoring method provided by the embodiment of the invention is introduced below.
Fig. 1 is a flowchart of a sow estrus monitoring method provided in an embodiment of the present application. As shown in fig. 1, the method comprises the steps of:
step S11, acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
step S12, identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
and step S13, inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain the oestrus judgment result of the sow.
In this embodiment, through constantly gathering the image that awaits measuring including the target object, obtain the status information of sow from a plurality of dimensions based on the image, whether the sow is estrus based on these status information integrated analysis judgement sows. Like this, the mode through computer vision monitors the sow oestrus, need not the action such as artifical patrolling and examining or driving boar, just can be real-time, accurate discovery sow oestrus for can in time breed the sow, not only improve the reproductive rate of sow, also greatly reduced human cost and monitoring efficiency. Simultaneously, this monitoring facilities's hardware cost is lower, and contactless with the sow, reduces epidemic disease infection risk, can not lead to the stress of sow to cause the sow health to receive the influence.
Optionally, the method further includes:
and when the oestrus judgment result is that the sow oestrus, executing preset reminding operation.
Wherein, the preset reminding operation includes but is not limited to the following operations:
(1) determining the field information corresponding to the oestrus sows, sending an oestrus reminding message comprising the field information to a specified terminal, and reminding feeding personnel.
(2) A display device is arranged at the sow column and can control the display device of the column where the sow in estrus is positioned to carry out estrus reminding display;
the display device can be a display screen and controls the oestrus information displayed on the display screen; alternatively, the display device may be a lamp of a specific color that is controlled to light when the sow is in heat.
(3) The electronic collar worn by the oestrous sow can be controlled to emit light of a specific color to prompt the worker.
Like this, raise personnel and remind the sow that can in time to this field through estrus and mate, improved the reproduction rate of sow.
Fig. 2 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 2, in step S12, the identification of the remaining food information of the food trough from the image to be detected includes:
step S21, identifying a crib area from the image to be detected through a crib detection model;
step S22, cutting the trough image from the image to be detected according to the trough area;
and step S23, inputting the trough image into a pre-trained after-eat detection model to obtain the after-eat information.
Optionally, the remaining food information may be only whether there is remaining food, or the ratio of the remaining feed in the trough, and the like.
In this embodiment, the allowance of the feed in the trough is determined by performing image recognition on the trough. Like this, need not artifical fodder to in the trough and monitor, can be in real time, accurate low fodder surplus and the sow situation of eating in knowing the trough.
Optionally, the after-eat detection model may be obtained by training based on a target detection algorithm, such as YOLOv1, YOLOv2, YOLOv3, R-CNN, Fast R-CNN, SPP-net, Fast R-CNN, R-FCN, SSD, etc., or a target detection algorithm using a lightweight network such as MobileNet as a backbone network, such as MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, etc., replacing the Darknet backbone network in YOLO with MobileNet, so as to improve the network operation speed while ensuring accuracy.
Fig. 3 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 3, in step S12, the posture information of the sow is recognized and obtained from the image to be detected, which includes:
step S31, identifying a sow pig body from the image to be detected through a pig body detection model;
step S32, cutting a sow image from the image to be detected according to the sow body;
in step S33, the sow image is input to a previously trained posture detection model to obtain posture information.
Experiments show that the posture change of the sow in the estrus period is obviously different from that in the estrus period. In the case of no boar, the sow has a longer standing time and a greater number of posture changes when estrusing, while the sow remains still for a period of time when performing a "boar estrus test". Therefore, the oestrus state of the sow can be analyzed through the posture information of the sow.
Optionally, the gesture detection model may be trained based on convolutional neural networks such as MobileNet-YOLO, MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, Faster R-CNN, R-FCN, and the like. Collecting a sow sample image in the column through a camera; obtaining a label of a sow sample image, wherein the label comprises a label frame for selecting a pig body and a posture mark, and the label can be marked as standing, sitting, lying on left side and lying on right side according to different postures of the pig; and inputting the sow sample image and the corresponding label into a convolutional neural network for training to obtain the posture detection model.
In the embodiment, the posture information of the sow can be rapidly and accurately identified through the posture detection model, so that the oestrus judgment can be timely and accurately carried out in the following process.
Fig. 4 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 4, the method further includes: and acquiring the acquisition time of the image to be detected. In step S12, identifying and obtaining sow vulva oestrus information from the image to be detected includes:
step S41, identifying and obtaining the position information of the vulva from the image to be detected according to the pre-trained vulva detection model;
step S42, cutting the image to be detected according to the position information to obtain a vulva image;
step S43, analyzing the similarity of two vulva images with a first time interval between the acquisition time according to a pre-trained similarity analysis model;
and step S44, analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
Artificial experience and existing data show that the sows can estre about 7 days from the beginning of weaning, and as time changes, the vulva of the first few days does not change obviously, and the closer to estrus, the smoother, swollen and redder the vulva of the sows. Compared with the similarity of the same time period of the same day as the previous day when the estrus is not in use, the similarity of the same time period of the same day as the previous day when the estrus is in use is lower. Therefore, the embodiment can predict whether the sow is oestrous at the current moment according to the change degree of the size and the color of the vulva.
Therefore, the sow is monitored by the fixed camera all the time from the time when the sow is in the limit fence, the image information of the sow is collected every 5 seconds, and the vulva only occupies a small part of the picture, so the vulva needs to be detected out first. Then, a vulva part is cut out according to the detection result, and a series of vulva image sequences are obtained. And finally, calculating to obtain a similarity sequence of the images of the vulva according to two groups of images of the vulva with the fixed time length of each day and the same time interval of the previous day, and judging whether the sow is oestrous at the current moment by classifying the similarity sequence.
Optionally, the clitoral detection model in step S41 may be obtained based on convolutional neural network training such as MobileNet-YOLOv1, MobileNet-YOLOv2, MobileNet-YOLOv3, Faster R-CNN, R-FCN, and the like.
Optionally, the similarity analysis model in step S43 may be obtained based on twin neural network training of the deep residual network (ResNet), such as Siamese-ResNet50, Siamese-ResNet101, and so on. The depth residual error network is used for respectively extracting the feature vectors of two target object sample images, and the similarity of the two feature vectors is calculated by using the contrast loss (contrast loss) as a loss function. The similarity analysis model may include 2 depth residual error networks for extracting the feature vectors of the two target object sample images in each group, or only 1 depth residual error network for extracting the feature vectors of the two target object sample images in each group at a time.
Optionally, the classification model in step S44 may be obtained by training based on neural networks such as VGG16, Googlenet, MobileNetV2, and the like. The classification model can also be based on a relatively simple neural network structure, for example, only including 2 convolutional layers and 1 fully-connected layer, obtaining the feature vectors of the sequences through the convolutional layers, and classifying through the fully-connected layer and a softmax function, wherein the softmax function maps the output result into a (0, 1) interval, namely a classification probability, so as to perform classification, and the cross entropy loss function is continuously propagated backwards until the network converges, so as to obtain the classification model.
In the embodiment, the sow images are continuously collected, the vulva images are identified from the sow images, and the similarity between the vulva images obtained at different times is used for determining whether the sow is oestrous.
Fig. 5 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 5, step S13 includes:
step S51, acquiring historical after-eating information and historical posture information corresponding to the sow;
step S52, determining feeding information of the sow according to the after-meal information and the historical after-meal information;
step S53, determining the posture change information of the sow according to the posture information and the historical posture information;
and step S54, analyzing and judging the feeding information, the posture change information and the sow vulva oestrus information through the oestrus judging model to obtain an oestrus judging result.
Optionally, in step S52, a feeding curve of the sow can be obtained based on the after-feeding information and the historical after-feeding information, and based on the curve, the change of the feeding condition of the sow during the oestrus period and the oestrus period can be analyzed.
Alternatively, in step S53, the posture change information of the sow is determined into two cases:
(one) sow without boar test
Step S53 includes:
b1, counting the posture change times and each posture duration of the sow in a preset time period according to the posture information and the historical posture information;
and step B2, obtaining the posture change information of the sow according to the times and the number of the posture changes of the sow and each posture duration.
(II) sow for boar examination
Step S53 includes:
step C1, acquiring the minimum circumscribed rectangle of the sow from the sow image according to the pre-trained posture detection model;
step C2, when the sow is determined to be in a standing posture according to the posture information, calculating the intersection ratio of two minimum circumscribed rectangles corresponding to adjacent acquisition time;
step C3, when the crossing ratio is larger than or equal to the second threshold value, determining that the sow does not move; for example, when the cross-over ratio is greater than or equal to 0.98, it is determined that no movement of the sow has occurred;
step C4, when the sow is determined not to move, counting a second time period when the sow does not move;
and step C5, obtaining the posture change information of the sow according to the second time period when the sow does not move.
Optionally, the oestrus determination model in step S54 may be obtained based on lightweight neural network training, such as inclusion, inclusion v3, Xception, and so on.
In the embodiment, after feeding information, posture change information and sow vulva oestrus information are input into the oestrus judgment model, the model synthesizes multidimensional state information of sows to comprehensively judge the oestrus state of the sows, and the oestrus state of the sows can be obtained more accurately. Like this, the mode through computer vision monitors the sow estrus, need not the manual work and patrols and examines or drive actions such as boar, just can be real-time, accurate discovery sow estrus for can in time mate the sow, not only improve the reproduction rate of sow, also greatly reduced human cost and monitoring efficiency.
Fig. 6 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 6, in another embodiment, after step S11, the method further includes:
step S61, detecting the definition and integrity of the trough, the sow and the vulva in the image to be detected through a pre-trained image detection model to obtain an image quality detection result;
and step S62, when the definition and the integrity are determined to be not in accordance with the preset conditions according to the image quality detection result, excluding the image to be detected.
In this embodiment, because the shooting reason, the fence probably appears in the image that awaits measuring and shelters from sow vulva or trough, perhaps the fuzzy condition in sow vulva, trough, consequently, acquire the image that awaits measuring after, need screen the image that awaits measuring in advance, get rid of trough, sow and sow vulva unclear, incomplete image that awaits measuring. Thus, the accuracy of subsequent image recognition is improved, and the accuracy of oestrus judgment is also improved.
The image detection model can be obtained based on target detection neural network training, such as yolo v1, yolo v2, yolo v3 and the like, and images containing clear food troughs, sows and vulvas of sows are input into the target detection neural network for training to obtain the image detection model capable of distinguishing whether the images contain clear and complete food troughs, sows and vulvas of sows.
Fig. 7 is a flowchart of a sow estrus monitoring method according to another embodiment of the present application. As shown in fig. 7, the method involves a model including:
and the image detection model 71 is used for screening the definition and the integrity of the acquired image to be detected, and inputting the screened image to be detected into the crib detection model 72, the pig body detection model 74 and the vulva detection model 76 respectively.
And the crib detection model 72 is used for identifying the crib from the image to be detected. The trough image cut out from the image to be measured is input to the food remaining detection model 73.
The food remaining detection model 73 recognizes food remaining information from the trough image, and inputs the food remaining information to the estrus determination model 79.
And the pig body detection model 74 is used for identifying the sow pig body from the image to be detected. The sow image cropped from the image to be measured is input to the posture detection model 75.
The posture detection model 75 recognizes the posture information of the sow from the sow image, and inputs the posture information to the oestrus judgment model 79.
And a vulva detection model 76 for identifying the vulva of the sow from the image to be detected. The vulva image clipped from the image to be measured is input to the similarity analysis model 77.
And the similarity analysis model 77 is used for analyzing the similarity of the two vulva images at the interval of the first time period between the acquisition times and inputting the obtained similarity into the classification model 78.
And the classification model 78 is used for analyzing the similarity to obtain the oestrus information of the female sows corresponding to the images of the female females and inputting the oestrus information into the oestrus judgment model 79.
The oestrus judging model 79 is used for judging the oestrus according to the historical after-eating information and the historical posture information corresponding to the sow; determining feeding information of the sow according to the after-feeding information and the historical after-feeding information; determining posture change information of the sow according to the posture information and the historical posture information; and analyzing and judging the feeding information, the posture change information and the sow vulva oestrus information to obtain an oestrus judgment result.
In this embodiment, all the models are deployed in the processing device, and the processing device sequentially inputs the models for processing and analysis after receiving the images to be detected, which are shot by the shooting device, so as to obtain the oestrus state of the sow.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods.
Fig. 8 is a block diagram of a sow estrus monitoring device provided in an embodiment of the present application, which may be implemented as part or all of an electronic device through software, hardware or a combination of the two. As shown in fig. 8, the sow estrus monitoring device includes:
the acquisition module 81 is used for acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
the identification module 82 is used for identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the vulva of the sow from the image to be detected;
and the judging module 83 is used for inputting the after-eating information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judging model to obtain the oestrus judging result of the sow.
Fig. 9 is a block diagram of a sow estrus monitoring system provided in an embodiment of the present application, and as shown in fig. 9, the system includes: a photographing device 91 and a processing device 92.
And the shooting device 91 is used for shooting from the rear side of the sow to obtain an image to be detected, wherein the image to be detected comprises the sow, the trough and the vulva of the sow.
A processing device 92 for acquiring an image to be measured; identifying and obtaining the rest food information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected; and inputting the after-feeding information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow.
Optionally, the system further comprises: an estrus reminder 93. And the estrus reminding device 93 is used for executing preset reminding operation when the estrus judgment result is the sow estrus.
Wherein, every sow column all is equipped with the shooting device, shoots the image from the rear side, and the image of shooing includes clear sow (including pig buttock, back leg), vulva and trough. By the image pickup device, images can be taken many times a day to make a sow oestrus judgment.
An embodiment of the present application further provides an electronic device, as shown in fig. 10, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501, when executing the computer program stored in the memory 1503, implements the steps of the method embodiments described below.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (pci) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this is not intended to represent only one bus or type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method embodiments described below.
It should be noted that, for the above-mentioned apparatus, electronic device and computer-readable storage medium embodiments, since they are basically similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
It is further noted that, herein, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. A sow oestrus monitoring method is characterized by comprising the following steps:
acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
inputting the after-feeding information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow;
the method further comprises the following steps:
acquiring the acquisition time of the image to be detected;
the identification of the sow vulva oestrus information from the image to be detected comprises the following steps:
identifying and obtaining the position information of the vulva from the image to be detected according to a pre-trained vulva detection model;
cutting the image to be detected according to the position information to obtain a vulva image;
analyzing the similarity of the two vulva images separated by a first time period between the acquisition times according to a pre-trained similarity analysis model;
and analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
2. The method of claim 1, further comprising:
and when the oestrus judgment result is that the sow oestrus, executing preset reminding operation.
3. The method as claimed in claim 1, wherein the identifying of the after-eating information of the food bowl from the image to be detected comprises:
identifying a crib area from the image to be detected through a crib detection model;
cutting the image to be detected according to the trough area to obtain a trough image;
and inputting the crib image into a pre-trained after-eat detection model to obtain the after-eat information.
4. The method as claimed in claim 1, wherein identifying posture information of a sow from the image to be detected comprises:
identifying a sow pig body from the image to be detected through a pig body detection model;
cutting the sow pig body from the image to be detected to obtain a sow image;
and inputting the sow image into a pre-trained posture detection model to obtain the posture information.
5. The method as claimed in claim 1, wherein the inputting the after meal information, the posture information and the sow vulva oestrus information into a pre-trained oestrus determination model to obtain the oestrus determination result of the sow comprises:
acquiring historical after-eating information and historical posture information corresponding to the sows;
determining feeding information of the sow according to the after-meal information and the historical after-meal information;
determining posture change information of the sow according to the posture information and historical posture information;
and analyzing and judging the feeding information, the posture change information and the sow vulva estrus information through the estrus judging model to obtain the estrus judging result.
6. The method of claim 1, wherein after acquiring the image to be detected, the method further comprises:
detecting the definition and integrity of a trough, a sow and a vulva in the image to be detected through a pre-trained image detection model to obtain an image quality detection result;
and when the definition and the integrity are determined to be not in accordance with the preset conditions according to the image quality detection result, removing the image to be detected.
7. The utility model provides a sow monitoring devices that estruses which characterized in that includes:
the acquisition module is used for acquiring an image to be detected, wherein the image to be detected comprises a sow, a trough and a sow vulva;
the identification module is used for identifying and obtaining the rest food information of the crib, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected;
the judgment module is used for inputting the after-eating information, the posture information and the sow vulva oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow;
the apparatus is further configured to:
acquiring the acquisition time of the image to be detected;
the identification module is configured to:
identifying and obtaining the position information of the vulva from the image to be detected according to a pre-trained vulva detection model;
cutting the image to be detected according to the position information to obtain a vulva image;
analyzing the similarity of the two vulva images separated by a first time period between the acquisition times according to a pre-trained similarity analysis model;
and analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
8. A sow oestrus monitoring system, comprising: a shooting device and a processing device are arranged in the shell,
the shooting device is used for shooting from the rear side of the sow to obtain an image to be detected, and the image to be detected comprises the sow, the trough and the vulva of the sow;
the processing device is used for acquiring an image to be detected; identifying and obtaining the food residue information of the trough, the posture information of the sow and the oestrus information of the female of the sow from the image to be detected; inputting the after-feeding information, the sow vulva posture information and the oestrus information into a pre-trained oestrus judgment model to obtain an oestrus judgment result of the sow;
the processing device is further configured to:
acquiring the acquisition time of the image to be detected;
the processing device is used for:
identifying and obtaining the position information of the vulva from the image to be detected according to a pre-trained vulva detection model;
cutting the image to be detected according to the position information to obtain a vulva image;
analyzing the similarity of the two vulva images separated by the first time period between the acquisition times according to a pre-trained similarity analysis model;
and analyzing the similarity according to a pre-trained classification model to obtain the sow vulva oestrus information corresponding to the vulva image.
9. The system of claim 8, further comprising: an estrus reminding device is arranged on the frame,
and the estrus reminding device is used for executing preset reminding operation when the estrus judging result is the sow estrus.
10. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method steps of any of claims 1-6.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method steps of any one of claims 1 to 6.
CN201910984811.2A 2019-10-16 2019-10-16 Sow oestrus monitoring method, device and system, electronic equipment and storage medium Active CN110839557B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910984811.2A CN110839557B (en) 2019-10-16 2019-10-16 Sow oestrus monitoring method, device and system, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910984811.2A CN110839557B (en) 2019-10-16 2019-10-16 Sow oestrus monitoring method, device and system, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110839557A CN110839557A (en) 2020-02-28
CN110839557B true CN110839557B (en) 2022-06-07

Family

ID=69597805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910984811.2A Active CN110839557B (en) 2019-10-16 2019-10-16 Sow oestrus monitoring method, device and system, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110839557B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111382739A (en) * 2020-03-03 2020-07-07 北京海益同展信息科技有限公司 Method, apparatus, system and computer-readable storage medium for feeding foodstuff
CN111467074B (en) * 2020-05-18 2023-11-03 京东科技信息技术有限公司 Method and device for detecting livestock status
CN111914685B (en) * 2020-07-14 2024-04-09 北京小龙潜行科技有限公司 Sow oestrus detection method and device, electronic equipment and storage medium
CN112640809A (en) * 2020-12-18 2021-04-13 中国农业大学 Sow oestrus detection method and device
CN114041426A (en) * 2020-12-31 2022-02-15 重庆市六九畜牧科技股份有限公司 Backup sow management pigsty
EP4387564A1 (en) * 2021-08-20 2024-06-26 Groupe Ro-Main Inc. Detecting estrus in animals for imsemination
CN113711944B (en) * 2021-08-27 2023-03-03 河南牧原智能科技有限公司 Sow estrus identification method, device and system
CN114403043B (en) * 2021-12-20 2022-11-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114358163A (en) * 2021-12-28 2022-04-15 东北农业大学 Food intake monitoring method and system based on twin network and depth data
CN115119766B (en) * 2022-06-16 2023-08-18 天津农学院 Sow oestrus detection method based on deep learning and infrared thermal imaging
CN115049934B (en) * 2022-08-11 2022-12-16 山东万牧农业科技有限公司郯城分公司 Poultry feed intelligent detection method based on image processing
CN115943908A (en) * 2022-12-05 2023-04-11 中国农业科学院北京畜牧兽医研究所 Sow oestrus detection method based on adaptive navigation and related equipment
CN116439158B (en) * 2023-06-20 2023-09-12 厦门农芯数字科技有限公司 Sow oestrus checking method, system, equipment and storage medium based on infrared identification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104936439A (en) * 2012-12-02 2015-09-23 安格瑞卡姆有限公司 Systems and methods for predicting the outcome of a state of a subject
CN107771706A (en) * 2017-09-18 2018-03-09 浙江利尔达物联网技术有限公司 A kind of oestrus of sow detection method and system
CN108633774A (en) * 2018-05-09 2018-10-12 中国农业科学院北京畜牧兽医研究所 A kind of machine boar for oestrus of sow identification
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN109637549A (en) * 2018-12-13 2019-04-16 北京小龙潜行科技有限公司 A kind of pair of pig carries out the method, apparatus and detection system of sound detection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104936439A (en) * 2012-12-02 2015-09-23 安格瑞卡姆有限公司 Systems and methods for predicting the outcome of a state of a subject
US20150302241A1 (en) * 2012-12-02 2015-10-22 Agricam Ab Systems and methods for predicting the outcome of a state of a subject
CN107771706A (en) * 2017-09-18 2018-03-09 浙江利尔达物联网技术有限公司 A kind of oestrus of sow detection method and system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN108633774A (en) * 2018-05-09 2018-10-12 中国农业科学院北京畜牧兽医研究所 A kind of machine boar for oestrus of sow identification
CN109637549A (en) * 2018-12-13 2019-04-16 北京小龙潜行科技有限公司 A kind of pair of pig carries out the method, apparatus and detection system of sound detection

Also Published As

Publication number Publication date
CN110839557A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110839557B (en) Sow oestrus monitoring method, device and system, electronic equipment and storage medium
KR102296501B1 (en) System to determine sows' estrus and the right time to fertilize sows using depth image camera and sound sensor
CN110598658B (en) Convolutional network identification method for sow lactation behaviors
CN110741963B (en) Object state monitoring and sow oestrus monitoring method, device and system
CN110135231A (en) Animal face recognition methods, device, computer equipment and storage medium
CN110991222B (en) Object state monitoring and sow oestrus monitoring method, device and system
CN111723729A (en) Intelligent identification method for dog posture and behavior of surveillance video based on knowledge graph
CN112734731B (en) Livestock temperature detection method, device, equipment and storage medium
WO2021104007A1 (en) Method and device for animal state monitoring, electronic device, and storage medium
KR102141582B1 (en) Prediction method and the apparatus for onset time of sow farrowing by image analysis
CN111310596A (en) Animal diseased state monitoring system and method
CN110532899B (en) Sow antenatal behavior classification method and system based on thermal imaging
CN111914685B (en) Sow oestrus detection method and device, electronic equipment and storage medium
CN111476119B (en) Insect behavior identification method and device based on space-time context
CN113537064A (en) Weak pig automatic detection marking method and system
CN116824626A (en) Artificial intelligent identification method for abnormal state of animal
Gu et al. A two-stage recognition method based on deep learning for sheep behavior
CN112101291B (en) Livestock nursing method, device, medium and electronic equipment
CN113221776A (en) Method for identifying general behaviors of ruminant based on artificial intelligence
CN115937982A (en) Pig posture and behavior recognition method and system
US20230057738A1 (en) Detecting estrus in animals for insemination
CN115119766B (en) Sow oestrus detection method based on deep learning and infrared thermal imaging
CN115777560A (en) Intelligent sow feeding system based on machine vision analysis technology
CN113837087A (en) Animal target detection system and method based on YOLOv3
Kawano et al. Toward building a data-driven system for detecting mounting actions of black beef cattle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Beijing Economic and Technological Development Zone, Beijing 100176

Applicant before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant