CN111914685A - Sow oestrus detection method and device, electronic equipment and storage medium - Google Patents

Sow oestrus detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111914685A
CN111914685A CN202010677045.8A CN202010677045A CN111914685A CN 111914685 A CN111914685 A CN 111914685A CN 202010677045 A CN202010677045 A CN 202010677045A CN 111914685 A CN111914685 A CN 111914685A
Authority
CN
China
Prior art keywords
sow
oestrus
model
video
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010677045.8A
Other languages
Chinese (zh)
Other versions
CN111914685B (en
Inventor
鞠铁柱
王宇华
耿科
张震
申光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaolongqianxing Technology Co ltd
Original Assignee
Beijing Xiaolongqianxing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaolongqianxing Technology Co ltd filed Critical Beijing Xiaolongqianxing Technology Co ltd
Priority to CN202010677045.8A priority Critical patent/CN111914685B/en
Publication of CN111914685A publication Critical patent/CN111914685A/en
Application granted granted Critical
Publication of CN111914685B publication Critical patent/CN111914685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items

Abstract

The embodiment of the invention provides a sow oestrus detection method, a sow oestrus detection device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a sow monitoring video; identifying whether the sow has oestrus local behavior or not according to the sow monitoring video; acquiring a video image according to a sow monitoring video, and extracting the contour of the video image to acquire the contour of a sow; judging the posture of the sow according to the sow profile, and eliminating the sleep posture of the sow; if the sow is identified to have oestrus local behavior, judging whether the duration of the static state of the sow state is greater than a preset oestrus threshold value or not according to the video image after the sow sleep posture is removed, starting timing from the static state of the sow, and if so, judging that the sow is oestrus. The sow oestrus detection method provided by the embodiment of the invention can realize automatic detection and accurate detection of sow oestrus.

Description

Sow oestrus detection method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a sow oestrus detection method and device, electronic equipment and a storage medium.
Background
The identification of the oestrus of the sow is a key link for judging whether the sow can normally breed, receive the fetus and give birth. The common methods for identifying the estrus of sows include mental state identification, genital change identification, compression identification, external observation, vaginal mucosa mucus examination, standing reflex examination, boar estrus examination, and the like.
For the oestrus identification of sows, domestic and foreign pig farms mainly use a pressing identification method, namely, a professional breeder with rich experience judges the oestrus behavior of the sows. The special breeder presses the back or the hip of the sow by hand, the sow is inactive, or the test boar climbs across the sow, and the sow is inactive, namely the hybridization period is right.
However, as the breeding industry is developed in a centralized large scale, shortage of professional feeders becomes a big problem in modern pig raising production, and therefore, a method capable of automatically detecting the oestrus of sows is urgently needed.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present invention provide a sow oestrus detection method and apparatus, an electronic device, and a storage medium.
Specifically, the embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a method for detecting oestrus of sows, including:
acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
identifying whether the sow has oestrus local behavior or not according to the sow monitoring video;
acquiring a video image according to the sow monitoring video, and carrying out contour extraction on the video image to acquire a sow contour;
judging the sow posture according to the sow profile, and eliminating the sow sleeping posture;
if the sow is identified to have oestrus local behavior, timing from the state of the sow after the sow sleep posture is removed according to the video image, judging whether the duration of the sow state in the standing state is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process of whether the sow in the standing state is greater than the preset oestrus threshold.
Further, according to the sow monitoring video, identifying whether the sow has oestrus local behavior or not comprises the following steps:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
Further, acquiring a video image according to the sow monitoring video, extracting the contour of the video image, and acquiring the contour of the sow, wherein the method comprises the following steps:
collecting a plurality of images of a sow under different conditions of feeding, sleeping and oestrus in advance as training samples;
marking the contour of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking result as sample output data, and carrying out training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow contour extraction model by using the images in the test set and the corresponding contour marking results, and adjusting the preliminary sow contour extraction model according to the test results until the prediction results meet the preset accuracy condition to obtain an optimal sow contour extraction model;
and taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow contour extraction model, and acquiring the sow contour according to the output of the optimal sow contour extraction model.
Further, judging the posture of the sow according to the contour of the sow, and eliminating the sleeping posture of the sow, wherein the method comprises the following steps:
collecting profile images of a sow in different postures of standing and sleeping as training samples;
taking the posture marking result of the training sample as label data to form a training set and a testing set; the posture marking result comprises a standing posture and a sleeping posture;
taking the contour image in the training set as sample input data, taking corresponding label data as sample output data, and carrying out training based on a LeNet network model to obtain a primary sow posture detection model; the preliminary sow posture detection model can exclude sows in a sleeping posture in the contour image;
testing the primary sow posture detection model by using the contour image in the test set and the corresponding label data, and adjusting the primary sow posture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow posture detection model;
and inputting the sow profile into the optimal sow posture detection model, and excluding the sow sleeping posture according to the output result of the optimal sow posture detection model.
Further, if the sow is identified to have oestrus local behavior, judging that the sow is in a standing state to start timing according to the video image after the sow is removed of the sleep posture, judging whether the duration of the standing state of the sow is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the standing state of the sow, and repeatedly executing the judgment process of whether the standing state of the sow is greater than the preset oestrus threshold, wherein the judgment process comprises the following steps:
judging whether an oestrus local behavior exists or not, if so, judging that the sow is in a static state to start according to the video image after the sow sleep posture is eliminated, carrying out sample modeling on the acquired first sow profile image and starting timing;
setting all newly acquired contour images of the sows as foreground images, setting the contour image of the last sow which is acquired as a background image, comparing the contours of the foreground image and the background image, judging whether the contours of the foreground image and the background image change or not, and when the contour change value is larger than a preset standing judgment threshold value, performing modeling and timing again; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing accumulating time, judging whether the accumulated time reaches a preset oestrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal of monitoring the oestrus of the sow, and if not, continuing reading the contour image of the sow and repeating the process until the end.
Further, the preset oestrus threshold is 120-150 seconds.
In a second aspect, an embodiment of the present invention further provides a sow oestrus detection apparatus, including:
the acquisition module is used for acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
the identification module is used for identifying whether the sow has oestrus local behavior according to the sow monitoring video;
the extraction module is used for acquiring a video image according to the sow monitoring video and extracting the outline of the video image to acquire the outline of the sow;
the elimination module is used for judging the sow posture according to the sow profile and eliminating the sow sleeping posture;
and the detection module is used for judging whether the sow is in a standing state for a duration greater than a preset oestrus threshold value or not according to the video image after the sow sleeping posture is removed when the sow is identified to have oestrus local behavior, judging whether the duration of the sow in the standing state is greater than the preset oestrus threshold value or not, judging that the sow is oestrus if the duration of the sow in the standing state is greater than the preset oestrus threshold value, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process whether the sow in the standing state is greater than the preset oestrus threshold value or not.
Further, the identification module is specifically configured to:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the program to implement the steps of the sow oestrus detection method according to the first aspect.
In a fourth aspect, the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the sow oestrus detection method as described in the first aspect.
As can be seen from the above technical solutions, the method, the apparatus, the electronic device, and the storage medium for detecting oestrus of sows provided in the embodiments of the present invention identify a local oestrus behavior of sows according to a monitoring video image of sows, and determine a standing time period when the local oestrus behavior of sows is determined, specifically including: acquiring a sow profile according to a sow monitoring video image, then eliminating a sow sleeping posture according to the sow profile, judging whether the duration of the sow sleeping state is greater than a preset oestrus threshold value or not when the sow posture is determined to be not the sleeping posture and is in the sleeping state, if so, judging that the sow oestrus occurs, and otherwise, timing again and repeating the process. Therefore, according to the sow oestrus detection method provided by the embodiment of the invention, the sow monitoring video is collected and then the background automatic processing is carried out to judge whether the sow oestrus, so that the automatic detection of the sow oestrus is realized, the required cost is low, and the oestrus identification accuracy is high because the oestrus is judged in an auxiliary manner according to the specific local action of the oestrus.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for detecting oestrus in sows according to an embodiment of the present invention;
fig. 2 is a schematic processing procedure diagram of a sow oestrus detection method according to an embodiment of the invention;
fig. 3 is a schematic view of a sow monitoring video acquisition process provided by an embodiment of the invention;
fig. 4 is a schematic view of a sow monitoring video capturing device according to an embodiment of the present invention;
fig. 5 is a schematic view illustrating the local behavior recognition of the sow oestrus provided by an embodiment of the present invention;
fig. 6 is a schematic view of a process for excluding a sow sleeping posture according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a sow silhouette extraction process provided by an embodiment of the present invention;
fig. 8 is a schematic view of a sow image provided in accordance with an embodiment of the present invention;
FIG. 9 is a schematic view of contour segmentation of a sow provided in accordance with an embodiment of the present invention;
fig. 10 is a schematic view of a sow standing identification monitoring oestrus process provided by an embodiment of the invention;
fig. 11 is a schematic structural view of a sow oestrus detection device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The identification of the oestrus of the sow is a key link for judging whether the sow can normally breed, receive the fetus and give birth. The common methods for identifying the estrus of sows include mental state identification, genital change identification, compression identification, external observation, vaginal mucosa mucus examination, standing reflex examination, boar estrus examination, and the like.
For the oestrus identification of sows, domestic and foreign pig farms mainly use a pressing identification method, namely, a professional breeder with rich experience judges the oestrus behavior of the sows. The special breeder presses the back or the hip of the sow by hand, the sow is inactive, or the test boar climbs across the sow, and the sow is inactive, namely the hybridization period is right. However, as the breeding industry is developed in a centralized large scale, shortage of professional feeders becomes a big problem in modern pig raising production, and therefore, a method capable of automatically detecting the oestrus of sows is urgently needed.
At present, infrared sensors are mostly adopted at home and abroad to judge the oestrus of sows. For example, s.c. scrilari and others use an infrared thermography technique to detect changes in the body surface temperature and hip temperature of the pudendum of a sow during the oestrus cycle, and find that the vulval temperature at the start of oestrus of the sow is significantly increased and significantly decreased before ovulation, thereby proposing the possibility of oestrus judgment based on the changes in the pudendum temperature, but the infrared thermography technique has many sensitive factors, such as wind speed, pig farm environment humidity, external temperature, standing posture and position of the sow during measurement, and is not suitable as a judgment basis. In addition, Zhangzi Yun and the like detect the subcutaneous body temperature of each sow after different varieties through novel electronic chips and explore the body temperature change rule of the replacement sow in the non-estrus period and the estrus period. However, the temperature of the pigs per day is changed by about 1 ℃, the temperature difference of the pigs per se can be caused by individual difference, in addition, the contact temperature acquisition equipment can generate stress response to the sows, the non-contact temperature acquisition equipment has high manufacturing cost and requires acquisition distance, and in addition, the temperature contrast is not obvious, so the non-contact temperature acquisition equipment is not suitable to be used as a judgment basis.
In order to solve the problems, the embodiment of the invention provides a sow oestrus detection method and device based on video behavior data identification. The scheme provided by the invention will be explained in detail by specific examples.
Fig. 1 shows a flowchart of a sow oestrus detection method provided by the embodiment of the invention. As shown in fig. 1, the method for detecting oestrus of sows provided by the embodiment of the present invention includes the following steps:
step 101: acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
step 102: identifying whether the sow has oestrus local behavior or not according to the sow monitoring video;
in the step, the oestrus local behavior of the sow is identified according to the sow monitoring video image. The sow oestrus local behavior comprises the following steps: during estrus, the sow has poor appetite or fasting phenomenon, the back of the sow is generally slightly arched, the head of the sow is usually directly towards the front, the ears of the sow are upright, the pig acts in a dull state, and the like.
Step 103: acquiring a video image according to the sow monitoring video, and carrying out contour extraction on the video image to acquire a sow contour;
step 104: judging the sow posture according to the sow profile, and eliminating the sow sleeping posture;
step 105: if the sow is identified to have oestrus local behavior, timing from the state of the sow after the sow sleep posture is removed according to the video image, judging whether the duration of the sow state in the standing state is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process of whether the sow in the standing state is greater than the preset oestrus threshold.
In this embodiment, when a sow monitoring video is obtained, image monitoring equipment needs to be arranged above a sow sty, for example, an RGB camera can be mounted above the sow sty to collect a video in real time and perform oestrus analysis, and it needs to be noted that when the sow oestrus monitoring is performed, a special oestrus searching area needs to be defined, a boar car track is laid, boar oestrus is searched, specifically, a boar car is made to advance at a constant speed, so that the contact time between a boar and each sow is about 2 min. When the boar is used for checking estrus, the standing behavior of the sow is taken as a main judgment basis (the sow is motionless and lasts for more than 2 minutes), the standing recognition is the core of the scheme, the accuracy of the standing recognition directly influences the accuracy of estrus judgment, and the special local behavior of the estrus sow is taken as an auxiliary means to assist in detecting estrus. Wherein, the fence of each pig is provided with an estrus warning lamp, and when the estrus is confirmed, the red lamp is lighted. And sending the specific columns of the oestrous sows to a server side and pushing the specific columns to an administrator mobile terminal.
In this embodiment, a sow monitoring video is acquired from an image monitoring device, then a sow oestrus local behavior is identified from the read sow monitoring video image, a profile is extracted, a sow profile is acquired, then a sow posture is judged according to the sow profile, after the non-sleep posture of the sow posture is determined, timing is started when the sow state is determined to be a standing state, whether the duration of the sow state in the standing state is greater than a preset oestrus threshold value or not is judged, if so, the sow is determined to be oestrus, otherwise, the start time of the sow standing state is re-determined, and the judgment process of whether the sow standing state is greater than the preset oestrus threshold value or not is repeatedly executed.
In this embodiment, it should be noted that, in an implementation manner, the step 102 may be omitted in the sow estrus detection method provided in this embodiment, but since an auxiliary judgment process of the local sow estrus behavior is omitted, in this case, the preset estrus threshold needs to be set longer, so as to improve the accuracy of the estrus judgment. For example, it may be set to 180 s. If the step 102 is not omitted, the estrus judgment accuracy is higher, and meanwhile, the standing time standard of the estrus judgment can be reduced, that is, the preset estrus threshold can be set shorter, for example, 150s, so as to improve the estrus judgment speed.
In this embodiment, it should be noted that, with the assistance of the local behavior of the sow oestrus, the detection method for detecting the oestrus of the sow by determining whether the sow is standing still for a long time through monitoring the video images is not only low in cost, high in automation degree, free of emergency response and high in detection accuracy compared with the detection method for detecting the temperature by using a sensor, and therefore, the method can be well popularized in practice.
According to the technical scheme, the sow oestrus detection method provided by the embodiment of the invention firstly identifies the local behavior of the sow oestrus according to the sow monitoring video image, such as: during estrus, the sow has poor appetite or fasting phenomenon, the back of the sow is generally slightly arched, the head of the sow is usually directly towards the front, the ears of the sow are upright, the pig acts in a dull state, and the like. Under the condition that the local oestrus behavior of the sow is determined, the standing time is judged, and the method specifically comprises the following steps: acquiring a sow profile according to a sow monitoring video image, then eliminating a sow sleeping posture according to the sow profile, judging whether the duration of the sow sleeping state is greater than a preset oestrus threshold value or not when the sow posture is determined to be not the sleeping posture and is in the sleeping state, if so, judging that the sow oestrus occurs, and otherwise, timing again and repeating the process. Therefore, according to the sow oestrus detection method provided by the embodiment of the invention, the sow monitoring video is collected and then the background automatic processing is carried out to judge whether the sow oestrus, so that the automatic detection of the sow oestrus is realized, the required cost is low, and the oestrus identification accuracy is high because the oestrus is judged in an auxiliary manner according to the specific local action of the oestrus. In addition, the sow monitoring video is collected and processed by the computer to detect whether the sow is oestrous, and compared with other detection modes adopting the sensor, the sow oestrus detection method has the advantages of low cost, high efficiency and no damage, and can avoid the stress reaction of the sow caused by a sensor monitoring method. In addition, the estrus time can be more accurately determined by the detection of the computer, so that accurate basis can be provided for artificial fertilization, the success rate of one-time fertilization is improved, and the risk of disease transmission caused by artificial participation can be reduced.
As shown in fig. 2, in the present embodiment, a sow oestrus detection method based on video behavior data identification is provided, in the present embodiment, a sow real-time monitoring video is acquired through a pig farm camera, on one hand, whether a sow has a specific behavior feature of an oestrous sow is determined by comparing the sow monitoring video with a sow oestrus behavior model, if the specific behavior feature exists, a resting determination oestrus standard is reduced, and otherwise, whether oestrus is determined according to a normal standard. The sow oestrus behavior model is obtained by deep learning based on data of hundreds of oestrus sows half an hour before oestrus. On the other hand, a boar (or an oestrogen is used) is introduced in the oestrus checking stage, the image contour of the sow is identified according to a MASK-RCNN-based pig contour identification model through a sow monitoring video, after the sleep posture of the sow is eliminated, according to the early oestrus behavior, if the early oestrus behavior is obvious, the time (originally 180 seconds, reduced to 150 seconds) of the resting threshold (also called oestrus threshold) is reduced, otherwise, the judgment is carried out according to the normal resting threshold. Wherein the standing time is counted from the judgment of the sow state in the video as the standing state. According to the sow oestrus detection method provided by the embodiment of the invention, whether the sow oestrus occurs can be judged by collecting the sow monitoring video and carrying out background automatic processing, so that the automatic detection of the sow oestrus is realized, and the personnel cost can be reduced. In addition, it should be noted that, in the embodiment, the oestrus identification accuracy is high because the comprehensive judgment is performed according to the specific behavioral characteristics of the oestrus of the sow and the static performance.
In the following embodiments, video data is needed when introducing the steps of determining whether the sow has local behaviors specific to sow oestrus and detecting oestrus stages. Therefore, a description of the sow video data acquisition process is given below.
The software required: video acquisition platform software; the hardware required: RGB camera, raspberry group 4b +; high performance computing and application service platform (configuration 1-2 GPU card)
The video acquisition process comprises the following steps: video acquisition passes through RGB camera connection raspberry group 4b +, through raspberry group 4b + operation client side acquisition program, sends data to the server, and the server keeps sow video data local after receiving data.
The video acquisition module mainly adopts a C/S structure and is divided into a client and a server, wherein the client is responsible for acquiring video images, data codes are transmitted to the server through socket communication, and the server decodes and transcodes the received data for storage.
As shown in fig. 4, the video capture software capture end is divided into the following modules:
a communication module: the server is used for being responsible for connecting the server and the client;
a data acquisition module: for video acquisition;
a clock module: the system is used for timing the acquisition equipment;
in this embodiment, the communication module is specifically configured to:
the communication module for video acquisition adopts socket communication, and considers that a server needs to be connected with a plurality of clients simultaneously, so a tcp protocol is adopted, the clients can detect the server from time to time, send a request for connection when the server is detected, and start to acquire information from a queue and transmit the information after the connection is successful.
In this embodiment, the data acquisition module is specifically configured to:
the data acquisition module is used for an opencv image processing tool, the opencv is provided with a camera reading interface, after image parameters are set, the image can be read according to frames, and data are stored in an acquisition queue.
In this embodiment, the clock module is specifically configured to:
the clock module is mainly used for timing the acquisition equipment, variables which can be accessed by the acquisition module and the transmission module are set in the program, the variables are marks for program operation, and the clock module can realize the control of the program through modifying the change quantity.
In the following embodiments, image sample data of the sow in different states is required to be used for introduction of contour extraction and posture judgment, and therefore, a sow image sample acquisition method and an acquisition process are provided below.
The software required: data acquisition platform software; the hardware required: RGB cameras (9); high performance computing and application service platform (configuration 1-2 GPU card)
Acquiring test data: the purpose is as follows: acquiring data (acquiring pictures regularly as required by adopting an RGB (red, green and blue) camera); use professional surveillance camera (or professional RGB camera), consider the pig house environment, should consider dustproof and dampproofing. The method is characterized in that a camera is specially arranged on each sow, the timing collection is carried out, the time interval is 1 second, equipment is started (or a trigger type is adopted) before the estrus is checked, in addition, the sow column matching is carried out before the collection, the arrangement scheme (the net height is 1.8-2 m) of the camera above the sow column can adopt RGB cameras, and the RGB cameras are placed above a plurality of sows to shoot a top view, as shown in figure 3.
The specific data acquisition scheme is as follows:
1. establishing a data acquisition site
2. Selecting representative pig (pigsty)
The upper cameras are provided with 9 cameras (the height is between 1.8 m, 2.0 m and 2.2 m), and the cameras respectively cover a plurality of pigs (the specific number of the pigs is determined according to the coverage range);
for posture judgment of the sow, data of the following states need to be acquired:
1) non-sleep;
2) sleeping;
and respectively selecting a food intake time period, a sleep time period and an estrus detection time period (only day time is considered) to acquire data, and taking a picture at an interval of 2 seconds. And shooting and collecting all day under the condition that the condition is not allowed, and performing manual processing at the later stage.
Sampling period:
day data was collected for 1 month according to the above requirements.
When data are collected:
1) identifying the sleep start time-end time of day (the interval data is stored separately);
2) rules of picture naming, column-month-day, minute-second
Data set label:
inputting a collection program starting command on the server by an administrator during each photo collection; by calling the camera, a picture is collected every 2 seconds, and the information of the column where the camera is located and the collected time is named for the picture. And directly storing the acquired data in a server.
And in the later stage, intercepting the daily sleep posture data set according to the identification, namely the sleep start time-sleep end time of each day, and providing a data set for the posture judgment learning model.
Further, based on the content of the above embodiment, in this embodiment, identifying whether the sow has oestrus local behavior according to the sow monitoring video includes:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
In this embodiment, as shown in fig. 5, in the oestrus searching stage, a 20min video before oestrus of an oestrus sow and a 20min video before oestrus searching of a non-oestrus sow can be collected, and then the videos are divided according to a 10 second-frame time sequence to obtain training samples, so as to generate a training data set and a test data set; inputting the training data set into a CNN model for feature extraction, and then inputting an LSTM model for classification learning; testing the sow oestrus local behavior judgment model by using a test set, and adjusting parameters of the sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model; and inputting the acquired sow monitoring video into the sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the sow oestrus local behavior judgment model. It should be noted that the preset accuracy condition may be that the prediction accuracy is higher than a preset threshold, for example, higher than 90%.
Further, based on the content of the above embodiment, in this embodiment, acquiring a video image according to the sow monitoring video, and performing contour extraction on the video image to acquire a sow contour includes:
collecting a plurality of images of a sow under different conditions of feeding, sleeping and oestrus in advance as training samples;
marking the contour of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking result as sample output data, and carrying out training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow contour extraction model by using the images in the test set and the corresponding contour marking results, and adjusting the preliminary sow contour extraction model according to the test results until the prediction results meet the preset accuracy condition to obtain an optimal sow contour extraction model;
and taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow contour extraction model, and acquiring the sow contour according to the output of the optimal sow contour extraction model.
In the embodiment, a plurality of images of the sow under different conditions of feeding, sleeping and oestrus are collected in advance through the raspberry dispatching device to serve as training samples. Marking the contour of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set as shown in FIGS. 8 and 9; taking the images in the training set as sample input data, taking the corresponding contour marking result as sample output data, and carrying out training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model; testing the preliminary sow contour extraction model by using the images in the test set and the corresponding contour marking results, and adjusting the preliminary sow contour extraction model according to the test results until an optimal sow contour extraction model is obtained after a prediction result meets a preset accuracy condition (for example, the accuracy is more than 90%); and performing image preprocessing (denoising, histogram equalization and sharpening) on the sow monitoring image, inputting the image into the optimal sow contour extraction model, and performing contour extraction according to the optimal sow contour extraction model to obtain the sow contour.
In this embodiment, the right sow can shoot all the contours due to the shooting angle, and the sows at both sides can shoot only partial contours (for example, when the sow sleeps, the back of the sow at the left side is close to the railing at the leftmost side, and the four limbs of the sow are possibly shot incompletely due to the angle). The head of the profile shot when the sow is fed and stood is different, the head of the sow for feeding and drinking is lower, the profile of the sow during sleeping contains four limbs and is different from the profile during standing, therefore, the profile needs to be extracted, and then the standing state and the standing time length are judged based on the extracted profile.
In the present embodiment, as shown in fig. 7, image recognition and contour extraction are performed in a Mask-Rcnn-based deep learning manner. Firstly, collecting a plurality of images of a sow under different conditions of feeding, sleeping and oestrus as training samples in advance through raspberry dispatching equipment, and marking the contour of the sow on each image in the training samples by using an open source marking tool Labelme to form a training set and a testing set; then, histogram equalization pretreatment is carried out on the experimental data, so that the influence of uneven brightness is reduced; setting Mask-Rcnn structural parameters, and training a segmentation model by using a training sample to obtain an optimal sow image segmentation model.
In this embodiment, it should be noted that Mask-Rcnn has three outputs, which are the result of classification, the coordinates of the prediction box, and Object _ Mask, and these three output branches are all in parallel relationship with each other, compared with other example segmentation algorithms that segment before classification, this parallel design is not only simple but also efficient.
Further, based on the content of the above embodiment, in this embodiment, the judging the sow posture according to the sow profile and excluding the sow sleeping posture includes:
collecting profile images of a sow in different postures of standing and sleeping as training samples;
taking the posture marking result of the training sample as label data to form a training set and a testing set; the posture marking result comprises a standing posture and a sleeping posture;
taking the contour image in the training set as sample input data, taking corresponding label data as sample output data, and carrying out training based on a LeNet network model to obtain a primary sow posture detection model; the preliminary sow posture detection model can exclude sows in a sleeping posture in the contour image;
testing the primary sow posture detection model by using the contour image in the test set and the corresponding label data, and adjusting the primary sow posture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow posture detection model;
and inputting the sow profile into the optimal sow posture detection model, and excluding the sow sleeping posture according to the output result of the optimal sow posture detection model.
In the embodiment, firstly, the profile images of the sow in the standing posture and the sleeping posture are collected as training samples; making label data of the standing posture and the sleeping posture in the training sample to form a training set and a testing set; then, using the contour image in the training set as sample input data, using the corresponding posture marking result as sample output data, and carrying out training based on a LeNet network model to obtain a preliminary sow posture detection model; finally, testing the primary sow posture detection model by using the contour images in the test set and the corresponding posture marking results, and adjusting the primary sow posture detection model according to the testing results until the prediction results meet preset accuracy conditions (for example, the accuracy is more than 90%) to obtain an optimal sow posture detection model; and inputting the sow profile into the optimal sow posture detection model, and excluding the sow sleeping posture according to the output result of the optimal sow posture detection model.
In the embodiment, the postures of the sow are mainly divided into the standing posture and the sleeping posture, but the long duration of the sleeping posture influences the judgment of the oestrus behavior of the sow, so that the sow sleeping posture is required to be excluded. In the embodiment, as shown in fig. 6, a LeNet network is adopted to identify postures so as to eliminate the sleep postures of the sows. Firstly, acquiring a picture of a sow, preparing a training sample, and then marking the sleeping postures and the non-sleeping postures of the sows in different columns in the training sample to form a training set and a testing set; setting LeNet network structure parameters, and training the recognition model by using the training samples to obtain an optimal sow posture detection model. It should be noted that, in the embodiment, the LeNet network is adopted to recognize the posture to eliminate the sleep posture of the sow, and the precision and the speed of the LeNet network model are obviously superior to those of other small neural network models.
Further, based on the content of the above embodiment, in this embodiment, if it is identified that the sow has a local oestrus behavior, determining from the sow state that the sow is in a static state and starting timing according to the video image after excluding the sleep posture of the sow, determining whether the duration of the static state of the sow is greater than a preset oestrus threshold, if so, determining that the sow is oestrus, otherwise, re-determining the start time of the static state of the sow, and repeatedly performing a determination process of whether the static state of the sow is greater than the preset oestrus threshold, including:
judging whether an oestrus local behavior exists or not, if so, judging that the sow is in a static state to start according to the video image after the sow sleep posture is eliminated, carrying out sample modeling on the acquired first sow profile image and starting timing;
setting all newly acquired contour images of the sows as foreground images, setting the contour image of the last sow which is acquired as a background image, comparing the contours of the foreground image and the background image, judging whether the contours of the foreground image and the background image change or not, and when the contour change value is larger than a preset standing judgment threshold value, performing modeling and timing again; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing accumulating time, judging whether the accumulated time reaches a preset oestrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal of monitoring the oestrus of the sow, and if not, continuing reading the contour image of the sow and repeating the process until the end.
In the embodiment, whether the specific local behavior of the oestrous sow exists or not is judged, if yes, the first sow profile image is subjected to sample modeling and timing from the fact that the sow posture is determined to be a standing posture; setting all newly acquired contour images of the sows as foreground images, setting the contour image of the last sow which is acquired as a background image, comparing the contours of the foreground image and the background image, judging whether the contours of the foreground image and the background image have large changes or not, and performing modeling and timing again when the contour change value is larger than a preset standing judgment threshold value; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing accumulating time, judging whether the accumulated time reaches a preset oestrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal of monitoring the oestrus of the sow, and if not, continuing reading the contour image of the sow and repeating the process until the end.
In the present embodiment, as shown in fig. 10, a decision method based on a contour background modeling algorithm is used. Firstly, judging whether a special local behavior of an oestrous sow exists, if so, modeling a first sow contour image acquired from a sow video, starting timing, setting all newly acquired pictures as foreground pictures, setting the last acquired picture as a background picture, repeatedly acquiring the sow contour image from the video stream, comparing the foreground pictures with the background picture, and judging whether mutation occurs (the mutation is larger in the similarity difference between the front and rear sow contours and larger than a preset static contour similarity value). If mutation occurs, updating the foreground image into a background image, modeling again, timing, if the similarity of the foreground image and the background image is within a preset standing judgment threshold range, continuing to accumulate time, judging whether the accumulated time reaches a preset oestrus threshold, if the accumulated time reaches the preset oestrus threshold, sending a result to a preset terminal to inform the preset terminal to monitor sow oestrus, and if the accumulated time does not reach the preset oestrus threshold, continuing to read the sow profile image, and repeating the process until the process is finished. In the embodiment, the contour background modeling algorithm can be better suitable for the situation of a later device, the influence caused by shaking of the camera can be perfectly solved by acquiring the contour and comparing the similarity of the contour, and the recognition speed is high.
In addition, it should be noted that, in the present embodiment, since the oestrus is determined in an auxiliary manner according to the oestrus-specific local behavior, the preset oestrus threshold may be set lower than the preset oestrus threshold in the general oestrus determination method. For example, if there is an estrus local situation, a smaller estrus threshold may be adopted, for example, from 180 seconds to 120 seconds and 150 seconds, to increase the recognition speed.
Therefore, as shown in fig. 2, in the embodiment, a sow oestrus detection method based on video behavior data identification is provided, in the embodiment, a sow real-time monitoring video is acquired through a pig farm camera, on one hand, whether a sow has a specific behavior feature of an oestrus sow is judged by comparing the sow monitoring video with a sow oestrus behavior model, if the specific behavior feature exists, a resting judgment oestrus standard is reduced, and otherwise, whether oestrus is judged according to a normal standard. The sow oestrus behavior model is obtained by deep learning based on data of hundreds of oestrus sows half an hour before oestrus. On the other hand, a boar (or an oestrogen is used) is introduced in the oestrus checking stage, the image contour of the sow is identified according to a MASK-RCNN-based pig contour identification model through a sow monitoring video, after the sleep posture of the sow is eliminated, according to the early oestrus behavior, if the early oestrus behavior is obvious, the time (originally 180 seconds, reduced to 150 seconds) of the resting threshold (also called oestrus threshold) is reduced, otherwise, the judgment is carried out according to the normal resting threshold. Wherein the standing time is counted from the judgment of the sow state in the video as the standing state. According to the sow oestrus detection method provided by the embodiment of the invention, whether the sow oestrus occurs can be judged by collecting the sow monitoring video and carrying out background automatic processing, so that the automatic detection of the sow oestrus is realized, and the personnel cost can be reduced. In addition, it should be noted that, in the embodiment, the oestrus identification accuracy is high because the comprehensive judgment is performed according to the specific behavioral characteristics of the oestrus of the sow and the static performance.
Fig. 11 is a schematic structural diagram of a sow oestrus detection device provided by an embodiment of the invention. As shown in fig. 11, the sow estrus detecting device provided by this embodiment includes: an acquisition module 21, an identification module 22, an extraction module 23, an exclusion module 24 and a detection module 25, wherein:
the acquisition module 21 is used for acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
the identification module 22 is used for identifying whether the sow has oestrus local behavior according to the sow monitoring video;
the extraction module 23 is configured to obtain a video image according to the sow monitoring video, and perform contour extraction on the video image to obtain a sow contour;
the elimination module 24 is used for judging the sow posture according to the sow profile and eliminating the sow sleeping posture;
and the detection module 25 is configured to determine that the sow is in a standing state and start timing according to the video image after the sow sleeping posture is removed when the sow is identified to have the oestrus local behavior, determine whether the duration of the sow in the standing state is greater than a preset oestrus threshold, determine that the sow is oestrus if the duration of the sow in the standing state is greater than the preset oestrus threshold, and otherwise, re-determine the starting time of the sow in the standing state and repeatedly execute a determination process of whether the sow in the standing state is greater than the preset oestrus threshold.
Further, based on the content of the foregoing embodiment, in this embodiment, the identification module 22 is specifically configured to:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
The sow oestrus detection device provided by the embodiment of the invention can be used for executing the sow oestrus detection method in the embodiment, and the working principle and the beneficial effect are similar, so detailed description is omitted here, and specific contents can be referred to the introduction of the embodiment.
In this embodiment, it should be noted that each module in the apparatus according to the embodiment of the present invention may be integrated into a whole or may be separately disposed. The modules can be combined into one module, and can also be further split into a plurality of sub-modules.
Based on the same inventive concept, another embodiment of the present invention provides an electronic device, which specifically includes the following components, with reference to fig. 12: a processor 301, a memory 302, a communication interface 303, and a communication bus 304;
the processor 301, the memory 302 and the communication interface 303 complete mutual communication through the communication bus 304;
the processor 301 is configured to call a computer program in the memory 302, and the processor implements all the steps of the above-mentioned sow estrus detection method when executing the computer program, for example, the processor implements the following processes when executing the computer program: acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state; identifying whether the sow has oestrus local behavior or not according to the sow monitoring video; acquiring a video image according to the sow monitoring video, and carrying out contour extraction on the video image to acquire a sow contour; judging the sow posture according to the sow profile, and eliminating the sow sleeping posture; if the sow is identified to have oestrus local behavior, timing from the state of the sow after the sow sleep posture is removed according to the video image, judging whether the duration of the sow state in the standing state is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process of whether the sow in the standing state is greater than the preset oestrus threshold.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
Based on the same inventive concept, a further embodiment of the present invention provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements all the steps of the above-mentioned sow oestrus detection method, for example, the processor implements the following processes when executing the computer program: acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state; identifying whether the sow has oestrus local behavior or not according to the sow monitoring video; acquiring a video image according to the sow monitoring video, and carrying out contour extraction on the video image to acquire a sow contour; judging the sow posture according to the sow profile, and eliminating the sow sleeping posture; if the sow is identified to have oestrus local behavior, timing from the state of the sow after the sow sleep posture is removed according to the video image, judging whether the duration of the sow state in the standing state is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process of whether the sow in the standing state is greater than the preset oestrus threshold.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
In addition, the logic instructions in the memory may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the sow oestrus detection method according to the various embodiments or some parts of the embodiments.
Moreover, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Furthermore, in the present disclosure, reference to the description of the terms "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A sow oestrus detection method is characterized by comprising the following steps:
acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
identifying whether the sow has oestrus local behavior or not according to the sow monitoring video;
acquiring a video image according to the sow monitoring video, and carrying out contour extraction on the video image to acquire a sow contour;
judging the sow posture according to the sow profile, and eliminating the sow sleeping posture;
if the sow is identified to have oestrus local behavior, timing from the state of the sow after the sow sleep posture is removed according to the video image, judging whether the duration of the sow state in the standing state is greater than a preset oestrus threshold, if so, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process of whether the sow in the standing state is greater than the preset oestrus threshold.
2. The sow oestrus detection method of claim 1 wherein identifying the presence of oestrus local behaviour in a sow based on the sow surveillance video comprises:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
3. The sow oestrus detection method of claim 1 wherein acquiring video images from the sow monitoring video and performing contour extraction on the video images to acquire sow contours comprises:
collecting a plurality of images of a sow under different conditions of feeding, sleeping and oestrus in advance as training samples;
marking the contour of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking result as sample output data, and carrying out training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow contour extraction model by using the images in the test set and the corresponding contour marking results, and adjusting the preliminary sow contour extraction model according to the test results until the prediction results meet the preset accuracy condition to obtain an optimal sow contour extraction model;
and taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow contour extraction model, and acquiring the sow contour according to the output of the optimal sow contour extraction model.
4. The sow oestrus detection method of claim 1 wherein the judgment of sow posture based on the sow profile to exclude the sow sleeping posture comprises:
collecting profile images of a sow in different postures of standing and sleeping as training samples;
taking the posture marking result of the training sample as label data to form a training set and a testing set; the posture marking result comprises a standing posture and a sleeping posture;
taking the contour image in the training set as sample input data, taking corresponding label data as sample output data, and carrying out training based on a LeNet network model to obtain a primary sow posture detection model; the preliminary sow posture detection model can exclude sows in a sleeping posture in the contour image;
testing the primary sow posture detection model by using the contour image in the test set and the corresponding label data, and adjusting the primary sow posture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow posture detection model;
and inputting the sow profile into the optimal sow posture detection model, and excluding the sow sleeping posture according to the output result of the optimal sow posture detection model.
5. The sow oestrus detection method of claim 1, wherein if the sow oestrus local behavior is identified, timing is started from the sow state determination of the sow state according to the video image after the sow sleep posture is eliminated, whether the duration of the sow state in the standing state is greater than a preset oestrus threshold is determined, if so, the sow is determined to be oestrus, otherwise, the start time of the sow standing state is determined again, and the determination process of whether the sow standing state is greater than the preset oestrus threshold is repeatedly performed, including:
judging whether an oestrus local behavior exists or not, if so, judging that the sow is in a static state to start according to the video image after the sow sleep posture is eliminated, carrying out sample modeling on the acquired first sow profile image and starting timing;
setting all newly acquired contour images of the sows as foreground images, setting the contour image of the last sow which is acquired as a background image, comparing the contours of the foreground image and the background image, judging whether the contours of the foreground image and the background image change or not, and when the contour change value is larger than a preset standing judgment threshold value, performing modeling and timing again; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing accumulating time, judging whether the accumulated time reaches a preset oestrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal of monitoring the oestrus of the sow, and if not, continuing reading the contour image of the sow and repeating the process until the end.
6. The sow oestrus detection method as claimed in any one of claims 1 to 5, wherein the preset oestrus threshold is 120-150 seconds.
7. The utility model provides a sow detection device that estruses which characterized in that includes:
the acquisition module is used for acquiring a sow monitoring video; wherein the sow monitoring video is obtained under a boar sentiment checking state;
the identification module is used for identifying whether the sow has oestrus local behavior according to the sow monitoring video;
the extraction module is used for acquiring a video image according to the sow monitoring video and extracting the outline of the video image to acquire the outline of the sow;
the elimination module is used for judging the sow posture according to the sow profile and eliminating the sow sleeping posture;
and the detection module is used for judging whether the sow is in a standing state for a duration greater than a preset oestrus threshold value or not according to the video image after the sow sleeping posture is removed when the sow is identified to have oestrus local behavior, judging whether the duration of the sow in the standing state is greater than the preset oestrus threshold value or not, judging that the sow is oestrus if the duration of the sow in the standing state is greater than the preset oestrus threshold value, otherwise, re-determining the starting time of the sow in the standing state, and repeatedly executing the judgment process whether the sow in the standing state is greater than the preset oestrus threshold value or not.
8. The oestrus detection device of a sow as claimed in claim 7 wherein the identification module is specifically configured to:
acquiring a first video in a preset time period before oestrus of an oestrus sow and a second video in a preset time period before oestrus of a non-oestrus sow in an oestrus checking stage;
dividing the first video and the second video into video frames, acquiring training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether the oestrus local behavior exists as sample label data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judgment model; the initial machine learning model comprises a CNN model and an LSTM model, the CNN model is used for feature extraction, and the LSTM model is used for classification learning based on feature extraction results of the CNN model;
testing the initial sow oestrus local behavior judgment model by utilizing a test data set, and adjusting parameters of the initial sow oestrus local behavior judgment model according to a test result until a prediction result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
and inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judgment model, and judging whether the sow has oestrus local behavior or not according to the output result of the optimal sow oestrus local behavior judgment model.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the sow oestrus detection method as claimed in any one of claims 1 to 6 are carried out when the processor executes the program.
10. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the steps of the sow oestrus detection method as claimed in any one of claims 1 to 6.
CN202010677045.8A 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium Active CN111914685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010677045.8A CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010677045.8A CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111914685A true CN111914685A (en) 2020-11-10
CN111914685B CN111914685B (en) 2024-04-09

Family

ID=73280933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010677045.8A Active CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111914685B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113016657A (en) * 2021-03-05 2021-06-25 河南牧原智能科技有限公司 Pigsty sow oestrus identification system and application method thereof
CN113711944A (en) * 2021-08-27 2021-11-30 河南牧原智能科技有限公司 Sow oestrus identification method, device and system
CN114041426A (en) * 2020-12-31 2022-02-15 重庆市六九畜牧科技股份有限公司 Backup sow management pigsty
CN114403043A (en) * 2021-12-20 2022-04-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114586701A (en) * 2022-04-15 2022-06-07 东南大学 Milk cow oestrus prediction device based on body temperature and exercise amount data

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1839621A1 (en) * 2006-03-31 2007-10-03 Walter Signorini Method and system to determine a physiological state of a sow
CN201504483U (en) * 2009-05-15 2010-06-16 广东广兴牧业机械设备公司 Sow oestrus monitoring device
CN103858791A (en) * 2012-12-13 2014-06-18 青岛金弘测控技术发展有限公司 Device capable of recognizing livestock estrus state automatically
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN107027650A (en) * 2017-03-21 2017-08-11 中国农业大学 A kind of boar abnormal state detection method and device based on PSO SVM
CN107133604A (en) * 2017-05-25 2017-09-05 江苏农林职业技术学院 A kind of pig abnormal gait detection method based on ellipse fitting and predictive neutral net
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN108766075A (en) * 2018-05-31 2018-11-06 长春博立电子科技有限公司 A kind of individualized education analysis system and method based on video analysis
CN108830144A (en) * 2018-05-03 2018-11-16 华南农业大学 A kind of milking sow gesture recognition method based on improvement Faster-R-CNN
US20190124892A1 (en) * 2017-10-19 2019-05-02 N.V. Nederlandsche Apparatenfabriek Nedap Method and system for determining a condition of at least one pig in a pen
CN109984054A (en) * 2019-04-19 2019-07-09 广州影子科技有限公司 Oestrous detection method, oestrous detection device and oestrous detection system
CN110741963A (en) * 2019-10-16 2020-02-04 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110839557A (en) * 2019-10-16 2020-02-28 北京海益同展信息科技有限公司 Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN110866481A (en) * 2019-11-07 2020-03-06 北京小龙潜行科技有限公司 Sow oestrus detection method and device
CN110991222A (en) * 2019-10-16 2020-04-10 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1839621A1 (en) * 2006-03-31 2007-10-03 Walter Signorini Method and system to determine a physiological state of a sow
CN201504483U (en) * 2009-05-15 2010-06-16 广东广兴牧业机械设备公司 Sow oestrus monitoring device
CN103858791A (en) * 2012-12-13 2014-06-18 青岛金弘测控技术发展有限公司 Device capable of recognizing livestock estrus state automatically
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN107027650A (en) * 2017-03-21 2017-08-11 中国农业大学 A kind of boar abnormal state detection method and device based on PSO SVM
CN107133604A (en) * 2017-05-25 2017-09-05 江苏农林职业技术学院 A kind of pig abnormal gait detection method based on ellipse fitting and predictive neutral net
US20190124892A1 (en) * 2017-10-19 2019-05-02 N.V. Nederlandsche Apparatenfabriek Nedap Method and system for determining a condition of at least one pig in a pen
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN108830144A (en) * 2018-05-03 2018-11-16 华南农业大学 A kind of milking sow gesture recognition method based on improvement Faster-R-CNN
CN108766075A (en) * 2018-05-31 2018-11-06 长春博立电子科技有限公司 A kind of individualized education analysis system and method based on video analysis
CN109984054A (en) * 2019-04-19 2019-07-09 广州影子科技有限公司 Oestrous detection method, oestrous detection device and oestrous detection system
CN110741963A (en) * 2019-10-16 2020-02-04 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110839557A (en) * 2019-10-16 2020-02-28 北京海益同展信息科技有限公司 Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN110991222A (en) * 2019-10-16 2020-04-10 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110866481A (en) * 2019-11-07 2020-03-06 北京小龙潜行科技有限公司 Sow oestrus detection method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
叶贵玉, 时自立: "母猪假发情的原因及判别方法", 河南畜牧兽医, no. 10 *
吕惠序;: "后备母猪的发情与配种", 今日畜牧兽医, no. 09 *
程建云;周黎明;: "规模猪场能繁母猪的饲养管理技术", 内蒙古农业科技, no. 03 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041426A (en) * 2020-12-31 2022-02-15 重庆市六九畜牧科技股份有限公司 Backup sow management pigsty
CN113016657A (en) * 2021-03-05 2021-06-25 河南牧原智能科技有限公司 Pigsty sow oestrus identification system and application method thereof
CN113711944A (en) * 2021-08-27 2021-11-30 河南牧原智能科技有限公司 Sow oestrus identification method, device and system
CN113711944B (en) * 2021-08-27 2023-03-03 河南牧原智能科技有限公司 Sow estrus identification method, device and system
CN114403043A (en) * 2021-12-20 2022-04-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114403043B (en) * 2021-12-20 2022-11-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114586701A (en) * 2022-04-15 2022-06-07 东南大学 Milk cow oestrus prediction device based on body temperature and exercise amount data

Also Published As

Publication number Publication date
CN111914685B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
CN111914685A (en) Sow oestrus detection method and device, electronic equipment and storage medium
CN110866481B (en) Sow oestrus detection method and device
CN110839557B (en) Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN110532899B (en) Sow antenatal behavior classification method and system based on thermal imaging
CN106847262A (en) A kind of porcine respiratory disease automatic identification alarm method
CN111178197A (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
CN101672839A (en) Device and method for detecting hatching egg incubation quality based on computer vision
CN111639629B (en) Pig weight measurement method and device based on image processing and storage medium
CN112257564B (en) Aquatic product quantity statistical method, terminal equipment and storage medium
CN110991222B (en) Object state monitoring and sow oestrus monitoring method, device and system
CN111161265A (en) Animal counting and image processing method and device
CN112288793B (en) Method and device for detecting backfat of livestock individuals, electronic equipment and storage medium
CN111767794A (en) Cage-rearing poultry abnormal behavior detection method and detection system based on machine vision
CN114596448A (en) Meat duck health management method and management system thereof
CN112232977A (en) Aquatic product cultivation evaluation method, terminal device and storage medium
CN114004866A (en) Mosquito recognition system and method based on image similarity difference
CN112861734A (en) Trough food residue monitoring method and system
CN114581948A (en) Animal face identification method
CN109949272A (en) Identify the collecting method and system of skin disease type acquisition human skin picture
CN115359418A (en) Livestock delivery monitoring and early warning system and method based on CLIP model
CN115777560A (en) Intelligent sow feeding system based on machine vision analysis technology
WO2023041904A1 (en) Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes
CN106326882A (en) Fingerprint identification system and fingerprint identification method based on image quality assessment technology
CN113627255A (en) Mouse behavior quantitative analysis method, device, equipment and readable storage medium
CN112215107A (en) Pig behavior identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant