CN109984054B - Estrus detection method, estrus detection device and estrus detection system - Google Patents

Estrus detection method, estrus detection device and estrus detection system Download PDF

Info

Publication number
CN109984054B
CN109984054B CN201910317979.8A CN201910317979A CN109984054B CN 109984054 B CN109984054 B CN 109984054B CN 201910317979 A CN201910317979 A CN 201910317979A CN 109984054 B CN109984054 B CN 109984054B
Authority
CN
China
Prior art keywords
index
sow
estrus
motion
static
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910317979.8A
Other languages
Chinese (zh)
Other versions
CN109984054A (en
Inventor
杨翔
高云
刘峰
卢军
童宇
管石胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yingzi Technology Co ltd
Original Assignee
Guangzhou Yingzi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yingzi Technology Co ltd filed Critical Guangzhou Yingzi Technology Co ltd
Priority to CN201910317979.8A priority Critical patent/CN109984054B/en
Publication of CN109984054A publication Critical patent/CN109984054A/en
Application granted granted Critical
Publication of CN109984054B publication Critical patent/CN109984054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Biophysics (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an estrus detection method, an estrus detection device and an estrus detection system. The estrus detection method comprises the following steps: acquiring information of a sow to be detected to determine the predicted oestrus time of the sow; generating a prediction index according to the current time and the predicted estrus time; acquiring motion information of the sow to generate a standing index and a restlessness index of the sow; generating an oestrus probability according to the static index, the agitation index and the prediction index; and determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold. According to the oestrus detection method, the oestrus detection device and the oestrus detection system, the prediction index, the standing index and the agitation index are comprehensively considered to generate the oestrus probability, when the oestrus probability is larger than the preset threshold value, the sow is determined to have oestrus, staff with rich experience is not required to check the oestrus manually, the labor cost is saved, and automatic oestrus checking can be achieved to improve the oestrus checking efficiency. And the estrus can be checked in real time, and whether the sow is estrus or not can be found in time, so that the sow estrus is prevented from missing, and the production efficiency is improved.

Description

Estrus detection method, estrus detection device and estrus detection system
Technical Field
The application relates to the technical field of cultivation, in particular to an estrus detection method, an estrus detection device and an estrus detection system.
Background
The estrus condition of the bred objects (such as sows) is often artificially detected by staff with rich experience in the existing breeding field, and the method is low in estrus checking efficiency (the staff is required to artificially detect one head of the sows), and high in labor cost.
Disclosure of Invention
The embodiment of the application provides an estrus detection method, an estrus detection device and an estrus detection system.
The estrus detection method comprises the following steps: acquiring information of a sow to be detected to determine the predicted oestrus time of the sow; generating a prediction index according to the current time and the predicted estrus time; acquiring motion information of the sow to generate a static index and a restlessness index of the sow; generating an oestrus probability according to the prediction index, the static index and the agitation index; and determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold.
The estrus detection device comprises a first acquisition module, a first generation module, a second acquisition module, a second generation module and a determination module. The first acquisition module is used for acquiring information of a sow to be detected so as to determine the predicted oestrus time of the sow; the first generation module is used for generating a prediction index according to the current time and the predicted estrus time; the second acquisition module is used for acquiring the motion information of the sow to generate a static index and a restlessness index of the sow; the second generation module is used for generating an oestrus probability according to the prediction index, the static index and the agitation index; the determining module is used for determining that the sow has oestrus when the oestrus probability is larger than or equal to a preset threshold value.
The oestrus detection system of the present application comprises a processor. The processor is configured to: acquiring information of a sow to be detected to determine the predicted oestrus time of the sow; generating a prediction index according to the current time and the predicted estrus time; acquiring motion information of the sow to generate a static index and a restlessness index of the sow; generating an oestrus probability according to the prediction index, the static index and the agitation index; and determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold.
According to the oestrus detection method, the oestrus detection device and the oestrus detection system, the prediction index is generated through the current time and the predicted oestrus time, the standing index and the agitation index are generated according to the motion information of the sow, and the oestrus probability is generated by comprehensively considering the prediction index, the standing index and the agitation index, so that when the oestrus probability is larger than a preset threshold value, the sow is determined to have oestrus, staff with abundant experience is not needed to go for artificial oestrus checking, the labor cost is saved, and automatic oestrus checking can be realized to improve the oestrus checking efficiency. The oestrus detection method can realize real-time oestrus detection, can timely find whether the sow oestrus, and prevents missing of the oestrus period of the sow, thereby improving the production efficiency.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 2 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 3 is a schematic block diagram of an estrus detection system according to certain embodiments of the present application;
FIG. 4 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 5 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 6 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 7 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 8 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 9 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 10 is a schematic illustration of the oestrus detection method according to certain embodiments of the present application;
FIG. 11 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 12 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 13 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 14 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 15 is a schematic illustration of the principles of an estrus detection method according to certain embodiments of the present application;
FIG. 16 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 17 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 18 is a schematic flow chart of an estrus detection method according to some embodiments of the present application;
FIG. 19 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 20 is a schematic illustration of the principles of an estrus detection method according to certain embodiments of the present application;
FIG. 21 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 22 is a schematic illustration of the principles of an estrus detection method according to certain embodiments of the present application;
FIG. 23 is a schematic illustration of the oestrus detection method according to certain embodiments of the present application;
FIG. 24 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 25 is a schematic illustration of the oestrus detection method according to certain embodiments of the present application;
FIG. 26 is a schematic illustration of the oestrus detection method according to certain embodiments of the present application;
FIG. 27 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 28 is a schematic illustration of the oestrus detection method according to certain embodiments of the present application;
FIG. 29 is a block schematic diagram of an estrus detection apparatus according to certain embodiments of the present application;
FIG. 30 is a schematic illustration of the principles of an estrus detection method according to certain embodiments of the present application;
FIG. 31 is a schematic diagram of a scenario of an estrus detection method according to some embodiments of the present application;
FIG. 32 is a schematic diagram of an oestrus detection system according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
Referring to fig. 1, an estrus detection method according to an embodiment of the present application includes:
011: acquiring information of a sow to be detected to determine the predicted oestrus time of the sow;
012: generating a prediction index according to the current time and the predicted estrus time;
013: acquiring motion information of the sow to generate a standing index and a restlessness index of the sow;
015: generating an oestrus probability according to the prediction index, the standing index and the agitation index; and
016: and determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold.
Referring to fig. 2, an estrus detection apparatus 10 according to an embodiment of the present disclosure includes a first obtaining module 11, a first generating module 12, a second obtaining module 13, a second generating module 15, and a determining module 16. The first acquisition module 11 is used for acquiring information of a sow to be detected so as to determine the predicted oestrus time of the sow; the first generation module 12 is used for generating a prediction index according to the current time and the predicted estrus time; the second acquisition module 13 is used for acquiring the motion information of the sow to generate a static index and a restlessness index of the sow; the second generation module 15 is used for generating an oestrus probability according to the prediction index, the static index and the agitation index; the determination module 16 is used for determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold. That is, step 011 is implemented by the first obtaining module 11, step 012 is implemented by the first generating module 12, step 013 is implemented by the second obtaining module 13, step 015 is implemented by the second generating module 15, and step 016 is implemented by the determining module 16.
Referring to fig. 3, an oestrus detection system 100 according to an embodiment of the present application comprises a processor 20. The processor 20 is configured to: acquiring information of a sow to be detected to determine the predicted oestrus time of the sow; generating a prediction index according to the current time and the predicted estrus time; acquiring motion information of the sow to generate a standing index and a restlessness index of the sow; generating an oestrus probability according to the prediction index, the standing index and the agitation index; and determining that the sow has oestrus when the oestrus probability is greater than or equal to a predetermined threshold. That is, step 011, step 012, step 013, step 014, and step 015 may be implemented by processor 20.
Specifically, when the oestrus of the current sow is checked (whether the oestrus is detected), the processor 20 firstly acquires information of the current sow, wherein the information of the sow comprises breed, weight, age of day, information of a current physiological stage (such as the current sow is in an initial oestrus stage, a pregnancy stage, a lactation stage and the like), a predicted oestrus time of the current sow can be determined according to the information of the sow, for example, the sow generally makes an initial oestrus in the initial oestrus stage (150 + 170 + day age) or the sow can make an oestrus again within 7 days after weaning (at the end of the lactation stage), the initial oestrus time is generally any one day in the 150 + 170 + day age, and the predicted oestrus time can be different according to breeds of different sows, such as the time of the initial oestrus can be 151 days, 152 days, 155 days, 156 days, 159 days, 160 days, 162 days, 165, 168 days or 170 days and the like, the processor 20 may determine a predicted oestrus time based on information from the current sow.
After determining the predicted estrus time, processor 20 may generate a prediction index based on the current time and the predicted estrus time, for example, the closer the current time is to the predicted estrus time, the greater the value of the prediction index.
The processor 20 then retrieves the movement information of the sow to determine the stillness and agitation indicators of the sow. Generally, a sow can have a standing reaction in oestrus, namely the sow stands still in the original place, and the sow can still stand still when receiving the climbing of a boar if oestrus. The processor 20 judges the current motion state of the sow through the motion information of the sow: lying, standing or moving and the like so as to determine standing indexes and restlessness indexes, for example, when the sow is in a standing state, the standing index is the maximum value 1 after the standing time reaches a certain time (such as 2 minutes), and the shorter the standing time is, the smaller the standing index is; and the agitation index is 0 when the sow is inactive in lying, and the agitation index is larger when the sow is more active and the time of continuous activity is longer.
After obtaining the prediction index, the stationary index and the restless index, the processor 20 generates an oestrus probability according to the prediction index, the stationary index and the restless index, for example, when the prediction index, the stationary index and the restless index are all large, it can be determined that the oestrus probability of the current sow is high (for example, 0.8), and finally, whether the oestrus probability of the current sow is greater than or equal to a predetermined threshold (for example, the predetermined threshold is set to 0.8) is determined, and it can be determined that the current sow has oestrus.
According to the oestrus detection method, the oestrus detection device 10 and the oestrus detection system 100, the prediction index is generated through the current time and the predicted oestrus time, the standing index and the agitation index are generated according to the motion information of the sow, and the oestrus probability is generated by comprehensively considering the prediction index, the standing index and the agitation index, so that when the oestrus probability is larger than a preset threshold value, the oestrus of the sow is determined, staff with abundant experience is not required to go to check the oestrus, the labor cost is saved, and automatic oestrus checking can be realized to improve the oestrus checking efficiency. The oestrus detection method can realize real-time oestrus detection, can timely find whether the sow oestrus, and prevents missing of the oestrus period of the sow, thereby improving the production efficiency.
Referring to fig. 4, in some embodiments, step 011 includes:
0112: generating an estrus time prediction model according to the estrus information of the sows which have estrus at multiple heads; and
0114: and processing the information of the sow to be detected according to the estrus time prediction model to determine the predicted estrus time.
Referring to fig. 5, in some embodiments, the first obtaining module 11 includes a first generating unit 112 and a first processing unit 114. The first generation unit 112 is configured to generate an estrus time prediction model according to the estrus information of a plurality of sows that have estrus. The first processing unit 114 is configured to process information of the sow to be detected according to the estrus time prediction model to determine a predicted estrus time. That is, step 0112 may be implemented by the first generating unit 112 and step 0114 may be implemented by the first processing unit 114.
Referring again to fig. 3, in some embodiments, processor 20 is further configured to generate an estrus time prediction model based on the estrus information of a plurality of sows that have estrus; and processing the information of the sow to be detected according to the estrus time prediction model to determine the predicted estrus time. That is, step 0112 and step 0114 may be implemented by processor 20.
Specifically, the processor 20 may obtain in advance estrus information of all sows that have already estrated in the pig farm, where the estrus information includes the breed, the time of estrus, the physiological stage at which the estrus is located, and the like of the sows, for example, the processor 20 may generate an estrus time prediction model for the breed a according to the estrus information of all sows of the breed a, and generate an estrus time prediction model for the breed B according to the estrus information of all sows of the breed B, thereby establishing different estrus time prediction models for sows of different breeds.
In one example, the processor 20 first obtains the oestrus time of sows in the same physiological stage (such as the initial oestrus stage or postweaning) when the sows of each breed oestrus, and then takes an average value as the predicted oestrus time corresponding to the current physiological stage of the sows of the current breed, for example, the average oestrus time of the initial oestrus stage of all sows of breed a is 165 days old, so that when the oestrus time prediction is performed on the sows of breed a in the initial oestrus stage, the predicted oestrus time of the sows is 165 days old; for another example, if the average estrus time of all the postweaning sows of breed a is the seventh post-weaning day, then when the estrus time prediction is performed on the postweaning sows of breed a, the predicted estrus time of the sows of breed a can be predicted to be the seventh post-weaning day.
In another example, processor 20 may count the initial estrus stage and postweaning estrus times (e.g., specific to a particular day) for all sows of each breed and count the number of sows that have estrus for each estrus time. The processor 20 may set the estrus time of the largest number of sows in the initial estrus stage as the predicted estrus time of the initial estrus stage of the sow of the current breed, and set the estrus time of the largest number of sows after weaning as the predicted estrus time after weaning of the sow of the current breed. For example, if the estrus time of the largest number of the sows in the initial estrus stage of breed B is 164 days old, the predicted estrus time of the initial estrus stage of the sow of breed B can be set to be 164 days old; for another example, the estrus time at which the number of postweaning sows is the largest is day 7 after weaning, and the predicted estrus time after weaning of the sow of breed B can be set to day seven after weaning.
In this manner, the processor 20 may process information of the current sow according to the estrus prediction model to obtain an accurate predicted estrus.
Referring to fig. 6, in some embodiments, step 012 includes:
0122: and generating a prediction index according to the time difference between the current time and the predicted estrus time.
Referring to fig. 7, in some embodiments, the first generation module 12 includes a second generation unit 122. The second generating unit 122 is configured to generate a prediction index according to a time difference between the current time and the predicted estrus time. That is, step 0122 may be implemented by the second generating unit 122.
Referring again to fig. 3, in some embodiments, processor 20 is further configured to generate a prediction index based on a time difference between the current time and the predicted estrus time. That is, step 0122 may be implemented by processor 20.
Specifically, after processor 20 processes the information of the current sow to determine the predicted estrus time according to the predicted estrus time model, processor 20 obtains the current time and calculates the time difference between the current time and the predicted estrus time, it being understood that the closer the current time is to the predicted estrus time (i.e., the smaller the time difference is), the higher the probability of estrus. In one example, the greater the prediction index, the greater the oestrus probability, the smaller the time difference calculated by the processor 20, the greater the prediction index. In another example, the greater the prediction index, the lower the oestrus probability, and the smaller the time difference calculated by the processor 20, the smaller the prediction index. In this manner, processor 20 may accurately determine the prediction index from the time difference.
Referring to fig. 8, in some embodiments, the motion information includes ranging data and activity data, and step 013 includes:
0131: acquiring time domain characteristic values and frequency domain characteristic values of ranging data and activity data;
0132: generating sow state data according to the time domain characteristic value and the frequency domain characteristic value; and
0133: and generating a static index and a restlessness index according to the sow state data.
Referring to fig. 9, in some embodiments, the second obtaining module 13 includes a first obtaining unit 131, a third generating unit 132, and a fourth generating unit 133. The first obtaining unit 131 is configured to obtain time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data. The third generating unit 132 is configured to generate sow status data according to the time domain feature values and the frequency domain feature values. The fourth generating unit 133 is configured to generate a static index and a restlessness index from the sow status data. That is, step 0131 may be implemented by the first acquiring unit 131, step 0132 may be implemented by the third generating unit 132, and step 0133 may be implemented by the fourth generating unit 133.
Referring again to fig. 3, in some embodiments, the oestrus detection system 100 further comprises a distance sensor 30 and an activity sensor 40, the distance sensor 30 being configured to measure distance data of the sow from the ground; the activity sensor 40 is used for collecting activity data of the sow; the motion information comprises ranging data and activity data, the processor 20 is further configured to: acquiring time domain characteristic values and frequency domain characteristic values of ranging data and activity data; generating sow state data according to the time domain characteristic value and the frequency domain characteristic value; and generating a static index and a restlessness index according to the sow state data. That is, step 0131, step 0132 and step 0133 may be implemented by processor 20.
Specifically, in the example shown in fig. 3, the oestrus detection system 100 further comprises a mounting bracket 60 and a swinery 70. Each sow is placed in a pigsty 70. The mounting bracket 60 is disposed at one end of the swinery 70. The range sensor 30 is mounted on the mounting bracket 60 and aimed at the sow. The number of the ranging sensors 30 is two, one of the ranging sensors 30 is directed to the head of the sow to obtain ranging data of the head of the sow, and the other ranging sensor 30 is directed to the back of the sow to obtain ranging data of the back of the sow, thereby simultaneously acquiring the ranging data of the head and the back of the sow. If only the distance measurement data of the head of the sow is acquired, the distance measurement data of the sow when the back of the sow stands but the head of the sow is lying down is basically the same as the distance measurement data of the sow when the head of the sow lies down, so that the standing state of the sow is judged wrongly; and when only the distance measurement data of the back of the sow is acquired, the distance measurement data when the head of the sow stands and the back of the sow is lying down is basically the same as the distance measurement data when the sow lies down, so that the standing state of the sow is judged wrongly. Therefore, the distance measurement data of the head and the back of the sow are obtained simultaneously to comprehensively judge the standing state of the sow, the misjudgment condition can be reduced, and the judgment accuracy of the standing state of the sow is improved. In another example, the distance measuring sensors 30 are arranged above the pigsty 70 and aligned with the back of the sow to obtain the distance measuring data of the back of the sow, so that the use amount of equipment can be reduced, and the cost is saved. In other embodiments, more than two distance measuring sensors 30 can be arranged to acquire the distance measuring data of the sow, so that the accuracy of judging the standing state of the sow is further improved.
The range data comprises effective range data and ineffective range data, the effective range data is range data of the sow, the ineffective range data is range data of other objects, for example, range data generated after a person carelessly shields the range sensor 30 when operating the pigsty 70 is greatly different from the range data of the sow and can cause errors when time domain characteristic values and frequency domain characteristic values of the range data are subsequently calculated, the range data which are greatly different from most range data can be removed as the ineffective range data, and therefore the range data of other objects can be prevented from influencing the range data of the sow.
The activity sensor 40 is also provided on the mounting bracket 60 and aimed at the sow. The activity sensor 40 may collect activity data of the sow and the processor 20 may determine from the activity data whether the sow is active and how active the sow is. The activity sensor 40 includes at least one of an infrared pyroelectric sensor, an infrared proximity switch, and a capacitance sensor.
As shown in fig. 10, after the processor 20 obtains the ranging data and the activity data, it may first perform a noise reduction filtering process to remove noise data and improve the accuracy of the data. Processor 20 then processes the noise-reduced ranging data and the activity data to extract time domain feature values and frequency domain feature values of the ranging data and time domain feature values and frequency domain feature values of the activity data. The processor 20 can determine whether the sow is in a standing state according to the time domain characteristic value and the frequency domain characteristic value of the ranging data, for example, the sow respectively corresponds to different time domain characteristic value ranges and frequency domain characteristic value ranges when standing and lying, and the processor 20 can determine whether the sow is in a standing state according to the time domain characteristic value and the frequency domain characteristic value of the current ranging data. After judging whether the sow is in a standing state, the processor 20 determines the intensity of the sow activity according to the time domain characteristic value and the frequency domain characteristic value of the activity data, for example, when the sow is in an activity state, the time domain characteristic value and the frequency domain characteristic value of the activity data are continuously fluctuated, and when the sow stops moving (such as standing), the variation of the time domain characteristic value and the frequency domain characteristic value is small, so that the processor 20 can determine the intensity of the sow activity according to the variation of the time domain characteristic value and the frequency domain characteristic value of the activity data.
The processor 20 obtains sow state data including whether the sow is in a standing state and the activity intensity of the sow according to the time domain characteristic value and the frequency domain characteristic value of the ranging data and the time domain characteristic value and the frequency domain characteristic value of the activity data. Finally, the processor 20 then generates corresponding static and agitation indexes according to sow status data within a certain time period (such as 10 minutes, 20 minutes, half an hour and the like), for example, the sow status data indicates that the sow is in a standing state and the intensity of activity is small, the sow can be considered as being in a static state, the value of the static index is large according to the longer the duration of the static state, and correspondingly, the value of the agitation index is small due to the smaller intensity of activity. For another example, if the sow status data indicate that the activity intensity of the sows is large in a specific time period, the sows are considered to be in a restless state at present, the restless index at this time is high, and no still index exists because the sows are always in the restless state (the still index requires that the sows are in the still state and is determined according to the time length of the still state). For another example, the sow status data of the first half time indicates that the sow is in a static state and the sow status data of the second half time indicates that the sow is in a restless state within a certain time period, so that a static index and a restless index are respectively determined. Therefore, the standing index and the agitation index of the current sow can be accurately obtained according to the ranging data and the activity data.
Referring to fig. 11, in some embodiments, step 013 further includes:
0134: processing the motion video image according to a sow motion recognition algorithm to generate a motion index; and
0135: and generating a static index and a restlessness index according to the motion index.
Referring to fig. 12, in some embodiments, the second obtaining module 13 includes a second processing unit 134 and a fifth generating unit 135. The second processing unit 134 is configured to process the moving video image according to a sow action recognition algorithm to generate a motion index. The fifth generation unit 135 is configured to generate a stillness index and a restlessness index from the motion index. That is, step 0134 may be implemented by the second processing unit 134, and step 0135 may be implemented by the fifth generating unit 135.
Referring again to fig. 3, in some embodiments, the oestrus detection system 100 further comprises a camera 50, wherein the camera 50 is configured to capture a moving video image of the sow; the processor 20 is further configured to process the motion video image according to a sow motion recognition algorithm to generate a motion index; and generating a static index and a restlessness index according to the motion index. That is, step 0134 and step 0135 may be implemented by processor 20.
Specifically, in the example shown in fig. 3, the camera 50 is mounted on a mounting bracket 60 for alignment with the sow so that the sow is within the field of view of the camera 50. The camera 50 may capture a motion video image of the sow to generate a motion video image, and the processor 20 processes the motion video image according to a sow motion recognition algorithm to generate a motion index.
The processor 20 processes the motion video image according to the sow motion recognition algorithm, and specifically, the image data of each frame of the current sow is matched with the motion template data of the sow in the database, so that whether the current sow stands or not is determined, deep learning can be realized, the motion template data of the sow in the database is continuously updated and learned, and the judgment accuracy of the standing state of the sow is improved.
Meanwhile, the processor 20 may determine the activity intensity of the sow according to the change of the pixel values of the pixels of the consecutive multiple frames of motion video images, for example, the larger the change of the pixel values of the pixels of the motion video images is, the more intense the activity of the sow is, so that the processor 20 may determine the motion index according to whether the sow stands and the activity intensity of the sow. The static index and the agitation index can be determined according to the exercise index within a period of time (such as within 10 minutes, within 20 minutes and the like), for example, the static index and the agitation index are generated according to the exercise index within 10 minutes, in the first 6 minutes, the exercise indexes indicate that the sow stands and the activity intensity is low, the sow is in a static state, the static index can be obtained (assuming that the static 10 minutes reaches the maximum value 1, the current static index can be 0.6), and in the last 4 minutes, the exercise indexes indicate that the sow stands and the activity intensity is high, and the corresponding agitation index (such as 0.6) can be obtained. Processor 20 may generate a static index and a restlessness index based on the movement index. Thus, only one camera 50 is needed to obtain the static index and the agitation index.
Referring to fig. 13, in some embodiments, the motion information includes ranging data and activity data, and step 013 further includes:
0136: detecting whether the pigsty parameters exist or not;
0137: initializing the pigsty parameters when the pigsty parameters do not exist; and
0138: initializing a motion model;
step 0134 includes:
0139: the motion video image is processed according to the motion model to generate a motion indicator.
Referring to fig. 14, in some embodiments, the second obtaining module 13 further includes a detecting unit 136, a first initializing unit 137 and a second initializing unit 138. The detecting unit 136 is used for detecting whether the pigsty parameters exist. The first initialization unit 137 is configured to initialize the pigsty parameters when the pigsty parameters do not exist. The second initialization unit 138 is used to initialize the motion model. The second processing unit 134 is further configured to process the motion video image according to the motion model to generate a motion indicator. That is, step 0136 may be implemented by the detection unit 136, step 0137 may be implemented by the first initialization unit 137, step 0138 may be implemented by the second initialization unit 138, and step 0139 may be implemented by the second processing unit 134.
Referring again to fig. 3, in some embodiments, the processor 20 is further configured to: detecting whether the pigsty parameters exist or not; initializing the pigsty parameters when the pigsty parameters do not exist; initializing a motion model; and processing the motion video image according to the motion model to generate a motion index. That is, step 0136, step 0137, step 0138 and step 0139 may be implemented by processor 20.
Specifically, at the time of first estrus examination (first estrus examination after the estrus detection system 100 is installed), as in the example shown in fig. 3, the camera 50 is installed on the mounting bracket 60 at one end of the swinery 70. In order to remove the influence of the pigsty 70 in the sports video image of the sow, the image of the pigsty 70 in the sports video image can be removed. It will be appreciated that since the sow is enclosed by the swinery 70, the sow is substantially in the central region of the field of view of the camera 50 and the relative positions of the sow and the swinery 70 in the image are substantially unchanged. The processor 20 can judge the approximate area of the pigsty 70 in the image according to the continuous multi-frame motion video image of the sow, so as to initialize the pigsty parameters (including the area parameters of the pigsty 70 in the image), and can quickly remove the image of the area according to the pigsty parameters to remove the image of the pigsty 70 when the motion video image of the sow is subsequently processed, so that the motion video image only includes the image of the sow, and the calculation accuracy of the subsequent motion index is improved. When the situation is not initially checked, the pigsty parameters can be directly detected to remove the images of the pigsty 70 in the motion video images because the initialization of the pigsty 70 is already carried out. And because the relative positions of the sow and the swinery 70 in the image are basically unchanged, the swinery parameters are basically unchanged, the sow and the swinery 70 only need to be initialized once during the initial heat check to determine the swinery parameters, and the swinery parameters do not need to be initialized every time a motion video image is acquired.
In the example shown in fig. 15, the processor 20 first acquires a motion video image of the sow and then detects the pigsty parameters, and when the pigsty parameters can be detected, the processor 20 initializes the motion model, and when the pigsty parameters cannot be detected, the processor 20 initializes the pigsty parameters. The generation of the motion index needs to acquire the pixel value change in the continuous multi-frame motion video images, after the motion model is initialized, the motion model can comprise the continuous multi-frame motion video images, the processor 20 processes the subsequent motion video images according to the motion model to generate the motion index, the processor 20 compares the pixel value change of the motion video images within a period of time (such as 1 minute, 2 minutes and the like) according to the pixel values of the continuous multi-frame images and the current motion video images in the motion model, and the motion index is generated by combining the standing state of the sow of each frame of motion video images.
Referring to fig. 16, in some embodiments, the motion information includes ranging data and activity data, and step 013 further includes:
0140: and updating the motion model according to the motion video image every first preset time.
Referring to fig. 17, in some embodiments, the second obtaining module 13 further includes an updating unit 140. The updating unit 140 is configured to update the motion model according to the motion video image every first predetermined time. That is, step 0140 may be implemented by update unit 140.
Referring again to fig. 3, in some embodiments, the processor 20 is further configured to: and updating the motion model according to the motion video image every first preset time. That is, step 0140 may be implemented by processor 20.
Specifically, as shown in the example of fig. 15, the motion model may be updated according to the motion video image within the first predetermined time T1 after every first predetermined time T1 (e.g., 10 minutes, 20 minutes, 30 minutes, etc.), so as to ensure that the capturing time of the motion video image included in the motion model and the current motion video image does not differ too much when the motion index is obtained. It can be understood that if the acquisition time of the motion video image in the motion model and the current motion video image is too long, for example, half a day away, the change of the factor such as the change of light is large, which leads to the change of the pixel value of the motion video image in the motion model and the current motion video image being necessarily large, thereby affecting the accuracy of the motion index.
Referring to fig. 18, in some embodiments, the motion information includes ranging data and activity data, and step 013 further includes:
0141: processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear;
0142: determining a standing index, a vertical ear index and a restlessness index according to the posture information;
0143: determining a static index according to the static index and the vertical ear index; and
0144: and determining a restlessness index according to the restlessness index.
Referring to fig. 19, in some embodiments, the second obtaining module 13 further includes a third processing unit 141, a first determining unit 142, a second determining unit 143, and a third determining unit 144. The third processing unit 141 is configured to process the moving video image according to the pig ear automatic detection and pose classification model based on the deep neural network to obtain pose information of the pig ear. The first determination unit 142 is configured to determine a standing index, a monaural index, and a restlessness index according to the posture information. The second determining unit 143 is configured to determine the static index according to the static index and the vertical ear index. The third determination unit 144 is configured to determine a restlessness index from the restlessness index. That is, step 0141 may be implemented by third processing unit 141, step 0142 may be implemented by first determining unit 142, step 0143 may be implemented by second determining unit 143, and step 0144 may be implemented by third determining unit 144.
Referring again to fig. 3, in some embodiments, the processor 20 is further configured to: processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear; determining a standing index, a vertical ear index and a restlessness index according to the posture information; determining a static index according to the static index and the vertical ear index; and determining a restlessness index according to the restlessness index. That is, step 0141, step 0142, step 0143 and step 0140 may be implemented by processor 20.
Specifically, in the example shown in fig. 20, processor 20 processes the motion video image according to a pig ear automatic detection and pose classification model based on a deep neural network to obtain pose information of the pig ear. The automatic pig ear detection and posture classification model can be generated according to the posture information of detected sows and corresponding motion information, because the postures of the pig ears of the sows of different varieties are different, the automatic pig ear detection and posture classification model can be generated according to the posture information of the detected sows of the same variety and the corresponding motion information, and the automatic pig ear detection and posture classification model can continuously update and learn the posture information of new pig ears according to the posture information of the pig ears of the subsequently detected sows and the corresponding motion video images, so that the corresponding automatic pig ear detection and posture classification models are respectively established for the sows of different varieties, and the accurate detection of the posture information of the pig ears of the sows of different varieties is realized. The pig ear automatic detection and posture classification model can identify the posture information of the pig ears in the current motion video image. The corresponding static index, vertical ear index and restlessness index are preset for different posture information, and for example, the static index, the vertical ear index and the restlessness index are calculated every 10 minutes. Processor 20 then generates a static index based on the static index and the erector index over a particular time period (e.g., over 30 minutes), and a restless index based on the restless index over the particular time period.
Referring to fig. 21, in some embodiments, the motion information includes ranging data, motion data, and a motion video image, the static indicators include a first static indicator, a second static indicator, and a third static indicator, the agitation indicators include a first agitation indicator, a second agitation indicator, and a third agitation indicator, and step 013 includes:
0131: acquiring time domain characteristic values and frequency domain characteristic values of ranging data and activity data;
0132: generating sow state data according to the time domain characteristic value and the frequency domain characteristic value;
0145: generating a first static index and a first agitation index according to the sow state data;
0134: processing the motion video image according to a sow motion recognition algorithm to generate a motion index;
0146: generating a second static index and a second agitation index according to the motion index;
0141: processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear;
0142: determining a standing index, a vertical ear index and a restlessness index according to the posture information;
0147: determining a third static index according to the static index and the vertical ear index;
0148: determining a third agitation index according to the agitation index;
0149: calculating a static index according to the first static index, the second static index and the third static index; and
0150: and calculating the restlessness index according to the first restlessness index, the second restlessness index and the third restlessness index.
Referring to fig. 22, in some embodiments, the second obtaining module 13 further includes a first calculating unit 149 and a second calculating unit 150. The first obtaining unit 131 is configured to obtain time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data. The third generating unit 132 is configured to generate sow status data according to the time domain feature values and the frequency domain feature values. The fourth generating unit 133 is further configured to generate a first stationary index and a first restless index from the sow status data. The second processing unit 134 is configured to process the moving video image according to a sow action recognition algorithm to generate a motion index. The fifth generation unit 135 is configured to generate a second stationary index and a second restless index from the motion index. The third processing unit 141 is configured to process the moving video image according to the pig ear automatic detection and pose classification model based on the deep neural network to obtain pose information of the pig ear. The first determination unit 142 is configured to determine a standing index, a monaural index, and a restlessness index according to the posture information. The second determining unit 143 is further configured to determine a third static index according to the static index and the vertical ear index. The third determination unit 144 is further configured to determine a third restless index based on the restless index. The first calculating unit 149 is configured to calculate a stationary index according to the first stationary index, the second stationary index, and the third stationary index. The second calculation unit is used for calculating the restless index according to the first restless index, the second restless index and the third restless index.
That is, step 0131 may be implemented by first acquiring unit 131, step 0132 may be implemented by third generating unit 132, step 0145 may be implemented by fourth generating unit 133, step 0134 may be implemented by second processing unit 134, step 0146 may be implemented by fifth generating unit 135, step 0141 may be implemented by third processing unit 141, step 0142 may be implemented by first determining unit 142, step 0147 may be implemented by second determining unit 143, step 0148 may be implemented by third determining unit 144, step 0149 may be implemented by first calculating unit 149, and step 0150 may be implemented by second calculating unit 150.
Referring again to fig. 3, in some embodiments, the oestrus detection system 100 further comprises a distance sensor 30, an activity sensor 40 and a camera 50, the distance sensor 30 being configured to measure distance data of the sow from the ground; the activity sensor 40 is used for collecting activity data of the sow; the camera 50 is used for acquiring a motion video image of the sow; motion information includes range data, activity data and motion video image, and quiet index includes first quiet index, second quiet index and the quiet index of standing of third, and restless index includes first restless index, second restless index and third restless index, and processor 20 still is used for: acquiring time domain characteristic values and frequency domain characteristic values of ranging data and activity data; generating sow state data according to the time domain characteristic value and the frequency domain characteristic value; generating a first static index and a first agitation index according to the sow state data; processing the motion video image according to a sow motion recognition algorithm to generate a motion index; calculating a second static index and a second agitation index according to the motion index; processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear; determining a standing index, a vertical ear index and a restlessness index according to the posture information; determining a third static index according to the static index and the vertical ear index; determining a third agitation index according to the agitation index, and calculating a static index according to the first static index, the second static index and the third static index; and calculating the restlessness index according to the first restlessness index, the second restlessness index and the third restlessness index. That is, step 0131, step 0132, step 0134, step 0141, step 0142, step 0145, step 0146, step 0147, step 0148, step 0149, and step 0150 may be implemented by processor 20.
Specifically, as in the example shown in fig. 3, processor 20 is disposed on mounting bracket 60 and is coupled to ranging sensor 30, activity sensor 40, and camera 50.
In the example shown in fig. 23, the processor 20 processes the ranging data and the activity data to obtain a time domain characteristic value and a frequency domain characteristic value of the ranging data, and a time domain characteristic value and a frequency domain characteristic value of the activity data. The processor 20 may determine whether the sow is in a standing state according to the time domain characteristic value and the frequency domain characteristic value of the ranging data. Processor 20 determines the intensity of sow activity based on the time domain feature values and frequency domain feature values of the activity data. The processor 20 can obtain sow status data including whether the sow is in a standing state and the activity intensity of the sow according to the time domain characteristic value and the frequency domain characteristic value of the ranging data and the time domain characteristic value and the frequency domain characteristic value of the activity data respectively. The processor 20 then generates a first quiet index and a second restless index based on the sow status data. The details of the specific method for generating the first stationary index and the first agitation index are similar to those described in detail above with respect to step 133, and are not described again here.
The processor 20 processes the moving video images according to a sow action recognition algorithm to recognize whether the sow is in a standing state. The activity intensity of the sow is determined by the change of the pixel values of a plurality of pixels of the continuous multi-frame motion video images. The processor 20 can determine the exercise index according to whether the sow stands and the activity intensity of the sow, and then generate a second standing index and a second agitation index according to the exercise index. The details of the specific method for generating the second stationary index and the second restless index are similar to those described in detail above with respect to step 135 and will not be repeated here.
The processor 20 identifies the posture information of the pig ears in the moving video image according to the pig ear automatic detection and posture classification model, and then can determine the corresponding static index, vertical ear index and restlessness index according to the posture information. And then determining a third stationary index according to the stationary index and the auricle index and determining a third agitation index according to the agitation index. The details of the specific method for generating the third stationary index and the third agitation index are similar to those described in detail above with respect to step 147 and step 148, and are not described again here.
After obtaining the first stationary index, the second stationary index, the third stationary index, the first restless index, the second restless index, and the third restless index, the processor 20 calculates the stationary index according to the first stationary index, the second stationary index, and the third stationary index, and calculates the restless index according to the first restless index, the second restless index, and the third restless index.
In one example, the processor 20 calculates an average of the first still index, the second still index, and the third still index as the still index, and calculates an average of the first restless index, the second restless index, and the third restless index as the restless index. In another example, the processor 20 allocates different weights to the first static index, the second static index and the third static index, and calculates the static index according to the first static index, the second static index and the third static index and the corresponding weights. For example, the first, second, and third stationary indexes are 0.8, 0.9, and 0.95, respectively, the weights of the first, second, and third stationary indexes are 0.4, 0.3, and 0.3, respectively, and the stationary index calculated from the first, second, and third stationary indexes and the corresponding weights is 0.8 + 0.4+0.9 + 0.3+0.95 + 0.875. Similarly, the processor 20 allocates different weights to the first restless index, the second restless index and the third restless index, and calculates according to the first restless index, the second restless index and the third restless index and the corresponding weights to obtain the restless index. For example, the first, second, and third agitation indexes are 0.9, 0.8, and 0.9, respectively, the weights of the first, second, and third agitation indexes are 0.4, 0.3, and 0.3, respectively, and the agitation index calculated from the first, second, and third agitation indexes and the corresponding weights is 0.9 × 0.4+0.8 × 0.3+0.9 × 0.3 — 0.87. Therefore, the processor 20 obtains a first static index and a first agitation index, a second static index and a second agitation index, and a third static index and a third agitation index through different algorithms, calculates the static indexes according to the first static index, the second static index and the third static index, calculates the agitation indexes according to the first agitation index, the second agitation index and the third agitation index, and can improve the accuracy of the static indexes and the agitation indexes.
Referring to fig. 24, in some embodiments, step 015 includes:
0151: the prediction index, the static index and the agitation index are processed according to the fuzzy decision model to generate the estrus probability.
Referring to fig. 25, in some embodiments, the second generating module 15 includes a sixth generating unit 151. The sixth generating unit 151 is configured to process the prediction index, the stationary index, and the restlessness index according to a fuzzy decision model to generate an estrus probability. That is, step 0151 may be implemented by the sixth generating unit 151.
Referring again to fig. 3, in certain embodiments, the processor 20 is further configured to process the prediction index, the quiet index, and the agitation index according to a fuzzy decision model to generate an estrus probability.
Specifically, as in the example shown in fig. 26, after obtaining the prediction index, the stationary index, and the agitation index, the prediction index, the stationary index, and the agitation index are input into the fuzzy decision model, and the processor 20 processes the prediction index, the stationary index, and the agitation index according to the fuzzy decision model to generate the estrus probability. The fuzzy decision model stores mapping rules of different prediction indexes, static indexes and agitation indexes and the estrus probability, namely the corresponding estrus probability can be calculated according to the different prediction indexes, static indexes and agitation indexes. After obtaining the estrus probability, the processor 20 determines whether the estrus probability is greater than or equal to a predetermined threshold (e.g., 0.8), determines that the current sow is in estrus when the estrus probability is greater than or equal to the predetermined threshold, and determines that the current sow is not in estrus when the estrus probability is less than the predetermined threshold.
In one example, the higher the prediction index is, the higher the oestrus probability is, the higher the stationary index is, the higher the oestrus probability is, and the higher the agitation index is, the higher the oestrus probability is, and when the prediction index, the stationary index and the agitation index are all higher, the higher the oestrus probability can be determined, for example, the prediction index, the stationary index and the agitation index are respectively 0.82, 0.94 and 0.85, and if the oestrus probability is 0.86 according to the calculation of the fuzzy decision model, the current sow can be determined to have oestrus; and when the prediction index is general and the static index and the agitation index are high, determining high oestrus probability, wherein if the prediction index, the static index and the agitation index are respectively 0.6, 0.8 and 0.95, and if the oestrus probability is 0.8 according to the calculation of the fuzzy decision model, judging that the current sow has oestrus. Therefore, the oestrus probability is determined by comprehensively considering the prediction index, the standing index and the agitation index, so that the oestrus probability is accurate, and the accuracy of oestrus detection is improved.
Referring to fig. 27, in some embodiments, step 015 includes:
0152: and calculating the oestrus probability of the sow to be detected every second preset time.
Referring to fig. 28, in some embodiments, the second generation module 15 includes a third calculation unit 152. The third calculating unit 152 is configured to calculate the oestrus probability of the sow to be detected once every second predetermined time. That is, step 0152 may be implemented by the third calculation unit 152.
Referring again to fig. 3, in some embodiments, the processor 20 is also configured to calculate the oestrus probability of the sow to be tested once every second predetermined time.
Specifically, as shown in the example of fig. 26, the processor 20 may obtain the oestrus probability every second predetermined time T2 to determine whether the sow is oestrous, the second predetermined time T2 may be determined according to the user's requirement, and the user may manually input the second predetermined time T2. For example, if the user wants to check every 2 hours whether the sow is in estrus, the second predetermined time T2 may be set to 2 hours, and the calculated pressure at the time of estrus detection may be reduced. Alternatively, the user may set the second predetermined time T2 to 1 second in order to check whether the sow is in heat in real time and to perform a timely process when the sow is in heat, in order to check whether the sow is in heat.
Referring to fig. 29, in some embodiments, the method for detecting oestrus further comprises: 017: when the sow is in estrus, prompt information is sent out.
Referring to fig. 30, in some embodiments, the oestrus detection device 10 further comprises a prompt module 17. The prompting module 17 is used for sending out prompting information when the sow is oestrous. That is, step 017 may be implemented by hinting module 17.
Referring again to fig. 3, in some embodiments, the oestrus detection system 100 further comprises a reminder 80, the reminder 80 being configured to send a reminder message after the sow has oestrus.
Specifically, when the processor 20 determines that the sow is oestrous, the processor 20 may send an oestrous instruction to the prompter 80, and the prompter 80 sends corresponding prompting information according to the oestrous instruction. For example, the prompter 80 may be a buzzer that sounds a sound prompt after receiving an estrus instruction. For another example, the prompter 80 may also be a display screen, and the prompter 80 is disposed on the mounting frame 60 and displays "estrus" after receiving the instruction of estrus to prompt the user that the sow currently estruses. For another example, as shown in fig. 31, the prompter 80 may be a mobile phone client, the processor 20 is in communication with the prompter 80, and the prompter 80 pushes a prompt notification corresponding to "sow 002 in pigsty a2 is already in heat" after receiving an instruction of already in heat (including sow pigsty number, sow number, heat information, etc.). Thus, the prompter 80 can prompt the user to breed the sow in estrus in time when the sow is in estrus.
Referring to fig. 32, the oestrus detection system 100 may further comprise a server 90, the processor 20 may be disposed within the server 90 and the server 90 is communicatively connected to the ranging sensor 30, the activity sensor 40 and the camera 50 and the prompter 80. At this time, the processor 20 of the server 90 may perform all the functions of any of the above embodiments, the data collected by the ranging sensor 30, the activity sensor 40 and the camera 50 is processed by the processor 20 of the server 90, and the processor 20 may control the prompting device 80 to issue the prompting information according to the processing result after the processing is completed. In other embodiments, a plurality (e.g., 2) of processors 20 are provided on the mounting rack 60 and the server 90, respectively. The data collected by the ranging sensor 30, the activity sensor 40 and the camera 50 can be processed partly by the processor 20 of the mounting bracket 60 and partly by the processor 20 of the server 90, so that the processing speed can be increased.
According to the oestrus detection method, the oestrus detection device 10 and the oestrus detection system 100, the prediction index is generated through the current time and the predicted oestrus time, the standing index and the agitation index are generated according to the motion information of the sow, and the oestrus probability is generated by comprehensively considering the prediction index, the standing index and the agitation index, so that when the oestrus probability is larger than a preset threshold value, the oestrus of the sow is determined, staff with abundant experience is not required to go to check the oestrus, the labor cost is saved, and automatic oestrus checking can be realized to improve the oestrus checking efficiency. The oestrus detection method can realize real-time oestrus detection, can timely find out whether the sows oestrus and prompt, and prevents the sows from missing the oestrus period, thereby improving the production efficiency.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (27)

1. An estrus detection method, characterized in that the estrus detection method comprises:
acquiring information of a sow to be detected to determine the predicted oestrus time of the sow, wherein the information of the sow comprises breed, weight, age of day and current physiological stage;
generating a prediction index according to the current time and the predicted estrus time;
acquiring motion information of the sow to generate a static index and a restlessness index of the sow;
generating an oestrus probability according to the prediction index, the static index and the agitation index; and
determining that the sow has estrus when the probability of estrus is greater than or equal to a predetermined threshold.
2. The oestrus detection method of claim 1 wherein the obtaining of information about a sow to be detected to determine a predicted oestrus time of the sow comprises:
generating an estrus time prediction model according to the estrus information of the sows which have estrus at multiple heads; and
and processing the information of the sow to be detected according to the estrus time prediction model to determine the predicted estrus time.
3. The method of claim 1, wherein generating a prediction index based on the current time and the predicted estrus time comprises: and generating the prediction index according to the time difference between the current time and the predicted estrus time.
4. The estrus detection method of claim 1 wherein the athletic information includes range data and activity data, and the obtaining the athletic information of the sow to generate a quiet index and a restless index of the sow comprises:
acquiring time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data;
generating sow state data according to the time domain characteristic values and the frequency domain characteristic values; and
and generating the static index and the agitation index according to the sow state data.
5. The estrus detection method of claim 1 wherein the motion information comprises motion video images, and the obtaining motion information of the sow to generate a quiet index and a restless index of the sow comprises:
processing the motion video image according to a sow motion recognition algorithm to generate a motion index; and
and generating the static index and the agitation index according to the motion index.
6. The estrus detection method of claim 5 wherein said obtaining athletic information of said sow to generate a quiet index and a restless index of said sow further comprises:
detecting whether the pigsty parameters exist or not;
initializing the swinery parameter when the swinery parameter does not exist;
initializing a motion model;
the processing the motion video image according to the sow action recognition algorithm to generate a motion index comprises:
processing the motion video image according to the motion model to generate the motion indicator.
7. The estrus detection method according to claim 6 wherein said obtaining athletic information of said sow to generate a quiet index and a restless index of said sow further comprises:
and updating the motion model according to the motion video image every first preset time.
8. The estrus detection method of claim 1 wherein the motion information comprises motion video images, and the obtaining motion information of the sow to generate a quiet index and a restless index of the sow comprises:
processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear;
determining a standing index, a vertical ear index and a restlessness index according to the posture information;
determining the static index according to the static index and the vertical ear index; and
and determining the agitation index according to the agitation index.
9. The estrus detection method according to claim 1, wherein the motion information includes ranging data, motion data and motion video images, the static indicators include a first static indicator, a second static indicator and a third static indicator, the agitation indicators include a first agitation indicator, a second agitation indicator and a third agitation indicator, and the obtaining the motion information of the sow to generate the static indicators and the agitation indicators of the sow comprises:
acquiring time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data;
generating sow state data according to the time domain characteristic values and the frequency domain characteristic values;
generating the first static index and the first agitation index according to the sow state data;
processing the motion video image according to a sow motion recognition algorithm to generate a motion index;
calculating the second static index and the second agitation index according to the motion index;
processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear;
determining a standing index, a vertical ear index and a restlessness index according to the posture information;
determining the third static index according to the static index and the vertical ear index;
determining the third restless index according to the restless index;
calculating the static index according to the first static index, the second static index and the third static index; and
and calculating the agitation index according to the first agitation index, the second agitation index and the third agitation index.
10. The oestrus detection method of claim 8 or 9 wherein the pig ear automated detection and posture classification model is generated from the posture information and the corresponding movement information of a plurality of detected sows.
11. The method according to claim 1, wherein the generating an estrus probability from the predictive index, the static index, and the agitation index comprises:
processing the prediction index, the static index, and the agitation index according to a fuzzy decision model to generate the estrus probability.
12. The method of claim 1, wherein generating an estrus probability based on the predictive index, the static index, and the agitation index further comprises: and calculating the oestrus probability of the sow to be detected every second preset time.
13. The estrus detection method according to claim 1, further comprising: and sending out prompt information when the sow is oestrous.
14. An estrus detection device, characterized in that said estrus detection device comprises:
the first acquisition module is used for acquiring information of a sow to be detected so as to determine the predicted oestrus time of the sow, wherein the information of the sow comprises breed, weight, age of day and current physiological stage;
the first generation module is used for generating a prediction index according to the current time and the predicted estrus time;
the second acquisition module is used for acquiring the motion information of the sow to generate a static index and a restless index of the sow;
the second generation module is used for generating an oestrus probability according to the prediction index, the static index and the agitation index; and
a determination module for determining that the sow has estrus when the estrus probability is greater than or equal to a predetermined threshold.
15. An estrus detection system, comprising a processor configured to:
acquiring information of a sow to be detected to determine the predicted oestrus time of the sow, wherein the information of the sow comprises breed, weight, age of day and current physiological stage;
generating a prediction index according to the current time and the predicted estrus time;
acquiring motion information of the sow to generate a static index and a restlessness index of the sow;
generating an oestrus probability according to the prediction index, the static index and the agitation index; and
determining that the sow has estrus when the probability of estrus is greater than or equal to a predetermined threshold.
16. The oestrus detection system of claim 15 wherein the processor is further configured to: generating an estrus time prediction model according to the estrus information of the sows which have estrus at multiple heads; and processing the information of the sow to be detected according to the estrus time prediction model to determine the predicted estrus time.
17. The oestrus detection system of claim 15 wherein the processor is further configured to: and generating the prediction index according to the time difference between the current time and the predicted estrus time.
18. The estrus detection system according to claim 15 further comprising a distance sensor and an activity sensor, said distance sensor for measuring distance data of said sow from the ground; the activity sensor is used for acquiring activity data of the sow; the motion information includes the ranging data and the activity data, the processor is further configured to: acquiring time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data; generating sow state data according to the time domain characteristic values and the frequency domain characteristic values; and generating the static index and the agitation index according to the sow state data.
19. The estrus detection system of claim 15 further comprising a camera for capturing video images of the sow's motion; the motion information comprises the motion video image, the processor is further configured to: processing the motion video image according to a sow motion recognition algorithm to generate a motion index; and generating the static index and the agitation index according to the motion index.
20. The oestrus detection system of claim 19 wherein the processor is further configured to: detecting whether the pigsty parameters exist or not; initializing the swinery parameter when the swinery parameter exists; initializing a motion model; and processing the motion video image according to the motion model to generate the motion index.
21. The oestrus detection system of claim 20 wherein the processor is further configured to: and updating the motion model according to the motion video image every first preset time.
22. The estrus detection system of claim 15 further comprising a camera for capturing video images of the sow's motion; the motion information comprises the motion video image, the processor is further configured to: processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear; determining a standing index, a vertical ear index and a restlessness index according to the posture information; determining the static index according to the static index and the vertical ear index; and determining the agitation index according to the agitation index.
23. The estrus detection system of claim 15 further comprising a distance sensor for measuring distance data of said sow from the ground, an activity sensor and a camera; the activity sensor is used for acquiring activity data of the sow; the camera is used for collecting a motion video image of the sow; the motion information comprises ranging data, motion data and motion video images, the static indexes comprise first static indexes, second static indexes and third static indexes, the agitation indexes comprise first agitation indexes, second agitation indexes and third agitation indexes, and the processor is further used for: acquiring time domain characteristic values and frequency domain characteristic values of the ranging data and the activity data; generating sow state data according to the time domain characteristic values and the frequency domain characteristic values; generating the first static index and the first agitation index according to the sow state data; processing the motion video image according to a sow motion recognition algorithm to generate a motion index; calculating the second static index and the second agitation index according to the motion index; processing the motion video image according to a pig ear automatic detection and posture classification model based on a deep neural network to obtain the posture information of the pig ear; determining a standing index, a vertical ear index and a restlessness index according to the posture information; determining the third static index according to the static index and the vertical ear index; determining the third restless index according to the restless index; calculating the static index according to the first static index, the second static index and the third static index; and calculating the agitation index according to the first agitation index, the second agitation index and the third agitation index.
24. The oestrus detection system of claim 22 or 23 wherein the pig ear automated detection and pose classification model is generated from the pose information and corresponding motion information for a plurality of detected sows.
25. The oestrus detection system of claim 15 wherein the processor is further configured to: processing the prediction index, the static index, and the agitation index according to a fuzzy decision model to generate the estrus probability.
26. The oestrus detection system of claim 15 wherein the processor is further configured to: and calculating the oestrus probability of the sow to be detected every second preset time.
27. The estrus detection system according to claim 15 further comprising a reminder for giving a reminder after the sow has estrus.
CN201910317979.8A 2019-04-19 2019-04-19 Estrus detection method, estrus detection device and estrus detection system Active CN109984054B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910317979.8A CN109984054B (en) 2019-04-19 2019-04-19 Estrus detection method, estrus detection device and estrus detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910317979.8A CN109984054B (en) 2019-04-19 2019-04-19 Estrus detection method, estrus detection device and estrus detection system

Publications (2)

Publication Number Publication Date
CN109984054A CN109984054A (en) 2019-07-09
CN109984054B true CN109984054B (en) 2021-07-20

Family

ID=67134143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910317979.8A Active CN109984054B (en) 2019-04-19 2019-04-19 Estrus detection method, estrus detection device and estrus detection system

Country Status (1)

Country Link
CN (1) CN109984054B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866481B (en) * 2019-11-07 2023-06-30 北京小龙潜行科技有限公司 Sow oestrus detection method and device
CN112970620B (en) * 2019-12-17 2023-07-21 中移(成都)信息通信科技有限公司 Method, apparatus, system, device and medium for detecting estrus status
CN111914685B (en) * 2020-07-14 2024-04-09 北京小龙潜行科技有限公司 Sow oestrus detection method and device, electronic equipment and storage medium
CN111870387B (en) * 2020-08-04 2022-02-18 青岛农业大学 Multifunctional sow oestrus detection device
CN114097628A (en) * 2020-12-31 2022-03-01 重庆市六九畜牧科技股份有限公司 Replacement gilt oestrus monitoring and management method
CN113016657A (en) * 2021-03-05 2021-06-25 河南牧原智能科技有限公司 Pigsty sow oestrus identification system and application method thereof
EP4104676A1 (en) 2021-06-18 2022-12-21 Minitüb GmbH Method and system for monitoring estrus and ovulation
CN113711944B (en) * 2021-08-27 2023-03-03 河南牧原智能科技有限公司 Sow estrus identification method, device and system
CN114403043B (en) * 2021-12-20 2022-11-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114731967B (en) * 2022-04-08 2023-04-25 内蒙古慧云科技有限公司 Device and method for detecting optimal sow mating time

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103037685A (en) * 2010-07-30 2013-04-10 松下电器产业株式会社 Estrus detection device and estrus detection system
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190075756A1 (en) * 2017-09-11 2019-03-14 FarmIn Technologies Systems, methods, and apparatuses for animal weight monitoring and management

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103037685A (en) * 2010-07-30 2013-04-10 松下电器产业株式会社 Estrus detection device and estrus detection system
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision

Also Published As

Publication number Publication date
CN109984054A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109984054B (en) Estrus detection method, estrus detection device and estrus detection system
CN110839557B (en) Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN104188635B (en) Livestock living body measurement equipment and method
US20180300538A1 (en) Image processing system, image processing apparatus, image processing method, and image processing program
US10796141B1 (en) Systems and methods for capturing and processing images of animals for species identification
CN111914685B (en) Sow oestrus detection method and device, electronic equipment and storage medium
CN108492890B (en) Human health state monitoring system and method
CN107767874A (en) A kind of baby crying sound identification reminding method and system
CN114596448A (en) Meat duck health management method and management system thereof
KR102307478B1 (en) A Computer Vision for the Prediction System of Livestock Diseases and Their Methods
KR20080018642A (en) Remote emergency monitoring system and method
CN111227791A (en) Sleep quality monitoring method and sleep monitoring device
CN107064159B (en) Device and system for detecting and judging growth trend according to yellow leaves of plants
CN113273178A (en) Animal house monitoring method and animal house monitoring system
CN111191499B (en) Fall detection method and device based on minimum center line
US20220183811A1 (en) Estrus determination device for sow, method for determining estrus of sow, and program for determining estrus of sow
CN108670256A (en) A kind of milk cow respiratory rate monitoring system and method
CN111414811A (en) Pig drinking water monitoring system and method
CN106326672A (en) Falling into sleep detecting method and system
CN115359050B (en) Method and device for detecting abnormal feed intake of livestock
CN110263753A (en) A kind of object statistical method and device
CN110472482A (en) A kind of method and device of object identification and real time translation
JP2011193843A (en) Field crop information management system
CN206003119U (en) A kind of Crazing in grassland sheep grazing behaviour detecting system
CN112560750A (en) Video-based ground cleanliness recognition algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant