CN114403044A - Sow oestrus searching method and oestrus searching robot - Google Patents

Sow oestrus searching method and oestrus searching robot Download PDF

Info

Publication number
CN114403044A
CN114403044A CN202210023753.9A CN202210023753A CN114403044A CN 114403044 A CN114403044 A CN 114403044A CN 202210023753 A CN202210023753 A CN 202210023753A CN 114403044 A CN114403044 A CN 114403044A
Authority
CN
China
Prior art keywords
sow
oestrus
image
chassis
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210023753.9A
Other languages
Chinese (zh)
Inventor
高学宇
谢捷斌
李仲超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Zhitong Technology Co ltd
Original Assignee
Xiamen Zhitong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Zhitong Technology Co ltd filed Critical Xiamen Zhitong Technology Co ltd
Priority to CN202210023753.9A priority Critical patent/CN114403044A/en
Publication of CN114403044A publication Critical patent/CN114403044A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Environmental Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a sow oestrus checking method and a oestrus checking robot, wherein the method applies a sow oestrus checking robot which comprises an image collector and a body temperature measurer, and the method comprises the following steps: collecting a sow image through an image collector, introducing the collected sow image into a pre-trained convolutional neural network for convolution operation, and identifying an oestrus characteristic part image of the sow; enabling a body temperature measurer to position the position of the oestrus characteristic part image, and acquiring the temperature information of the oestrus characteristic part of the sow through the body temperature measurer; predicting and identifying the oestrus key state of the sow by regression analysis on the oestrus feature position image of the sow to obtain oestrus key state information of the sow; and carrying out normalized data processing on the temperature information of the characteristic parts of the sows and the oestrus key state information, and introducing a classifier for classification and identification so as to identify the oestrus state of the sows. The invention realizes automatic sow oestrus checking and judges whether the sow is in an oestrus state.

Description

Sow oestrus searching method and oestrus searching robot
Technical Field
The invention relates to the technical field of sow oestrus searching methods and devices, in particular to a sow oestrus searching method and a sow oestrus searching robot.
Background
One oestrus cycle of the sow generally comprises an early oestrus period, an oestrus period, a late oestrus period and a rest period, the oestrus condition of the sow is timely checked and mastered in the breeding process, the sow is guaranteed to finish fertilization in a reasonable time, the pregnancy and the farrowing efficiency of the sow can be effectively improved, and the pig raising benefit is maximized.
The current sow oestrus checking work mainly depends on manual experience judgment and analysis, in the large-scale breeding process, a plurality of sows are generally kept in a pigsty, a breeder checks the oestrus of the sows one by one, and the work load of mastering the oestrus conditions of all sows in the pigsty is large and difficult;
if artificial estrus induction is carried out on the sows before the estrus is checked, the sows can be promoted to estrus more effectively, and the estrus condition of the sows can be judged more effectively during the estrus checking.
Disclosure of Invention
The invention aims to provide a sow oestrus checking method and a sow oestrus checking robot, which overcome the defects and realize automatic oestrus checking of sows to judge whether the sows are in oestrus.
In order to achieve the above purpose, the solution of the invention is: a sow oestrus checking method applies a sow oestrus checking robot, the sow oestrus checking robot comprises an oestrus checking device, the oestrus checking device comprises an image collector and a body temperature measurer, and the method comprises the following steps:
s1, collecting the sow image through the image collector, introducing the collected sow image into a pre-trained convolutional neural network for convolution operation, and identifying the oestrus characteristic part image of the sow;
s2, enabling the body temperature measurer to locate the position of the oestrus characteristic part image, and acquiring oestrus characteristic part temperature information of the sow through the body temperature measurer;
s3, predicting and identifying the key oestrus state of the sow by regression analysis on the oestrus feature position image of the sow to obtain the key oestrus state information of the sow;
and S4, carrying out normalized data processing on the characteristic part temperature information and the oestrus key state information of the sow, and introducing a classifier for classification and identification so as to identify the oestrus state of the sow.
Further, the oestrus characteristic part image is an anus and/or vulva and/or ear image of the sow, and the oestrus characteristic part temperature information is anus and/or vulva and/or ear temperature of the sow.
Further, the key oestrus state is vulvar color and/or vulvar morphology and/or binaural posture of the sow when oestrus occurs.
Further, the classifier is a classifier based on a machine learning algorithm.
Further, the sow estrus-searching robot also comprises an estrus-inducing device, wherein the estrus-inducing device comprises a bionic execution mechanism which at least simulates one inducing characteristic of a boar;
before the step S1 is started, at least one attraction characteristic of the boar is firstly simulated through the bionic actuating mechanism so as to induce the oestrus of the sow.
Further, the attraction characteristic is at least one of a boar smell characteristic and a boar sound characteristic.
Furthermore, the sow emotion checking robot also comprises a walking mechanism, the walking mechanism comprises a chassis, a controller, a navigation obstacle avoidance sensor and a walking driving mechanism, the controller is arranged on the chassis, map information is stored in the controller, the controller is in communication connection with the navigation obstacle avoidance sensor and the walking driving mechanism, and the emotion checking device and the emotion inducing device are arranged on the chassis;
before the sow is induced to estrus, the navigation obstacle avoidance sensor is used for acquiring the position of the current walking mechanism and the topographic information of the current position, and the position and the topographic information of the current position are transmitted to the controller, and the controller is used for controlling the walking driving mechanism to drive the chassis to drive the estrus searching device and the estrus inducing device to move to the preset pigsty position of the sow with the pre-searched estrus according to the position, the topographic information of the current position and the stored map information of the position and the topographic information of the current position.
Further, after the chassis drives the emotion checking device and the emotion inducing device to move to a pigsty preset position of the sow with the emotion pre-checking, the navigation obstacle avoidance sensor collects position information of the sow with the emotion pre-checking, and the controller controls the walking driving mechanism to drive the chassis to drive the emotion checking device and the emotion inducing device to move to a preset relative position between the sow and the emotion inducing device.
The utility model provides a sow looks into feelings robot, includes running gear, looks into the feelings device, running gear includes the chassis, sets up controller, navigation on the chassis keep away barrier sensor, travel driving mechanism, look into the feelings device and install on the chassis, the controller communication is connected navigation keeps away barrier sensor, travel driving mechanism, looks into the feelings device, look into the feelings device for image acquisition ware, body temperature caliber, radio in at least one.
Further, still install on the chassis and lure the feelings device, it connects to lure the feelings device communication the controller, it includes the bionic actuating mechanism that an at least simulation boar lures the characteristic to lure the feelings device.
After the scheme is adopted, the invention has the beneficial effects that:
(1) collecting a sow image through the image collector, introducing the collected sow image into a pre-trained convolutional neural network for convolution operation, and identifying an oestrus characteristic part image of the sow, so that the automatic identification of the oestrus characteristic part of the sow is realized by utilizing the image collector and the convolutional neural network;
(2) the body temperature measurer is enabled to position the position of the oestrus characteristic part image, and the body temperature measurer is used for collecting the temperature information of the oestrus characteristic part of the sow, so that the body temperature measurer is automatically positioned to the premeasured sow oestrus characteristic part, and the temperature of the oestrus characteristic part of the sow is automatically measured;
(3) predicting and identifying the oestrus key state of the sow by regression analysis on the oestrus feature position image of the sow to obtain oestrus key state information of the sow, so that the acquired oestrus feature position image of the sow is automatically compared with the due state of the feature position when the sow oestrus occurs, and the oestrus state of the sow is preliminarily obtained;
(4) the feature part temperature information and the oestrus key state information of the sow are subjected to normalized data processing, and are guided into a classifier for classification and identification so as to identify the oestrus state of the sow, so that the feature part temperature and the oestrus key state of the sow are automatically combined to comprehensively obtain the oestrus condition of the sow, and the result is more reasonable and accurate.
Drawings
FIG. 1 is a schematic perspective view of a robot for examining emotion according to the present invention;
fig. 2 is a schematic view of the sectional structure of the bottom view of the emotion-checking robot.
Description of reference numerals: description of reference numerals: the system comprises a walking mechanism, a 2-emotion inducing device, a 3-chassis, a 4-driving motor, a 5-navigation obstacle avoidance sensor, a 6-walking driving mechanism, a 7-driving wheel, an 8-emotion checking device, a 9-laser radar and a 10-3D camera.
Detailed Description
The invention is described in detail below with reference to the accompanying drawings and specific embodiments.
The invention provides a sow emotion checking robot, as shown in figures 1 and 2, which comprises a walking mechanism 1, an emotion inducing device 2 and an emotion checking device 8, wherein the walking mechanism 1 comprises a chassis 3, a controller arranged on the chassis 3, a navigation obstacle avoidance sensor 5 and a walking driving mechanism 6, the controller is in communication connection with the navigation obstacle avoidance sensor 5 and the walking driving mechanism 6 and stores map information, the navigation obstacle avoidance sensor 5 is at least one of a laser radar 9 and a 3D camera 10, the navigation obstacle avoidance sensor 5 faces the advancing direction of the walking mechanism 1, in the embodiment, the navigation obstacle avoidance sensor 5 simultaneously comprises the laser radar 9 and the 3D camera 10, the navigation obstacle avoidance sensor 5 acquires the position of the current walking mechanism 1 and the topographic information of the current position, the position and the topographic information of the current position are processed by the controller to form a control signal for the walking driving mechanism 6, to control the walking driving mechanism 6 to make corresponding actions, the specific process is described in detail later; the walking driving mechanism 6 comprises four driving wheels 7 which are respectively rotatably arranged at four corners of the bottom of the chassis 3 and four driving motors 4 which are arranged on the chassis 3, the four driving motors 4 are respectively and electrically connected with the controller, each driving motor 4 is in driving connection with one driving wheel 7, the driving connection form between each driving motor 4 and each driving wheel 7 is not limited, when turning is needed, the controller controls the four driving wheels 7 to rotate in a differential speed manner to drive the chassis 3 to turn, when straight going is needed, the controller controls the four driving wheels 7 to synchronously rotate to drive the chassis 3 to go straight, and compared with a steering manner adopting steering wheels, the turning radius in a narrow pigsty passageway is smaller, and the turning is more flexible;
the estrus inducing device 2 is installed on the chassis 3, the estrus inducing device 2 is in communication connection with the controller, the controller controls the estrus inducing device 2 to act, the estrus inducing device 2 comprises a bionic execution mechanism which at least simulates one characteristic of a boar, the bionic execution mechanism is at least one of a smell device, a sound device and a vision device, specifically, the bionic execution mechanism is at least one of a sprayer, a loudspeaker and a display to respectively simulate the smell, the sound and the appearance of the boar, the bionic execution mechanism in the embodiment comprises a sprayer and a loudspeaker, gas with boar hormone or smell is sprayed out through the sprayer, the sound of the boar is played through the loudspeaker, and the sow is induced to estrus;
the estrus detecting device 8 is mounted on the chassis 3, the estrus detecting device 8 is in communication connection with the controller and is used for detecting the oestrus state of the sow, the estrus detecting device 8 is at least one of an image collector, a body temperature measurer and a radio, in the embodiment, the estrus detecting device 8 comprises an image collector and a body temperature measurer, the image collector is used for collecting characteristic part images of the sow, the body temperature measurer is used for measuring the temperature of the characteristic part of the sow, the body temperature measurer is an infrared thermometer, the controller is used for judging whether the sow is oestrous or not according to the characteristic part images of the sow and the temperature of the characteristic part of the sow, and the specific judging process is described in detail later.
The invention provides a sow oestrus checking method which comprises the following steps:
before the estrus is searched, firstly, the navigation obstacle avoidance sensor 5 acquires the position of the current walking mechanism 1 in the pigsty and the topographic information of the current position, and transmits the position and the topographic information of the current position to the controller, the controller controls the walking driving mechanism 6 to drive the chassis 3 to drive the estrus searching device 8 and the estrus inducing device 2 to move to the preset position of the pigsty of the sow with the pre-searched estrus according to the position, the topographic information of the current position and the map information stored in the controller, and then, the navigation obstacle avoidance sensor 5 acquires the position information of the sow with the pre-searched estrus, so that the walking controller controls the walking driving mechanism 6 to drive the chassis 3 to drive the estrus searching device 8 and the estrus inducing device 2 to move to a preset relative position with the sow so as to compensate according to the current actual pose of the pig, ensuring the subsequent luring and checking of the emotion to be carried out smoothly;
in the embodiment, the gas with boar hormone or smell is sprayed out through a sprayer of the bionic execution mechanism, boar sound is played through a loudspeaker, the function of inducing the oestrus of the sows is achieved, and in the inducing process, a controller judges the currently induced pigs according to the current position and map information stored in the controller, so that the purpose of accurately setting the types, duration and other oestrus factors of the induced oestrus for sows of different pig types and types in a targeted manner is achieved, and the inducing situation of the pigs is recorded;
then, the situation is checked, and the situation checking process comprises the following steps:
s1, acquiring images of sows (induced pigs) through the image acquisition device, introducing the acquired images of the sows into a pre-trained convolutional neural network for convolution operation, and identifying oestrus characteristic part images of the sows, wherein the oestrus characteristic part images are anus and/or vulva and/or ear images of the sows;
s2, positioning the position of the oestrus characteristic part image by the body temperature measurer, and acquiring the temperature information of the oestrus characteristic part of the sow by the body temperature measurer, wherein the temperature information of the oestrus characteristic part is the temperature of the anus and/or the vulva and/or the ears of the sow;
s3, predicting and identifying the key oestrus state of the sow by regression analysis on the oestrus feature position image of the sow, wherein the key oestrus state is the vulva color and/or vulva shape and/or binaural posture of the sow during oestrus, namely, the consistency degree of the currently acquired oestrus feature position image and the feature position image of the sow during oestrus is judged to obtain the key oestrus state information of the sow;
s4, carrying out normalized data processing on the feature part temperature information and the oestrus key state information of the sow, and introducing a classifier for classification and identification, wherein the classifier is a classifier based on a machine learning algorithm, an SVM classifier is preferably adopted in the embodiment, and the classifier can be other classifiers as well and is not particularly limited so as to identify the oestrus state of the sow;
when the heat checking device 8 identifies that the current pig is oestrous, or when the heat checking device 2 identifies that the heat checking device is not oestrous for the current pig, the controller records the heat checking condition of the pig and controls the walking mechanism 1 to move to the next pig to be heat checked, so that the heat checking and heat checking work of all heat checking sows in the pigsty is completed.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the design of the present invention, and all equivalent changes made in the design key point of the present invention fall within the protection scope of the present invention.

Claims (10)

1. A sow oestrus checking method is characterized by comprising the following steps: the method applies a sow oestrus searching robot, the sow oestrus searching robot comprises an oestrus searching device (8), the oestrus searching device (8) comprises an image collector and a body temperature measurer, and the method comprises the following steps:
s1, collecting the sow image through the image collector, introducing the collected sow image into a pre-trained convolutional neural network for convolution operation, and identifying the oestrus characteristic part image of the sow;
s2, enabling the body temperature measurer to locate the position of the oestrus characteristic part image, and acquiring oestrus characteristic part temperature information of the sow through the body temperature measurer;
s3, predicting and identifying the key oestrus state of the sow by regression analysis on the oestrus feature position image of the sow to obtain the key oestrus state information of the sow;
and S4, carrying out normalized data processing on the characteristic part temperature information and the oestrus key state information of the sow, and introducing a classifier for classification and identification so as to identify the oestrus state of the sow.
2. The sow oestrus detecting method as claimed in claim 1, wherein: the oestrus characteristic part image is an anus and/or vulva and/or ear image of the sow, and the oestrus characteristic part temperature information is the anus and/or vulva and/or ear temperature of the sow.
3. The sow oestrus detecting method as claimed in claim 1, wherein: the key oestrus state is vulvar color and/or vulvar morphology and/or binaural posture of the sow when oestrus occurs.
4. The sow oestrus detecting method as claimed in claim 1, wherein: the classifier is based on a machine learning algorithm.
5. The sow oestrus detecting method as claimed in claim 1, wherein: the sow estrus-searching robot further comprises an estrus-inducing device (2), wherein the estrus-inducing device (2) comprises a bionic execution mechanism which at least simulates one inducing characteristic of a boar;
before the step S1 is started, at least one attraction characteristic of the boar is firstly simulated through the bionic actuating mechanism so as to induce the oestrus of the sow.
6. The sow oestrus detecting method as claimed in claim 5, wherein: the attraction characteristic is at least one of a boar odor characteristic and a boar sound characteristic.
7. The sow oestrus detecting method as claimed in claim 5, wherein: the sow emotion checking robot further comprises a walking mechanism (1), the walking mechanism (1) comprises a chassis (3), a controller, a navigation obstacle avoidance sensor (5) and a walking driving mechanism (6), the controller is arranged on the chassis (3), map information is stored in the controller, the controller is in communication connection with the navigation obstacle avoidance sensor (5) and the walking driving mechanism (6), and the emotion checking device (8) and the emotion inducing device (2) are installed on the chassis (3);
before the sow is induced to estrus, the navigation obstacle avoidance sensor (5) is used for collecting the position of the current walking mechanism (1) and the topographic information of the current position, and the position and the topographic information of the current position are transmitted to the controller, and the controller is used for controlling the walking driving mechanism (6) to drive the chassis (3) to drive the estrus searching device (8) and the estrus inducing device (2) to move to the pigsty preset position of the sow with the pre-searched estrus according to the position, the topographic information of the current position and the stored map information.
8. The sow oestrus detecting method as claimed in claim 7, wherein: the device is characterized in that the chassis (3) drives the heat checking device (8) and the heat inducing device (2) to move to a pigsty of a pre-heat sow in a preset position, the navigation obstacle avoidance sensor (5) collects the position information of the pre-heat sow, so that the controller controls the walking driving mechanism (6) to drive the chassis (3) to drive the heat checking device (8) and the heat inducing device (2) to move to a preset relative position between the heat checking device and the sow.
9. A sow emotion-seeking robot is characterized in that: including running gear (1), look into feelings device (8), running gear (1) is in including chassis (3), setting controller, navigation on chassis (3) are kept away barrier sensor (5), running gear (6), look into feelings device (8) and install on chassis (3), the controller communication is connected the navigation is kept away barrier sensor (5), running gear (6), is looked into feelings device (8), it is at least one in image acquisition ware, body temperature measurement ware, the radio to look into feelings device (8).
10. The sow heat-checking robot as claimed in claim 9, wherein: still install on chassis (3) and lure feelings device (2), lure feelings device (2) communication connection the controller, lure feelings device (2) including the bionic actuating mechanism who simulates boar a kind of luring the characteristic at least.
CN202210023753.9A 2022-01-10 2022-01-10 Sow oestrus searching method and oestrus searching robot Pending CN114403044A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210023753.9A CN114403044A (en) 2022-01-10 2022-01-10 Sow oestrus searching method and oestrus searching robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210023753.9A CN114403044A (en) 2022-01-10 2022-01-10 Sow oestrus searching method and oestrus searching robot

Publications (1)

Publication Number Publication Date
CN114403044A true CN114403044A (en) 2022-04-29

Family

ID=81271030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210023753.9A Pending CN114403044A (en) 2022-01-10 2022-01-10 Sow oestrus searching method and oestrus searching robot

Country Status (1)

Country Link
CN (1) CN114403044A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115119766A (en) * 2022-06-16 2022-09-30 天津农学院 Sow oestrus detection method based on deep learning and infrared thermal imaging
CN115943908A (en) * 2022-12-05 2023-04-11 中国农业科学院北京畜牧兽医研究所 Sow oestrus detection method based on adaptive navigation and related equipment
WO2023235735A3 (en) * 2022-05-31 2024-02-22 The Curators Of The University Of Missouri Method and system for detecting sow estrus utilizing machine vision

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN106070001A (en) * 2016-06-04 2016-11-09 永州市天顺畜牧科技有限责任公司 The cultural method of breeding sow
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN210148104U (en) * 2019-04-18 2020-03-17 上海创宏生物科技有限公司 Boar robot capable of improving emotion finding precision
KR20200117610A (en) * 2019-04-05 2020-10-14 권은희 Electronic Sow Management Apparatus
CN212071964U (en) * 2020-03-27 2020-12-04 牧原食品股份有限公司 Ground inspection robot
CN112640809A (en) * 2020-12-18 2021-04-13 中国农业大学 Sow oestrus detection method and device
CN216874489U (en) * 2022-01-10 2022-07-05 厦门智瞳科技有限公司 Sow estrus-inducing robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN106070001A (en) * 2016-06-04 2016-11-09 永州市天顺畜牧科技有限责任公司 The cultural method of breeding sow
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
KR20200117610A (en) * 2019-04-05 2020-10-14 권은희 Electronic Sow Management Apparatus
CN210148104U (en) * 2019-04-18 2020-03-17 上海创宏生物科技有限公司 Boar robot capable of improving emotion finding precision
CN212071964U (en) * 2020-03-27 2020-12-04 牧原食品股份有限公司 Ground inspection robot
CN112640809A (en) * 2020-12-18 2021-04-13 中国农业大学 Sow oestrus detection method and device
CN216874489U (en) * 2022-01-10 2022-07-05 厦门智瞳科技有限公司 Sow estrus-inducing robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王晨阳等: "智能感知技术在猪饲养管理中的应用研究进展", 养猪, no. 6, pages 82 - 88 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023235735A3 (en) * 2022-05-31 2024-02-22 The Curators Of The University Of Missouri Method and system for detecting sow estrus utilizing machine vision
CN115119766A (en) * 2022-06-16 2022-09-30 天津农学院 Sow oestrus detection method based on deep learning and infrared thermal imaging
CN115119766B (en) * 2022-06-16 2023-08-18 天津农学院 Sow oestrus detection method based on deep learning and infrared thermal imaging
CN115943908A (en) * 2022-12-05 2023-04-11 中国农业科学院北京畜牧兽医研究所 Sow oestrus detection method based on adaptive navigation and related equipment

Similar Documents

Publication Publication Date Title
CN114403044A (en) Sow oestrus searching method and oestrus searching robot
Norton Receptive-field properties of superior colliculus cells and development of visual behavior in kittens.
CN216874489U (en) Sow estrus-inducing robot
CN112469269A (en) Method for autonomously training animals to respond to oral commands
CN103390193B (en) A kind of automatic trainer of rat robot towards navigation and rat behavior recognition methods and training method
CN212071964U (en) Ground inspection robot
US7073458B2 (en) System and method for milking animals
CN114037552B (en) Method and system for polling physiological growth information of meat ducks
CN106719066A (en) A kind of bionics ewe is looked into feelings/lure feelings device and looks into feelings/lure feelings method
CN109326180A (en) Intelligent road test system and method
WO2019009857A2 (en) Method of detecting estrous period of cows by a drone
Xu et al. Detecting sow vulva size change around estrus using machine vision technology
CN210247950U (en) Intelligent detection device and intelligent detection system
CN118216493A (en) Weed removing device and method based on autonomous weed identification
CN114220478A (en) Full-digital intelligent acquisition system for animal signs in pasture
CN113869130A (en) Binocular vision intelligent detection method and system
CN206629733U (en) A kind of bionics ewe looks into feelings/lure feelings device
JP2003535604A (en) Automatic horse training system
CN117029904A (en) Intelligent cage-rearing poultry inspection system
US20230057738A1 (en) Detecting estrus in animals for insemination
Wu et al. Research on cow rumination monitoring based on new activity sensor
CN115226650B (en) Sow oestrus state automatic detection system based on interaction characteristics
CN209017618U (en) One boar toy-type automatic dung cleaning system
CN114051951A (en) Pet caring method based on pet identification and pet caring robot
CN108765577B (en) Real-time point cloud data-driven four-limb livestock animal skeleton augmented reality tracking method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220429