CN114550059A - Method, device and equipment for identifying health condition of chicken and storage medium - Google Patents
Method, device and equipment for identifying health condition of chicken and storage medium Download PDFInfo
- Publication number
- CN114550059A CN114550059A CN202210177082.1A CN202210177082A CN114550059A CN 114550059 A CN114550059 A CN 114550059A CN 202210177082 A CN202210177082 A CN 202210177082A CN 114550059 A CN114550059 A CN 114550059A
- Authority
- CN
- China
- Prior art keywords
- chicken
- target
- characteristic
- region
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 241000287828 Gallus gallus Species 0.000 title claims abstract description 379
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000036541 health Effects 0.000 title claims abstract description 32
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 235000013330 chicken meat Nutrition 0.000 claims abstract description 371
- 241000561734 Celosia cristata Species 0.000 claims abstract description 61
- 210000001520 comb Anatomy 0.000 claims abstract description 61
- 238000001514 detection method Methods 0.000 claims abstract description 41
- 238000012549 training Methods 0.000 claims description 77
- 238000002372 labelling Methods 0.000 claims description 23
- 238000004590 computer program Methods 0.000 claims description 20
- 238000012360 testing method Methods 0.000 claims description 9
- 238000006073 displacement reaction Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 6
- 230000009467 reduction Effects 0.000 claims description 6
- 230000003862 health status Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 238000009395 breeding Methods 0.000 description 8
- 230000001488 breeding effect Effects 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 244000144972 livestock Species 0.000 description 7
- 230000000694 effects Effects 0.000 description 4
- 210000002569 neuron Anatomy 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 244000144977 poultry Species 0.000 description 3
- 235000013594 poultry meat Nutrition 0.000 description 3
- 241000209149 Zea Species 0.000 description 2
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 2
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 235000005822 corn Nutrition 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 230000005180 public health Effects 0.000 description 2
- 206010011703 Cyanosis Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 230000006806 disease prevention Effects 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012847 principal component analysis method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- 229940088594 vitamin Drugs 0.000 description 1
- 229930003231 vitamin Natural products 0.000 description 1
- 235000013343 vitamin Nutrition 0.000 description 1
- 239000011782 vitamin Substances 0.000 description 1
- 150000003722 vitamin derivatives Chemical class 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/70—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in livestock or poultry
Abstract
The invention provides a method, a device, equipment and a storage medium for identifying the health condition of chickens, wherein the method comprises the following steps: acquiring a motion video which is acquired by a camera and contains a target chicken; inputting the motion video into a trained target tracking network to obtain a picture of an area where the target chicken is located and liveness characteristics of the target chicken; inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located; acquiring the chicken head temperature of the region of the target chicken as the temperature characteristic of the chicken head through an infrared thermometer matched with the camera; inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken. The invention can improve the detection accuracy of the sick chicken.
Description
Technical Field
The invention relates to the technical field of computer identification, in particular to a method, a device, equipment and a storage medium for identifying health conditions of chickens.
Background
With the continuous development of the scale of the breeding industry and the increase of the breeding density of livestock and poultry, the breeding number of the chickens bred in a single cage is also increased rapidly. Under the condition, the method also puts high requirements on disease prevention and control of the livestock and poultry, and once the livestock and poultry have diseases or sub-health states and are not found and treated in time, great potential safety hazards and economic loss can be caused.
Taking sick chickens as an example, the sick chickens have a colony effect, if the sick chickens in the caged chickens can not be identified and detected in time, the sick chickens can spread in the chickens in the whole cage rapidly, so that the chickens are sick in a large area, serious economic loss is caused to breeders, and hidden dangers are brought in public health aspects such as food safety.
However, the livestock breeding is mainly patrolled and examined through manual visual inspection, so that the workload of patrolling personnel is increased, and sick chickens cannot be found in time due to personnel negligence or excessive livestock quantity. At present, although an intelligent identification method based on machine vision also exists, the identification accuracy is low, and the requirements of livestock breeding cannot be met. Therefore, a method for identifying sick chickens with higher accuracy is needed.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a storage medium for identifying health conditions of chickens, and aims to solve the problem of low identification accuracy of sick chickens at present.
In a first aspect, an embodiment of the present invention provides a method for identifying a health condition of a chicken, including:
acquiring a motion video which is acquired by a camera and contains a target chicken;
inputting the motion video into a trained target tracking network to obtain a picture of an area where the target chicken is located and liveness characteristics of the target chicken;
inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located;
acquiring the chicken head temperature of the region where the target chicken is located through an infrared thermometer matched with the camera as the temperature characteristic of the chicken head;
inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken.
In a possible implementation manner, inputting a picture of an area where a target chicken is located into a trained chicken head detection network to obtain a cockscomb color feature of the target chicken and a depth feature of the area where the chicken head is located, including:
inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the color score of the cockscomb of the target chicken, and taking the color score of the cockscomb of the target chicken as the color feature of the cockscomb of the target chicken;
and acquiring the depth characteristic of the image of the region of the target chicken by a stem characteristic extraction network of the chicken head detection network, and performing dimensionality reduction processing on the depth characteristic to obtain the depth characteristic of the region of the target chicken head.
In one possible implementation, the color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
In one possible implementation manner, the method further includes:
acquiring a first training sample, wherein the first training sample comprises a motion video sample containing chickens, each picture containing chickens in the motion video sample corresponds to first labeling information, and the first labeling information indicates the area of the chickens on the picture;
and training the pre-constructed target tracking network based on the first training sample and the corresponding first marking information thereof to obtain the trained target tracking network.
In one possible implementation manner, the method further includes:
acquiring a second training sample, and inputting the second training sample into a trained target tracking network to obtain a picture sample, wherein the second training sample comprises a motion video sample containing chickens, and the picture sample is a picture of a region where the chickens are cut out from the second training sample;
obtaining second labeling information corresponding to the picture sample through labeling, wherein the second labeling information indicates the color of the cockscomb in the picture sample;
and training the pre-constructed chicken head detection network based on the second training sample and the corresponding second label information thereof to obtain the trained chicken head detection network.
In one possible implementation manner, the liveness of the target chicken is characterized by the displacement variation of the central point of the target chicken within a fixed time.
In a possible implementation, the temperature characteristic t of the chicken head of the target chickeniComprises the following steps:
wherein, TiThe chicken head temperature T of the ith chicken is obtained by adopting an infrared thermometer to testmaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
In a second aspect, an embodiment of the present invention provides an apparatus for identifying health status of a chicken, including:
the acquisition module is used for acquiring a motion video which is acquired by the camera and contains a target chicken;
the first characteristic obtaining module is used for inputting the motion video into a trained target tracking network to obtain the picture of the area where the target chicken is located and the liveness characteristic of the target chicken;
the second third characteristic module is used for inputting the picture of the region where the target chicken is located into the trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located;
the fourth characteristic module is used for acquiring the chicken head temperature of the region where the target chicken is located through an infrared thermometer matched with the camera and used as the temperature characteristic of the chicken head;
and the health condition determining module is used for inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken.
In a possible implementation manner, the obtaining of the second third feature module is specifically configured to:
inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the color score of the cockscomb of the target chicken, and taking the color score of the cockscomb of the target chicken as the color feature of the cockscomb of the target chicken;
and acquiring the depth characteristic of the image of the region of the target chicken by a stem characteristic extraction network of the chicken head detection network, and performing dimensionality reduction processing on the depth characteristic to obtain the depth characteristic of the region of the target chicken head.
In one possible implementation, the color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
In a possible implementation manner, the obtaining a first feature module is further configured to:
acquiring a first training sample, wherein the first training sample comprises a motion video sample containing chickens, each picture containing chickens in the motion video sample corresponds to first labeling information, and the first labeling information indicates the area of the chickens on the picture;
and training the pre-constructed target tracking network based on the first training sample and the corresponding first marking information thereof to obtain the trained target tracking network.
In a possible implementation manner, the obtaining a second third feature module is further configured to:
acquiring a second training sample, and inputting the second training sample into a trained target tracking network to obtain a picture sample, wherein the second training sample comprises a motion video sample containing chickens, and the picture sample is a picture of a region where the chickens are cut out from the second training sample;
obtaining second labeling information corresponding to the picture sample through labeling, wherein the second labeling information indicates the color of the cockscomb in the picture sample;
and training the pre-constructed chicken head detection network based on the second training sample and the corresponding second label information thereof to obtain the trained chicken head detection network.
In one possible implementation manner, the liveness of the target chicken is characterized by the displacement variation of the central point of the target chicken within a fixed time.
In a possible implementation, the temperature characteristic t of the chicken head of the target chickeniComprises the following steps:
wherein, TiThe chicken head temperature T of the ith chicken is obtained by adopting an infrared thermometer to testmaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the steps of the method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the steps of the method according to the first aspect or any one of the possible implementation manners of the first aspect.
The embodiment of the invention provides a method, a device, equipment and a storage medium for identifying the health condition of chickens. And then, inputting the picture of the region of the target chicken into the trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region of the chicken head. And then, acquiring the chicken head temperature of the region of the target chicken as the temperature characteristic of the chicken head through an infrared thermometer matched with the camera. And finally, inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken. Therefore, the health condition of the chicken can be quickly and accurately identified by identifying the collected motion video containing the chicken.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flowchart of an implementation of a method for identifying health status of a chicken according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a chicken health status identification device provided in an embodiment of the present invention;
fig. 3 is a schematic diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following description is made by way of specific embodiments with reference to the accompanying drawings.
As described in the background art, the sick chickens have a group effect, and if the sick chickens cannot be identified and detected in time, the sick chickens can spread in the chicken group rapidly, so that a large-area sick phenomenon is caused, and not only can serious economic loss be caused, but also public health problems such as food safety can be brought. However, in the current cage chicken farm, inspection is mainly carried out manually, the identification rate of sick chickens is low, and a large amount of manpower is consumed.
Therefore, the computer vision technology based on deep learning replaces manual monitoring, real-time identification and chicken situation monitoring of sick chicken in the chicken farm are achieved, and the method has important significance for relieving the shortage of chicken farm labor, strengthening epidemic situation control, improving the automation level of breeding industry and achieving intelligent breeding.
The cockscomb of healthy chicken is ruddy in color, uniform in texture and round and vivid in corn, while the cockscomb of sick chicken is abnormal in color, dry and uneven, and semi-closed or fully-closed in corn. The identification research of sick chickens based on the traditional machine vision mainly comprises the step of judging the chickens died of illness by using a support vector machine through the key points of cockscombs and the movement of the key points. The sick chicken is identified by changing the color of the image of the cockscomb, changing the eye state and extracting the characteristics, counting the number of the cockscomb and the behavior of the cockscomb. However, the recognition accuracy is low, and the requirement of livestock breeding cannot be met. Therefore, a method for identifying sick chickens with higher accuracy is needed.
In order to solve the problems in the prior art, embodiments of the present invention provide a method, an apparatus, a device, and a storage medium for identifying a health status of a chicken. The following first describes a method for identifying the health condition of a chicken provided by an embodiment of the present invention.
The subject of execution of the method for identifying a health condition of a chicken may be an apparatus for identifying a health condition of a chicken, which may be an electronic device having a processor and a memory, such as a mobile electronic device or a non-mobile electronic device. The embodiments of the present invention are not particularly limited.
Referring to fig. 1, it shows a flowchart of an implementation of the method for identifying a health condition of a chicken provided by an embodiment of the present invention, which is detailed as follows:
and step S110, acquiring a motion video which is acquired by a camera and contains the target chicken.
The target chicken may be one chicken, multiple chickens, a chicken with a special requirement, or all chickens in a certain area, and the user may select the target chicken according to the actual situation, which is not limited herein.
The motion video may be a motion video of multiple time periods of multiple monitoring points, or a motion video of multiple time periods of a certain monitoring point, which is not limited herein.
And S120, inputting the motion video into the trained target tracking network to obtain the picture of the region where the target chicken is located and the liveness characteristics of the target chicken.
The liveness of the target chicken is characterized by the displacement variation of the central point of the target chicken in a fixed time.
In some embodiments, the training process of the target tracking network may be:
step 1210, obtaining a first training sample.
The first training sample comprises a motion video sample containing chickens, each picture containing the chickens in the motion video sample corresponds to first marking information, and the first marking information indicates the areas of the chickens on the picture.
Step S1220, training a pre-constructed target tracking network based on the first training sample and the corresponding first label information, so as to obtain a trained target tracking network.
And inputting the first training sample and the first marking information corresponding to the first training sample into a pre-constructed target tracking network, and training the target tracking network to obtain the trained target tracking network.
In order to test the trained target tracking network, a motion video containing chickens is selected as a test sample and input into the trained target tracking network, so that the detection effect of the trained target tracking network is detected.
Specifically, a rectangular frame labeling method can be adopted to label the region where the chicken is located on each picture containing the chicken. And then inputting the first training sample and the label corresponding to the first training sample into a target tracking network for training.
And after the target tracking network training is finished, the detection can be carried out. Inputting the motion video into a trained target tracking network to obtain a picture of the region where the target chicken is located and displacement variation of the central point of the target chicken in a fixed time, and taking the displacement variation as liveness characteristics of the target chicken.
And S130, inputting the picture of the region where the target chicken is located into the trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located.
In some embodiments, the image of the area where the target chicken is located is input to the trained chicken head detection network, so as to obtain the color score of the comb of the target chicken, and the color score of the comb of the target chicken is used as the comb color feature of the target chicken.
Specifically, the color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
In some embodiments, the training process of the chicken head detection network may be:
step S1310, obtaining a second training sample, and inputting the second training sample into the trained target tracking network to obtain a picture sample.
The second training sample may be the same as or different from the first training sample, and is not limited herein. The second training sample comprises a motion video sample containing the chicken, and the picture sample is a picture of the area where the chicken is cut out from the second training sample.
Step S1320, obtaining second labeling information corresponding to the picture sample through labeling. And the second marking information indicates the color of the cockscomb in the picture sample. The color of the rooster comb may include bright red, white, dark purple and yellow.
Step S1330, training the pre-constructed chicken head detection network based on the second training sample and the second labeled information corresponding thereto to obtain a trained chicken head detection network.
And inputting the second training sample and second marking information corresponding to the second training sample into a pre-constructed chicken head detection network for training, so as to obtain the trained chicken head detection network.
The depth features of the image of the region where the target chicken is located can be obtained through the trained stem feature extraction network of the chicken head detection network, and dimension reduction processing is carried out on the depth features by adopting a dimension reduction method, so that the depth features of the region where the chicken head of the target chicken is located are obtained.
And S140, acquiring the chicken head temperature of the region of the target chicken as the temperature characteristic of the chicken head through an infrared thermometer matched with the camera.
After the temperature of the chicken head is tested by using an infrared thermometer matched with the camera, the acquired temperature is normalized and used as the temperature characteristic of the chicken head.
Wherein the temperature characteristic t of the chicken head of the target chickeniComprises the following steps:
wherein, TiTo adoptThe chicken head temperature, T, of the ith chicken measured by an infrared thermometermaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
And S150, inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into a trained classification network to obtain the health condition of the target chicken.
Firstly, fusing the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the cocks are located and the temperature characteristic of the cocks of the target chicken into a sick chicken identification characteristic, and then inputting the sick chicken identification characteristic into a trained classification network to identify the health condition of the target chicken.
The chicken health condition identification method provided by the invention comprises the steps of firstly, acquiring a motion video which is acquired by a camera and contains a target chicken, then, inputting the motion video into a trained target tracking network, and obtaining a picture of an area where the target chicken is located and liveness characteristics of the target chicken. And then, inputting the picture of the region of the target chicken into the trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region of the chicken head. And then, acquiring the chicken head temperature of the region of the target chicken as the temperature characteristic of the chicken head through an infrared thermometer matched with the camera. And finally, inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken. Therefore, the health condition of the chicken can be quickly and accurately identified by identifying the collected motion video containing the chicken.
The following describes in detail the identification method of the health status of chicken by a specific example:
before identification, a target tracking network, a chicken head detection network and a classification network need to be trained. The specific training process is as follows:
the method comprises the steps of collecting 10 sections of monitoring videos of chicken movement under different monitoring points in a chicken farm by utilizing a camera, wherein each section of video is 5min, wherein 7 sections of the monitoring videos are divided into first training samples, and the rest 3 sections of the monitoring videos are divided into second training samples.
The method comprises the steps of training a target tracking network, marking the area of a chicken in each picture containing the chicken in a first training sample by using a rectangular frame, and inputting the first training sample and a corresponding label marked by using the rectangular frame into a pre-constructed deepsort target tracking network for training.
Firstly, marking the color of a cockscomb in a picture containing a chicken area by using marking software, wherein the cockscomb color mainly comprises four types of bright red, white, cyanotic and yellow. Inputting the first training sample into a trained target tracking network to obtain a picture sample, wherein the picture sample is a picture of a region where chickens are cut out from the second training sample, and then marking the color of cockscombs in the picture sample by adopting marking software to obtain second marking information. And then training a preset Yolov3 chicken head detection network.
The classification network comprises three full-connection layers, the number of neurons of the first full-connection layer is 1000, the number of neurons of the second full-connection layer is 500, the number of neurons of the third full-connection layer is 100 in dimension, an output layer is added behind the third full-connection layer, the number of the neurons is 1, and finally a sigmoid activation function is passed. Inputting the liveness characteristics, the cockscomb color characteristics, the depth characteristics of the region where the cocks are located and the fusion characteristics of the temperature characteristics of the cocks, which are output in the steps, into a classification network, and training the classification network.
After the three networks are trained, the health condition of the chicken can be identified, and a second training sample is selected as a test set for identification.
Firstly, inputting a second training sample into a trained deepsort target tracking network to obtain the position of the central point of the target chicken and the picture of the area where the target chicken is located, and then calculating the relative displacement of the target chicken within 60s as the liveness characteristic F of the target chickenvel。
Then, inputting the picture of the region of the target chicken into a trained yolov3 chicken head detection network for testing to obtain the four-dimensional color characteristic F of the chicken head of the target chickencolThe color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
Meanwhile, inputting the picture of the region where the target chicken is located into a well-trained yolov3 chicken head detection network, extracting the depth feature of the region where the chicken head is located through a backbone feature extraction network darknet53 of yolov3, wherein the size of an input image of the darknet53 is 416 multiplied by 3, the size of a final output feature map is 13 multiplied by 1024, the 1 multiplied by 1024 dimensional feature of the grid where the chicken head is located is reserved, the PCA principal component analysis method is adopted to reduce the dimension of the 1024 dimensional depth feature to 50 dimensions, and finally the depth feature F of the region where the chicken head is located is obtaineddeep。
Secondly, acquiring the chicken head temperature of the region of the target chicken by an infrared thermometer matched with the camera, and normalizing the acquired chicken head temperature to be used as the temperature characteristic F of the chicken headtemp。
Normalizing the temperature characteristic t of the chicken head of the treated target chickeniComprises the following steps:
wherein, TiThe chicken head temperature T of the ith chicken is obtained by adopting an infrared thermometer to testmaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
Finally, the 1-dimensional activity characteristic F of the target chicken obtained in the process is obtainedvel4-dimensional cockscomb color feature FcolTemperature characteristic of 1-dimensional Chicken head FtempAnd 50 vitamin chicken headDepth feature of region FdeepFusion into sick chicken identification feature Ffinal56D FfonalInputting the obtained result into a trained classification network to obtain the health condition of the target chicken.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Based on the chicken health status identification method provided by the embodiment, correspondingly, the invention also provides a specific implementation mode of the chicken health status identification device applied to the chicken health status identification method. Please see the examples below.
As shown in fig. 2, an apparatus 200 for identifying health conditions of chickens is provided, which includes:
the acquisition module 210 is configured to acquire a motion video including a target chicken, which is acquired by a camera;
the acquiring first characteristic module 220 is used for inputting the motion video into a trained target tracking network to obtain the picture of the region where the target chicken is located and the liveness characteristic of the target chicken;
the acquiring second and third feature module 230 is configured to input the picture of the region where the target chicken is located into the trained chicken head detection network, so as to obtain the cockscomb color feature of the target chicken and the depth feature of the region where the chicken head is located;
the acquiring fourth characteristic module 240 is configured to acquire, through an infrared thermometer matched with the camera, a chicken head temperature of an area where the target chicken is located, as a temperature characteristic of the chicken head;
and the health condition determining module 250 is used for inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into the trained classification network to obtain the health condition of the target chicken.
In a possible implementation manner, the obtaining second third characteristic module 230 is specifically configured to:
inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the color score of the cockscomb of the target chicken, and taking the color score of the cockscomb of the target chicken as the color feature of the cockscomb of the target chicken;
and acquiring the depth characteristic of the image of the region of the target chicken by a stem characteristic extraction network of the chicken head detection network, and performing dimensionality reduction processing on the depth characteristic to obtain the depth characteristic of the region of the target chicken head.
In one possible implementation, the color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
In one possible implementation, the obtain first characteristics module 220 is further configured to:
acquiring a first training sample, wherein the first training sample comprises a motion video sample containing chickens, each picture containing chickens in the motion video sample corresponds to first labeling information, and the first labeling information indicates the area of the chickens on the picture;
and training the pre-constructed target tracking network based on the first training sample and the corresponding first marking information thereof to obtain the trained target tracking network.
In a possible implementation manner, the obtaining second third feature module 230 is further configured to:
acquiring a second training sample, and inputting the second training sample into a trained target tracking network to obtain a picture sample, wherein the second training sample comprises a motion video sample containing chickens, and the picture sample is a picture of a region where the chickens are cut out from the second training sample;
obtaining second labeling information corresponding to the picture sample through labeling, wherein the second labeling information indicates the color of the cockscomb in the picture sample;
and training the pre-constructed chicken head detection network based on the second training sample and the corresponding second label information thereof to obtain the trained chicken head detection network.
In one possible implementation manner, the liveness of the target chicken is characterized by the displacement variation of the central point of the target chicken within a fixed time.
In a possible implementation, the temperature characteristic t of the chicken head of the target chickeniComprises the following steps:
wherein, TiThe chicken head temperature T of the ith chicken is obtained by adopting an infrared thermometer to testmaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
Fig. 3 is a schematic diagram of an electronic device provided in an embodiment of the present invention. As shown in fig. 3, the electronic apparatus 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the above-described embodiments of the method for identifying the health status of each chicken, such as the steps 110 to 150 shown in fig. 1. Alternatively, the processor 30, when executing the computer program 32, implements the functions of the modules in the above-mentioned device embodiments, such as the functions of the modules 210 to 250 shown in fig. 2.
Illustratively, the computer program 32 may be partitioned into one or more modules that are stored in the memory 31 and executed by the processor 30 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 32 in the electronic device 3. For example, the computer program 32 may be divided into the modules 210 to 250 shown in fig. 2.
The electronic device 3 may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the electronic device 3, and does not constitute a limitation of the electronic device 3, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the electronic device 3, such as a hard disk or a memory of the electronic device 3. The memory 31 may also be an external storage device of the electronic device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the electronic device 3. The memory 31 is used for storing the computer program and other programs and data required by the electronic device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the method according to the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of the embodiments of the method for identifying the health status of each chicken may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier signal, telecommunications signal, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.
Claims (10)
1. A method for identifying a health condition of a chicken, comprising:
acquiring a motion video which is acquired by a camera and contains a target chicken;
inputting the motion video into a trained target tracking network to obtain a picture of the region where the target chicken is located and liveness characteristics of the target chicken;
inputting the picture of the region where the target chicken is located into a trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located;
acquiring the chicken head temperature of the region of the target chicken as the temperature characteristic of the chicken head through an infrared thermometer matched with the camera;
inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into a trained classification network to obtain the health condition of the target chicken.
2. The identification method of claim 1, wherein the inputting the picture of the region where the target chicken is located into the trained chicken head detection network to obtain the cockscomb color feature and the depth feature of the region where the chicken head is located comprises:
inputting the picture of the area where the target chicken is located into the trained chicken head detection network to obtain the color score of the cockscomb of the target chicken, and taking the color score of the cockscomb of the target chicken as the color feature of the cockscomb of the target chicken;
and acquiring the depth characteristic of the picture of the region where the target chicken is located through the backbone characteristic extraction network of the chicken head detection network, and performing dimensionality reduction processing on the depth characteristic to obtain the depth characteristic of the region where the chicken head of the target chicken is located.
3. The identification method according to claim 2, wherein the color score of the rooster comb of the target chicken is:
score=color_probability×confidence;
wherein, color _ probability is the possibility of the cockscomb color of the target chicken, and the dimensionality is 4-dimensional; the confidence is the confidence that whether an object exists in the region where the cockscomb of the target chicken is located, and the dimensionality is 1 dimension.
4. The identification method according to any one of claims 1 to 3, characterized in that the method further comprises:
acquiring a first training sample, wherein the first training sample comprises a motion video sample containing chickens, each picture containing chickens in the motion video sample corresponds to first labeling information, and the first labeling information indicates the area of the chickens on the picture;
and training the pre-constructed target tracking network based on the first training sample and the corresponding first marking information thereof to obtain the trained target tracking network.
5. The identification method of claim 4, wherein the method further comprises:
acquiring a second training sample, and inputting the second training sample into a trained target tracking network to obtain a picture sample, wherein the second training sample comprises a motion video sample containing chickens, and the picture sample is a picture of an area where the chickens are cut out from the second training sample;
obtaining second labeling information corresponding to the picture sample through labeling, wherein the second labeling information indicates the color of the cockscomb in the picture sample;
and training the pre-constructed chicken head detection network based on the second training sample and the corresponding second label information thereof to obtain the trained chicken head detection network.
6. The identification method of claim 1, wherein the liveness of the target chicken is characterized by the displacement variation of the central point of the target chicken within a fixed time.
7. Identification method according to claim 1, characterized in that the temperature characteristic t of the chicken head of the target chickeniComprises the following steps:
wherein, TiThe chicken head temperature T of the ith chicken is obtained by adopting an infrared thermometer to testmaxIs the maximum value of the collected chicken head temperature, TminIs the minimum value of the collected chicken head temperature.
8. An apparatus for identifying a health condition of a chicken, comprising:
the acquisition module is used for acquiring a motion video which is acquired by the camera and contains a target chicken;
the acquisition first characteristic module is used for inputting the motion video into a trained target tracking network to obtain the picture of the area where the target chicken is located and the liveness characteristic of the target chicken;
a second third characteristic module is obtained and used for inputting the picture of the region where the target chicken is located into the trained chicken head detection network to obtain the cockscomb color characteristic of the target chicken and the depth characteristic of the region where the chicken head is located;
the fourth characteristic module is used for acquiring the chicken head temperature of the area where the target chicken is located through an infrared thermometer matched with the camera and used as the temperature characteristic of the chicken head;
and the health condition determining module is used for inputting the liveness characteristic, the cockscomb color characteristic, the depth characteristic of the region where the chicken head is located and the temperature characteristic of the chicken head of the target chicken into a trained classification network to obtain the health condition of the target chicken.
9. An electronic device, comprising a memory for storing a computer program and a processor for invoking and running the computer program stored in the memory, performing the method of any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210177082.1A CN114550059A (en) | 2022-02-24 | 2022-02-24 | Method, device and equipment for identifying health condition of chicken and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210177082.1A CN114550059A (en) | 2022-02-24 | 2022-02-24 | Method, device and equipment for identifying health condition of chicken and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114550059A true CN114550059A (en) | 2022-05-27 |
Family
ID=81679213
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210177082.1A Pending CN114550059A (en) | 2022-02-24 | 2022-02-24 | Method, device and equipment for identifying health condition of chicken and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114550059A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110200598A (en) * | 2019-06-12 | 2019-09-06 | 天津大学 | A kind of large-scale plant that raises sign exception birds detection system and detection method |
KR20200105558A (en) * | 2019-02-28 | 2020-09-08 | 주식회사 에스티엔 | A Computer Vision for the Prediction System of Livestock Diseases and Their Methods |
CN113221864A (en) * | 2021-04-12 | 2021-08-06 | 蚌埠学院 | Method for constructing and applying diseased chicken visual recognition model with multi-region depth feature fusion |
US20220022427A1 (en) * | 2020-04-27 | 2022-01-27 | It Tech Co., Ltd. | Ai-based livestock management system and livestock management method thereof |
CN114022831A (en) * | 2021-09-16 | 2022-02-08 | 四川雪月天佑农牧科技有限公司 | Binocular vision-based livestock body condition monitoring method and system |
-
2022
- 2022-02-24 CN CN202210177082.1A patent/CN114550059A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20200105558A (en) * | 2019-02-28 | 2020-09-08 | 주식회사 에스티엔 | A Computer Vision for the Prediction System of Livestock Diseases and Their Methods |
CN110200598A (en) * | 2019-06-12 | 2019-09-06 | 天津大学 | A kind of large-scale plant that raises sign exception birds detection system and detection method |
US20220022427A1 (en) * | 2020-04-27 | 2022-01-27 | It Tech Co., Ltd. | Ai-based livestock management system and livestock management method thereof |
CN113221864A (en) * | 2021-04-12 | 2021-08-06 | 蚌埠学院 | Method for constructing and applying diseased chicken visual recognition model with multi-region depth feature fusion |
CN114022831A (en) * | 2021-09-16 | 2022-02-08 | 四川雪月天佑农牧科技有限公司 | Binocular vision-based livestock body condition monitoring method and system |
Non-Patent Citations (1)
Title |
---|
刘烨虹;刘修林;侯若羿;黄永凯;陆辉山;: "基于WSN的蛋鸡活动量监测系统设计", 南方农业学报, no. 07, 10 August 2018 (2018-08-10) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Yang et al. | Feeding behavior recognition for group-housed pigs with the Faster R-CNN | |
Guo et al. | Multi-object extraction from topview group-housed pig images based on adaptive partitioning and multilevel thresholding segmentation | |
Mohamed et al. | Msr-yolo: Method to enhance fish detection and tracking in fish farms | |
CN102564964B (en) | Spectral image-based meat quality visual non-contact detection method | |
US11967069B2 (en) | Pathological section image processing method and apparatus, system, and storage medium | |
Santoni et al. | Cattle race classification using gray level co-occurrence matrix convolutional neural networks | |
CN107835654A (en) | Image processing apparatus, image processing method and image processing program | |
Huang et al. | Identification of group-housed pigs based on Gabor and Local Binary Pattern features | |
CN104854620A (en) | Image processing device, image processing system, and program | |
Li et al. | Group-housed pig detection in video surveillance of overhead views using multi-feature template matching | |
Gorokhovatskyi et al. | Explanation of CNN image classifiers with hiding parts | |
Kanjalkar et al. | Detection and classification of plant leaf diseases using ANN | |
Lamping et al. | ChickenNet-an end-to-end approach for plumage condition assessment of laying hens in commercial farms using computer vision | |
Chin et al. | Facial skin image classification system using Convolutional Neural Networks deep learning algorithm | |
CN111325181A (en) | State monitoring method and device, electronic equipment and storage medium | |
Pinto et al. | Image feature extraction via local binary patterns for marbling score classification in beef cattle using tree-based algorithms | |
Wang et al. | Pig face recognition model based on a cascaded network | |
McKenna et al. | Automated classification for visual-only postmortem inspection of porcine pathology | |
Pauzi et al. | A review on image processing for fish disease detection | |
CN110414369B (en) | Cow face training method and device | |
Witte et al. | Evaluation of deep learning instance segmentation models for pig precision livestock farming | |
CN114550059A (en) | Method, device and equipment for identifying health condition of chicken and storage medium | |
CN116416523A (en) | Machine learning-based rice growth stage identification system and method | |
Hitelman et al. | The effect of age on young sheep biometric identification | |
Khavalko et al. | Classification and Recognition of Medical Images Based on the SGTM Neuroparadigm. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |