WO2017030448A1 - Procédé et appareil pour évaluer un animal - Google Patents
Procédé et appareil pour évaluer un animal Download PDFInfo
- Publication number
- WO2017030448A1 WO2017030448A1 PCT/NZ2016/050129 NZ2016050129W WO2017030448A1 WO 2017030448 A1 WO2017030448 A1 WO 2017030448A1 NZ 2016050129 W NZ2016050129 W NZ 2016050129W WO 2017030448 A1 WO2017030448 A1 WO 2017030448A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- animal
- shape information
- model
- evaluation
- measurements
- Prior art date
Links
- 241001465754 Metazoa Species 0.000 title claims abstract description 308
- 238000000034 method Methods 0.000 title claims abstract description 76
- 238000005259 measurement Methods 0.000 claims abstract description 36
- 238000011156 evaluation Methods 0.000 claims description 71
- 210000003484 anatomy Anatomy 0.000 claims description 25
- 238000003384 imaging method Methods 0.000 claims description 20
- 238000012545 processing Methods 0.000 claims description 17
- 238000004458 analytical method Methods 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 claims description 5
- 238000009499 grossing Methods 0.000 claims description 4
- 238000012935 Averaging Methods 0.000 claims description 2
- 241000283690 Bos taurus Species 0.000 abstract description 36
- 208000030175 lameness Diseases 0.000 abstract description 10
- 235000013365 dairy product Nutrition 0.000 abstract description 6
- 230000004888 barrier function Effects 0.000 description 78
- 230000003068 static effect Effects 0.000 description 26
- 230000012173 estrus Effects 0.000 description 12
- 238000010801 machine learning Methods 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000010171 animal model Methods 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 210000000481 breast Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000007620 mathematical function Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000004513 sizing Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000037237 body shape Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013211 curve analysis Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 244000144980 herd Species 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/752—Contour matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30128—Food products
Definitions
- the present invention relates to a method and apparatus for automatically evaluating an animal based on its physical appearance, and in particular, but not exclusively, to a method and apparatus for determining a body condition score (BCS), particularly for a dairy cow.
- BCS body condition score
- the present application is also directed to methods for automatically evaluating a body condition score for cattle.
- WO2010/063527 One method of evaluating BCS is described in international publication WO2010/063527.
- a 3D camera is used to obtain a three dimensional image of the animal.
- the image is processed and a BCS score is determined based on a statistical analysis of a number of features of the data.
- the method described in WO2010/063527 results in the need to use a global analysis approach.
- the region of the animal which is of interest must be within the area captured by the image; the region must not be obscured; and the image must not contain relevant features from two or more different animals. It is also strongly preferable that the system evaluates the animal quickly enough to allow the animal to be drafted immediately after the image is collected.
- a method of calculating an evaluation of an animal comprising:
- the step of evaluating the animal comprises the step of calculating a category for the animal.
- the 3D model comprises a triangular mesh.
- the 3D model comprises a grid.
- the method comprises the step of smoothing and/or filtering the 3D shape information prior to the step of creating the 3D point cloud.
- the method comprises the step of smoothing and/or filtering the 3D point cloud prior to the step of forming the 3D model.
- the method comprises the step of identifying points or areas within the model which are representative of one or more anatomical features or regions.
- the step of receiving 3D shape information comprises receiving information from a plurality of frames of 3D shape information.
- the method comprises the step of pre-processing the 3D shape information before the 3D model is formed.
- the method comprises the step of normalizing the model prior to (or simultaneously with) the step of calculating the one or more representative measurements.
- the step of normalizing the model comprises the step of determining one or more geometries of the animal.
- the step of normalising the model comprises the step of determining one or more of the animal's height, length or width.
- the step of determining the animal's height comprises the step of determining a position of a surface on which the animal is standing.
- the step of determining the animal's height comprises the step of averaging data from several frames or a comparison with a feature of known size.
- the step of determining a position of a surface on which the animal is standing comprises the step of receiving 3D shape information from the surface when the animal is not standing on the surface.
- the step of identifying data corresponding to one or more anatomical regions comprises comparing the data to one or more 3D feature descriptors.
- an orientation of the animal is calculated from the relative positions of the anatomical regions or from an analysis of the outline of the animal.
- the step of identifying data corresponding to one or more anatomical regions comprises the step of excluding results which result in one or more parameters of the 3D model falling outside predetermined limits.
- the one or more parameters comprise one or more of a predetermined angle, curvature, length or depth, or a relationship to one or more other anatomical regions.
- the step of identifying data corresponding to one or more anatomical regions comprises performing a global optimisation to minimise or maximise, within predetermined limits, a distance between preselected anatomical features.
- the step of identifying data corresponding to one or more anatomical regions comprises performing a global optimisation to minimise or maximise, within predetermined limits, a distance between preselected anatomical features, or to minimise or maximise, within predetermined limits, another geometric descriptor of the 3D model.
- the step of calculating one or more representative measurements from the 3D model comprises the step of modelling an intersection plane through the one or more anatomical regions.
- the step of modelling an intersection plane through the region of interest comprises the step of rotating or translating the intersection plane to maximise or minimise a selected parameter.
- the step of calculating one or more representative measurements comprises the step of fitting a curve to the two dimensional shape information.
- the one or more representative measurements comprise one or more of coefficients of the curve, the length between two points on the curve, an area defined by the curve, a depth associated with the curve, or another descriptor of the curve.
- the step of identifying data corresponding to one or more anatomical regions comprises the step of modelling an intersection plane through the region of interest to define two dimensional shape information corresponding to a two dimensional curve.
- the one or more representative measurements comprise one or more of the average height, maximum height or minimum height of each grid square.
- the step of forming a 3D model comprises defining a grid based on the 3D shape information.
- the step of receiving 3D shape information corresponding to an area occupied by the animal comprises the step of receiving information from a 3D imaging device.
- the 3D imaging device comprises one or more of LiDAR, structure from motion devices, stereo or multiview camera devices, depth cameras based on time-of-flight or any similar methodology, lightfield cameras, or any device which provides depth information for a scene being captured.
- the category relates to one or more of body condition, lameness, udder conformation or a trait other than production (TOP), for example height at shoulder.
- an apparatus for calculating an evaluation of an animal comprising a three dimensional ( 3D) imaging device for collecting 3D shape information corresponding to a space occupied by the animal and a processing device in communication with the 3D imaging device which is configured to:
- the processing device calculates a category for the animal.
- an apparatus for automatically evaluating animals comprising a structure comprising two spaced apart static barrier means, a spacing between the barrier means selected to allow an animal to pass between the barrier means, the apparatus further comprising a three dimensional (3D) imaging device for selectively collecting 3D shape information of an area of interest of the animal when the animal is positioned in a space between the static barrier means and processing means for selectively evaluating the animal based on the 3D shape information, if collected, the apparatus further comprising first moveable barrier means provided at a first end of the static barrier means for selectively preventing a second animal from entering the space between the static barrier means and/or second moveable barrier means provided at a second end of the static barrier means for selectively preventing the animal from moving out of the space between the static barrier means.
- 3D three dimensional
- the apparatus determines whether a selected animal is to be evaluated based, in part, on a space between the animal and an adjacent animal.
- the apparatus determines whether the (or a) selected animal is to be evaluated based, in part, on a speed at which the animal is moving. Preferably, in use, the apparatus determines whether the (or a) selected animal is to be evaluated based, in part, on an assessment of whether a new evaluation of the animal is required.
- the apparatus determines whether a selected animal is to be evaluated based solely on one or more of:
- the apparatus comprises the second moveable barrier means wherein, in use, the apparatus closes the second moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is in front of the first animal, is greater than a predetermined distance.
- the apparatus comprises the first moveable barrier means, wherein, in use, the apparatus closes the first moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is behind the first animal, is greater than a predetermined distance.
- the apparatus comprises the second moveable barrier means, wherein, in use, the apparatus closes the moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected animal is required, and the selected animal is travelling at a speed which is greater than a predetermined speed.
- the evaluation of whether the selected animal is required is dependent, in part, on a length of time since the last evaluation of the animal.
- the evaluation of whether the selected animal is required is dependent, in part, on the result of a previous evaluation of the selected animal.
- the apparatus further comprises animal position sensing means.
- the animal position sensing means comprise a first sensor means for detecting when a fore part of the animal is in a first position which is indicative of the entire animal having moved between the static barrier means.
- the animal position sensing means comprise a second sensor means for detecting when a rear of the animal has moved beyond a second position which is adjacent the first moveable barrier means.
- the first sensor means comprises a photoelectric sensor.
- the second position sensing means comprises a photoelectric sensor.
- the animal position sensing means comprises the 3D imaging device and the processing means.
- the apparatus comprises an electronic ID reader.
- the 3D imaging device comprises a 3D camera.
- the apparatus comprises a lighting means for artificially lighting an area which is within a field of view of the 3D imaging device.
- the intensity of the light inside the structure is adjustable.
- the apparatus sends a signal to an automatic drafting gate depending on the evaluation of the animal.
- the animal position sensing means comprises a drafting gate entry sensor.
- the animal position sensing means comprises a drafting gate exit sensor.
- the evaluation of the animal performed by the apparatus comprises a calculation of a body condition score.
- a method of automatically evaluating animals comprising the steps of:
- the animal ii. if the animal is to selected to be evaluated, collecting 3D shape information of an area of interest of the animal when the animal is in a space between two spaced apart static barrier means, and processing the 3D shape information to evaluate the animal based on the 3D shape information.
- the method comprises the step of closing a first moveable barrier means to prevent a second animal from entering into the space between the static barrier means if a distance between the animal and a second animal which is behind the first animal is greater than a predetermined distance.
- the method comprises the step of closing a second moveable barrier means to prevent the animal from moving out of the space between the static barrier means, if a speed of the animal is greater than a predetermined maximum speed.
- the method comprises the step of closing the (or a) second moveable barrier means to prevent the animal from moving out of the space between the static barrier means if a distance between the animal and a second animal which is in front of the first animal is greater than a predetermined distance.
- the method comprises receiving a signal from a first animal position sensor means when a fore part of the animal is in a first position which is indicative of the entire animal having moved into the space between the static barrier means.
- the method comprises receiving a signal from a second animal position sensor means when a rear of the animal has passed a second position which is adjacent a first end of the static barrier means.
- the method comprises capturing the 3D shape information after receiving the signal from the second animal position sensing means.
- the method comprises the step of using a 3D camera to capture the 3D shape information.
- the method comprises the step of processing the 3D shape information to determine when the animal is in a suitable position to obtain 3D information to perform the evaluation.
- the method comprises the step of processing the 3D shape information to determine whether the animal is in a suitable stance to obtain 3D information to perform the evaluation.
- the method comprises updating a herd management system depending on the evaluation of the animal.
- the method comprises sending an automatic drafting gate a signal which is representative of the evaluation of the animal.
- the evaluation of the animal comprises a calculation of a body condition score.
- the method and/or apparatus of the preceding aspects is combined to form a system.
- the invention may be broadly said to consist in a method for calculating an evaluation of an animal comprising the method to calculate an evaluation used in the apparatus for automatically evaluating an animal.
- the combined apparatus also forms an aspect of the invention.
- the invention may also be said broadly to consist in the parts, elements and features referred to or indicated in the specification of the application, individually or collectively, in any or all combinations of two or more of said parts, elements or features, and where specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
- a system and/or method for calculating an evaluation of an animal is substantially as herein described, with reference to any one or more of the accompanying embodiments and drawings.
- Figure 1 is a diagrammatic side view of an apparatus according to one embodiment of the present invention.
- Figure 2 is a diagrammatic top view of the apparatus of Figure 2, with the cover removed for clarity and the drafting gate positions shown in outline.
- Figure 3 is a schematic view of a system for calculating an evaluation of an animal
- Figure 4 is a flow chart of the operation of an embodiment of the system used to calculate body condition score.
- Figure 5a, 5b are embodiments of a 3D point cloud of (a) an empty location and (b) an animal in location, as can be used in an embodiment of the method or system.
- Figure 6a, 6b show the intersection of a plane with an animal and the resulting 2D curve points measured in an embodiment of the system.
- Figure 7 shows a plurality of possible planes intersecting with an animal
- Figure 8a, 8b are diagrams showing calculation of parameters from a 2D curve as in figure 6b
- a system for calculating an evaluation of an animal is generally referenced by arrow 1 10.
- the system estimates which class from a plurality of predetermined physical classes an animal falls into, for example by calculating a body condition score (BCS) between 1 and 5.
- BCS body condition score
- the system comprises an imaging device 1 1 which is capable of collecting 3D shape information corresponding to an area of interest.
- the imaging device 1 1 may collect information on the entire animal, or only a portion of the animal which has features which are representative of the particular parameter which is to be categorised.
- the imaging device 1 1 may include one or more of a LiDAR, structure from motion devices, stereo or multiview camera devices, depth cameras based on time-of-flight or any similar methodology, lightfield cameras, or any other device which provides depth information for the scene being captured.
- the imaging device may be referred to herein as a "3D camera".
- the 3D camera is capable of taking at least one photo or a series of photos.
- the series of photos may be, or may be considered, as a plurality of frames in sequence. In some embodiments there is a comparison between frames, or the evaluation of a plurality of frames is used to confirm the evaluation.
- the imaging device 1 1 is statically mounted, while in other embodiments it may be attached to or otherwise carried by a person.
- the imaging device is in communication with a processing device 300, typically a computer.
- the processing device receives the 3D shape information from the imaging device 1 1 and calculates an evaluation of the animal (e.g. by determining which class the selected animal belongs to) based on the 3D shape information.
- the class calculated may be output to a suitable visual output device such as a computer monitor, a portable device such as a tablet, or a wearable display such as Google GlassTM.
- a suitable visual output device such as a computer monitor, a portable device such as a tablet, or a wearable display such as Google GlassTM.
- a record of the evaluation may be updated in an electronic record or database, for example MINDATM by Livestock Improvement Corporation.
- the parameter investigated may be one of a number of possible categorisations of an animal which are based on its appearance. Examples include body condition score, lameness, udder conformation or a trait other than production (TOP), for example height at shoulder.
- TOP trait other than production
- lameness detection can be achieved using a similar approach.
- lameness is detected by analysing the shape of the spine of either a stationary cow or a cow in motion. The analysis can investigate whether the animal has an arched or flat back. Lame cows tend to stand and walk with a progressively arched back as the severity of the lameness increases, whereas healthy cows stand and walk with a flat back.
- the 2D curve can be extracted from the 3D shape information by intersecting a plane through the spine of the cow running from just in front of the hips to roughly the shoulders of the cow. Variations of the plane or location of investigation are possible.
- Measurements can then be computed which describe how flat or arched this curve is and hence predict whether this cow is lame and the degree of lameness. This can either be performed on a single frame for a stationary cow, or over a series of successive frames when analysing a cow in motion.
- Figure 4 shows a possible method for operation of an embodiment of the system which is explained in more detail below.
- the sensor, imaging device or 3D camera captures 40 an image or set of images (frames).
- the images may be smoothed or filtered 41 to improve the modelling or accuracy of the technique.
- the images are then used to form a 3D point cloud 43.
- the point cloud allows a 3D surface to be viewed without having to connect each of the points.
- the point cloud can again be smoothed or filtered 42 before the point cloud is used to create a 3D connected model 44.
- the 3D model should now be an accurate and relatively smooth replication of the animal.
- the 3D connected model is used to detect anatomical points of interest 45. This can be used to measure relative locations or to extract curves 46 between areas of interest.
- the extracted curves or model portions are preferably 2 dimensional to allow measurements of the curves to be taken 47. However this may also be possible in 3D segments.
- the curve measurements may be compared directly to a known set but preferably are provided to 48 a machine learning model 51 to determine a characteristic(s) of the animal.
- the machine learning model has been trained on a known data set 49 through a training program 50.
- the characteristic of the animal may be BCS 52, lameness, or other characteristic determinable from the geometry of the animal.
- the 3D shape information is used to create a 3D point cloud.
- An example point cloud 70 for a bovine animal A is shown in Figure 5b. As can be seen in the figure the animal A is partially constrained between parallel bars for instance in a race, however this may not be necessary if sufficient camera views are available.
- the point cloud data is used to create a 3D surface model.
- the surface model is a triangular mesh formed using the well known method described by Gopi & Krishnan (Gopi, M., & Krishnan, S. (2002). A Fast and Efficient Projection-Based Approach for Surface Reconstruction. SIBGRAPI ' 02, Proceedings of the 15th Brazilian Symposium on Computer Graphics and Image Processing (pp. 179-186). Washington, DC: IEEE Computer Society, 2002).
- Filtering and/or smoothing of the data may occur prior to the creation of the 3D point cloud and/or prior to the creation of the 3D model. This may involve removing or reducing noise or artefacts, filtering parts of the scene so that only information from a specific region of interest is analysed further, and/or other pre-processing steps. Next, one or more points or areas of interest within the model are identified. A number of options for identifying one or more areas of interest may be used, as are described further below.
- Figure 6a shows how once the 3D model has been prepared it is possible to calculate the intersection of the model surface 70 with a plane 71 .
- the intersection of the plane with the model surface forms a 2D curve 72 which has the shape of the underlying physical object captured at this location as shown in Figure 6b.
- the curve information for any of part of the underlying object can be extracted for further analysis.
- a plurality of plane locations and/or orientations may be used.
- the planes 71 may be limited to a portion of the model, or they may extend across the width or depth of the model.
- the planes 71 may be sloped relative to the horizontal or vertical axis.
- the planes may be curves, for instance between three areas of interest.
- the location of the planes 71 or sections of Figure 7 may be chosen to select possible areas of interest although the system is not limited to these particular planes. In fact different planes may be desirable where different characteristics are being evaluated. The number of areas of interest identified may be dependent on the quality of the images available and/or the characteristic being evaluated.
- the information from these curves is then passed to a machine learning (ML) framework which has been trained to evaluate the animal (for example by calculating a body condition score) based on the curve data for each region extracted from the 3D model of the animal.
- ML machine learning
- the information the ML model uses to calculate the evaluation may be the raw curve points 90 provided by intersecting the model with the intersection plane.
- a curve described by a mathematical function can be fit to the raw curve points and the coefficients of the mathematical function can be provided as features to the ML system for predicting the evaluation or class.
- An example curve 91 is shown in Figure 8a where a curve has been fit to the upper surface of an animal and the distance between intersection points has been measured.
- measurements computed from a 2D curve which has been fit to the points can be calculated from the fitted curves and provided as features to the ML framework.
- Figure 8b shows a radius 92 being fit to a curve 90 of the top surface of an animal.
- all of the aforementioned features can be supplied when training the system and the ML system can determine which set of complementary features can best be used to evaluate the parameter which is under evaluation.
- Other metadata about each animal can also be provided such as animal breed breakdown and other properties that may be relevant to the evaluation.
- a number of techniques may be used to detect features or regions from which the curve information is to be extracted.
- 3D feature descriptors which describe the shape of a region of the object.
- 3D descriptor type is described by Rusu et al (Persistent Point Feature Histograms for 3D Point Clouds (Radu Bogdan Rusu, Zoltan Csaba Marton, Nico Blodow, Michael Beetz), Proceedings of the 10th International Conference on Intelligent Autonomous Systems (IAS-10), Baden-Baden, Germany, 2008.) although many others exist.
- an anatomical feature or region of the animal can be identified. If the feature or region is distinctive enough from the rest of the input data, the system can locate the same region in another similar view (e.g., another "frame") of the same underlying animal. In other words, another similar view (e.g., another "frame") of the same underlying animal.
- the detection of anatomical points of interest may use any one or more of seam carving, plane sweeping or conforming the model to a known structure. By extracting several descriptors around the vicinity of the 'centre' of the anatomical part the orientation of the part can also be established.
- the system may exploit known or fixed constraints as well as knowledge of the environment in which the input data was captured in to reduce the search space when looking for certain anatomical regions of interest.
- the known anatomy of the animal may be exploited to reduce the search space further or eliminate false positive matches.
- the model may include predetermined limits for anatomical data
- the orientation of a given plane may also be set relative to other parts of the model, that is, it may be set to be parallel to or perpendicular to other parts.
- its orientation may be set relative to the pin bones right at the back of the animal.
- correct plane orientation can also be determined by rotating or translating the positioned plane and maximizing or minimizing an angle, curvature, length or depth appropriately for the region.
- Various possible plane positions are shown in Figure 7.
- intersection plane described above may be swept across the model of the animal and anatomical parts of interest identified based on their distinctive shape profile. For instance when identifying the correct plane placement to extract a vertical cut across the tailhead region the plane may be swept across the broad area where this region is expected to be, and then the plane position that maximises the depth between the pin bones and the tail can be selected as the point that represents this region.
- a global optimization is applied which minimizes the descriptor distance between candidate locations on anatomical regions while maintaining a feasible object pose.
- This optimization simultaneously minimizes geometric measurements at the proposed anatomical feature locations. For instance, when applying the method to the problem of body condition scoring and determination of the precise location of the backbone, the height of the backbone ridge is maximised and/or the point which maximises or minimizes curvature (for example the radius of an osculating circle fit to the backbone ridge curve) is selected. Curvature of other regions such as the hips may also be maximised or minimized, or other geometric measurements or measurements associated with the hip to hip curve may be used, where the area between a straight line connecting the proposed hip points and the curve of the cow's surface may be maximised. Other similar properties of each anatomical region can be exploited.
- the size of the particular animal needs to be calculated and the curve data adjusted based on this size.
- Several geometries or measures of the size of a given animal can be used for this purpose, for example the length of the animal, its width, or its height.
- the position of the sensor from which the 3D shape information is captured e.g., the 3D camera
- the geometries or measures are calculated by comparing multiple frames or images of the animal, or by comparison with a known feature. For instance a ruler or known length could be included in the image field.
- Calculation of the animal's height can be established through knowledge of how far from the ground a certain part of the animal is. Often when measuring the stature of an animal such as a dairy cow the height at the shoulders is used. If the ground is visible from the perspective of the sensor (for instance a 3D camera) then a ground plane can be fit to 3D points on the ground, and thus the height above the ground for any point on the animal model can be easily calculated. If the depth sensor's position is static then a 3D capture of the area in which the animal stands when the 3D image is taken may also be used to pre-calculate the ground plane for later use in determining the height of any point on the animal model.
- the 3D shape information of the surface S on which the animals are imaged is taken when the animal is not standing on the surface as shown in Figure 5a.
- a point cloud may be formed from this image for use in the method. This provides an initial position form which a height can be extrapolated if required.
- Height may be computed at a consistent location on the animal (just forward of the hips) or an average over the entire length of the backbone from the tailhead forward may be used.
- a single view containing both the relevant part of the animal and the ground can be used to compute the animal's height.
- several points along the backbone of the animal can be used.
- multiple height measurements of the same animal taken over time from successive captures (images) could be aggregated to ensure that any single
- Calculation of the animal's width may be preferred as it does not rely on knowledge of where the ground is.
- the length of the animal may be used as it does not require knowledge of the position of the ground. However, this does require the sensor to be far enough away from the animal to see its entire length.
- the evaluation of the animal may be based on absolute measurements, such that normalisation of some or all of the data is not required.
- the image capture, point cloud formation and 3D model generation steps are the same as those described above. However, in this embodiment simpler features are extracted and the process of accurate anatomical point detection and plane placement is avoided.
- This method involves finding the rear-most point of the animal.
- the point cloud surface is then divided into a grid and the height from the ground of each individual point in each grid square is computed and then normalized by the height of the particular animal. Measures such as the average height, maximum height, minimum height, and standard deviation of the height of the points, may be computed for each grid square.
- the measurements for all grid squares are then entered into the ML framework to calculate the evaluation of the animal, for example by determining the category of the animal.
- the size of each grid square needs to be large enough to ensure that precise localisation of the individual squares on the surface of the animal does not significantly affect the measurements of the grid. Conversely the squares must not be so large that the discriminative power of the measurements that describe the region and its depressions (or lack thereof) are lost.
- Normalizing the region under analysis may be achieved by ascertaining the animal's height, as is described above. Any curve data calculated may be normalized by a factor derived from the animal's actual height relative to a standardised height.
- imaging device 1 1 and the processing device may be integrated into a single unit, or may be separate.
- the function of the processing device 2 may be distributed between two or more processors.
- the processor may be a computer or microprocessor or logic device.
- the system may use a matrix or grid which is superimposed on the 3D model of the animal.
- the method may measure the volume in each matrix grid position, or a shape of the volume in each matrix grid position to provide an input to the machine learning module 51 .
- the size of the grid may be adjusted depending on the animal or accuracy required, or a plurality of grid sizes may be used.
- the grid sizing may be adaptive dependent on the curvature or other aspect of the model.
- an apparatus for automatically evaluating an animal is generally referenced by arrow 100.
- the animal is a bovine.
- the apparatus 100 comprises two spaced apart static barrier means 1 .
- the static barrier means 1 are typically substantially parallel, as shown in Figure 1 .
- the static barrier means 1 may comprise a prior art cattle race, and are spaced apart sufficiently widely to allow an animal A to comfortably walk between them, but not so widely as to allow the animal A to turn around.
- At least one automatically moveable barrier means 2 is provided, typically configured as a pair of pneumatically operated doors.
- the barrier means 2 may be provided as first moveable barrier means at the entrance end of the race (that is, a first end of the static barrier means 1 ) and/or as second moveable barrier means at the opposite, exit end of the race (at the second, opposite end of the static barrier means 1 ).
- the moveable barrier means 2a can be opened to allow animals A to proceed into the space between the static barrier means 1 , or closed to prevent animals behind the barrier means 2a from proceeding forward and to prevent animals A in front of the barrier means 2a from moving backward.
- the moveable barrier means 2b can be opened to allow the animal A to proceed out of the space between the static barrier means 1 , or can be closed to bring the animal A to a halt within the space between the static barrier means 1 , and to prevent an animal in front of the moveable barrier means 2b from moving backward into that space.
- a structure 3 comprising a cover 4 may be provided.
- the cover 4, if provided, must be sufficiently high that the animal is comfortable walking through the structure 3, but is preferably sufficiently low that some or all of the animal inside the structure is in shadow.
- the cover 4 may extend partially or fully down the sides of the structure 3.
- the apparatus 100 may be provided with a walk-over weigh platform (not shown).
- the apparatus 100 is provided with animal position sensing means for sensing the position of the animal A.
- the animal position sensing means comprise a photoelectric sensor 6 located at a first position 7 for sensing when a required portion of the animal has moved through the first moveable barrier means 2a.
- the first position sensor 7 is spaced apart from the first moveable barrier means 2a or the first end of the static barrier means 1 by a distance which is approximately equal to the length of the animal, for example around 150 cm.
- the animal position sensing means may also comprise a second photoelectric sensor 9 located at a second position 10 which is substantially adjacent the first moveable barrier means 2a, or if that is not present, is adjacent the first end of the static barrier means 1 .
- the apparatus 100 comprises a 3D imaging device 1 1 .
- the 3D camera 1 1 is position such that one or more portions of the animal which are relevant to the evaluation of the animal can be brought within the field of view of the 3D camera 1 1 . These portions of the animal are described herein as the "area of interest". In some embodiments not all of the areas of interest will be within the field of view of the 3D camera simultaneously, but rather, information about each of the areas of interest may be captured at different times.
- an artificial lighting source 12 may be provided.
- the lighting source 12, if provided, is preferably adjustable (preferably automatically) to provide at least a minimum light level required by the 3D camera 1 1 .
- the animal When measuring characteristics of the animal with the 3D camera it is often preferable that the animal be stationary for a small amount of time in order to improve the accuracy of the measurements taken.
- the animal's pose when assessing animal characteristics such as BCS or lameness it is preferable for the animal's pose to be such that they are standing with even weight distribution and with their joints in a consistent position, in order to get an accurate sense of the animal's body structure and shape without the changes in body shape introduced through the animal being in motion. In addition, it may be preferable to stop the animal in order to allow some further interaction with the animal.
- the apparatus may allow a certain animal to pass through the apparatus without taking any steps to evaluate it, if certain conditions are present, one of those conditions being whether a new evaluation of the animal is "required".
- an evaluation of the animal is said to be "required” if more than a threshold period of time has elapsed since the last evaluation.
- the threshold period may be changed depending on the result of the last evaluation (for example, a cow which was last assessed as lame may be monitored more frequently than other cows which were not last assessed as lame). If an evaluation is "required” then an evaluation will be performed at the next convenient occasion. However, this does not infer that the apparatus will perform an evaluation of the animal the very next time it passes through the apparatus, if certain other conditions (as described further below), mean that it is not possible or not convenient to do so.
- the apparatus 100 may be in communication with a database to record the evaluation, when performed, and to receive information on when the last evaluation was performed and what its result was.
- Other conditions which may be used in making the decision on whether or not to evaluate the animal may include the speed at which the animal is moving, and the distance between the animal and any other animals in front or behind. Animals which are too close to other animals may not be evaluated, as the presence of two animals in the field of view of the 3D camera may result in an incorrect evaluation. In addition, the fact that animals are closely spaced together can be an indication that the animals are becoming bottlenecked in the race (i.e. the animals waiting to proceed through the system are being crowded together) perhaps because one animal has not proceeded through the system as quickly as expected or because animals are coming out of the milking shed faster than anticipated.
- the system may not evaluate any animals (or at least, may not close any of the moveable barriers 2a, 2b) until it detects that a space between the animal currently between the static barriers and the next animal waiting to enter the space between the barriers is at least equal to a predetermined minimum distance. Missing some evaluations during a single milking is not a problem as properties such as BCS or other metrics change slowly thus obtaining a measurement once every few days is sufficient.
- gate 2a will close whenever required to ensure separation and valid heat detection results, as timely assessment of oestrus is critical to the farmer. Damage of an adjacent animal by closing a moveable barrier 2a, 2b on them is avoided, as is closing a barrier 2a, 2b at a time which might startle an adjacent animal.
- animals which are moving rapidly though the apparatus may not be evaluated as they may be moving too fast for accurate information to be collected from the 3D camera, and too fast to safely bring them to a halt by closing the second moveable barrier means 2b.
- the apparatus may close the second movable barrier means 2b to bring the animal to a complete halt while 3D information of the area(s) of interest is captured.
- the start of the signal from the animal presence sensor 16 is allowed to initiate the command to close moveable barrier 2b.
- the presence of an animal at drafting gate entrance sensor 13 inhibits this command, preventing moveable barrier 2b from closing in the case where there is insufficient distance between animals.
- the apparatus may collect the 3D information without closing the second moveable barrier means 2b. This may occur in particular when the system is used to evaluate cows which are waiting to be milked in a rotary milking shed Operation of a preferred embodiment of the apparatus 100 is as follows:
- the moveable barrier means doors 2a, 2b are normally in the open position so that an animal A can move past the moveable barrier means 2a and into the field of view of the 3D camera 1 1 .
- the first photoelectric sensor 6 detects the presence of the head or chest of the animal A. Triggering of the first photoelectric sensor 6 may cause the moveable barrier means 2a to close behind the animal A, preventing the animal from moving backwards, and preventing the head of another animal from entering the field of view of the 3D camera.
- the second moveable barrier means 2b may closed, or both barrier means 2a, 2b may be closed.
- the 3D camera 1 1 may be triggered to record one or more images. Alternatively the 3D camera may be triggered a predetermined time after the first photoelectric sensor 6 detects the presence of the animal, or when a video analysis of the animal shows that the animal is in a suitable position and pose for information to be captured.
- the 3D camera 1 1 records a plurality of images, for example three images. Each image may have a different exposure setting.
- the system may analyse a video signal to determine when the animal is moving the least and capture an image at that time.
- position sensor may comprise any type of applicable position sensor, for example an acoustic range sensor, a motion sensor, a camera based system performing video analysis, a thermal camera, a depth camera, a laser based device or the like. These may replace one or more of the photo eyes 6, 9.
- acoustic range sensor for example an acoustic range sensor, a motion sensor, a camera based system performing video analysis, a thermal camera, a depth camera, a laser based device or the like. These may replace one or more of the photo eyes 6, 9.
- the position sensor may be capable of assessing the speed at which the animal is moving through the apparatus 100.
- This may comprise a plurality of photoeye sensors or a video system.
- the apparatus 100 will be used in conjunction with an automated drafting gate system 200.
- the automated drafting gate system 200 is provided with a sensor 13 (for example a photoelectric sensor) to indicate that the animal has passed through the drafting gate entrance 14, a signal from this sensor 13 may be used to indicate that the moveable barrier means 2a, if closed, can be opened.
- the apparatus may utilize moveable barrier means which are part of an existing automated drafting gate system 200 as the second moveable barrier means 2b.
- the system comprises an EID reader 15.
- a further sensor 16 may be positioned to indicate that the animal is in position for the EID sensor to take a reading of an EID tag associated with the animal. If the EID reader 15 has not obtained a reading within a predetermined time of the sensor 16 indicating that the animal is in position, then the moveable barrier means 2a may be kept closed until the animal has moved past another sensor 17 positioned at an exit of the drafting gates (if provided).
- the moveable barrier means 2a open to allow access to a second animal, the animal A which has just been processed by the apparatus 100 will be motivated to move away.
- further means for motivating the first animal A to move away from the moveable barrier means 2 may be provided, for example a nozzle configured to squirt compressed air towards the animal, or means (possibly pneumatic) for making a suitable noise.
- the 3D camera 1 1 is in communication with a processing means 300 which performs an analysis of the images taken from the 3D camera 1 1 to calculate an evaluation of the animal A, and to determine whether an evaluation of the animal is required, and if one is required, whether the correct conditions (e.g. speed, proximity to other animals) are present for an evaluation to occur.
- the evaluation comprises categorising of the animal, for example by calculating a body condition score (BCS) between 1 and 5 for the animal.
- BCS body condition score
- the processing means 300 may send a control signal to the automated drafting gates 200 depending on the result of the evaluation. For example, in one embodiment cows with a normal BCS may be drafted into one area (for example an entrance to a milking shed), while cows with a low BCS may be drafted into an area where additional feed is available. Cows which have failed to be identified by the electronic ID reader may be drafted into a third area.
- an animal may not be drafted into a special area as soon as the result of the evaluation indicates that this may be necessary. Instead, a record may be kept that the animal must be drafted at a later time.
- the position sensing means may comprise the 3D camera 1 1 and processing means 300.
- the first and second sensors 6, 9 may not be required, as the apparatus may be capable of determining when the animal is in the correct position to capture an image of the area of interest without the use of additional sensors.
- the 3D camera 1 1 may operate substantially continuously while the apparatus is in use.
- the position sensing means are operable to determine whether a second animal is within a predetermined distance, for example 100cm, of the moveable barrier means 2a.
- the position sensing means may comprise a further photoelectric sensor located substantially 100cm in front of the moveable barrier means 2a.
- This additional sensor may be used to determine whether another animal is within a
- This information may be used to determine if the moveable barrier means 2a, 2b are to be held open (e.g., if the animals are bottlenecked in the area leading to the apparatus), and may also be used to determine that the moveable barrier means 2a, 2b must be closed to ensure separation of the animals (e.g. if the evaluation includes oestrus detection).
- the decision on when to open the moveable barrier may be based on further criteria, sensors or characteristics.
- the system may use:
- the milking cycle i.e. am or pm
- Processing of the information from the 3D camera to calculate the evaluation of the animal may be performed using any of the embodiments described herein.
- the processor may use the output from the 3D camera and/or an additional 2D camera to detect the presence of an oestrus indicator on the animal, and may include an analysis of the indicator in the calculation of the evaluation, or as part of a separate evaluation calculation.
- the oestrus indicator may be a pressure sensitive heat detection patch, or any suitable alternative oestrus indicator.
- the oestrus indicator may comprise a tail paint marking.
- the oestrus indicator may comprise a patch which has different infra-red or human visible characteristics when activated.
- the present invention provides an apparatus and method for automatically evaluating an animal which can be operated independently of a rotary milking shed and which creates a minimal disruption to the movement of the animals through the race.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Environmental Sciences (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Animal Husbandry (AREA)
- Medical Informatics (AREA)
- Biodiversity & Conservation Biology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention concerne un procédé et un appareil pour évaluer automatiquement un animal en fonction de son apparence physique, et en particulier, mais pas exclusivement, un procédé et un appareil pour déterminer un indice de condition corporelle (ICC) ou boiterie, en particulier pour une vache laitière. La présente invention concerne également des procédés pour évaluer automatiquement un indice de condition corporelle pour du bétail. Le procédé consiste de préférence à recevoir des informations de forme tridimensionnelle (3D) correspondant à un espace occupé par l'animal ; créer un nuage de points 3D à partir des informations de forme 3D ; créer un modèle 3D en fonction des informations de forme ; calculer une ou plusieurs mesures représentatives à partir du modèle 3D ; et évaluer l'animal en fonction des mesures.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NZ71109215 | 2015-08-17 | ||
NZ711098 | 2015-08-17 | ||
NZ711098A NZ711098A (en) | 2015-08-17 | Apparatus and method for automatically evaluating an animal | |
NZ711092 | 2015-08-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017030448A1 true WO2017030448A1 (fr) | 2017-02-23 |
Family
ID=58051110
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/NZ2016/050129 WO2017030448A1 (fr) | 2015-08-17 | 2016-08-17 | Procédé et appareil pour évaluer un animal |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017030448A1 (fr) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416260A (zh) * | 2018-01-25 | 2018-08-17 | 北京农业信息技术研究中心 | 一种三维图像监测装置及方法 |
WO2019003015A1 (fr) * | 2017-06-29 | 2019-01-03 | The Gsi Group Llc | Estimation de poids d'animal basée sur la régression |
CN109238264A (zh) * | 2018-07-06 | 2019-01-18 | 中国农业大学 | 一种家畜位置姿势归一化方法及装置 |
WO2019036532A1 (fr) * | 2017-08-16 | 2019-02-21 | Adams Land & Cattle Co. | Installation de tri de bétail |
JP2019187277A (ja) * | 2018-04-24 | 2019-10-31 | 国立大学法人 宮崎大学 | 牛のボディコンディションスコアの評価装置、評価方法及び評価プログラム |
CN111353416A (zh) * | 2020-02-26 | 2020-06-30 | 广东温氏种猪科技有限公司 | 基于牲畜三维测量的姿态检测方法、系统及存储介质 |
CN112036364A (zh) * | 2020-09-14 | 2020-12-04 | 北京海益同展信息科技有限公司 | 跛行识别方法及装置、电子设备和计算机可读存储介质 |
CN112825791A (zh) * | 2020-12-25 | 2021-05-25 | 河南科技大学 | 一种基于深度学习与点云凸包化特征的奶牛体况评分方法 |
WO2021259886A1 (fr) * | 2020-06-25 | 2021-12-30 | Signify Holding B.V. | Système de détection pour déterminer un paramètre d'un groupe d'animaux |
CN114266811A (zh) * | 2021-11-26 | 2022-04-01 | 河南讯飞智元信息科技有限公司 | 牲畜体况评分方法、装置、电子设备及存储介质 |
US11425892B1 (en) | 2021-08-18 | 2022-08-30 | Barel Ip, Inc. | Systems, methods, and user interfaces for a domestic animal identification service |
WO2023031759A1 (fr) | 2021-09-02 | 2023-03-09 | Lely Patent N.V. | Système d'élevage d'animaux |
US20230260327A1 (en) * | 2020-06-30 | 2023-08-17 | Cattle Eye Ltd | Autonomous livestock monitoring |
CN116883328A (zh) * | 2023-06-21 | 2023-10-13 | 查维斯机械制造(北京)有限公司 | 基于计算机视觉的牛胴体脊柱区域快速提取方法 |
US20230342902A1 (en) * | 2022-04-22 | 2023-10-26 | Iclassifier Inc. | Method and system for automated evaluation of animals |
US11910784B2 (en) | 2020-10-14 | 2024-02-27 | One Cup Productions Ltd. | Animal visual identification, tracking, monitoring and assessment systems and methods thereof |
PL442537A1 (pl) * | 2022-10-17 | 2024-04-22 | Szkoła Główna Gospodarstwa Wiejskiego w Warszawie | Sposób oceny kondycji bydła domowego |
WO2024088479A1 (fr) | 2022-10-28 | 2024-05-02 | Technische Universität München, Körperschaft des öffentlichen Rechts | Procédé et dispositif d'acquisition et d'analyse automatiques de la représentation de la démarche d'un animal |
US12121007B2 (en) | 2021-09-07 | 2024-10-22 | Hill's Pet Nutrition, Inc. | Method for determining biometric data relating to an animal based on image data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050136819A1 (en) * | 2002-08-02 | 2005-06-23 | Kriesel Marshall S. | Apparatus and methods for the volumetric and dimensional measurement of livestock |
WO2010127023A1 (fr) * | 2009-05-01 | 2010-11-04 | Spicola Tool, Llc | Système stéréoscopique d'estimation de la masse sans contact et à distance |
US20110279650A1 (en) * | 2008-12-03 | 2011-11-17 | Bohao Liao | Arrangement and method for determining a body condition score of an animal |
US20140029808A1 (en) * | 2012-07-23 | 2014-01-30 | Clicrweight, LLC | Body Condition Score Determination for an Animal |
WO2016023075A1 (fr) * | 2014-08-13 | 2016-02-18 | Meat & Livestock Australia Limited | Imagerie 3d |
-
2016
- 2016-08-17 WO PCT/NZ2016/050129 patent/WO2017030448A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050136819A1 (en) * | 2002-08-02 | 2005-06-23 | Kriesel Marshall S. | Apparatus and methods for the volumetric and dimensional measurement of livestock |
US20110279650A1 (en) * | 2008-12-03 | 2011-11-17 | Bohao Liao | Arrangement and method for determining a body condition score of an animal |
WO2010127023A1 (fr) * | 2009-05-01 | 2010-11-04 | Spicola Tool, Llc | Système stéréoscopique d'estimation de la masse sans contact et à distance |
US20140029808A1 (en) * | 2012-07-23 | 2014-01-30 | Clicrweight, LLC | Body Condition Score Determination for an Animal |
WO2016023075A1 (fr) * | 2014-08-13 | 2016-02-18 | Meat & Livestock Australia Limited | Imagerie 3d |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10816387B2 (en) | 2017-06-29 | 2020-10-27 | The Gsi Group Llc | Regression-based animal weight estimation |
WO2019003015A1 (fr) * | 2017-06-29 | 2019-01-03 | The Gsi Group Llc | Estimation de poids d'animal basée sur la régression |
US11819008B2 (en) | 2017-08-16 | 2023-11-21 | Adams Land & Cattle Co. | Livestock sorting facility |
WO2019036532A1 (fr) * | 2017-08-16 | 2019-02-21 | Adams Land & Cattle Co. | Installation de tri de bétail |
US11324203B2 (en) | 2017-08-16 | 2022-05-10 | Adams Land & Cattle Co. | Livestock sorting facility |
CN108416260A (zh) * | 2018-01-25 | 2018-08-17 | 北京农业信息技术研究中心 | 一种三维图像监测装置及方法 |
JP7062277B2 (ja) | 2018-04-24 | 2022-05-06 | 国立大学法人 宮崎大学 | 牛のボディコンディションスコアの評価装置、評価方法及び評価プログラム |
JP2019187277A (ja) * | 2018-04-24 | 2019-10-31 | 国立大学法人 宮崎大学 | 牛のボディコンディションスコアの評価装置、評価方法及び評価プログラム |
CN109238264B (zh) * | 2018-07-06 | 2020-09-01 | 中国农业大学 | 一种家畜位置姿势归一化方法及装置 |
CN109238264A (zh) * | 2018-07-06 | 2019-01-18 | 中国农业大学 | 一种家畜位置姿势归一化方法及装置 |
CN111353416A (zh) * | 2020-02-26 | 2020-06-30 | 广东温氏种猪科技有限公司 | 基于牲畜三维测量的姿态检测方法、系统及存储介质 |
CN111353416B (zh) * | 2020-02-26 | 2023-07-07 | 广东温氏种猪科技有限公司 | 基于牲畜三维测量的姿态检测方法、系统及存储介质 |
WO2021259886A1 (fr) * | 2020-06-25 | 2021-12-30 | Signify Holding B.V. | Système de détection pour déterminer un paramètre d'un groupe d'animaux |
US12290051B2 (en) | 2020-06-25 | 2025-05-06 | Signify Holding, B.V. | Sensing system for determining a parameter of a set of animals |
US20230260327A1 (en) * | 2020-06-30 | 2023-08-17 | Cattle Eye Ltd | Autonomous livestock monitoring |
CN112036364A (zh) * | 2020-09-14 | 2020-12-04 | 北京海益同展信息科技有限公司 | 跛行识别方法及装置、电子设备和计算机可读存储介质 |
CN112036364B (zh) * | 2020-09-14 | 2024-04-16 | 京东科技信息技术有限公司 | 跛行识别方法及装置、电子设备和计算机可读存储介质 |
US12193414B2 (en) | 2020-10-14 | 2025-01-14 | One Cup Productions Ltd. | Animal visual identification, tracking, monitoring and assessment systems and methods thereof |
US11910784B2 (en) | 2020-10-14 | 2024-02-27 | One Cup Productions Ltd. | Animal visual identification, tracking, monitoring and assessment systems and methods thereof |
CN112825791A (zh) * | 2020-12-25 | 2021-05-25 | 河南科技大学 | 一种基于深度学习与点云凸包化特征的奶牛体况评分方法 |
CN112825791B (zh) * | 2020-12-25 | 2023-02-10 | 河南科技大学 | 一种基于深度学习与点云凸包化特征的奶牛体况评分方法 |
US11425892B1 (en) | 2021-08-18 | 2022-08-30 | Barel Ip, Inc. | Systems, methods, and user interfaces for a domestic animal identification service |
WO2023031759A1 (fr) | 2021-09-02 | 2023-03-09 | Lely Patent N.V. | Système d'élevage d'animaux |
US12121007B2 (en) | 2021-09-07 | 2024-10-22 | Hill's Pet Nutrition, Inc. | Method for determining biometric data relating to an animal based on image data |
CN114266811A (zh) * | 2021-11-26 | 2022-04-01 | 河南讯飞智元信息科技有限公司 | 牲畜体况评分方法、装置、电子设备及存储介质 |
US20230342902A1 (en) * | 2022-04-22 | 2023-10-26 | Iclassifier Inc. | Method and system for automated evaluation of animals |
PL442537A1 (pl) * | 2022-10-17 | 2024-04-22 | Szkoła Główna Gospodarstwa Wiejskiego w Warszawie | Sposób oceny kondycji bydła domowego |
WO2024088479A1 (fr) | 2022-10-28 | 2024-05-02 | Technische Universität München, Körperschaft des öffentlichen Rechts | Procédé et dispositif d'acquisition et d'analyse automatiques de la représentation de la démarche d'un animal |
DE102022128733A1 (de) | 2022-10-28 | 2024-05-08 | Technische Universität München, Körperschaft des öffentlichen Rechts | Verfahren zur automatisierten Erfassung und Analyse des Gangbildes eines Tieres |
CN116883328A (zh) * | 2023-06-21 | 2023-10-13 | 查维斯机械制造(北京)有限公司 | 基于计算机视觉的牛胴体脊柱区域快速提取方法 |
CN116883328B (zh) * | 2023-06-21 | 2024-01-05 | 查维斯机械制造(北京)有限公司 | 基于计算机视觉的牛胴体脊柱区域快速提取方法 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017030448A1 (fr) | Procédé et appareil pour évaluer un animal | |
CN108717523B (zh) | 基于机器视觉的母猪发情行为检测方法 | |
CA2744146C (fr) | Agencement et procede de determination d'une note d'etat corporel d'un animal | |
EP2925121B1 (fr) | Système et procédé pour prédire le résultat de santé d'un sujet | |
US9737040B2 (en) | System and method for analyzing data captured by a three-dimensional camera | |
US10373306B2 (en) | System and method for filtering data captured by a 3D camera | |
US20240382109A1 (en) | Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes | |
US10303939B2 (en) | System and method for filtering data captured by a 2D camera | |
Huang et al. | Cow tail detection method for body condition score using Faster R-CNN | |
WO2016023075A1 (fr) | Imagerie 3d | |
CN113598081B (zh) | 一种肉牛健康自动监测系统 | |
CA2775395C (fr) | Systeme de vision pour robot de pose | |
US20250212845A1 (en) | System and method for detecting lameness in cattle | |
US9681634B2 (en) | System and method to determine a teat position using edge detection in rear images of a livestock from two cameras | |
US9171208B2 (en) | System and method for filtering data captured by a 2D camera | |
NZ711098A (en) | Apparatus and method for automatically evaluating an animal | |
Yuan et al. | Stress-free detection technologies for pig growth based on welfare farming: A review | |
CA2849212C (fr) | Systeme de vision pour robot de pose | |
CN119359783B (zh) | 基于牲畜三维测量的姿态检测方法 | |
Zhao et al. | Real-time automatic classification of lameness in dairy cattle based on movement analysis with image processing technique | |
WO2025068550A1 (fr) | Système et procédé de détermination d'au moins un paramètre corporel d'un individu bovin par imagerie à distance | |
IL305217A (en) | Identifying individual pigs in a pen during visual tracking using proximity sensors | |
CA2924285C (fr) | Systeme de vision pour robot de pose |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16837377 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16837377 Country of ref document: EP Kind code of ref document: A1 |