NZ711098A - Apparatus and method for automatically evaluating an animal - Google Patents

Apparatus and method for automatically evaluating an animal

Info

Publication number
NZ711098A
NZ711098A NZ711098A NZ71109815A NZ711098A NZ 711098 A NZ711098 A NZ 711098A NZ 711098 A NZ711098 A NZ 711098A NZ 71109815 A NZ71109815 A NZ 71109815A NZ 711098 A NZ711098 A NZ 711098A
Authority
NZ
New Zealand
Prior art keywords
animal
barrier means
evaluation
space
static
Prior art date
Application number
NZ711098A
Inventor
Lyal Harris Bevin
James Catchpole Jason
Amy Hempstalk Kathryn
Stephen Cross Peter
Original Assignee
Livestock Improvement Corporation Limited
Filing date
Publication of NZ711098A publication Critical patent/NZ711098A/en
Application filed by Livestock Improvement Corporation Limited filed Critical Livestock Improvement Corporation Limited
Priority to PCT/NZ2016/050129 priority Critical patent/WO2017030448A1/en

Links

Abstract

apparatus (100) for automatically evaluating animals (A) is provided. The apparatus (100) has a structure comprising two spaced apart static barrier means (1), where a spacing between the barrier means (1) is selected to allow an animal (A) to pass between them. The apparatus (100) further comprises a three dimensional (3D) imaging device (11) for selectively collecting 3D shape information of an area of interest of the animal (A) when the animal (A) is positioned in the space between the static barrier means (1). Processing means (300) are provided for selectively evaluating the animal (A) based on the 3D shape information collected. The apparatus (100) further comprises first moveable barrier means (2a) at a first end of the static barrier (1) means for selectively preventing a second animal from entering the space between the static barrier means (1), and/or second moveable barrier means (2b) provided at a second end of the static barrier means (1) for selectively preventing the animal (A) from moving out of the space between the static barrier means (1). rises a three dimensional (3D) imaging device (11) for selectively collecting 3D shape information of an area of interest of the animal (A) when the animal (A) is positioned in the space between the static barrier means (1). Processing means (300) are provided for selectively evaluating the animal (A) based on the 3D shape information collected. The apparatus (100) further comprises first moveable barrier means (2a) at a first end of the static barrier (1) means for selectively preventing a second animal from entering the space between the static barrier means (1), and/or second moveable barrier means (2b) provided at a second end of the static barrier means (1) for selectively preventing the animal (A) from moving out of the space between the static barrier means (1).

Description

APPARATUS AND METHOD FOR AUTOMATICALLY EVALUATING AN ANIMAL The present invention relates to apparatus and methods for automatically evaluating an animal, and in particular, but not exclusively, to apparatus and methods for automatically evaluating a body condition score (BCS) for cows.
Background to the Invention Profitability from farmed animals, in particular dairy cows, is enhanced by regular evaluation of a number of parameters, for example oestrus status, body condition, lameness/locomotion, teat conformation and the like. Often such assessments result in the categorisation of the animal according to a recognised scale, for example 1-5.
Attempts have been made in the past to automate the evaluation of some of these parameters, in particular oestrus status. However, many of these parameters, for example body condition, are still evaluated manually in most cases. This can lead to the information obtained being unreliable through inconsistencies between evaluations performed by different people, or through the same person evaluating different animals inconsistently.
A number of conditions must be satisfied for the automated systems of the prior art to function correctly: the region of the animal which is of interest must be within the area captured by the image; the region must not be obscured; and the image must not contain relevant features from two or more different animals. It is also strongly preferable that the system evaluates the animal quickly enough to allow the animal to be drafted immediately after the image is collected.
It would be advantageous to develop a system which was not reliant on integration with a rotary milking shed in order to function properly, and which allows a high throughput of animals.
Throughout the description and claims the term “2D” is understood to mean “two dimensional”, and the term “3D” is understood to mean “three dimensional”.
The reference to any prior art in the specification is not, and should not be taken as, an acknowledgement or any form of suggestion that the prior art forms part of the common general knowledge in any country.
Object of the Invention It is an object of the present invention to provide a method and/or apparatus for automatically evaluating an animal which will overcome and/or ameliorate problems with such apparatus and methods at present, or which will at least provide a useful choice.
Other objects of the present invention may become apparent from the following description, which is given by way of example only.
Brief Summary of the Invention According to one aspect of the present invention there is provided an apparatus for automatically evaluating animals, the apparatus comprising a structure comprising two spaced apart static barrier means, a spacing between the barrier means selected to allow an animal to pass between the barrier means, the apparatus further comprising a three dimensional (3D) imaging device for selectively collecting 3D shape information of an area of interest of the animal when the animal is positioned in a space between the static barrier means and processing means for selectively evaluating the animal based on the 3D shape information, if collected, the apparatus further comprising first moveable barrier means provided at a first end of the static barrier means for selectively preventing a second animal from entering the space between the static barrier means and/or second moveable barrier means provided at a second end of the static barrier means for selectively preventing the animal from moving out of the space between the static barrier means.
Preferably, in use, the apparatus determines whether a selected animal is to be evaluated based, in part, on a space between the animal and an adjacent animal.
Preferably, in use, the apparatus determines whether the (or a) selected animal is to be evaluated based, in part, on a speed at which the animal is moving.
Preferably, in use, the apparatus determines whether the (or a) selected animal is to be evaluated based, in part, on an assessment of whether a new evaluation of the animal is required.
Preferably, in use, the apparatus determines whether a selected animal is to be evaluated based solely on one or more of: a space between the animal and an adjacent animal; a speed at which the animal is moving; and an assessment of whether a new evaluation of the animal is required.
Preferably the apparatus comprises the second moveable barrier means wherein, in use, the apparatus closes the second moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is in front of the first animal, is greater than a predetermined distance.
Preferably the apparatus comprises the first moveable barrier means, wherein, in use, the apparatus closes the first moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is behind the first animal, is greater than a predetermined distance.
Preferably the apparatus comprises the second moveable barrier means, wherein, in use, the apparatus closes the moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected animal is required, and the selected animal is travelling at a speed which is greater than a predetermined speed.
Preferably, the evaluation of whether the selected animal is required is dependent, in part, on a length of time since the last evaluation of the animal.
Preferably, the evaluation of whether the selected animal is required is dependent, in part, on the result of a previous evaluation of the selected animal.
Preferably, the apparatus further comprises animal position sensing means.
Preferably, the animal position sensing means comprise a first sensor means for detecting when a fore part of the animal is in a first position which is indicative of the entire animal having moved between the static barrier means.
Preferably, the animal position sensing means comprise a second sensor means for detecting when a rear of the animal has moved beyond a second position which is adjacent the first moveable barrier means.
Preferably, the first sensor means comprises a photoelectric sensor.
Preferably, the second position sensing means comprises a photoelectric sensor.
Preferably, the animal position sensing means comprises the 3D imaging device and the processing means.
Preferably, the apparatus comprises an electronic ID reader.
Preferably, the 3D imaging device comprises a 3D camera.
Preferably, the apparatus comprises a lighting means for artificially lighting an area which is within a field of view of the 3D imaging device.
Preferably, the intensity of the light inside the structure is adjustable.
Preferably, the apparatus sends a signal to an automatic drafting gate depending on the evaluation of the animal.
Preferably, the animal position sensing means comprises a drafting gate entry sensor.
Preferably, the animal position sensing means comprises a drafting gate exit sensor.
Preferably the evaluation of the animal performed by the apparatus comprises a calculation of a body condition score.
According to a second aspect of the present invention there is provided a method of automatically evaluating animals, the method comprising the steps of: i. determining whether a selected animal is to be evaluated based on one or more of: a space between the animal and an adjacent animal; a speed at which the animal is moving; and an assessment of whether a new evaluation of the animal is required. ii. if the animal is to selected to be evaluated, collecting 3D shape information of an area of interest of the animal when the animal is in a space between two spaced apart static barrier means, and processing the 3D shape information to evaluate the animal based on the 3D shape information.
Preferably, the method comprises the step of closing a first moveable barrier means to prevent a second animal from entering into the space between the static barrier means if a distance between the animal and a second animal which is behind the first animal is greater than a predetermined distance.
Preferably, the method comprises the step of closing a second moveable barrier means to prevent the animal from moving out of the space between the static barrier means, if a speed of the animal is greater than a predetermined maximum speed.
Preferably, the method comprises the step of closing the (or a) second moveable barrier means to prevent the animal from moving out of the space between the static barrier means if a distance between the animal and a second animal which is in front of the first animal is greater than a predetermined distance.
Preferably, the method comprises receiving a signal from a first animal position sensor means when a fore part of the animal is in a first position which is indicative of the entire animal having moved into the space between the static barrier means.
Preferably, the method comprises receiving a signal from a second animal position sensor means when a rear of the animal has passed a second position which is adjacent a first end of the static barrier means.
Preferably, the method comprises capturing the 3D shape information after receiving the signal from the second animal position sensing means.
Preferably, the method comprises the step of using a 3D camera to capture the 3D shape information.
Preferably, the method comprises the step of processing the 3D shape information to determine when the animal is in a suitable position to obtain 3D information to perform the evaluation.
Preferably, the method comprises the step of processing the 3D shape information to determine whether the animal is in a suitable stance to obtain 3D information to perform the evaluation.
Preferably, the method comprises updating a herd management system depending on the evaluation of the animal.
Preferably, the method comprises sending an automatic drafting gate a signal which is representative of the evaluation of the animal.
Preferably the evaluation of the animal comprises a calculation of a body condition score.
The invention may also be said broadly to consist in the parts, elements and features referred to or indicated in the specification of the application, individually or collectively, in any or all combinations of two or more of said parts, elements or features, and where specific integers are mentioned herein which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
According to a still further aspect of the present invention, an apparatus and/or method for automatically evaluating animals is substantially as herein described, with reference to the accompanying drawings.
Further aspects of the invention, which should be considered in all its novel aspects, will become apparent from the following description given by way of example of possible embodiments of the invention.
Brief Description of the Drawings Figure 1 is a diagrammatic side view of an apparatus according to one embodiment of the present invention.
Figure 2 is a diagrammatic top view of the apparatus of Figure 1, with the cover removed for clarity and the drafting gate positions shown in outline.
Best Mode For Performing the Invention Referring first to Figures 1 and 2, an apparatus for automatically evaluating an animal according to one embodiment of the present invention is generally referenced by arrow 100. In preferred embodiments the animal is a bovine.
The apparatus 100 comprises two spaced apart static barrier means 1. The static barrier means 1 are typically substantially parallel, as shown in Figure 1.
The static barrier means 1 may comprise a prior art cattle race, and are spaced apart sufficiently widely to allow an animal A to comfortably walk between them, but not so widely as to allow the animal A to turn around.
At least one automatically moveable barrier means 2 is provided, typically configured as a pair of pneumatically operated doors. The barrier means 2 may be provided as first moveable barrier means 2a at the entrance end of the race (that is, a first end of the static barrier means 1) and/or as second moveable barrier means 2b at the opposite, exit end of the race (at the second, opposite end of the static barrier means 1). The moveable barrier means 2a can be opened to allow animals A to proceed into the space between the static barrier means 1, or closed to prevent animals behind the barrier means 2a from proceeding forward and to prevent animals A in front of the barrier means 2a from moving backward. The moveable barrier means 2b can be opened to allow the animal A to proceed out of the space between the static barrier means 1, or can be closed to bring the animal A to a halt within the space between the static barrier means 1, and to prevent an animal in front of the moveable barrier means 2b from moving backward into that space.
In some embodiments a structure 3 comprising a cover 4 may be provided. The cover 4, if provided, must be sufficiently high that the animal is comfortable walking through the structure 3, but is preferably sufficiently low that some or all of the animal inside the structure is in shadow. In some embodiments the cover 4 may extend partially or fully down the sides of the structure 3. In some embodiments the apparatus 100 may be provided with a walk-over weigh platform (not shown).
The apparatus 100 is provided with animal position sensing means for sensing the position of the animal A. In one embodiment the animal position sensing means comprise a photoelectric sensor 6 located at a first position 7 for sensing when a required portion of the animal has moved through the first moveable barrier means 2a. In one embodiment the first position sensor 7 is spaced apart from the first moveable barrier means 2a or the first end of the static barrier means 1 by a distance which is approximately equal to the length of the animal, for example around 150 cm. The animal position sensing means may also comprise a second photoelectric sensor 9 located at a second position 10 which is substantially adjacent the first moveable barrier means 2a, or if that is not present, is adjacent the first end of the static barrier means 1.
The apparatus 100 comprises a 3D imaging device 11. The imaging device may include one or more of a LiDAR, structure from motion devices, stereo or multiview camera devices, depth cameras based on time-of-flight or any similar methodology, lightfield cameras, or any other device which provides depth information for the scene being captured, and is referred to herein as a “3D camera”.
The 3D camera 11 is position such that one or more portions of the animal which are relevant to the evaluation of the animal can be brought within the field of view of the 3D camera 11. These portions of the animal are described herein as the “area of interest”. In some embodiments not all of the areas of interest will be within the field of view of the 3D camera simultaneously, but rather, information about each of the areas of interest may be captured at different times.
In some embodiments an artificial lighting source 12 may be provided. The lighting source 12, if provided, is preferably adjustable (preferably automatically) to provide at least a minimum light level required by the 3D camera 11.
When measuring characteristics of the animal with the 3D camera it is often preferable that the animal be stationary for a small amount of time in order to improve the accuracy of the measurements taken. Some 3D cameras, for example those based on time-of-flight technology, produce the best results when the object being measurement is moving as little as possible.
Furthermore, when assessing animal characteristics such as BCS or lameness it is preferable for the animal’s pose to be such that they are standing with even weight distribution and with their joints in a consistent position, in order to get an accurate sense of the animal’s body structure and shape without the changes in body shape introduced through the animal being in motion. In addition, it may be preferable to stop the animal in order to allow some further interaction with the animal.
Many of the evaluations which can be performed by the apparatus do not need to be performed every time the animal passes through the apparatus, as the evaluation is not likely to change rapidly. For example, in the case of Body Condition Score (BCS) it is sufficient for the animal to be evaluated only once every 3-4 days. Accordingly, the apparatus may allow a certain animal to pass through the apparatus without taking any steps to evaluate it, if certain conditions are present, one of those conditions being whether a new evaluation of the animal is “required”. In this context an evaluation of the animal is said to be “required” if more than a threshold period of time has elapsed since the last evaluation. The threshold period may be changed depending on the result of the last evaluation (for example, a cow which was last assessed as lame may be monitored more frequently than other cows which were not last assessed as lame). If an evaluation is “required” then an evaluation will be performed at the next convenient occasion.
However, this does not infer that the apparatus will perform an evaluation of the animal the very next time it passes through the apparatus, if certain other conditions (as described further below), mean that it is not possible or not convenient to do so.
The apparatus 100 may be in communication with a database to record the evaluation, when performed, and to receive information on when the last evaluation was performed and what its result was.
Other conditions which may be used in making the decision on whether or not to evaluate the animal may include the speed at which the animal is moving, and the distance between the animal and any other animals in front or behind. Animals which are too close to other animals may not be evaluated, as the presence of two animals in the field of view of the 3D camera may result in an incorrect evaluation. In addition, the fact that animals are closely spaced together can be an indication that the animals are becoming bottlenecked in the race (i.e. the animals waiting to proceed through the system are being crowded together) perhaps because one animal has not proceeded through the system as quickly as expected or because animals are coming out of the milking shed faster than anticipated. In this circumstance it is desirable to allow the animals to proceed through the system without delaying them to avoid causing a backlog of animals which eventually would adversely impact operations in the shed and the farmer, and so the system may not evaluate any animals (or at least, may not close any of the moveable barriers 2a, 2b) until it detects that a space between the animal currently between the static barriers and the next animal waiting to enter the space between the barriers is at least equal to a predetermined minimum distance. Missing some evaluations during a single milking is not a problem as properties such as BCS or other metrics change slowly thus obtaining a measurement once every few days is sufficient.
However in some cases it may be necessary to evaluate every animal if at all possible (for example if the evaluation to be performed includes oestrus detection). In that case gate 2a will close whenever required to ensure separation and valid heat detection results, as timely assessment of oestrus is critical to the farmer.
Injury of an adjacent animal by closing a moveable barrier 2a, 2b on them is avoided, as is closing a barrier 2a, 2b at a time which might startle an adjacent animal.
Similarly, animals which are moving rapidly though the apparatus may not be evaluated as they may be moving too fast for accurate information to be collected from the 3D camera, and too fast to safely bring them to a halt by closing the second moveable barrier means 2b.
If the animal is moving relatively slowly, and there is a sufficient distance between the animal and any animals in front of the animal, then the apparatus may close the second movable barrier means 2b to bring the animal to a complete halt while 3D information of the area(s) of interest is captured. With a sufficient distance between animals, the start of the signal from the animal presence sensor 16 is allowed to initiate the command to close moveable barrier 2b.
However, the presence of an animal at drafting gate entrance sensor 13 inhibits this command, preventing moveable barrier 2b from closing in the case where there is insufficient distance between animals. However, if the animal is moving sufficiently slowly, or voluntarily stops in a convenient position, the apparatus may collect the 3D information without closing the second moveable barrier means 2b. This may occur in particular when the system is used to evaluate cows which are waiting to be milked in a rotary milking shed Operation of a preferred embodiment of the apparatus 100 is as follows: The moveable barrier means doors 2a, 2b are normally in the open position so that an animal A can move past the moveable barrier means 2a and into the field of view of the 3D camera 11.
When the animal A has moved past the barrier means 2a the first photoelectric sensor 6 detects the presence of the head or chest of the animal A. Triggering of the first photoelectric sensor 6 may cause the moveable barrier means 2a to close behind the animal A, preventing the animal from moving backwards, and preventing the head of another animal from entering the field of view of the 3D camera. Alternatively the second moveable barrier means 2b may closed, or both barrier means 2a, 2b may be closed.
Continued forward motion by the animal A moves the rear of the animal beyond the second position 10. When the second photoelectric sensor 9 detects that this has occurred, the 3D camera 11 may be triggered to record one or more images. Alternatively the 3D camera may be triggered a predetermined time after the first photoelectric sensor 6 detects the presence of the animal, or when a video analysis of the animal shows that the animal is in a suitable position and pose for information to be captured.
In one embodiment the 3D camera 11 records a plurality of images, for example three images.
Each image may have a different exposure setting. In another embodiment the system may analyse a video signal to determine when the animal is moving the least and capture an image at that time.
In another embodiment position sensor may comprise any type of applicable position sensor, for example an acoustic range sensor, a motion sensor, a camera based system performing video analysis, a thermal camera, a depth camera, a laser based device or the like. These may replace one or more of the photo eyes 6, 9.
In another embodiment the position sensor may be capable of assessing the speed at which the animal is moving through the apparatus 100. This may comprise a plurality of photoeye sensors or a video system.
In many embodiments the apparatus 100 will be used in conjunction with an automated drafting gate system 200. If the automated drafting gate system 200 is provided with a sensor 13 (for example a photoelectric sensor) to indicate that the animal has passed through the drafting gate entrance 14, a signal from this sensor 13 may be used to indicate that the moveable barrier means 2a, if closed, can be opened. In one embodiment the apparatus may utilize moveable barrier means which are part of an existing automated drafting gate system 200 as the second moveable barrier means 2b.
In preferred embodiments the system comprises an EID reader 15. A further sensor 16 may be positioned to indicate that the animal is in position for the EID sensor to take a reading of an EID tag associated with the animal. If the EID reader 15 has not obtained a reading within a predetermined time of the sensor 16 indicating that the animal is in position, then the moveable barrier means 2a may be kept closed until the animal has moved past another sensor 17 positioned at an exit of the drafting gates (if provided).
Those skilled in the art will appreciate that when the moveable barrier means 2a open to allow access to a second animal, the animal A which has just been processed by the apparatus 100 will be motivated to move away. In some embodiments further means for motivating the first animal A to move away from the moveable barrier means 2 may be provided, for example a nozzle configured to squirt compressed air towards the animal, or means (possibly pneumatic) for making a suitable noise.
The 3D camera 11 is in communication with a processing means 300 which performs an analysis of the images taken from the 3D camera 11 to calculate an evaluation of the animal A, and to determine whether an evaluation of the animal is required, and if one is required, whether the correct conditions (e.g. speed, proximity to other animals) are present for an evaluation to occur. In one embodiment the evaluation comprises categorising of the animal, for example by calculating a body condition score (BCS) between 1 and 5 for the animal.
In preferred embodiments only the results from certain animals are processed for evaluation.
For example, some evaluations may only be performed on animals which have previously been flagged as requiring ongoing monitoring.
In preferred embodiments the processing means 300 may send a control signal to the automated drafting gates 200 depending on the result of the evaluation. For example, in one embodiment cows with a normal BCS may be drafted into one area (for example an entrance to a milking shed), while cows with a low BCS may be drafted into an area where additional feed is available. Cows which have failed to be identified by the electronic ID reader may be drafted into a third area.
In some embodiments an animal may not be drafted into a special area as soon as the result of the evaluation indicates that this may be necessary. Instead, a record may be kept that the animal must be drafted at a later time.
In some embodiments the position sensing means may comprise the 3D camera 11 and processing means 300. In these embodiments the first and second sensors 6, 9 may not be required, as the apparatus may be capable of determining when the animal is in the correct position to capture an image of the area of interest without the use of additional sensors. In these embodiments the 3D camera 11 may operate substantially continuously while the apparatus is in use.
In another embodiment (not shown) the position sensing means are operable to determine whether a second animal is within a predetermined distance, for example 100cm, of the moveable barrier means 2a. In one embodiment the position sensing means may comprise a further photoelectric sensor located substantially 100cm in front of the moveable barrier means This additional sensor may be used to determine whether another animal is within a predetermined distance of the first moveable barrier means 2a. This information may be used to determine if the moveable barrier means 2a, 2b are to be held open (e.g, if the animals are bottlenecked in the area leading to the apparatus), and may also be used to determine that the moveable barrier means 2a, 2b must be closed to ensure separation of the animals (e.g if the evaluation includes oestrus detection).
Processing of the information from the 3D camera to calculate the evaluation of the animal may be performed as follows.
In one embodiment the 3D shape information from the 3D camera is used to create a 3D point cloud. Next the point cloud data is used to create a 3D surface model. In a preferred embodiment the surface model is a triangular mesh formed using the well known method described by Gopi & Krishnan (Gopi, M., & Krishnan, S. (2002). A Fast and Efficient Projection- Based Approach for Surface Reconstruction. 15th Brazilian Symposium on Computer Graphics and Image Processing (pp. 179-186). Washington, DC: IEEE Computer Society.Processing, 2002).
Filtering and/or smoothing of the data may occur prior to the creation of the 3D point cloud and/or prior to the creation of the 3D model. This may involve removing or reducing noise or artefacts, filtering parts of the scene so that only information from a specific region of interest is analysed further, and/or other pre-processing steps.
Next, one or more points or areas of interest within the model are identified. A number of options for identifying one or more areas of interest may be used, as are described further below.
Once the 3D model has been prepared it is possible to calculate the intersection of the model surface with a plane. The intersection of the plane with the model surface forms a 2D curve which has the shape of the underlying physical object captured at this location. By positioning, orientating and sizing the intersection plane, the curve information for any of part of the underlying object can be extracted for further analysis.
The information from these curves is then passed to a machine learning (ML) framework which has been trained to evaluate the animal (for example by calculating a body condition score) based on the curve data for each region extracted from the 3D model of the animal.
The information the ML model uses to calculate the evaluation may be the raw curve points provided by intersecting the model with the intersection plane. Alternatively, a curve described by a mathematical function can be fit to the raw curve points and the coefficients of the mathematical function can be provided as features to the ML system for predicting the evaluation or class.
In an alternate embodiment measurements computed from a 2D curve which has been fit to the points, such as lengths, curvature information, depths, areas, or any other descriptive measure, can be calculated from the fitted curves and provided as features to the ML framework.
Alternatively all of the aforementioned features can be supplied when training the system and the ML system can determine which set of complementary features can best be used to evaluate the parameter which is under evaluation. Other metadata about each animal can also be provided such as animal breed breakdown and other properties that may be relevant to the evaluation.
Detecting points of interest A number of techniques may be used to detect features or regions from which the curve information is to be extracted.
One option for identifying an area of interest is the use of 3D feature descriptors which describe the shape of a region of the object. One example of a 3D descriptor type is described by Rusu et al (Persistent Point Feature Histograms for 3D Point Clouds (Radu Bogdan Rusu, Zoltan Csaba Marton, Nico Blodow, Michael Beetz), Proceedings of the 10th International Conference on Intelligent Autonomous Systems (IAS-10), Baden-Baden, Germany, 2008.) although many others exist.
By comparing a reference shape descriptor to a descriptor derived from the 3D model an anatomical feature or region of the animal can be identified. If the feature or region is distinctive enough from the rest of the input data, the system can locate the same region in another similar view (e.g., another “frame”) of the same underlying animal.
By extracting several descriptors around the vicinity of the ‘centre’ of the anatomical part the orientation of the part can also be established.
In some embodiments the system may exploit known or fixed constraints as well as knowledge of the environment in which the input data was captured in to reduce the search space when looking for certain anatomical regions of interest. In the case of animals, the known anatomy of the animal may be exploited to reduce the search space further or eliminate false positive matches.
Once several anatomical points or regions in the input data have been located, this information, combined with the known anatomy of the animal, allows the orientation of the animal to be determined and consequently the orientation of a given intersection plane.
The fact that the animal’s orientation is constrained by the barrier means allows false positives to be reduced, as assigning orientations to planes is simply a process of refinement of an initial estimate.
The orientation of a given plane may also be set relative to other parts of the model, that is, it may be set to be parallel to or perpendicular to other parts. For example, in the case of the tailhead plane, its orientation may be set relative to the pin bones right at the back of the animal.
Like the plane positioning process, correct plane orientation can also be determined by rotating the positioned plane and maximizing or minimizing an angle, curvature, length or depth appropriately for the region.
In another embodiment the intersection plane described above may be swept across the model of the animal and anatomical parts of interest identified based on their distinctive shape profile.
For instance when identifying the correct plane placement to extract a vertical cut across the tailhead region the plane may be swept across the broad area where this region is expected to be, and then the plane position that maximises the depth between the pin bones and the tail can be selected as the point that represents this region.
In yet another embodiment a global optimization is applied which minimizes the descriptor distance between candidate locations on anatomical regions while maintaining a feasible object pose. This optimization simultaneously minimizes geometric measurements at the proposed anatomical feature locations. For instance, when applying the method to the problem of body condition scoring and determination of the precise location of the backbone, the height of the backbone ridge is maximised and/or the point which maximises or minimizes curvature (for example the radius of an osculating circle fit to the backbone ridge curve) is selected.
Curvature of other regions such as the hips may also be maximised or minimized, or other geometric measurements or measurements associated with the hip to hip curve may be used, where the area between a straight line connecting the proposed hip points and the curve of the cow’s surface may be maximised. Other similar properties of each anatomical region can be exploited.
In one embodiment all of these factors are simultaneously optimized in order to obtain the globally optimal location of the anatomical points of interest while compensating for the deficiencies and limitations of any one approach.
Normalisation Some animal properties vary between animals due to the size of the animal. However, certain calculations must be independent of the absolute size of the animal. For example, absolute measures such as lengths, depths or distances between parts of the animal will vary due to the animal’s size, as well as due to a classification of the animal. Thus, in many embodiments, the animal’s size must be standardised prior to analysis.
To ensure that a curve analysis or measurements taken from curves are independent of the effect of animal size, the size of the particular animal needs to be calculated and the curve data adjusted based on this size. Several measures of the size of a given animal can be used for this purpose, for example the length of the animal, its width, or its height. The position of the 3D camera may dictate which of these measurements are available and are sufficiently accurate.
Calculation of the animal’s height (sometimes known as the animal’s stature) can be established through knowledge of how far from the ground a certain part of the animal is. Often when measuring the stature of an animal such as a dairy cow the height at the shoulders is used. If the ground is visible from the perspective of the 3D camera then a ground plane can be fit to 3D points on the ground, and thus the height above the ground for any point on the animal model can be easily calculated. A 3D capture of the area in which the animal stands when the 3D image is taken may also be used to pre-calculate the ground plane for later use in determining the height of any point on the animal model.
Height may be computed at a consistent location on the animal (just forward of the hips) or an average over the entire length of the backbone from the tailhead forward may be used.
While normalisation is required in many embodiments, particularly those which result in a categorisation of the animal, in other embodiments the evaluation of the animal may be based on absolute measurements, such that normalisation of some or all of the data is not required.
In another embodiment the image capture, point cloud formation and 3D model generation steps are the same as those described above. However, in this embodiment simpler features are extracted and the process of accurate anatomical point detection and plane placement is avoided.
This method involves finding the rear-most point of the animal. The point cloud surface is then divided into a grid and the height from the ground of each individual point in each grid square is computed and then normalized by the height of the particular animal.
Measures such as the average height, maximum height, minimum height, and standard deviation of the height of the points, may be computed for each grid square.
The measurements for all grid squares are then entered into the ML framework to calculate the evaluation of the animal, for example by determining the category of the animal. The size of each grid square needs to be large enough to ensure that precise localisation of the individual squares on the surface of the animal does not significantly affect the measurements of the grid.
Conversely the squares must not be so large that the discriminative power of the measurements that describe the region and its depressions (or lack thereof) are lost.
If required, further invariance to the precise localisation of the grid can be achieved by interpolating point values between adjacent squares or by weighting the values in the squares near the boundaries of the squares which are increasingly subject to the effects of grid localisation. Depending on the exact camera placement, points near the edges of the model – which may be more susceptible to noise or missing data due to the extreme angle they make between the animal’s surface normal at this point and the camera itself – may be omitted.
Normalizing the region under analysis (so as to remove the effect of the size of the animal) may be achieved by ascertaining the animal’s height, as is described above. Any curve data calculated may be normalized by a factor derived from the animal’s actual height relative to a standardised height.
While many of the embodiments of the invention have been described above with reference to the calculation of the body condition score of a dairy cow, the methods described may also be used for evaluation of other characteristics of a dairy cow and/or for evaluation of various characteristics of other animals. In some embodiments the processor may use the output from the 3D camera and/or an additional 2D camera to detect the presence of an oestrus indicator on the animal, and may include an analysis of the indicator in the calculation of the evaluation, or as part of a separate evaluation calculation. The oestrus indicator may be a pressure sensitive heat detection patch, or any suitable alternative oestrus indicator. For example, in one embodiment the oestrus indicator may comprise a tail paint marking. In another embodiment the oestrus indicator may comprise a patch which has different infra-red or human visible characteristics when activated.
Those skilled in the art will appreciate that the present invention provides an apparatus and method for automatically evaluating an animal which can be operated independently of a rotary milking shed and which creates a minimal disruption to the movement of the animals through the race.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise”, “comprising”, and the like, are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense, that is to say, in the sense of “including, but not limited to”.
Where in the foregoing description, reference has been made to specific components or integers of the invention having known equivalents, then such equivalents are herein incorporated as if individually set forth.
Although this invention has been described by way of example and with reference to possible embodiments thereof, it is to be understood that modifications or improvements may be made thereto without departing from the spirit or scope of the accompanying claims.

Claims (37)

WHAT WE CLAIM IS:
1. An apparatus for automatically evaluating animals, the apparatus comprising a structure comprising two spaced apart static barrier means, a spacing between the barrier means 5 selected to allow an animal to pass between the barrier means, the apparatus further comprising a three dimensional (3D) imaging device for selectively collecting 3D shape information of an area of interest of the animal when the animal is positioned in a space between the static barrier means and processing means for selectively evaluating the animal based on the 3D shape information, if collected, 10 the apparatus further comprising first moveable barrier means provided at a first end of the static barrier means for selectively preventing a second animal from entering the space between the static barrier means and/or second moveable barrier means provided at a second end of the static barrier means for selectively preventing the animal from moving out of the space between the static barrier means.
2. The apparatus of claim 1 wherein, in use, the apparatus determines whether a selected animal is to be evaluated based, in part, on a space between the animal and an adjacent animal. 20
3. The apparatus of claim 1 or 2 wherein, in use, the apparatus determines whether the (or a) selected animal is to be evaluated based, in part, on a speed at which the animal is moving.
4. The apparatus of claim 1, 2 or 3 wherein, in use, the apparatus determines whether the 25 (or a) selected animal is to be evaluated based, in part, on an assessment of whether a new evaluation of the animal is required.
5. The apparatus of claim 1 wherein, in use, the apparatus determines whether a selected animal is to be evaluated based solely on one or more of: 30 a. a space between the animal and an adjacent animal; b. a speed at which the animal is moving; and c. an assessment of whether a new evaluation of the animal is required.
6. The apparatus of any one of the preceding claims comprising the second moveable 35 barrier means wherein, in use, the apparatus closes the second moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is in front of the first animal, is greater than a predetermined distance. 5
7. The apparatus of any one of claims 1 to 5 comprising the first moveable barrier means, wherein, in use, the apparatus closes the first moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected first animal is required, and a distance between the first animal and a second animal which is behind the first animal, is greater than a predetermined distance.
8. The apparatus of any one of claims 1 to 5 comprising the second moveable barrier means, wherein, in use, the apparatus closes the moveable barrier means when the apparatus determines that a new evaluation of the (or a) selected animal is required, and the selected animal is travelling at a speed which is greater than a predetermined speed.
9. The apparatus of any one of claims 4 - 8 wherein the evaluation of whether the selected animal is required is dependent, in part, on a length of time since the last evaluation of the animal. 20
10. The apparatus of any one of claims 4 – 8 wherein the evaluation of whether the selected animal is required is dependent, in part, on the result of a previous evaluation of the selected animal.
11. The apparatus of any one of the preceding claims further comprising animal position 25 sensing means.
12. The apparatus of claim 11 wherein the animal position sensing means comprise a first sensor means for detecting when a fore part of the animal is in a first position which is indicative of the entire animal having moved between the static barrier means.
13. The apparatus of claim 11 or 12 wherein the animal position sensing means comprise a second sensor means for detecting when a rear of the animal has moved beyond a second position which is adjacent the first moveable barrier means. 35
14. The apparatus of claim 11, 12 or 13 wherein the first sensor means comprises a photoelectric sensor.
15. The apparatus of claim 11, 12, 13 or 14 wherein the second position sensing means comprises a photoelectric sensor. 5
16. The apparatus of any one of claims 11 to 15 wherein the animal position sensing means comprises the 3D imaging device and the processing means.
17. The apparatus of any one of the preceding claims comprising an electronic ID reader. 10
18. The apparatus of any one of the preceding claims wherein the 3D imaging device comprises a 3D camera.
19. The apparatus of any one of the preceding claims comprising a lighting means for artificially lighting an area which is within a field of view of the 3D imaging device.
20. The apparatus of claim 13 wherein the intensity of the light inside the structure is adjustable.
21. The apparatus of any one of the preceding claims wherein the apparatus sends a signal 20 to an automatic drafting gate depending on the evaluation of the animal.
22. The apparatus of claim 21 wherein the animal position sensing means comprises a drafting gate entry sensor. 25
23. The apparatus of claim 21 or 22 wherein the animal position sensing means comprises a drafting gate exit sensor.
24. A method of automatically evaluating animals, the method comprising the steps of: i. determining whether a selected animal is to be evaluated based on one or more 30 of: a. a space between the animal and an adjacent animal; b. a speed at which the animal is moving; and c. an assessment of whether a new evaluation of the animal is required. ii. if the animal is to selected to be evaluated, collecting 3D shape information of an area of 35 interest of the animal when the animal is in a space between two spaced apart static barrier means, and processing the 3D shape information to evaluate the animal based on the 3D shape information.
25. The method of claim 25 comprising the step of closing a first moveable barrier means to prevent a second animal from entering into the space between the static barrier means if a distance between the animal and a second animal which is behind the first animal is 5 greater than a predetermined distance.
26. The method of claim 24 or 25 comprising the step of closing a second moveable barrier means to prevent the animal from moving out of the space between the static barrier means, if a speed of the animal is greater than a predetermined maximum speed.
27. The method of claim 24, 25 or 26 comprising the step of closing the (or a) second moveable barrier means to prevent the animal from moving out of the space between the static barrier means if a distance between the animal and a second animal which is in front of the first animal is greater than a predetermined distance.
28. The method of any one of claims 24-27 wherein the method comprises receiving a signal from a first animal position sensor means when a fore part of the animal is in a first position which is indicative of the entire animal having moved into the space between the static barrier means.
29. The method of any one of claims 24-28 comprising receiving a signal from a second animal position sensor means when a rear of the animal has passed a second position which is adjacent a first end of the static barrier means. 25
30. The method of claim 29 comprising capturing the 3D shape information after receiving the signal from the second animal position sensing means.
31. The method of any one of claims 24 - 30 comprising the step of using a 3D camera to capture the 3D shape information.
32. The method of any one of claims 24-31 comprising the step of processing the 3D shape information to determine when the animal is in a suitable position to obtain 3D information to perform the evaluation. 35
33. The method of any one of claims 24-32 comprising the step of processing the 3D shape information to determine whether the animal is in a suitable stance to obtain 3D information to perform the evaluation.
34. The method of any one of claims 24-33 comprising updating a herd management system depending on the evaluation of the animal. 5
35. The method of any one of claims 24-34 comprising sending an automatic drafting gate a signal which is representative of the evaluation of the animal.
36. The apparatus of any one of claims 1 to 23 wherein the evaluation of the animal performed by the apparatus comprises a calculation of a body condition score.
37. The method of any one of claims 24-34 wherein the evaluation of the animal comprises a calculation of a body condition score.
NZ711098A 2015-08-17 2015-08-17 Apparatus and method for automatically evaluating an animal NZ711098A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/NZ2016/050129 WO2017030448A1 (en) 2015-08-17 2016-08-17 Method and apparatus for evaluating an animal

Publications (1)

Publication Number Publication Date
NZ711098A true NZ711098A (en)

Family

ID=

Similar Documents

Publication Publication Date Title
WO2017030448A1 (en) Method and apparatus for evaluating an animal
RU2469530C2 (en) Device and method for providing information about animals as they pass through passage for animals
US9737040B2 (en) System and method for analyzing data captured by a three-dimensional camera
CA2744146C (en) Arrangement and method for determining a body condition score of an animal
US10373306B2 (en) System and method for filtering data captured by a 3D camera
US9265227B2 (en) System and method for improved attachment of a cup to a dairy animal
Huang et al. Cow tail detection method for body condition score using Faster R-CNN
US10303939B2 (en) System and method for filtering data captured by a 2D camera
Ruchay et al. Accurate 3d shape recovery of live cattle with three depth cameras
CA3230401A1 (en) Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes
US9171208B2 (en) System and method for filtering data captured by a 2D camera
CA2775395A1 (en) Vision system for robotic attacher
US9681634B2 (en) System and method to determine a teat position using edge detection in rear images of a livestock from two cameras
Ling et al. Point cloud-based pig body size measurement featured by standard and non-standard postures
Chowdhury et al. Deep learning based computer vision technique for automatic heat detection in cows
US20170189155A1 (en) Oestrus detection system
NZ711098A (en) Apparatus and method for automatically evaluating an animal
CA2849212C (en) Vision system for robotic attacher
Zhao et al. Real-time automatic classification of lameness in dairy cattle based on movement analysis with image processing technique
Le Cozler et al. Chapter 18: The use of 3D imaging technology in animal management, with a special emphasis on ruminant production
Zhao et al. Detection of lameness in dairy cattle using limb motion analysis with automatic image processing
WO2023180587A2 (en) System and method for detecting lameness in cattle
Van Hertem et al. On farm implementation of a fully automatic computer vision system for monitoring gait related measures in dairy cows
CA2924285C (en) Vision system for robotic attacher
Bahr et al. The ease of movement: how automatic gait and posture analysis can contribute to early lameness detection in dairy cattle