WO2001052633A1 - A method and an apparatus for locating the teats of an animal - Google Patents

A method and an apparatus for locating the teats of an animal Download PDF

Info

Publication number
WO2001052633A1
WO2001052633A1 PCT/SE2001/000083 SE0100083W WO0152633A1 WO 2001052633 A1 WO2001052633 A1 WO 2001052633A1 SE 0100083 W SE0100083 W SE 0100083W WO 0152633 A1 WO0152633 A1 WO 0152633A1
Authority
WO
WIPO (PCT)
Prior art keywords
teat
image
light
images
support
Prior art date
Application number
PCT/SE2001/000083
Other languages
French (fr)
Inventor
Martin SJÖLUND
Sven Åke SVENSSON
Original Assignee
Delaval Holding Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delaval Holding Ab filed Critical Delaval Holding Ab
Priority to AU2001228983A priority Critical patent/AU2001228983A1/en
Publication of WO2001052633A1 publication Critical patent/WO2001052633A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/017Automatic attaching or detaching of clusters
    • A01J5/0175Attaching of clusters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/586Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Definitions

  • the present invention relates to a method of guiding an arrangement according to the preamble of claim 1. It also relates to an apparatus support guide arrangement according to the preamble of claim 18.
  • a milking apparatus including a teat cup connected to a milking machine, a teat cleaning device and a teat after treatment device, can be applied to a teat of a milk animal, such as a cow, by automatic means so as to avoid the need for attendance by an operator.
  • the application stage has to be extremely reliable. Therefore a teat cup must be quickly and correctly applied to a teat on every occasion an animal is present for milking. Also, the equipment to carry out the application stage has to work during difficult conditions and must be durable while not being expensive.
  • the object of the invention has been achieved by a method of the initially defined kind, which is characterized by
  • these two images are combined to provide a two-dimensional difference image, and thereafter the teats are identified.
  • said establishing of differences is made by determining the relative differences in teat positions in at least two provided two-dimensional images.
  • the positioning of the teats can be made continuously as the support approaches the teats.
  • said differences are used for calculating a three dimensional image and thereby determining the positions of said teats.
  • a distance to any specific object in the image can be determined.
  • a preferred method includes the steps of
  • the object of the invention has further been achieved by an arrangement of the initially defined kind, which is characterised by comprising a control device arranged to turn on and off said sources of light alternatingly, the sources of light being placed so that the teats are illuminated from different angles in relation to the image capturing device, and the image interpreting means being arranged to use the differences created by the illumination of said sources of light to produce a two- dimensional difference image. It is hereby possible to use a more simple design than before. For instance simple image interpreting means and diodes can be used, resulting in a more cost efficient design.
  • said image interpreting means further includes identifying means arranged to identify possible teat candidates, and to select a target teat, the image interpreting means being arranged to determine the position of said target teat, and to send signals for homing said support and any supported apparatus to said target teat.
  • a first and a second source of light placed at the left respectively right side of the image capturing device for generating a first image and a second image.
  • a first and a second source of light placed at the left respectively right side of the image capturing device for generating a first image and a second image.
  • said image interpreting means is arranged to superimpose said types of captured images for blanking out details occurring on a plurality of said images.
  • said shapes of the teat are isolated, and thereafter identified.
  • said image interpreting means is arranged to create a two-dimensional sub image and comprises electronic means arranged to process electric signals in said sub image, said electric signals representing captured parts of images illuminated by the first sources of light subtracted from the signal representing other captured parts of images illuminated by the second source of light.
  • said image interpreting means is arranged to create a two-dimensional sub image and comprises electronic means arranged to process electric signals in said sub image, said electric signals representing captured parts of images illuminated by the first sources of light subtracted from the signal representing other captured parts of images illuminated by the second source of light.
  • the image interpreting means further comprises positioning means, arranged to measure the distance travelled by the support, and is arranged to use a predetermined equation, which comprises factors representing the measured distance and a measured section of the sub image, measured on at least two occasions, and thereby determining the distance to at least one teat.
  • a predetermined equation which comprises factors representing the measured distance and a measured section of the sub image, measured on at least two occasions, and thereby determining the distance to at least one teat.
  • a quick analysing method of positioning the teats is possible. By switching the sources of light on and off between different captured images, it is possible to filter out disturbing reflections and to assimilate the reflections that differ between the pictures.
  • An image capturing device favourably a video camera, is directed towards the teats.
  • the camera looks at the teats which are illuminated by the sources of light.
  • the sources of light and the camera are fixedly related to each other and move as a unit, being mounted on a milking apparatus support.
  • the camera receives an image similar to that seen with the eye.
  • a black and white camera is considered.
  • a first image is taken when a left source of light is activated and a second when a right source of light is activated.
  • a two-dimensional difference image is then produced by comparing, pixel by pixel, the difference in luminance between the two images. (Note: the term image, is used although in operation no image is displayed.)
  • One teat from the image is chosen, conveniently the front right teat when a milking robot approaches the udder from the right front of the animal, and the sources of light and camera are moved, under control of information derived from the image processing, to approach the chosen teat by movement towards the respective teat.
  • This technique reliably distinguishes one teat from another by relative positions in the image on known positions. Similarly it can distinguish teats from background objects such as a leg or a tail or part of a stall or even debris, such as straw or other vegetation that may be in the region of the teats, by size or shape or unrealistic positions.
  • the techniques is quite immune to stray light and different levels of ambient illumination generating disturbing reflections in the captured images, because all the disturbing reflections are filtered out and only the reflections from the alternatingly illuminated teat are left and supplied to identifying means.
  • a respective cup By repeated use of the technique on each remaining teat, a respective cup can be applied to each teat. Otherwise the position of the remaining teat or teats can be estimated from the established position of the chosen teat and stored information for the specific animal, including the milking history, for teat-cup application without use of the technique once the first teat-cup has been successfully applied.
  • milk animal in this application is intended to indicate any kind of milk animals, such as cows, sheep, goats, buffaloes, and horses.
  • Figure 1 illustrates a schematic diagram of an apparatus to form a teat- distinguishing image according to the invention
  • FIG. 2 illustrates a block diagram of the image processing hardware
  • Figure 3 illustrates a flow chart of the control activity
  • Figures 4 and 5 illustrate representations of images captured by a camera for respectively turned on and turned off diodes
  • Figure 6 illustrates a processed difference image
  • Figure 7 illustrates a processed sub-image
  • Figure 8 illustrates a method for calculating the distance to the teat.
  • a milking apparatus support by means of which specific equipment, such as teat-cups, for performing animal related operations, can be carried to a proper operating position.
  • a milking apparatus support 11 according to the invention is shown, incorporated in an installation for automatic milking.
  • Sources of light 12, 13 are positioned at both sides of an image capturing device 14 to direct light towards a teat (not shown), above and around the mouth 15 of a teat-cup 16 when in a carrier 17 of a robot arm 18.
  • the illuminating means and image capturing means are not a severe additional load for a teat-cup carrier robot arm 18, even when mounted at the outer end of the robot arm.
  • the sources of light 12, 13 is a pair of diodes emitting visible light in the red frequency range.
  • the image capturing device 14 is for instance a compact solid state camera receiving the red frequency range.
  • the camera and diodes are mounted on the carrier 17.
  • the camera used is a l A inch (12 mm) charge coupled device camera fitted with a lens to give a 90° angle of viewing in the horizontal plane.
  • the camera is positioned so that it is directed towards the mouth of the teat-cup 15 to capture the light emitted by the diodes. This positioning of the camera view angle assists in distinguishing between objects at different distances from the camera.
  • the video performance of the camera preferably provides a resolution of approximately 1 millimetre at a distance of 200 millimetres with a 450 millimetre field of view.
  • FIG. 2 two diodes are represented as 21a and 21b and a switch for turning on and of the diodes alternatingly is represented as 21c.
  • An image capturing device such as a camera, to be directed towards areas illuminated by the diodes, is represented as 22.
  • a power supply unit 23 energizes electronic circuit units 24 and 25 and the camera 22.
  • the image interpreting means 25 processes the image information from the camera 22 and supplies image position information to the unit 24 which provides control information at output 27 for the robot (not shown).
  • the power supply unit 23 also energizes the diodes 21. This energizing is, however, time controlled by a timer 26 for obtaining captured images from the camera 22 when parts of an animal (not shown) are illuminated at different angles by the diodes 21, switching the energizing alternately from one diode to the other.
  • Figure 3 is a flow chart of the method of guiding a milking apparatus support towards a teat of a milk animal according to the present invention.
  • the method begins at block 31.
  • the method continues with moving the support (11, Fig. 1) to a fixed start position, with or without reference to the animal. This fixed start position can be any suitable position.
  • the method continues at block 33 with illuminating, by turning on one of the diodes, a region expected to contain an udder.
  • the next step, at block 34 is capturing from the camera (17, Fig. 1; 22, Fig. 2) a set of images, at least one obtained by the camera, when directed towards the illuminated region.
  • the captured images from the blocks 34 and 36 are compared to each other in block 37 for eliminating disturbing reflections existing in each of the captured images as described in more detail below with reference to Figures 4 and 5.
  • the resulting difference image obtained at block 37 (shown in figure 6) is then analysed in block 38 in order to identify possible teat candidates.
  • the next step, at block 39, consists of determining if any teat candidates have been found. If the answer is negative, i.e. no teat candidates have been found, the method continues at block 40 with moving the support and repeating the steps starting with block 33.
  • This step consists of selecting one of the teat candidates as a target teat.
  • the method continues at block 42 with determining the position of the target teat, this step is explained with reference to figure 7 and 8.
  • the next step, at block 43 is homing in said support and any supported apparatus to the target teat.
  • the next step, at block 44 consists of determining if all four teats of a milk animal have been selected as a target teat and the support and any supported milking apparatus have been homed in to each one of the four teats. If the answer is negative, the steps starting with block 33 are repeated. If, on the other hand, the answer is affirmative the method continues at block 45, where the method is completed.
  • Figures 4 and 5 show captured images as forwarded by the camera 14 (fig. 1) or 22 (fig. 2) when the diodes 12, 13 or 21 (fig. 2) are alternately turned on and off.
  • the diode 12 is off and the diode 13 on the left side of the camera is on, whereby the light is directed onto the udder of an animal whereby the teats are mostly illuminated on the left sides.
  • Figure 5 shows a vice versa situation when the diode 13 placed on the right side of the camera is off and diode 12 is on which light is directed onto the udder of an animal whereby the teats are illuminated on the right sides.
  • Some bright image pixels 50, 51, 52, 53 and 54 are observed by the camera in both pictures when the light hits different parts of the animal, such as from the left illuminated left parts of teats 50, 51, in fig 4 and in fig.5 from the right illuminated right parts 52, 53 of said teats 50,51. Also other details, such as posts of a stall (not shown) hit by the sheet of light may generate similar image pixels.
  • the difference image 61 is examined, primarily for identification of the different objects represented in the image, by comparing the differences in the sets of pictures obtained by the diodes and scanning the image pixels in columns between the top and bottom (or vice versa) of the image, across the image.
  • the diodes and camera are arranged stationary on said robot arm.
  • the brightest group of pixels of each column in each picture is noted.
  • a group 51/53 of such pixels side- by-side in adjacent columns maintained in successive alternate illuminating form together the shape of a teat.
  • the group 51/53 would be identified as a possible teat candidate, for instance by comparing to a stored list of different objects, while the groups 55, 56 would be rejected as not changing objects, since the illumination of the diodes are too weak to affect them, and is for instance arising from a leg far away illuminated by outer sources of light (not shown).
  • the group 54 being for instance a leg positioned very near, is excluded having a too large dimension.
  • the group 51, 53 would then be selected as a target teat, as being the nearest teat, while the group 50/52, representing another teat, is selected to be treated later. Other pixels representing reflections on hoses, parts of the stall, etc. could be rejected in a similar manner or by other software techniques known in the art.
  • a suitable group 51/53 of bright pixels a sum of the variations of both pictures, has been identified representing a teat of interest, a target teat, a sub image 71 as shown in fig. 7 of the image portion and enclosing the identified pixel group 51/53 is defined, and only this difference image is then scanned and processed to provide guidance information to the robot to move the supported teat-cup or other apparatus towards the deduced position of the pixel group.
  • the pixels of the difference image are those which are variations from figure 4 where one diode is on and those variations from figure 5 where the other diode is on. Here they are shown together contributing to a full picture of a target teat.
  • the pixels from figure 4 are here the right part 53 and the pixels from figure 5 are here the left part 51.
  • two sources of light generate two sheets of light that cover each other, whereby it is possible to obtain a safe generation of captured images.
  • the method and apparatus according to the invention will improve the process of teat locating by using a reliable robot guiding system that is safe for an animal.
  • the method for determining the location of a teat will now be further explained by reference to figure 8, where the target to be homed in on is teat 51/53.
  • the homing is made by determining the distance x from the camera to the teat.
  • the homing according to the invention is made by moving the support and measure the distance travelled and simultaneously measure the angle occupied by the target teat at least two occasions.
  • the support starts at a first camera position and closes in on the target teat to the second position.
  • the angle ⁇ occupied by the teat is measured at for instance the width y at a predetermined height h from the tip of the teat, as shown in fig 6.
  • the angle ⁇ is measured, and at the second position the second angle ⁇ ' can be measured. Meanwhile, as the camera approaches the target teat, the distance ⁇ x travelled by the support is measured by a distance travelling means (not shown). By calculating the differences in angle size and the distance travelled by the support the distance to the teat can be calculated. The calculation is made repeatedly as the support closes in on the target teat.
  • the object is to approach the teat so that a centerpoint of the teat is in the centre of the picture.
  • the teat now occupies a predetermined part of the picture.
  • the homing in on the teat is made without calculating the distance according to the present invention.
  • the centre of the camera is directed towards the identified target teat.
  • the support moves towards the teat and in the difference image the number of pixels is counted.
  • the number exceeds a predetermined number the speed is lowered and when the number of pixels reaches a further predetermined number the camera is halted. If, after that, the target moves, the camera also moves and holds the distance, following the increase and decrease in pixels representing the movements of the target teat.
  • the centre point of the teat is maintained in the centre of the picture.
  • teat cups can be applied using a three-dimensional image as a reference for the control of the support.
  • the measuring is made by measuring the number of pixels in the images captured by the capturing device.
  • a middle image is added, where none of the diodes is activated. In other words, first the left diode is activated, then none and then the right. Hereby an image with no illumination related to the support is achieved.
  • the illuminating means is a cluster of diodes, connected to form an illuminating means having the same illuminating effect as a regular incandescent lamp.

Abstract

The present invention relates to an apparatus support guide arrangement, where a support (16) carries sources of light (12, 13), adapted to illuminate at least two areas on at least one teat, and is associated with an image capturing device (14) arranged to view the teat and to capture images thereof and to provide image signals to an image interpreting means. According to the invention a control device is arranged to turn on and off said sources of light (12, 13; 21) alternatingly, the sources of light (12, 13) being placed so that the teats are illuminated from different angles in relation to the image capturing device, and the image interpreting means being arranged to use the differences created by the illumination of said sources of light (12, 13) to produce a difference image. The present invention further relates to a method for the operation of the arrangement.

Description

A method and an apparatus for locating the teats of an animal
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a method of guiding an arrangement according to the preamble of claim 1. It also relates to an apparatus support guide arrangement according to the preamble of claim 18.
BACKGROUND OF THE INVENTION
Such a method and such an apparatus are known from WO 97/15900.
Numerous proposals have been made for techniques by which e.g. a milking apparatus, including a teat cup connected to a milking machine, a teat cleaning device and a teat after treatment device, can be applied to a teat of a milk animal, such as a cow, by automatic means so as to avoid the need for attendance by an operator.
As automatic techniques for the rest of the milking procedure have been available for some time, automation of the teat cup application stage has become the main obstacle to development of a fully-automatic milking procedure which does not require continuous attendance and enables the so called "milking on demand" regime.
For many reasons, e.g. animal safety and comfort, milk hygiene and economic efficiency, the application stage has to be extremely reliable. Therefore a teat cup must be quickly and correctly applied to a teat on every occasion an animal is present for milking. Also, the equipment to carry out the application stage has to work during difficult conditions and must be durable while not being expensive.
A favourable solution to the problems of obtaining a fully-automatic milking procedure mentioned above is described in the above mentioned WO 97/15900. Though, it has been found that when using image analyses of the captured image of the area defined by the possible teat candidates of a milking animal, reflections from other sources of light, such as lamps and wmdows, other than the one specially used for illuminating the said area, and from illuminated equipments in a stall, such as frames and gates, might cause problems;
OBJECT OF THE INVENTION
It is an object of the invention to provide an improved teat locating technique for guiding an apparatus towards at least one teat of a milk animal.
SUMMARY OF THE INVENTION
The object of the invention has been achieved by a method of the initially defined kind, which is characterized by
- activating a first source of light and capturing images with the image capturing device,
- deactivating the first source of light,
- activating a second source of light placed at a different location than the first source of light and capturing images with the image capturing device after or at the same time as deactivating the first source of light, - deactivating the second source of light,
- moving the support from a first position to a second position, after obtaining a first set of images,
- repeating the steps of activating and deactivating the sources of light and capturing a second set of images at the second position, -establishing the differences between the captured images in previous steps and using said differences for calculating the position of said teat. Hereby disturbing reflections are eliminated from the captured images and an effective homing process is achieved.
Preferably, after capturing an image when the first source of light is activated and a second image when the second source of light is activated, these two images are combined to provide a two-dimensional difference image, and thereafter the teats are identified. When this step has been repeated at least once, said establishing of differences is made by determining the relative differences in teat positions in at least two provided two-dimensional images. Hereby the positioning of the teats can be made continuously as the support approaches the teats.
Suitably said differences are used for calculating a three dimensional image and thereby determining the positions of said teats. Hereby a distance to any specific object in the image can be determined.
A preferred method includes the steps of
-creating a two-dimensional sub image of said at least one teat to be analysed,
- measuring a first section of at least one of the teats in the first set of images, thus obtaining a first section value at the first position, - moving said support to the second position,
- measuring a second section of at least one teat in the second set of images thus obtaining a second section value,
- measuring simultaneously or close after the measuring of the sections, the distance travelled by the support from the first to the second position, -determining the absolute distance from said support to the teat on the basis of the measured sections and the distance travelled. Hereby a quick method to analyse the entire frame, using a minimum of two pictures is achieved.
The object of the invention has further been achieved by an arrangement of the initially defined kind, which is characterised by comprising a control device arranged to turn on and off said sources of light alternatingly, the sources of light being placed so that the teats are illuminated from different angles in relation to the image capturing device, and the image interpreting means being arranged to use the differences created by the illumination of said sources of light to produce a two- dimensional difference image. It is hereby possible to use a more simple design than before. For instance simple image interpreting means and diodes can be used, resulting in a more cost efficient design.
Suitably said image interpreting means further includes identifying means arranged to identify possible teat candidates, and to select a target teat, the image interpreting means being arranged to determine the position of said target teat, and to send signals for homing said support and any supported apparatus to said target teat.
Preferably there is a first and a second source of light, placed at the left respectively right side of the image capturing device for generating a first image and a second image. Hereby an optimum illumination difference angle is achieved.
Suitably said image interpreting means is arranged to superimpose said types of captured images for blanking out details occurring on a plurality of said images. Hereby the shapes of the teat are isolated, and thereafter identified.
Suitably said image interpreting means is arranged to create a two-dimensional sub image and comprises electronic means arranged to process electric signals in said sub image, said electric signals representing captured parts of images illuminated by the first sources of light subtracted from the signal representing other captured parts of images illuminated by the second source of light. Hereby a compact image to analyse is achieved, allowing a quick analysis. Objects lit up in both images will thus be cancelled out, resulting in an image that is insensitive to background illumination.
Preferably the image interpreting means further comprises positioning means, arranged to measure the distance travelled by the support, and is arranged to use a predetermined equation, which comprises factors representing the measured distance and a measured section of the sub image, measured on at least two occasions, and thereby determining the distance to at least one teat. Hereby the use of a quick analysing method of positioning the teats is possible. By switching the sources of light on and off between different captured images, it is possible to filter out disturbing reflections and to assimilate the reflections that differ between the pictures.
An image capturing device, favourably a video camera, is directed towards the teats. Conveniently, the camera looks at the teats which are illuminated by the sources of light. The sources of light and the camera are fixedly related to each other and move as a unit, being mounted on a milking apparatus support.
The camera receives an image similar to that seen with the eye. For the ease of understanding a black and white camera is considered. When the image is processed to identify the teats, for example, a first image is taken when a left source of light is activated and a second when a right source of light is activated. A two-dimensional difference image is then produced by comparing, pixel by pixel, the difference in luminance between the two images. (Note: the term image, is used although in operation no image is displayed.) One teat from the image is chosen, conveniently the front right teat when a milking robot approaches the udder from the right front of the animal, and the sources of light and camera are moved, under control of information derived from the image processing, to approach the chosen teat by movement towards the respective teat.
This technique reliably distinguishes one teat from another by relative positions in the image on known positions. Similarly it can distinguish teats from background objects such as a leg or a tail or part of a stall or even debris, such as straw or other vegetation that may be in the region of the teats, by size or shape or unrealistic positions.
By alternately turning on and off of the light sources according to the present invention, the techniques is quite immune to stray light and different levels of ambient illumination generating disturbing reflections in the captured images, because all the disturbing reflections are filtered out and only the reflections from the alternatingly illuminated teat are left and supplied to identifying means.
By repeated use of the technique on each remaining teat, a respective cup can be applied to each teat. Otherwise the position of the remaining teat or teats can be estimated from the established position of the chosen teat and stored information for the specific animal, including the milking history, for teat-cup application without use of the technique once the first teat-cup has been successfully applied.
The use of the term milk animal in this application is intended to indicate any kind of milk animals, such as cows, sheep, goats, buffaloes, and horses.
DRAWING SUMMARY
Embodiments of the invention will now be described in more detail with reference to the accompanying drawings, in which
Figure 1 illustrates a schematic diagram of an apparatus to form a teat- distinguishing image according to the invention,
Figure 2 illustrates a block diagram of the image processing hardware,
Figure 3 illustrates a flow chart of the control activity,
Figures 4 and 5 illustrate representations of images captured by a camera for respectively turned on and turned off diodes,
Figure 6 illustrates a processed difference image,
Figure 7 illustrates a processed sub-image, and Figure 8 illustrates a method for calculating the distance to the teat.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS A particular embodiment of the present invention will now be described in more detail.
In installations for automatic milking there is provided a milking apparatus support, by means of which specific equipment, such as teat-cups, for performing animal related operations, can be carried to a proper operating position. In fig. 1 such a milking apparatus support 11 according to the invention is shown, incorporated in an installation for automatic milking. Sources of light 12, 13 are positioned at both sides of an image capturing device 14 to direct light towards a teat (not shown), above and around the mouth 15 of a teat-cup 16 when in a carrier 17 of a robot arm 18. The illuminating means and image capturing means are not a severe additional load for a teat-cup carrier robot arm 18, even when mounted at the outer end of the robot arm.
In the preferred embodiment the sources of light 12, 13 is a pair of diodes emitting visible light in the red frequency range. The image capturing device 14 is for instance a compact solid state camera receiving the red frequency range. The camera and diodes are mounted on the carrier 17. The camera used is a lA inch (12 mm) charge coupled device camera fitted with a lens to give a 90° angle of viewing in the horizontal plane. The camera is positioned so that it is directed towards the mouth of the teat-cup 15 to capture the light emitted by the diodes. This positioning of the camera view angle assists in distinguishing between objects at different distances from the camera. The video performance of the camera preferably provides a resolution of approximately 1 millimetre at a distance of 200 millimetres with a 450 millimetre field of view. The above mentioned specifications are used in the preferred embodiment but could, as readily understood by a person skilled in the art, be modified in different ways. In Figure 2 two diodes are represented as 21a and 21b and a switch for turning on and of the diodes alternatingly is represented as 21c. An image capturing device, such as a camera, to be directed towards areas illuminated by the diodes, is represented as 22. A power supply unit 23 energizes electronic circuit units 24 and 25 and the camera 22. The image interpreting means 25 processes the image information from the camera 22 and supplies image position information to the unit 24 which provides control information at output 27 for the robot (not shown).
The power supply unit 23 also energizes the diodes 21. This energizing is, however, time controlled by a timer 26 for obtaining captured images from the camera 22 when parts of an animal (not shown) are illuminated at different angles by the diodes 21, switching the energizing alternately from one diode to the other.
Figure 3 is a flow chart of the method of guiding a milking apparatus support towards a teat of a milk animal according to the present invention. The method begins at block 31. At block 32 the method continues with moving the support (11, Fig. 1) to a fixed start position, with or without reference to the animal. This fixed start position can be any suitable position. The method continues at block 33 with illuminating, by turning on one of the diodes, a region expected to contain an udder. The next step, at block 34, is capturing from the camera (17, Fig. 1; 22, Fig. 2) a set of images, at least one obtained by the camera, when directed towards the illuminated region.
Then the activity continues at block 35 by having the diode turned off and the other diode turned on. By the camera another set of images, being at least one, is captured of the region now illuminated by the diode, block 36.
The captured images from the blocks 34 and 36 are compared to each other in block 37 for eliminating disturbing reflections existing in each of the captured images as described in more detail below with reference to Figures 4 and 5. The resulting difference image obtained at block 37 (shown in figure 6) is then analysed in block 38 in order to identify possible teat candidates. The next step, at block 39, consists of determining if any teat candidates have been found. If the answer is negative, i.e. no teat candidates have been found, the method continues at block 40 with moving the support and repeating the steps starting with block 33.
If on the other hand, the answer is affirmative the method continues at block 41. This step consists of selecting one of the teat candidates as a target teat. The method continues at block 42 with determining the position of the target teat, this step is explained with reference to figure 7 and 8. The next step, at block 43, is homing in said support and any supported apparatus to the target teat. The next step, at block 44, consists of determining if all four teats of a milk animal have been selected as a target teat and the support and any supported milking apparatus have been homed in to each one of the four teats. If the answer is negative, the steps starting with block 33 are repeated. If, on the other hand, the answer is affirmative the method continues at block 45, where the method is completed.
Figures 4 and 5 show captured images as forwarded by the camera 14 (fig. 1) or 22 (fig. 2) when the diodes 12, 13 or 21 (fig. 2) are alternately turned on and off. In Figure 4 the diode 12 is off and the diode 13 on the left side of the camera is on, whereby the light is directed onto the udder of an animal whereby the teats are mostly illuminated on the left sides. Figure 5 shows a vice versa situation when the diode 13 placed on the right side of the camera is off and diode 12 is on which light is directed onto the udder of an animal whereby the teats are illuminated on the right sides. Some bright image pixels 50, 51, 52, 53 and 54 are observed by the camera in both pictures when the light hits different parts of the animal, such as from the left illuminated left parts of teats 50, 51, in fig 4 and in fig.5 from the right illuminated right parts 52, 53 of said teats 50,51. Also other details, such as posts of a stall (not shown) hit by the sheet of light may generate similar image pixels.
Furthermore, other sources of light like wmdows and the general stall illumination lamps, may give some unwanted reflections in the captured images as indicated by the references 55 in fig 4 and 56 in fig. 5. These unwanted reflections 55, 56 are the same in both pictures and referred to as disturbing reflections when discussed below, contrary to the true reflections from teats 50/52, 51/53 or leg 54 generated by the light emanating from the diodes.
As is made clear by Figure 5 the disturbing reflections 55, 56 also exist when the diode 13 is off and diode 12 is on.
In fig. 4 and 5 another pair of teats 57, 58 is shown but is not discussed in detail, since only two teats are necessary to explain the method.
By making use of the method according to the present invention, i.e. turning on and off the diodes alternatingly, one can obtain the two different captured images as shown in Figures 4 and 5. Processing the two captured images according to the inventive method shown by the flow chart of Figure 3 and described above, the disturbing reflections 55, 56 are eliminated or cancelled. In such a way a difference image 61 is obtained as shown in figure 6 comprising the sorted out pixels which are to be processed for evaluation of the presence of a target teat, e.g. the teats 51, 53, and to control the guidance of the robot and its carrier 17 (fig. 1).
Thus, the difference image 61 is examined, primarily for identification of the different objects represented in the image, by comparing the differences in the sets of pictures obtained by the diodes and scanning the image pixels in columns between the top and bottom (or vice versa) of the image, across the image. The diodes and camera are arranged stationary on said robot arm. The brightest group of pixels of each column in each picture is noted. A group 51/53 of such pixels side- by-side in adjacent columns maintained in successive alternate illuminating form together the shape of a teat. The group 51/53 would be identified as a possible teat candidate, for instance by comparing to a stored list of different objects, while the groups 55, 56 would be rejected as not changing objects, since the illumination of the diodes are too weak to affect them, and is for instance arising from a leg far away illuminated by outer sources of light (not shown). The group 54, being for instance a leg positioned very near, is excluded having a too large dimension. The group 51, 53 would then be selected as a target teat, as being the nearest teat, while the group 50/52, representing another teat, is selected to be treated later. Other pixels representing reflections on hoses, parts of the stall, etc. could be rejected in a similar manner or by other software techniques known in the art.
Once a suitable group 51/53 of bright pixels, a sum of the variations of both pictures, has been identified representing a teat of interest, a target teat, a sub image 71 as shown in fig. 7 of the image portion and enclosing the identified pixel group 51/53 is defined, and only this difference image is then scanned and processed to provide guidance information to the robot to move the supported teat-cup or other apparatus towards the deduced position of the pixel group. The pixels of the difference image are those which are variations from figure 4 where one diode is on and those variations from figure 5 where the other diode is on. Here they are shown together contributing to a full picture of a target teat. The pixels from figure 4 are here the right part 53 and the pixels from figure 5 are here the left part 51.
In a further embodiment (not shown) two sources of light generate two sheets of light that cover each other, whereby it is possible to obtain a safe generation of captured images.
The method and apparatus according to the invention will improve the process of teat locating by using a reliable robot guiding system that is safe for an animal.
The method for determining the location of a teat will now be further explained by reference to figure 8, where the target to be homed in on is teat 51/53. The homing is made by determining the distance x from the camera to the teat. The homing according to the invention is made by moving the support and measure the distance travelled and simultaneously measure the angle occupied by the target teat at least two occasions. The support starts at a first camera position and closes in on the target teat to the second position. In a computerised difference image, as explained in fig 6 the teat grows, the angle ^occupied by the teat is measured at for instance the width y at a predetermined height h from the tip of the teat, as shown in fig 6. At the first position the angle φ is measured, and at the second position the second angle φ ' can be measured. Meanwhile, as the camera approaches the target teat, the distance Δx travelled by the support is measured by a distance travelling means (not shown). By calculating the differences in angle size and the distance travelled by the support the distance to the teat can be calculated. The calculation is made repeatedly as the support closes in on the target teat.
Any disturbance movement of the target teat is considered, but the measurements are taken at such short interval and in such a number that disturbance movements have little influence. The object is to approach the teat so that a centerpoint of the teat is in the centre of the picture. When the teat is at the predestined distance for the operation to be performed the support stops. The teat now occupies a predetermined part of the picture. The calculation in relation to fig 8 uses the following factors: x = distance between the initial camera position and the teat, to be calculated φ = angle occupied by the teat at the initial camera position, φ' = angle occupied by the teat at the second camera position, Δx = distance between the camera positions, y = width of the teat
The calculation for the operation is as follows: tan φ =-
X (1)
tan φ' = — =— (2) x - Ax
For small angles φ and φ the equations (1) and (2), using the Taylor series expansion of the function tan φ, can be written φ =-y (1 ' ) x φ = y (2 ') x-Ax
By eliminating y and solving for x one obtains x = - . (3)
≠ Hereby the distance to the teat can be calculated by using only three factors; the angles φ and φ ' and the distance Δx travelled towards the teat. Such a simple equation, to be calculated by the processing means, results in a fast homing process, since corrections in the positioning are rapidly made.
In another embodiment of the invention the homing in on the teat is made without calculating the distance according to the present invention. Instead the centre of the camera is directed towards the identified target teat. The support moves towards the teat and in the difference image the number of pixels is counted. When the number exceeds a predetermined number the speed is lowered and when the number of pixels reaches a further predetermined number the camera is halted. If, after that, the target moves, the camera also moves and holds the distance, following the increase and decrease in pixels representing the movements of the target teat. Thus the centre point of the teat is maintained in the centre of the picture.
It can also be suitable to produce a three-dimensional image using images from at least two positions. For instance, teat cups can be applied using a three-dimensional image as a reference for the control of the support.
In a further embodiment of guiding an apparatus support the measuring is made by measuring the number of pixels in the images captured by the capturing device.
In a still further embodiment a middle image is added, where none of the diodes is activated. In other words, first the left diode is activated, then none and then the right. Hereby an image with no illumination related to the support is achieved.
In a further embodiment the illuminating means is a cluster of diodes, connected to form an illuminating means having the same illuminating effect as a regular incandescent lamp.
It is noted that only two teats are mentioned in the description. The same procedure can be performed on any number of teats but here the simplest choice is explained, between two teats. Four teats are visible in the fig. 4 and 5.

Claims

1. A method of guiding an apparatus support (11) towards at least one teat (51/53) of a milk animal, the method comprising the following steps: - moving said support (11) to a position where an udder is expected to be viewed;
- illuminating with at least two sources of light (12, 13) from the support (11) a region expected to contain at least one udder;
- capturing images of at least one teat by means of an image capturing device (14;22) arranged on said support (11); , - calculating a position of said teat, said method being characterized by:
- activating a first source of light (12) and capturing images with the image capturing device (14, 22),
- deactivating the first source of light (12),
- activating a second source of light (13) placed at a different location than the first source of light (12) and capturing images with the image capturing device (14, 22) after or at the same time as deactivating the first source of light,
- deactivating the second source of light,
- moving the support from a first position to a second position, after obtaining a first set of images, - repeating the steps of activating and deactivating the sources of light and capturing a second set of images at the second position,
-establishing the differences between the captured images in previous steps and using said differences for calculating the position of said teat.
2. A method according to claim 1, wherein;
- after capturing a first image when the first source of light (12) is activated and a second image when the second source of light (13) is activated, combining these two images to provide a two-dimensional difference image (61), and thereafter identifying the teats (50/52, 51,53), - repeating said step at least once, and then -establishing differences by determining the relative differences in teat positions in at least two provided two-dimensional images.
3. A method according to claim 1 or 2, wherein said differences are used for calculating a three-dimensional image and thereby determining the positions of said teats (50/52, 51,53).
4. A method according to any of the claims 1-3, having the sources of light (12, 13) on each side of the image capturing device.
5. A method according to any of the claims 1-4, wherein the establishing of the differences is made by processing said captured images from the first set of images by superimposing the captured images in such a way that details (55, 56) occurring on both are blanked out, resulting in a difference image (61), said difference image (61) being analysed to identify possible teat candidates (50/ 52, 51/53).
6. A method of guiding an apparatus support according to any of the previous claims, characterized by the steps of:
'-creating a sub image (71) of said at least one teat to be analysed, - measuring a first section of at least one of the teats in the first set of images, thus obtaining a first section value at the first position,
- moving said support to the second position,
- measuring a second section of at least one teat in the second set of images thus obtaining a second section value, - measuring simultaneously or close after the measuring of the sections, the distance travelled by the support (11) from the first to the second position,
- determining the absolute distance from said support to the teat on basis of the measured sections and the distance travelled.
7. A method of guiding an apparatus support according to claim 6, wherein the distance is determined continuously by measuring the sections and the distance travelled as the support approaches the teat.
8. A method of guiding an apparatus support according claim 6 or 7, wherein the section measured is represented by an angle (<p) occupied by the teat in the image captured by the image capturing means (14,22).
9. A method of guiding an apparatus support according to claim 6 or 7, wherein the number of pixels in the images captured by the capturing device is measured.
10. A method according to any of the claims 5-9, wherein the superimposing between said two captured image is made by adding pixel by pixel in the captured images, where the difference established between the images is pixels representing at least said teat (51/53).
11. A method according to claim 10, wherein the superimposing of said two captured images is made electronically by processing electric signals representing each pixel of said two captured images, subtracting the signals generated by the first captured image from the signals generated by the second illuminated captured image type.
12. A method according to any of the previous claims, characterized in that: an appropriate distance in relation to a teat is maintained, even when the animal is moving, enabling performing of specific predestined procedures .
13. A method according to claim 12, characterized in that the animal related operation is performed as the appropriate distance is established.
14. A method according to claim 13, characterized in that the animal related operation is to apply a teat-cup (16) to the teat.
15. A method according to any of the preceding claims, wherein at least one teat- cup (16) carried by said support (11) is attached to at least one individual target teat (50/52, 51/53).
16. A method according to any of the preceding claims, wherein said target teat (51/53) is cleaned by having a fluid sprayed towards said teat candidates.
17. A method according to any of the preceding claims, wherein said target teat (51/53) is given a follow up treatment.
18. A method according to any of the preceding claims, wherein the first and second source of light (12, 13) are used to illuminate at least two areas on said teat candidates (50/52, 51/53), the light being such that at a limited distance at least a substantial part of the region viewed by the image capturing device is illuminated.
19. A method according to anyone of the previous claims, wherein as sources of light (12, 13), two diodes are used, mounted at each side of the camera to illuminate two areas on said teat candidates (50/52, 51/53),
20. An apparatus support guide arrangement, where a support (16) is arranged with sources of light (12,13; 21), adapted to illuminate at least two areas on at least one teat (50/52, 51/53), and is associated with an image capturing device (14; 22) arranged to view a teat (50/52, 51/53) and to capture images thereof and to provide image signals to an image interpreting means, characterized by comprising a control device (26) arranged to turn on and off said sources of light (12,13; 21) alternatingly, the sources of light (12,13; 21) being placed so that the teats are illuminated from different angles in relation to the image capturing device, and the image interpreting means (25) being arranged to use the differences created by the illumination of said sources of light (12,13; 21) to produce a difference image (61).
21. An arrangement according to claim 20, wherein said image interpreting means (25) further includes identifying means arranged to identify possible teat candidates (50/52, 51/53), and to select a target teat (51/53), the image interpreting means being arranged to determine the position of said target teat (51/53), and to send signals for homing said support (11) and any supported apparatus (16) to said target teat (51/53).
22. An arrangement according to claim 20 or 21, wherein there is a first and a second source of light, placed at the left respectively right side of the image capturing device (14), for generating a first image and a second image.
23. An arrangement according to any of the claims 20-22, wherein said image interpreting means (25) is arranged to superimpose said captured images for blanking out details occurring on a plurality of said images.
24. An arrangement according to any of the claims 20-23, wherein said image interpreting means (25) is arranged to create a sub image (71) and comprises electronic means arranged to process electric signals in said sub image, said electric signals representing captured parts of images illuminated by the first sources of light (12) subtracted from the signal representing other captured parts of images illuminated by the second source of light (13).
25. An arrangement according to any of the claims 20- 24, wherein the image interpreting means (25) further comprises positioning means, arranged to measure the distance travelled by the support, and is arranged to use a predetermined equation, which comprises factors representing the measured distance and a measured section of the sub image, measured on at least two occasions, and thereby determining the distance to at least one teat.
26. An arrangement according to claim 25, wherein the section the image interpreting means is arranged to measure is represented by the angle occupied by the target teat (51/53) in the sub image (71).
27. An arrangement according to claim 25 or 26, wherein the image interpreting means is arranged to measure the target teat (51/53) by counting the number of pixels in the sub image (71).
28. An arrangement according to any of the claims 24-27, wherein the positioning means in association with the image interpreting means are arranged to measure the distance to the target teat (51/53) also when the target teat (51/53) is moving.
29. An arrangement according to anyone of the claims 20-28, wherein said support (11) is arranged to bring at least one teat-cup (16) to said target teat (51/53).
30. An arrangement according to anyone of the claims 20-29, wherein said support (11) is provided with a cleaning device arranged to spray said target teat (51/53) with a cleaning fluid.
31. An arrangement according to anyone of the claims 20-30, wherein the support (11) comprises an apparatus arranged for following up treatment of the target teat (51/53).
32. An arrangement according to anyone of the claims 20-31, wherein said sources of light (12, 13; 21) is of diode type and are arranged to generate at least one visible illuminated area on said teat candidates (50/52, 51/53).
33. An arrangement according to anyone of the claims 20-32, wherein said source of light (12,13; 21) is of IR diode type and is arranged to generate at least one invisible illuminated area on said teat candidates (50/52, 51/53).
PCT/SE2001/000083 2000-01-19 2001-01-18 A method and an apparatus for locating the teats of an animal WO2001052633A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001228983A AU2001228983A1 (en) 2000-01-19 2001-01-18 A method and an apparatus for locating the teats of an animal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE0000153A SE0000153L (en) 2000-01-19 2000-01-19 A method and apparatus for locating the teats of a dairy animal
SE0000153-7 2000-01-19

Publications (1)

Publication Number Publication Date
WO2001052633A1 true WO2001052633A1 (en) 2001-07-26

Family

ID=20278152

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2001/000083 WO2001052633A1 (en) 2000-01-19 2001-01-18 A method and an apparatus for locating the teats of an animal

Country Status (3)

Country Link
AU (1) AU2001228983A1 (en)
SE (1) SE0000153L (en)
WO (1) WO2001052633A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003055297A1 (en) * 2001-12-28 2003-07-10 Idento Electronics B.V. Method and apparatus for detection of teats
US7490576B2 (en) 2006-03-15 2009-02-17 Lmi Technologies Ltd Time of flight teat location system
US8210122B2 (en) 2004-03-30 2012-07-03 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal
US8393296B2 (en) 2011-04-28 2013-03-12 Technologies Holdings Corp. Milking box with robotic attacher including rotatable gripping portion and nozzle
US8590488B2 (en) 2010-08-31 2013-11-26 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8671885B2 (en) 2011-04-28 2014-03-18 Technologies Holdings Corp. Vision system for robotic attacher
US8683946B2 (en) 2011-04-28 2014-04-01 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US8746176B2 (en) 2011-04-28 2014-06-10 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US8800487B2 (en) 2010-08-31 2014-08-12 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US8885891B2 (en) 2011-04-28 2014-11-11 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US8903129B2 (en) 2011-04-28 2014-12-02 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9043988B2 (en) 2011-04-28 2015-06-02 Technologies Holdings Corp. Milking box with storage area for teat cups
US9049843B2 (en) 2011-04-28 2015-06-09 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US9058657B2 (en) 2011-04-28 2015-06-16 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9107379B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Arrangement of milking box stalls
US9149018B2 (en) 2010-08-31 2015-10-06 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US9161511B2 (en) 2010-07-06 2015-10-20 Technologies Holdings Corp. Automated rotary milking system
US9161512B2 (en) 2011-04-28 2015-10-20 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9215861B2 (en) 2011-04-28 2015-12-22 Technologies Holdings Corp. Milking box with robotic attacher and backplane for tracking movements of a dairy animal
US9258975B2 (en) 2011-04-28 2016-02-16 Technologies Holdings Corp. Milking box with robotic attacher and vision system
US9265227B2 (en) 2011-04-28 2016-02-23 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9357744B2 (en) 2011-04-28 2016-06-07 Technologies Holdings Corp. Cleaning system for a milking box stall
US9681634B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. System and method to determine a teat position using edge detection in rear images of a livestock from two cameras
US10111401B2 (en) 2010-08-31 2018-10-30 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary parlor
US10127446B2 (en) 2011-04-28 2018-11-13 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US10357015B2 (en) 2011-04-28 2019-07-23 Technologies Holdings Corp. Robotic arm with double grabber and method of operation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997015900A1 (en) * 1995-10-27 1997-05-01 Alfa Laval Agri Ab Teat location for milking
US5631976A (en) * 1994-04-29 1997-05-20 International Business Machines Corporation Object imaging system
WO2000011935A1 (en) * 1998-08-31 2000-03-09 Alfa Laval Agri Ab A method and an apparatus for locating the teats of an animal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5631976A (en) * 1994-04-29 1997-05-20 International Business Machines Corporation Object imaging system
WO1997015900A1 (en) * 1995-10-27 1997-05-01 Alfa Laval Agri Ab Teat location for milking
WO2000011935A1 (en) * 1998-08-31 2000-03-09 Alfa Laval Agri Ab A method and an apparatus for locating the teats of an animal

Cited By (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003055297A1 (en) * 2001-12-28 2003-07-10 Idento Electronics B.V. Method and apparatus for detection of teats
US8210122B2 (en) 2004-03-30 2012-07-03 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal
US7490576B2 (en) 2006-03-15 2009-02-17 Lmi Technologies Ltd Time of flight teat location system
US9161511B2 (en) 2010-07-06 2015-10-20 Technologies Holdings Corp. Automated rotary milking system
US8720382B2 (en) 2010-08-31 2014-05-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9737043B2 (en) 2010-08-31 2017-08-22 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US10477828B2 (en) 2010-08-31 2019-11-19 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US10327414B2 (en) 2010-08-31 2019-06-25 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US8707905B2 (en) 2010-08-31 2014-04-29 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9433184B2 (en) 2010-08-31 2016-09-06 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US8720383B2 (en) 2010-08-31 2014-05-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US8726843B2 (en) 2010-08-31 2014-05-20 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US10111401B2 (en) 2010-08-31 2018-10-30 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary parlor
US8800487B2 (en) 2010-08-31 2014-08-12 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US8807085B2 (en) 2010-08-31 2014-08-19 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US8807086B2 (en) 2010-08-31 2014-08-19 Technologies Holdings Corp Automated system for applying disinfectant to the teats of dairy livestock
US9980458B2 (en) 2010-08-31 2018-05-29 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9894876B2 (en) 2010-08-31 2018-02-20 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9888664B2 (en) 2010-08-31 2018-02-13 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9775325B2 (en) 2010-08-31 2017-10-03 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9763424B1 (en) 2010-08-31 2017-09-19 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US10595501B2 (en) 2010-08-31 2020-03-24 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9706747B2 (en) 2010-08-31 2017-07-18 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9686962B2 (en) 2010-08-31 2017-06-27 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9686961B2 (en) 2010-08-31 2017-06-27 Technologies Holdings Corp. Automated system for moving a robotic arm along a rotary milking platform
US9126335B2 (en) 2010-08-31 2015-09-08 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9149018B2 (en) 2010-08-31 2015-10-06 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US8590488B2 (en) 2010-08-31 2013-11-26 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9648839B2 (en) 2010-08-31 2017-05-16 Technologies Holdings Corp. System and method for determining whether to operate a robot in conjunction with a rotary milking platform based on detection of a milking claw
US9648843B2 (en) 2010-08-31 2017-05-16 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9560832B2 (en) 2010-08-31 2017-02-07 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9549531B2 (en) 2010-08-31 2017-01-24 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9247709B2 (en) 2010-08-31 2016-02-02 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9516854B2 (en) 2010-08-31 2016-12-13 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US10595500B2 (en) 2010-08-31 2020-03-24 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9480238B2 (en) 2010-08-31 2016-11-01 Technologies Holdings Corp. Vision system for facilitating the automated application of disinfectant to the teats of dairy livestock
US9474248B2 (en) 2010-08-31 2016-10-25 Technologies Holdings Corp. Automated system for applying disinfectant to the teats of dairy livestock
US9462781B2 (en) 2010-08-31 2016-10-11 Technologies Holdings Corp. Automated system for moving a robotic arm along a rotary milking platform
US9462782B2 (en) 2010-08-31 2016-10-11 Technologies Holdings Corp. System and method for controlling the position of a robot carriage based on the position of a milking stall of an adjacent rotary milking platform
US9485955B2 (en) 2011-04-28 2016-11-08 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9706745B2 (en) 2011-04-28 2017-07-18 Technologies Holdings Corp. Vision system for robotic attacher
US9374976B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with robotic attacher, vision system, and vision system cleaning device
US9374979B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with backplane and robotic attacher
US9374974B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. Milking box with robotic attacher
US9374975B2 (en) 2011-04-28 2016-06-28 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9402365B2 (en) 2011-04-28 2016-08-02 Technologies Holdings Corp. Milking box with robotic attacher
US9326480B2 (en) 2011-04-28 2016-05-03 Technologies Holdings Corp. Milking box with robotic attacher
US9439390B2 (en) 2011-04-28 2016-09-13 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9282718B2 (en) 2011-04-28 2016-03-15 Technologies Holdings Corp. Milking box with robotic attacher
US9282720B2 (en) 2011-04-28 2016-03-15 Technologies Holdings Corp. Arrangement of milking box stalls
US9462780B2 (en) 2011-04-28 2016-10-11 Technologies Holdings Corp. Vision system for robotic attacher
US9468188B2 (en) 2011-04-28 2016-10-18 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9271471B2 (en) 2011-04-28 2016-03-01 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US9474246B2 (en) 2011-04-28 2016-10-25 Technologies Holdings Corp. Milking box with robotic attacher
US9480236B2 (en) 2011-04-28 2016-11-01 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US9265227B2 (en) 2011-04-28 2016-02-23 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9258975B2 (en) 2011-04-28 2016-02-16 Technologies Holdings Corp. Milking box with robotic attacher and vision system
US9491924B2 (en) 2011-04-28 2016-11-15 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9504224B2 (en) 2011-04-28 2016-11-29 Technologies Holdings Corp. Milking box with robotic attacher
US9510554B2 (en) 2011-04-28 2016-12-06 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US9253959B2 (en) 2011-04-28 2016-02-09 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9215861B2 (en) 2011-04-28 2015-12-22 Technologies Holdings Corp. Milking box with robotic attacher and backplane for tracking movements of a dairy animal
US9549529B2 (en) 2011-04-28 2017-01-24 Technologies Holdings Corp. Robotic attacher and method of operation
US9183623B2 (en) 2011-04-28 2015-11-10 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9582871B2 (en) 2011-04-28 2017-02-28 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9615537B2 (en) 2011-04-28 2017-04-11 Technologies Holdings Corp. Milking box with backplane responsive robotic attacher
US9171208B2 (en) 2011-04-28 2015-10-27 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9161512B2 (en) 2011-04-28 2015-10-20 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US9648840B2 (en) 2011-04-28 2017-05-16 Technologies Holdings Corp. Milking robot with robotic arm, vision system, and vision system cleaning device
US9681635B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. Milking box with robotic attacher
US9681634B2 (en) 2011-04-28 2017-06-20 Technologies Holdings Corp. System and method to determine a teat position using edge detection in rear images of a livestock from two cameras
US9107379B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Arrangement of milking box stalls
US9107378B2 (en) 2011-04-28 2015-08-18 Technologies Holdings Corp. Milking box with robotic attacher
US9686960B2 (en) 2011-04-28 2017-06-27 Technologies Holdings Corp. Milking box with robotic attacher
US9686959B2 (en) 2011-04-28 2017-06-27 Technologies Holdings Corp. Milking box with robotic attacher
US9058657B2 (en) 2011-04-28 2015-06-16 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US9357744B2 (en) 2011-04-28 2016-06-07 Technologies Holdings Corp. Cleaning system for a milking box stall
US9737040B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US9049843B2 (en) 2011-04-28 2015-06-09 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US9737039B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. Robotic attacher and method of operation
US9737042B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9737041B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9737048B2 (en) 2011-04-28 2017-08-22 Technologies Holdings Corp. Arrangement of milking box stalls
US9743635B2 (en) 2011-04-28 2017-08-29 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US9756830B2 (en) 2011-04-28 2017-09-12 Technologies Holdings Corp. Milking box with robotic attacher
US9043988B2 (en) 2011-04-28 2015-06-02 Technologies Holdings Corp. Milking box with storage area for teat cups
US9763422B2 (en) 2011-04-28 2017-09-19 Technologies Holdings Corp. Milking box with robotic attacher
US8903129B2 (en) 2011-04-28 2014-12-02 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US9883654B2 (en) 2011-04-28 2018-02-06 Technologies Holdings Corp. Arrangement of milking box stalls
US8885891B2 (en) 2011-04-28 2014-11-11 Technologies Holdings Corp. System and method for analyzing data captured by a three-dimensional camera
US8826858B2 (en) 2011-04-28 2014-09-09 Technologies Holdings Corp. Milking box with robotic attacher
US9901067B2 (en) 2011-04-28 2018-02-27 Technologies Holdings Corp. Robotic attacher and method of operation
US9930861B2 (en) 2011-04-28 2018-04-03 Technologies Holdings Corp. Milking box with robotic attacher
US9980459B2 (en) 2011-04-28 2018-05-29 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips
US8813680B2 (en) 2011-04-28 2014-08-26 Technologies Holdings Corp. Milking box with robotic attacher
US9980460B2 (en) 2011-04-28 2018-05-29 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US8746176B2 (en) 2011-04-28 2014-06-10 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US10127446B2 (en) 2011-04-28 2018-11-13 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US10143179B2 (en) 2011-04-28 2018-12-04 Technologies Holdings Corp. Milking box with a robotic attacher having a three-dimensional range of motion
US10172320B2 (en) 2011-04-28 2019-01-08 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US10303939B2 (en) 2011-04-28 2019-05-28 Technologies Holdings Corp. System and method for filtering data captured by a 2D camera
US10327415B2 (en) 2011-04-28 2019-06-25 Technologies Holdings Corp. System and method for improved attachment of a cup to a dairy animal
US8683946B2 (en) 2011-04-28 2014-04-01 Technologies Holdings Corp. System and method of attaching cups to a dairy animal
US10349618B2 (en) 2011-04-28 2019-07-16 Technologies Holdings Corp. System and method of attaching a cup to a dairy animal according to a sequence
US10357015B2 (en) 2011-04-28 2019-07-23 Technologies Holdings Corp. Robotic arm with double grabber and method of operation
US10362759B2 (en) 2011-04-28 2019-07-30 Technologies Holdings Corp. Milking box with robotic attacher
US10373306B2 (en) 2011-04-28 2019-08-06 Technologies Holdings Corp. System and method for filtering data captured by a 3D camera
US8671885B2 (en) 2011-04-28 2014-03-18 Technologies Holdings Corp. Vision system for robotic attacher
US10477826B2 (en) 2011-04-28 2019-11-19 Technologies Holdings Corp. Milking box with robotic attacher
US8651051B2 (en) 2011-04-28 2014-02-18 Technologies Holdings Corp. Milking box with robotic attacher
US8393296B2 (en) 2011-04-28 2013-03-12 Technologies Holdings Corp. Milking box with robotic attacher including rotatable gripping portion and nozzle
US10602712B2 (en) 2011-04-28 2020-03-31 Technologies Holdings Corp. Milking box with storage area for teat cups
US11096370B2 (en) 2011-04-28 2021-08-24 Technologies Holdings Corp. Milking box with robotic attacher comprising an arm that pivots, rotates, and grips

Also Published As

Publication number Publication date
AU2001228983A1 (en) 2001-07-31
SE0000153D0 (en) 2000-01-19
SE0000153L (en) 2001-07-20

Similar Documents

Publication Publication Date Title
WO2001052633A1 (en) A method and an apparatus for locating the teats of an animal
US5934220A (en) Teat location for milking
EP2685811B1 (en) System and method for three dimensional teat modeling for use with a milking system
WO2000011935A1 (en) A method and an apparatus for locating the teats of an animal
EP0989800B1 (en) Apparatus and method for recognising and determining the position of a part of an animal
EP3520605B1 (en) Arrangement and method for determining a weight of an animal
EP1460892B1 (en) Method and apparatus for detection of teats
EP1656014B1 (en) Improvements in or relating to milking machines
EP0993655B1 (en) An animal related apparatus
EP0975211B2 (en) Apparatus and method for recognising and determining the position of a part of an animal
SE9901385L (en) Method and apparatus for recognizing and determining a position and a robot including such a device
WO2000011940A1 (en) An apparatus and a method for monitoring an animal related area
WO2010023122A2 (en) Arrangement and method for determining positions of the teats of a milking animal
WO2010020457A1 (en) Arrangement and method for controlling a movable robot arm
US20180035679A1 (en) Device for optically identifying the sex of a slaughter pig
WO2014014341A1 (en) Milking arrangement
SE516384C2 (en) Apparatus for locating the teats of a milk animal using light sources to illuminate two areas on a teat at different angles and an image capture device to supply information to a processor
WO2010023121A2 (en) Arrangement and method for determining positions of the teats of a milking animal
French et al. An image analysis system to guide a sensor placement robot onto a feeding pig

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP