WO2014014341A1 - Milking arrangement - Google Patents

Milking arrangement Download PDF

Info

Publication number
WO2014014341A1
WO2014014341A1 PCT/NL2013/050481 NL2013050481W WO2014014341A1 WO 2014014341 A1 WO2014014341 A1 WO 2014014341A1 NL 2013050481 W NL2013050481 W NL 2013050481W WO 2014014341 A1 WO2014014341 A1 WO 2014014341A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
beam
arranged
milking
Prior art date
Application number
PCT/NL2013/050481
Other languages
French (fr)
Inventor
Assaf WISE
Gideon Yadin
Original Assignee
Lely Patent N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP12177270.1 priority Critical
Priority to EP12177270 priority
Application filed by Lely Patent N.V. filed Critical Lely Patent N.V.
Publication of WO2014014341A1 publication Critical patent/WO2014014341A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/017Automatic attaching or detaching of clusters
    • A01J5/0175Attaching of clusters

Abstract

Milking arrangement (1) comprising: • - a milking parlour (2) with teat cups (8), • - a robot arm (7) for connecting the teat cups to the teats (5) of a milking animal (3) • - a control means (6, 9) for controlling the robot arm, and comprising: • - a coherent light source (12) emit a coherent beam (13) of optical radiation, • - a speckle pattern generator (14) for forming a speckled beam (16) • - a camera (9) for imaging a reflected part of said beam, and • - a control unit (6) arranged to form a three-dimensional image from said image by comparing said image with a reference image of said beam, and statistically cross- correlating said speckle pattern with the speckle pattern in said reference image, to identify said teats and said teat cups and to detect the location thereof with respect to the camera, on the basis of said three-dimensional image.

Description

Milking arrangement

The present invention relates to a milking arrangement. More in particular, the invention relates to a milking arrangement comprising a milking parlour with teat cups, a robot arm for connecting the teat cups to the teats of a milking animal, and a control means for controlling the robot arm.

Such milking arrangements are known as milking robots, available on the market today. Milking robots are able to milk cows in a completely automated way. Thereto, a dairy animal such as a cow enters the milking parlour. The control means is arranged to help guide the robot arm to the teats for connecting the teat cups to the teats of the milking animal. Various types of control means are known in the prior art, such as laser detection systems, stereoscopic cameras and 3D time-of-flight cameras.

It turns out that in practice, the known control means do not always function sufficiently reliably and fast, sometimes to a point that it is unable to connect all teat cups to the teats. This is of course undesirable, as it reduces not only the capacity of the milking arrangement, but also might reduce milk production for the dairy animal as one or more quarters will not be milked out. Even if the dairy farmer would be warned and would milk the cow by connecting the teat cups manually, this would reduce the overall capacity of the milking arrangement, and would furthermore lead to more work for the farmer and to more stress for the dairy animal. It is therefore an object of the present invention to provide a milking arrangement of the kind mentioned above, that is more reliable and/or faster, or at least provides the public with a reasonable alternative.

The invention achieves this object by means of a milking arrangement according to claim 1 , and in particular comprising a milking parlor with teat cups, a robot arm for connecting the teat cups to the teats of a milking animal, a control means for controlling the robot arm, wherein said control means comprise a coherent light source arranged to emit a coherent beam of optical radiation, a speckle pattern generator arranged to impart a speckle pattern to said beam, thereby forming a speckled beam, a camera for repeatedly obtaining an image of a reflected part of said beam, and a control unit arranged to form a three-dimensional image from said image by comparing said image with at least one reference reflection image of said beam and taken with said camera, and statistically cross-correlating said speckle pattern in said image with said speckle pattern in said at least one reference reflection image, wherein the control unit is further arranged to identify said teats and to detect the location thereof with respect to the camera, on the basis of said three- dimensional image. In such a milking arrangement, and such a control means, it turns out to be possible to detect teats, and thus connect the teat cups, reliably and fast. In particular a fast detection is advantageous as milking animals are live creatures, that can move in an unpredictable way, hindering easy detection and connection. Therefore, if that detection is performed swiftly, connection becomes more reliable. Special embodiments of the invention are described in the dependent claims, as well as in what follows.

It is noted that the technology of creating a speckle pattern such as caused by interference among different components of a diffused beam, and to use the speckle pattern to establish position, orientation and/or movement is itself known from a.o. the gamebox controller Kinect and their software developers PrimeSense, see e.g. http://www.joystiq.com/2010/06/19/kinect-how-it-works-from-the-company-behind- the-tech/. Therefore, for specific technical details reference is made to corresponding publications such as WO2007/043036. In the present application, the presence of a memory for one or more reference images of the speckles at known distances, and other parts necessary to perform the method from the cited WO'036 document are deemed implicitly included in the milking arrangement, in particular in the control means. Herein, various steps regarding the cross-correlation of an image to a reference image are given in an exemplary embodiment on pages 15 and further, hereby included by reference.

It is furthermore noted that detection of structures and objects in a three-dimensional image is also known in the art. Depending on the objects to be detected, a number of criteria may be applied to the image. For example, if a teat has to be detected, one can look for a more or less cylindrical object with a diameter of about 2-3 cm and a length roughly between 2 and 8 cm, with a rounded tip to the lower side and connected at the upper side to a much bigger spherical structure, and moreover being provided in fourfold in a trapezoidal symmetry. Furthermore, as finger detection has already been contemplated for Kinect and such like systems with only computing power determining the required resolution, and as teats and fingers are geometrically like objects, the presently contemplated system is well suited for teat detection. Of course, if other objects need to be detected, suitable criteria can be provided, based on knowledge of the geometry of those objects.

In advantageous embodiments, the control unit is further arranged to detect at least one of said teat cups and the position thereof with respect to the camera, on the basis of said three-dimensional image. By not only identifying a teat and detecting its location, but also detecting the position of a teat cup in the image, the control unit can determine the mutual distance between the teat cup and the teat to which it is to be connected. By controlling the guiding of the teat cup by minimising that distance, the efficiency will be improved. It also ensures that any mispositioning of the teat cup on the robot arm can be corrected. Failing to do so and using only a basic position might cause unsuccessful attempts to connect, which decreases system efficiency.

A particular advantage of the system with respect to for example triangulation systems, using two or more cameras, is that a single camera suffices to do measurements. An actual image by the camera is compared and cross-correlated with a stored reference image taken before by the same camera at a known distance. This makes the system less vulnerable to malfunctions of a camera (as it has only one), but also faster, as only one image at a time has to be acquired. This increased speed helps in obtaining the increased detection speed, in particular for unpredictably moving animals. Furthermore, it also helps that, contrary to time-of- flight measurements, no sampling of a modulated signal is necessary for each pixel separately, nor integration of said signal for averaging, to increase bad signal-to- noise ratios. Inevitably, this takes time, while the arrangement according to the invention only requires the grabbing of a single image to detect momentary position, since the processing can be done externally, with a high-power computer, that does thus not need to be present very near the camera or even on the robot arm. Therefore, in a particular embodiment, the control means has a single camera that is arranged to repeatedly obtain an image of a reflected part of said beam from which image the control unit forms a three-dimensional image. It is stressed here that this single camera relates to the grabbing of the image for making the three-dimensional image. Other cameras may be present for other purposes. The forming of the three- dimensional image still takes place by cross-correlating the speckle pattern in it with the speckle pattern of one or more reference images, as stored in the control means.

Advantageously, the speckle pattern generator is arranged to generate a constant and random speckled pattern. This allow easier cross-correlation of the image with the reference image, as each part of the pattern is in principle unique, and can be traced in the distorted received image more easily. However, it is also possible to use a speckle pattern generator that is arranged to generate a speckle pattern having some degree of regularity, up to a completely regular pattern.

In embodiments, the milking arrangement according to the invention further comprises an additional sensor different from said camera, said sensor being arranged to obtain at least one additional image of a reflected part of said beam in a way that differs from the way of said camera, and wherein the control unit is arranged to use said additional image in identifying at least said teats. With such embodiments, an additional image is available to improve the detection capabilities of the control means. Because in use, the detection of teats will often depend on e.g. edge detection and the like. But when a surface is inclined sharply, the reflected speckle pattern will be locally weak and/or much distorted. It is then relatively difficult to determine whether the signal is just weak but real, or whether there is some noie or other signal disturbance. In other words, many likelihood criteria used to detect edges and the like will have difficulties determining such edges with a high accuracy. Now, by using an additional image, this accuracy and likelihood may be improved, because it becomes better possible to determine whether a structure in the acquired image is a real edge or similar structure by comparing it with the corresponding part of the additional image. If the latter for example shows a similar discontinuity, the likelihood of there being a real edge is higher, while a smooth surface in the additional image helps to conclude that there is not. In embodiments, the additional sensor comprises a thermal image camera or an ultrasound sensor. Generally, such sensors are less susceptible to dirt, as a layer of dirt will often still reflect ultrasound in much the same way as clean tissue does, and dirt will assume a temperature that is often much as the same as the underlying tissue. Therefore, using such additional images improves the reliability of the original, three-dimensional image. E.g., such a thermal image camera produces a thermal image of the region of interest. Since an udder and the teats, but also most other body parts, are at more than 30 °C (such as at about 33-35 °C), which is almost always much more than ambient temperature, these structures are easily visible. Also, since such a sensor measures a completely different, but relevant parameter, this additional image is also available for other purposes, such as health monitoring and general animal management. For example, if a teat or udder quarter has an inflammation, this may show up in the thermographic image as a rise in temperature. Additionally or aternatively, the additional sensor comprises a visual camera. Similarly, any perceived matching (dis)continuities in the visual iamge as compared with the original image may support the finding of an edge or other structure, while non-matching images support the absence of such structures. Furthermore, there are very cheap, compact, versatile and reliable visual cameras available.

In particular embodiments, the control unit is arranged to determine a movement for at least a part of said at least one additional image. Preferably the control unit is furthermore to use said movement in detecting the location of at least the teats with respect to the control means. Specifically, the movement relates to speed, direction of movement or preferably both. Furthermore, the movement is, whenever possible, determined for a structure already recognised in either the additional image or the (original) image.

As mentioned above, animals are live creatures, that can move unexpectedly. More in particular, any movement of the milking animal as a whole will have a relatively large effect on the position of the teats. Therefore, fast imaging is important in order to obtain an arrangement that can reliably connect teat cups to teats. But being able to monitor and track such movements in processing the image for detecting the teats also helps in improving the reliability of the milking arrangement. By using movement recognition for the additional sensor, use can be made of any available greater ease in doing so than for the image detection based on speckle patterns.

In particular, the additional sensor comprises a visual camera arranged to repeatedly obtain an additional image, wherein the control unit comprises visual image recognition software and is arranged to determine a rate of movement for one or more structures as detected in the additional image by the image recognition software. Such motion detection is much more easily done with visual techniques, and its results can be used in correcting any measurements and determinations for the original image and its subsequent processing into a three-dimensional image. For example, when the direction and rate of movement of a structure recognised as a teat tip is determined by means of the additional visual images, it becomes easier to predict where to find matching parts of the speckle pattern in a subsequent image in the cross-correlating step. After all, if a movement is swift, the relative displacement of e.g. a teat tip in the image can be such that the speckle pattern will be distorted with respect to the previous image to a relatively high degree. With knowledge of the movement, that can be determined in parallel with the acquiring of the original image, and of course with knowledge of the previously obtained three-dimensional image, the new three-dimensional image and thus the speckle pattern to be expected can be predicted to a higher degree, so that the actual determination of the new three- dimensional image can be performed quicker.

The invention will now be described in more detail by means of some non-limiting embodiments and drawings, in which:

- Figure 1 very diagrammatically shows a milking arrangement 1 according to the invention, in a perspective side elevational view;

- Figure 2 very diagrammatically shows a part of the milking arrangement 1 in more detail; and

- Figures 3A and 3B diagrammatically show a part of a reference image, and of an actual image, respectively.

Figure 1 very diagrammatically shows a milking arrangement 1 according to the invention, in a perspective side elevational view. The milking arrangement 1 comprises a milking parlour 2, that is presently occupied by a milking animal 3 having an udder 4 with teats 5. The arrangement further comprises a control unit 6 with a robot arm 7, here carrying a teat cup 8, and with a camera 9 having a field of view 10. Herein, the camera 9 is comprised in the control means for controlling movement of the robot arm 7.

The milking parlour 2 may be a spatially fixed milking parlour, or may be a parlour on a rotary platform. The parlour may comprise a single milking box, or may be a part of a multi-parlour arrangement. The robot arm 7 may be a dedicated robot arm for just a single milking parlour 2, or may be a displaceable robot arm to operate in a plurality of milking parlours 2, in particular in a herring bone set-up or for a rotary platform.

The control unit 6 is arranged to control a.o. the milking process, with settings for milking, quality control and so on, but is in particular also arranged to control operation of the robot arm 7 with the help of information from the camera 9. Camera 9 has a field of view 10 that is arranged to be suitable to acquire a view of a relevant part of the milking animal 3 and/or the milking parlour 2. In particular, the field of view 10 is arranged to be able to comprise, when in use, a view of at least a part of the udder 4, teats 5 and at least one teat cup 8. The camera 9 may be positioned on the robot arm 7, on the control unit 6, directly connected to the milking parlour 2 or any other suitable position, as long as in use a suitable field of view can be arranged. In each case, the three-dimensional image then provides coordinates in a respective coordinate frame, such as with respect to the robot arm, the control unit, the milking parlour, respectively. In particular, positioning the camera on the robot arm is very suitable for stationary milking parlours. For milking arrangements with a rotary platform on which a plurality of milking parlours has been provided, it may be advantageous to provide the camera in a fixed position with respect to the ground.

Figure 2 very diagrammatically shows a part of the milking arrangement 1 in more detail, in particular the camera 9 and the control unit 6.

The camera 9 comprises an illumination unit 11 and an imaging unit 17. The illumination unit 1 1 comprises a laser 12, emitting a laser beam 13. A diffuser 14 and a diffractive element 15 turn the beam 13 into an emitted speckled beam 16. The imaging unit 17, having the field of view 10 of the camera 9, comprises imaging optics 18, that create an image of the field of view onto the sensor 19, that provides images to the control unit 6, that in turn comprises an image processor 20 and a memory 21. An additional rgb camera has been indicated by numeral 22, and has a field-of-view 23. It is explicitly noted here that the rgb camera 22 is not a part of the camera 9 that is arranged to acquire images for generating the three-dimensional image. In use, the laser 12 emits a laser beam 13 of a useful wavelength, having a wavelength between 600 and 1200 nm, preferably between 700 and 1000 nm. NIR has an advantage that ambient levels are relatively low and the sensitivity of most eyes is also low. Therefore, inconvenience for e.g. cows is relatively low. The laser beam then is sent through a diffuser 14, such as a piece of ground glass, that generates a speckle pattern in the beam, or a speckled beam, by interference effects within the propagated beam. The diffractive element 15 helps to control the brightness level in a direction transverse to the beam propagation direction. It is noted that the position of the diffractive element 15 may also be between laser 12 and diffuser 14. The diffuser 14 may also be a different kind of speckle pattern generator, such as a holographic plate, or a transparency with an imprinted pattern. For more information regarding the diffuser 14 and the diffractive element 15, and regarding the technical background of this technique, reference is made to

WO2007/043036, incorporated by reference, and in particular to page 5-8. The speckled beam is emitted and hits an object, in this case an udder 4 with teats 5. A part of the radiation is reflected towards the imaging unit 17, in which the imaging optics 18 form an image of the reflected radiation onto the sensor 19. The sensor 19 could be a ccd sensor, a cmos sensor or the like. Advantageously, the imaging unit comprises a filter, transmitting substantially only radiation with the wavelength of the laser source 12, in order to filter out as much ambient light as possible. The image formed in sensor 19 is then sent to the image processor 20, in which it is processed and a.o. compared to one or more reference images stored in memory 21. The memory 21 could also serve to store temporarily or permanently the image from sensor 19, as well as any subsequent image from said sensor. It is noted that Figure 2 is not to scale, and that the illumination unit 1 1 and the imaging unit 17 are preferably positioned very close to each other. Furthermore, it is not relevant where, that is in which part of the arrangement, the processing takes place. Alternatively, the processing could take place within the imaging unit 17 or in a device physically separate from the control unit 6. Of course, the control unit 6 should be connected to the image processor 20 in such a way that the results and information of the latter can be used by the former in the performing of its controlling tasks.

With reference numeral 22, an additional sensor in the form of a rgb camera has been indicated. Its field-of-view 23 should be overlapping with the field-of-view 10 of the camera/imaging unit 17 as much as possible. The rgb camera 22 serves to provide a visual image for supporting the formation of the three-dimensional image and the image and object recognition in said three-dimensional image. Thereto, the rgb camera is operatively connected to the image processor 20. The latter can compare an image from the rgb camera 22 to the actual image from the imaging unit 17 and or to its subsequent three-dimensional image. If a structure or, edge or object is apparently detected in the (actual or) three-dimensional image but something is determined to be wrong, such as a wrong number of detected teats, then the rgb image may be taken into account. For example, assume five teats have been determined in the three-dimensional image, but the rgb image shows a continuous colour and/or intensity on the position (i.e. spatial angle) of one of the teats and its immediate surroundings, then it is safe to conclude that that particular position does not contain a teat. In a similar fashion, it is possible to use the rgb camera 22 as a means to determine movement of objects in the image. It is relatively easy to determine movement by means of an optical (rgb) camera and image. Then, if a structure in the three- dimensional image has been coupled (cross-correlated) to the rgb image, in other words its position and distance are known, a displacement in the rgb image can be coupled to a displacement in the three-dimensional image. Then, the new and distorted speckle pattern can be predicted to some degree. This makes the cross- correlating of the new speckle pattern in the subsequent image from imaging unit 17 easier. Figure 3A diagrammatically shows a part of a reference image, and Figure 3B diagrammatically shows a part of an actual image taken by the imaging unit 17.

The reference image 3A is an image of the speckled beam, taken at a known distance. The image shows the pattern of the speckles as present in space at said distance. Just for convenience, the pattern is shown as completely regular. This greatly simplifies the following discussion. However, it is to be noticed that a random, non-repetitive pattern is much more convenient in practice, as this allows to identify a part of the actual image much easier and with much more certainty. Furthermore, although dots have been indicated, this does not mean that there are only bright spots while all the rest is dark. Rather, the dots indicate brighter parts in the image, while the parts around and inbetween the dots is darker, but not necessarily completely dark, even without a view to ambient light. The actual image 3B, highly idealised in this case, shows how the emitted speckled beam 16 would be imaged when illuminating a part of a milking animal 3. One can see a pattern of the speckles. Some parts are lacking dots, obviously the parts where no reflective object is present. Furthermore, some parts do show a pattern, that, however, has been deformed with respect to the original. The deformation of the pattern, and in particular the distance between neighbouring speckles, and also the (average) size of the speckles is an indication of the orientation and the distance with respect to the camera (or imaging unit) of the surface reflecting the speckle pattern, but can also also be compared with the distance at which the reference image 3A was taken. For example, a part pattern slightly above the centre of Figure 3B shows speckles at about the same distance as in Figure 3A, and also in about a square pattern. This indicates that the reflecting surface is oriented substantially transversely with respect to the camera and at about the same distance as for image 3A. To the left and right thereof, the speckles are more and more closer together, and run off to the top of the page. This indicates that the surface bends further away, i.e. bends to the back, and furthermore is slightly inclined such as to face the ground. In all, the central part of the image seems to resemble roughly a semi-circle, better: a half- sphere. Looking more closely, four structures can be found having a more or less cylindrical shape with a rounded tip. These are obviously the teats. To the extreme left and right edges of the Figure 3B, similar cylindrical structures can be seen, which can be recognised as the legs, while the large structure at the top of the Figure will be the belly. Note that in this case the image analysis is a kind of two-step analysis. First, a three-dimensional image is created by determining, for as many points or speckles as possible, the spatial coordinates thereof. Then, the three-dimensional image is further analysed in order to extract surfaces and shapes therefrom, by means of image and shape recognition techniques. These are perse deemed known to the skilled person.

It will be clear that in the above Figure 3B, the picture is much clearer than will be the case in practice. For example, there is no dirt, noise or background signal present, the structures (legs, udder, teats) are all separated and easily recognisable. On the other hand, it is more difficult to actually position the structures in space (i.e.

determine the right orientation and distance) with a completely regular pattern, as it is not possible to distinguish parts in a repetitive pattern. Thereto, an irregular pattern is used, for example random or regular though non-repetitive patterns. With such patterns, it is easier to cross-correlate parts of the collected image and similar parts of the reference image, as no so-called wrapping problem occurs. Reference is again made to WO2007/043036. The above embodiments and drawings are not intended to limit the invention, the scope of which is determined by the appended claims.

Claims

1. Milking arrangement (1 ) comprising:
- a milking parlour (2) with teat cups (8),
- a robot arm (7) for connecting the teat cups to the teats (5) of a milking animal (3)
- a control means (6, 9; 1 1 , 17) for controlling the robot arm,
wherein said control means comprise:
- a coherent light source (12) arranged to emit a coherent beam (13) of optical radiation,
- a speckle pattern generator (14) arranged to impart a speckle pattern to said beam, thereby forming a speckled beam (16),
- a camera (9) for repeatedly obtaining an image of a reflected part of said beam, and
- a control unit (6) arranged to form a three-dimensional image from said image by comparing said image with at least one reference reflection image of said beam and taken with said camera, and statistically cross-correlating said speckle pattern in said image with said speckle pattern in said at least one reference reflection image, wherein the control unit is further arranged to identify said teats and to detect the location thereof with respect to the camera, on the basis of said three-dimensional image.
2. Milking arrangement according to claim 1 , wherein the control unit is further arranged to detect at least one of said teat cups and the position thereof with respect to the camera, on the basis of said three-dimensional image.
3. Milking arrangement according to any preceding claim, wherein the control means has a single camera that is arranged to repeatedly obtain an image of a reflected part of said beam from which image the control unit forms a three- dimensional image.
4. Milking arrangement according to any preceding claim, wherein the speckle pattern generator is arranged to generate a constant and random speckled pattern.
5. Milking arrangement according to any preceding claim, further comprising an additional sensor different from said camera, said sensor being arranged to obtain at least one additional image of a reflected part of said beam in a way that differs from the way said camera obtains said image, and wherein the control unit is arranged to use said additional image in identifying at least said teats.
5 6. Milking arrangement according to claim 5, wherein the additional sensor comprises a thermal image camera.
7. Milking arrangement according to claim 5 or 6, wherein the additional sensor comprises a visual camera.
o
8. Milking arrangement according to any of claims 5-7, wherein the control unit is arranged to determine a movement for at least a part of said at least one additional image, and preferably to use said movement in detecting the location of at least the teats with respect to the control means.
PCT/NL2013/050481 2012-07-20 2013-07-02 Milking arrangement WO2014014341A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP12177270.1 2012-07-20
EP12177270 2012-07-20

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE112013003612.6T DE112013003612T5 (en) 2012-07-20 2013-07-02 Milking arrangement
SE1550114A SE1550114A1 (en) 2012-07-20 2013-07-02 Milking arrangement

Publications (1)

Publication Number Publication Date
WO2014014341A1 true WO2014014341A1 (en) 2014-01-23

Family

ID=48794166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2013/050481 WO2014014341A1 (en) 2012-07-20 2013-07-02 Milking arrangement

Country Status (3)

Country Link
DE (1) DE112013003612T5 (en)
SE (1) SE1550114A1 (en)
WO (1) WO2014014341A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000011939A2 (en) * 1998-08-31 2000-03-09 Delaval Holding Ab An apparatus and a method for monitoring the motion activity of an animal
WO2005094565A1 (en) * 2004-03-30 2005-10-13 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal
WO2007043036A1 (en) 2005-10-11 2007-04-19 Prime Sense Ltd. Method and system for object reconstruction
WO2009093965A1 (en) * 2008-01-22 2009-07-30 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000011939A2 (en) * 1998-08-31 2000-03-09 Delaval Holding Ab An apparatus and a method for monitoring the motion activity of an animal
WO2005094565A1 (en) * 2004-03-30 2005-10-13 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal
WO2007043036A1 (en) 2005-10-11 2007-04-19 Prime Sense Ltd. Method and system for object reconstruction
WO2009093965A1 (en) * 2008-01-22 2009-07-30 Delaval Holding Ab Arrangement and method for determining positions of the teats of a milking animal

Also Published As

Publication number Publication date
SE1550114A1 (en) 2015-02-04
DE112013003612T5 (en) 2015-06-03

Similar Documents

Publication Publication Date Title
US8807080B2 (en) Implement for automatically milking a dairy animal
EP1293184B1 (en) Walking auxiliary for person with dysopia
RU2543948C2 (en) Device and method of determining quantitative indicator of body condition of animals
US20140285818A1 (en) Determining positional information of an object in space
JP3916255B2 (en) Method and apparatus for guiding a milking device support
US9314150B2 (en) System and method for detecting tooth cracks via surface contour imaging
US20150092163A1 (en) Adaptive camera and illuminator eyetracker
JP5297486B2 (en) Device for detecting and tracking the eye and its gaze direction
JP4006577B2 (en) Monitoring device
EP1537775B1 (en) A device for milking animals and a method for this purpose
US9041914B2 (en) Three-dimensional coordinate scanner and method of operation
NL1024935C2 (en) Device for milking animals.
US20100178982A1 (en) Method and system for operating a self-propelled vehicle according to scene images
US20110057930A1 (en) System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
EP1940218B1 (en) Arrangement and method for visual detection in a milking system
US20170300061A1 (en) Methods and systems for obstacle detection using structured light
US7533988B2 (en) Eyeshot detection device using distance image sensor
US20050065441A1 (en) System, apparatus and method for measurement of motion parameters of an in-vivo device
DE69824588T2 (en) Device for animals
US7682026B2 (en) Eye location and gaze detection system and method
US20130321600A1 (en) Systems for determining animal metrics and related devices and methods
US6234109B1 (en) Apparatus and method for recognizing and determining the position of a part of an animal
DE102007022939A1 (en) Vehicle e.g. car, ambience monitoring device, has object extraction process unit extracting image area of object, and object type determination process unit determining object type depending on object extraction process unit
KR20060043565A (en) Measurement apparatus for movement information of moving object
EP1294222A1 (en) Milking system with three-dimensional imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13737690

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 1120130036126

Country of ref document: DE

Ref document number: 112013003612

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13737690

Country of ref document: EP

Kind code of ref document: A1