EP1537531A1 - Bildaufnahmesystem und -verfahren zur auswertung der körperlichen verfassung - Google Patents

Bildaufnahmesystem und -verfahren zur auswertung der körperlichen verfassung

Info

Publication number
EP1537531A1
EP1537531A1 EP03741046A EP03741046A EP1537531A1 EP 1537531 A1 EP1537531 A1 EP 1537531A1 EP 03741046 A EP03741046 A EP 03741046A EP 03741046 A EP03741046 A EP 03741046A EP 1537531 A1 EP1537531 A1 EP 1537531A1
Authority
EP
European Patent Office
Prior art keywords
interest
region
cow
indicative
illuminated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP03741046A
Other languages
English (en)
French (fr)
Other versions
EP1537531B1 (de
Inventor
David Sharony
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vet-Tech Ltd
Vet Tech Ltd
Original Assignee
Vet-Tech Ltd
Vet Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vet-Tech Ltd, Vet Tech Ltd filed Critical Vet-Tech Ltd
Publication of EP1537531A1 publication Critical patent/EP1537531A1/de
Application granted granted Critical
Publication of EP1537531B1 publication Critical patent/EP1537531B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • a method for monitoring the body condition of an animal comprising: providing reference data representative of the body condition scales and corresponding values of a predetermined measurable parameter indicative of the curvature of a region of interest on the body; imaging the region of interest by illuminating an array of spaced-apart locations on the body within the region of interest, collecting light returned from the illuminated locations, and generating data indicative thereof; processing said generated data to obtain a three-dimensional representation of the region of interest and calculate a value of the predetermined measurable parameter from said three-dimensional representation; and utilizing said reference data to determine the body condition scale corresponding to the calculated value of said predetermined measurable parameter.
  • the shift is indicative of the height of the respective point in the map (3D representation), and is determined utilizing data indicative of the detectors' location with respect to each other and relative to the region of interest, or the single detector's location relative to the region of interest and to the trajectories of the incident light components.
  • a method for monitoring the condition of an animal comprising: imaging the cow while marching along a predetennined path and generating data indicative of the acquired images; analyzing said data to identify the existence of a certain pattern of locomotion or in-coordination in the cow's marching.
  • the waist part of the cow's body is of interest because it is indicative of the transverse processes of the lumbar vertebrae and the spinous processes of the lumbar vertebrae.
  • the BCS values of 3.2 and 3.1 measured at, respectively, the first region of interest (transverse processes of the lumbar vertebrae and the spinous processes of the lumbar vertebrae) and the second region of interest (the tail-part of the cow) are indicative of that the cow is in accumulating surplus of the energy balance, and vice versa, if the BCS values of 3.1 and 3.2 are measured at, respectively, the first and second regions of interest, this is indicative of that the cow is in accumulating deficit of the energy balance.
  • a system for monitoring the body condition of an animal comprising:
  • an optical device including an illuminating assembly operable to produce structured light in the form of an array of spatially separated light components to thereby illuminate an array of locations within a predetermined region of interest on the body part, and a light detection assembly operable for acquiring at least one image of the illuminated body part by collecting light scattered therefrom and generating data indicative of the acquired image;
  • a control unit connectable to the optical device, the control unit comprising a memory for storing reference data representative of the body condition scales and corresponding values of a predetermined measurable parameter that is indicative of the surface relief of the predetermined region of interest of the body part; and a data processing and analyzing utility preprogrammed for processing the data indicative of the acquired image to obtain a three- dimensional representation of the imaged region, calculate a value of the measurable parameter for the imaged body part, and analyze the calculated value with respect to the reference data to thereby determine the body condition scale of the specific animal.
  • Figs. 2A to 2C illustrate three different examples, respectively, of an illuminating assembly suitable to be used in the system of Fig. 1 ;
  • Fig. 3A schematically illustrates the region of interest on the body part of the dairy cow illuminated by the illuminating assembly of either one of Figs. 2B and 2C;
  • Fig. 3B illustrates the experiment of imaging the region of interest with the illuminating assembly of Fig. 2A;
  • Figs. 5A-5C exemplify the principles of calculation of a specific measurable parameter indicative of the curvature of the region of interest
  • Fig. 5D schematically illustrates the spots' pattern on the dorsal and/or the rear part of a cow.
  • FIG. 1 there is schematically illustrated an imaging system 100 according to the invention for dete ⁇ riining the BCS of a dairy cow DC.
  • the cow is typically provided with an identification code carrying tag (not shown) attached, for example, to its leg (e.g., front leg), neck, ear or any other place.
  • the imaging system 100 includes an optical device 102 and a control unit 110.
  • the optical device 102 includes an illuminating assembly 104 and a detection assembly 105.
  • the detection assembly 105 includes a pixel-array detector 106, which is preferably a video camera, and may optionally include an additional pixel-array detector 108 (shown in dashed lines).
  • the illuminating assembly 104 is constructed and operated to produce structured light in the fo ⁇ n of an array (one- or two-dimensional array) of light components, as will be described more specifically further below with reference to Figs. 2A-2C.
  • the camera(s) 106 is accommodated and oriented with respect to the body (a site where the body is to be located during the monitoring procedure) so as to acquire images of a predetermined region of interest.
  • Figs. 2A - 2C illustrate three non-limiting examples of the implementation of the illuminating assembly 104.
  • the illuminating assembly 104 includes one-dimensional array of light-emitting elements, (e.g. lasers that are operable to produce an array of spatially separated light beams, generally at 112A to thereby illuminate an array of spaced-apart locations (e.g., spots) within a region of interest.
  • the illuminating assembly is the so-called "laser beam box", for example producing 40 laser beams with the beam axes being for example spaced a 2cm distance from each other.
  • the box 104 at its front surface (by which it faces the cow) is formed with an optical window 107 (e.g., glass window).
  • an optical window 107 e.g., glass window.
  • a power supply of 4.5V and 500mA is can be used.
  • the illuminating assembly 104 is composed of a two-dimensional array of light-emitting elements, e.g. lasers, generally at 104A, each operable to produce a light beam 112A, to thereby illuminate a matrix (two-dimensional array) of spaced- apart locations (e.g., spots) within a region of interest.
  • the illuminating assembly 104 comprises a single light emitting element 104B generating a beam of light 113, and a mask (grid) 116 in the form of an array (one-dimensional array, or two-dimensional array as shown in the present example) of transmitting regions (e.g., holes) 118 spaced by non- transparent (blocking) regions.
  • the mask (grid) 116 splits the emitted light beam 113 into a two-dimensional array of spatially separated light components 112B to thereby produce a two-dimensional array of spaced-apart illuminated locations within the region of interest.
  • the light components produced either by the array of light-emitting elements 104 A or by the array of holes 118 are arranged in a pre-defined pattern, e.g., a matrix shape, e.g., consisting of straight lines and rows.
  • Fig. 3 A schematically illustrates the rear part of the cow (one example of the suitable region of interest) with a matrix of illuminated locations (spots) 120 covering the entire imaged area.
  • Fig. 3B illustrates the experiment of imaging the region of interest with one-dimensional array of light beams: shift of the illuminated spots' locations from those obtainable on a flat surface is associated with the curvature of the imaged surface.
  • the main relevant parts for the scoring of the cow are the pins bone (Tuber ischii), the hooks bone (Tuber coaxe), the thurl and the sacral ligament.
  • the central area of the illuminated body part along the tail head is typically convex, while the regions at both sides of the tail head may be convex, flat, or concave.
  • the degree of concavity of these regions is correlated with the body condition of a cow.
  • a skilled person inspects the rear part of the cow visually, and sometimes with manually appreciation, to estimate the degree of concavity and/or convexity of the relevant areas and determine the BCS of the cow, that defines the sub cutaneus amount of adiposic tissue which indicates the energy balance of the cow.
  • the present invention uses the imaging system of Fig. 1 (with one or two cameras) and a specific image processing technique.
  • the detection assembly preferably includes a video camera, operable to acquire 25 frames per second.
  • the camera may be conventional industrial color video camera (e.g., 5x5x10cm) equipped with appropriate focusing optics, aperture, shutter and speed and white balance means.
  • the camera is preferably positioned at a 90 degree elevation and at a 145cm distance from the cow back height to achieve a focused image (the laser line being in the middle of the image frame).
  • the camera is calibrated once during the installation.
  • the focus is set to an average distance of the cow back.
  • the exposure is set to maintain a contrast between the laser spots and the brightest possible area on the dark area.
  • the camera calibration stage includes the following two steps: acquiring a set of images at predetermined distances, and storing the images (in the memory utility of the control unit); processing the images to detect the laser spots; calibrating the processing (software model) to create a depth of focus to thereby give the desired precision of measurements of the surface profile depth (with about 1mm tolerance).
  • the determined depth of the spots' locations is then translated to a three-dimensional curve.
  • the optical device (illuminating assembly 104 and detection assembly 105 shown in Fig. 1) is operated to obtain data indicative of one or more images of the region of interest (step A). This is implemented by illuminating an array (one- or two- dimensional array) of locations within the region of interest on the cow's body, collecting light returned (scattered) from the region of interest by one or two cameras, and generating output indicative of the so-acquired images.
  • This output data is received at the control unit 110 and processed by the data processing and analyzing utility (software) appropriately preprogrammed (e.g., with real time grabbing software with an event indicator whenever a new frame is grabbed) to control the imaging process and obtain three-dimensional representation of the body surface within the region of interest (step B), and further process the 3D representation to determine the BCS of the imaged cow (step C).
  • This is implemented by calculating a value of a predetermined measurable parameter indicative of the curvature (surface relief) of the region of interest and utilizing previously prepared (and stored in the memory of the control unit) reference data in the form of BCS scales and corresponding values of the predetermined measurable parameter to analyze the calculated value and determine the corresponding BCS scale.
  • a shift between the two images for each of the illuminated locations (spots) on the surface of the cow's body caused by the curvature of the surface is determined. If a single image is acquired, such a shift is a distance between the actual position of the illuminated location with respect to the other locations and a "theoretical" position of the corresponding location on the surface if being substantially flat (i.e., the corresponding location on the body surface along the trajectory of the corresponding light component).
  • this shift is determined as a distance between the two illuminated locations of a matching pair in the two images.
  • the shift is indicative of the height of the respective point in the topography map (three- dimensional representation of the region of interest), which is determined by utilizing data indicative of the detectors' location with respect to each other and relative to the region of interest, or the single detector location relative to the region of interest and to the illuminator trajectory.
  • the measurable parameter is indicative of the curvature of the region of interest, i.e., of a topographic map (surface relief) of the region of interest, and is actually representative of the volume (depth) of the region of interest.
  • This curvature is determined with respect to a predefined reference plane (RP in Fig. 1), which in the present example is perpendicular to the line between the camera and the cow's rear part and is tangential to the region of interest, namely, to the dorsal and/or the rear part of the cow at the point of spin bone and tail head.
  • the reference plane is selected as a plane in the vicinity of the tail head.
  • the system of the present invention is installed at a predetermined location for monitoring the dairy cow condition when passing along a predetermined (substantially straight) path.
  • the cow is assigned with unique identification code that is presented on a tag attached to the cow.
  • an ID tag may be optical (barcode), RF tag (carrying an RF resonance circuit), magnetic tag or of any other automatic ID technique.
  • the construction and operation of the tag do not form part of the invention.
  • Such a tag may be of any known suitable type, and therefore need not be specifically described, except to note that it is preferably attached to a predetermined location on the cow, for example the front leg of the cow.
  • the cow When the cow moves along the predetennined path and its front leg arrives at a location of the ID reader accommodation, the cow is identified and a respective signal is generated (by the ID reader) to actuate the imaging system either immediately or a certain time thereafter, depending on a distance between the ID-location and the region of interest and a prediction of the cow movement speed.
  • the imaging system or at least its optics is movable with respect to the cow location.
  • the system of the present invention may utilize a remote control unit that may be wireless connected to the optical device. The system may operate to grab the required number of images and transmit data indicative of the images as well as data indicative of the cow's ID to a remote (central) location where the data is appropriately processed.
  • a sequence of images of the moving cow is acquired so as to thereby identify an in-coordination or locomotion problem of the cow marching.
  • the existence of a certain pattern of this locomotion and/or in-coordination problem is indicative of a certain problem (disease), for example is indicative of limping, neurological disorder associated with nervous system diseases, such as "mad cow” disease, CCN (Cerebrocortical necrosis), tumors, parasitoses, meningitis, rabies, trauma, compression and intoxicoses of the animal..
  • Fig. 5A shows a 3D topography map TP reconstructed from the map(s) of points grabbed by one or two cameras from the region of interest (back of the cow). These points correspond to the scattering of the illuminated locations produced by the structured incident light.
  • a matrix of 120 laser sources can be used as the illuminating assembly.
  • the measurable parameter may be a distance h (height) between the reference plane RP and the point Pi in the topography map mostly distant from the reference plane, or a cross-sectional area CA of a cross-sectioned segment ES of the map in a plane P' perpendicular to the reference plane and defined by a pair of most distant points Pi and P 2 at opposite sides of the central plane of the image.
  • this parameter is the distance between the reference plane RP and the point on the line-segment ES maximally distant from the reference plane RP.
  • this parameter is the cross-sectional area CA confined between the envelope segment ES and its projection on the reference plane AC.
  • the measurable parameter is the volume of at least a part of the topography map confined between the envelope segment, wherein the relevant part of the topography map is defined by a rectangle of a certain area (the entire structured light grid LG or a part of it) around the line segment ES, and the projection of this area onto the reference plane.
  • This calculation practically comprises the integration of the height values of the illuminated locations (centers of these locations) over the image within the certain area, or, equivalently, summing the distances between the envelope segment and the reference plane over all the center points of the illuminated locations within the certain area.
  • the CFM value may be given in volume units, e.g. cubic centimeters (cm 3 ).
  • the accuracy in calculating the centers of the illuminated locations is estimated to be about a tenth of a pixel.
  • the accuracy in height is estimated to be about 1/2000 of the field-of-view of the camera, which for a measured volume of 600mm* 600mm* 600mm is about 0.5mm, less than the expected error due to the fur of a cow.
  • the maximal expected depth of concavity is about 150mm. If the BCS value should be calculated with accuracy of 0.25, there are 16 different values between BCS 1 and 5. Thus, accuracy in height of 150mm/16 or 9.375mm is sufficient for the purposes of the present invention.
  • the estimated accuracy is 5 times better.
  • the system today deals with the following 13 BCS scale values: 1.00, 1.50, 2.00, 2.25, 2.50, 2.75, 3.00, 3.25, 3.50, 3.75, 4.00, 4.50, and 5.00.
  • 1 decimal score 0.1
  • 50 BCS scales can be achieved (1.00, 1.10, 1.20, 4.80, 4.90, 5.00).
  • Fig. 5D schematically illustrates the spots' pattern on the rear part of a cow. It should be understood that, actually, imaging and analyzing the surface relief at one side of the cow's tail is sufficient for the purposes of the present invention.
  • the rear part of a cow may comprise deep valleys (for thin cows of low BCS) or shallow valleys, or may have no valleys at all (for normal cows with moderate BCS). Accordingly, the CFM can be positive, zero or negative.
  • the inventors have found that for cows with BCS of about 3.50, the value of CFM is about zero. Thus, for cows of the BCS below 3.50 the value of CFM is negative, while cows of BCS above 3.50 have a positive value of CFM.
  • BCS values enables to calculate the CFM values resulting from the image processing for each scan and match it with the given BCS value.
  • a correlation between CFM and BCS values can be determined.
  • This correlation is not necessarily a linear transformation and might not even be represented by any mathematical formula at all, but is simply experimentally matched.
  • the match is represented in a table correlating between pairs of points, where the first point is a CFM value, and the other point is a BCS value.
  • Fig. 6 exemplifying a flow diagram of the operational stages of the image processing considering the image acquisition with two cameras, or a single video camera.
  • the data to be processed is indicative of two images of the cow's rear part concurrently acquired by two cameras with two different angles of light collection.
  • a calibration stage is carried out, in order to compensate for internal variables of the camera(s) and for external variables of the camera(s) (step I).
  • Such internal variables include magnification, calibration and Distortion, while external variables of the cameras are for example the relative angles of the cameras in space.
  • calibration of the centers of all illuminated locations is carried out by suitable image processing (e.g., including pattern recognition) - step II.
  • the spot's center can be identified as the brightest point or gravity center in the spot.
  • processing of data received from the two cameras includes finding the matching pairs of illuminated locations, i.e., two images of the same illuminated location.
  • the processing starts with finding the center of each of the illuminated locations in each of the two images.
  • the center is simply the geometrical center of each spot.
  • the finding of the center can be done by any known suitable method, for example by detecting extreme locations of the illuminated region along two mutually perpendicular axes and calculating the center of the so-formed geometrical structure.
  • the central point is defined as the point of maximal light intensity along the illuminated line.
  • step III matching pairs between the central points of the illuminated locations in the two images are determined, i.e., location of each illuminated spot in the first image is matched with the corresponding location in the second image (step III).
  • imaging of a flat surface e.g. a flat wall
  • using a matrix of illuminated spots would result in an image composed of a matrix of spots with the same pattern as that of the matrix of incident light components, e.g., a two-dimensional linear array.
  • the locations of some of the spots are shifted from those in the flat-surface image, due to the surface curvature.
  • step IV 3D coordinates of each illuminated location on the cow's surface are calculated, utilizing the data representative of the spots' centers location (or, in the case of two cameras, data representative of the two- dimensional coordinates of the pairs of central points of the matched illuminated locations) and data representative of the three-dimensional (3D) location of the camera(s).
  • the triangulation technique is based on measuring a shift (called parallax) of an imaged object in two images. More specifically, when the object is imaged by two cameras distanced from each other (or by one camera taking two different shots at two places far apart), the relative place of the object is shifted between the two images, i.e., the object's image changes its location relative to the background. Measuring this shift allows for determining the distance of the object from the background or from the camera(s), or finds its location in space.
  • a shift called parallax
  • the 3D locations of all the illuminated spots within the region of interest are calculated, assuming that the peripheral illuminated spots may serve as the background for the calculation in the first approximation. If the peripheral illuminated spots are not shifted due to the surface curvature, or if such a shift is negligible, the use of a single camera is sufficient to calculate the deviation of each of the interior illuminated spots from its assumed position in the absence of the surface curvature. This shift, together with the relative locations of the camera and the light source from the region of interest, enable to calculate the "height" of each illuminated spot with respect to an imaging plane.
  • each point depth is calculated by measuring the distance between this point and a straight reference line between the tow pins bone (Tuber ischii), as shown in Fig. 5D. To this end, one-side measuring is sufficient, or when having data from both sides, the highest side is taking into account.
  • the so-obtained 3D coordinates are then used to calculate Curvature Factor Measure (CFM) indicative of the cow's body condition score - step V.
  • CFM Curvature Factor Measure
  • a two-dimensional (2D) representation of the 3D cow's rear part is obtained, for example in the following manner.
  • the reference plane RP is selected.
  • the points calculated in the previous stage are drawn on the selected reference plane.
  • their coordinates may need to be transformed according to the coordinate system of the chosen plane.
  • the first two coordinates of each point (X;, Yj) represent its place within the reference plane RP, while the third coordinate (Z;) denotes the height of the point above/beneath the reference plane RP.
  • This data indicative of the points' heights is used to calculate the CFM.
  • the 3D virtual surface is thus the basic for the CFM measurements.
  • the BCS is then obtained from the calculated CFM value and the reference data, as described above (step VI).
  • the image processing is generally similar to that of two cameras, and differs therefrom in the following two modifications, (a) Finding of matching pairs of points (centers of the illuminated spots) is eliminated since there is a single image; and (b) Calculation of the 3D location of a given illuminated spot on the cow's body part is carried out by talcing into account the location of the light source and the direction of the light beam from the light source to the relevant light spot along the trajectory of the beam.
  • These modifications actually affect the determination of the shift to calculate the 3D location of each illuminated spot, i.e. to reconstruct the topographic map.
  • the shift between the actual location of the illuminated spot and a "theoretical" location of the corresponding area on the surface if being substantially flat is determined in order to calculate the CFM.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Husbandry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Optical Recording Or Reproduction (AREA)
EP03741046A 2002-07-25 2003-07-27 Bildaufnahmesystem und -verfahren zur auswertung der körperlichen verfassung Expired - Lifetime EP1537531B1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL15091502A IL150915A0 (en) 2002-07-25 2002-07-25 Imaging system and method for body condition evaluation
IL15091502 2002-07-25
PCT/IL2003/000622 WO2004012146A1 (en) 2002-07-25 2003-07-27 Imaging system and method for body condition evaluation

Publications (2)

Publication Number Publication Date
EP1537531A1 true EP1537531A1 (de) 2005-06-08
EP1537531B1 EP1537531B1 (de) 2007-03-28

Family

ID=29596369

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03741046A Expired - Lifetime EP1537531B1 (de) 2002-07-25 2003-07-27 Bildaufnahmesystem und -verfahren zur auswertung der körperlichen verfassung

Country Status (7)

Country Link
US (1) US7853046B2 (de)
EP (1) EP1537531B1 (de)
AT (1) ATE358304T1 (de)
AU (1) AU2003281732A1 (de)
DE (1) DE60312869T2 (de)
IL (1) IL150915A0 (de)
WO (1) WO2004012146A1 (de)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2027770A2 (de) 2007-08-22 2009-02-25 Icerobotics Limited Verfahren und Vorrichtung zur automatischen Einstufung des Viehzustands
NL1034292C2 (nl) 2007-08-27 2009-03-02 Maasland Nv Systeem en werkwijze voor het beheren van een groep dieren.
WO2010063527A1 (en) * 2008-12-03 2010-06-10 Delaval Holding Ab Arrangement and method for determining a body condition score of an animal
EP3574751A1 (de) * 2018-05-28 2019-12-04 Bayer Animal Health GmbH Vorrichtung zur fliegenbekämpfung
US10639014B2 (en) 2015-02-27 2020-05-05 Biondi Engineering Sa Method and relevant apparatus for the determination of the body condition score, body weight and state of fertility

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8565490B2 (en) * 2004-07-01 2013-10-22 David A. Krien Computerized imaging of sporting trophies and method of providing a replica
US20110028212A1 (en) * 2004-07-01 2011-02-03 David Krien Computerized Imaging of Sporting Trophies and Method of Providing a Replica
KR100685517B1 (ko) 2005-06-09 2007-03-09 대한민국 신체충실지수 판정 보조기
US20070008408A1 (en) * 2005-06-22 2007-01-11 Ron Zehavi Wide area security system and method
NL1032435C2 (nl) 2006-09-05 2008-03-06 Maasland Nv Inrichting voor het automatisch melken van een melkdier.
DE102007036294A1 (de) 2007-07-31 2009-02-05 Gea Westfaliasurge Gmbh Vorrichtung und ein Verfahren zum Bereitstellen von Informationen über Tiere beim Durchlaufen eines Tierdurchganges
FI123049B (fi) * 2007-09-03 2012-10-15 Mapvision Ltd Oy Tallentava konenäköjärjestelmä
DE102007050857A1 (de) * 2007-10-24 2009-04-30 Weber Maschinenbau Gmbh Breidenbach Fettauflagenvermessungseinrichtung
CA2711388C (en) * 2008-01-22 2016-08-30 Delaval Holding Ab Arrangement and method for determining the position of an animal
BRPI1008771A2 (pt) 2009-02-27 2019-07-02 Body Surface Translations Inc estimativa de parâmetros físicos usando representações tridimensionais
WO2010127023A1 (en) * 2009-05-01 2010-11-04 Spicola Tool, Llc Remote contactless stereoscopic mass estimation system
US8369566B2 (en) * 2009-05-01 2013-02-05 Texas Tech University System Remote contactless stereoscopic mass estimation system
AU2010268755B2 (en) * 2009-06-29 2014-10-16 Australian Pork Limited Assessment of animals and carcasses
DE102010049310A1 (de) * 2010-10-22 2012-04-26 Weber Maschinenbau Gmbh Breidenbach Abtastvorrichtung und Verfahren zum Ermitteln der Kontur eines Objekts
KR20120052769A (ko) * 2010-11-16 2012-05-24 한국전자통신연구원 가상머신을 동기화시키기 위한 장치 및 방법
WO2012138290A1 (en) * 2011-04-05 2012-10-11 Delaval Holding Ab Animal handling arrangement and method
CN102626306A (zh) * 2012-03-30 2012-08-08 徐�明 用于自动评定奶牛体况分的方法
US9167800B2 (en) 2012-06-04 2015-10-27 Clicrweight, LLC Systems for determining animal metrics and related devices and methods
US8787621B2 (en) * 2012-06-04 2014-07-22 Clicrweight, LLC Methods and systems for determining and displaying animal metrics
NL2009923C2 (en) * 2012-07-20 2014-01-23 Lely Patent Nv System for detecting and determining positions of animal parts.
US20140029808A1 (en) * 2012-07-23 2014-01-30 Clicrweight, LLC Body Condition Score Determination for an Animal
EP2698763A1 (de) * 2012-08-14 2014-02-19 Hölscher & Leuschner GmbH & Co. Verfahren zur Analyse eines lebenden Nutztieres
DE102012217089A1 (de) * 2012-09-21 2014-03-27 Siemens Aktiengesellschaft Schichtdarstellung von Volumendaten
DE202013002483U1 (de) 2013-03-15 2014-06-16 Csb-System Ag Vorrichtung zur Vermessung einer Schlachttierkörperhälfte
US9501719B1 (en) * 2013-10-28 2016-11-22 Eyecue Vision Technologies Ltd. System and method for verification of three-dimensional (3D) object
NL2011952C2 (en) * 2013-12-12 2015-06-15 Lely Patent Nv Method and device to monitor growth of an animal, in particular a calf.
JP6362068B2 (ja) * 2014-02-17 2018-07-25 キヤノン株式会社 距離計測装置、撮像装置、距離計測方法、およびプログラム
NL2012540B1 (en) * 2014-04-01 2016-02-15 Lely Patent Nv An arrangement and method to determine a body condition score of an animal.
EP2957861A1 (de) * 2014-06-17 2015-12-23 Expert Ymaging, SL Vorrichtung und Verfahren zur automatischen Berechnung von Parametern eines Objekts
CN106604641A (zh) * 2014-08-29 2017-04-26 Csb-系统公司 用于在肉畜上评价动物保护遵守的装置和方法
ITUB20150404A1 (it) * 2015-02-27 2016-08-27 Ingenera Sa Metodo e relativo apparato per la determinazione senza contatto del punteggio di condizione corporea e del peso di un animale.
JP6357140B2 (ja) * 2015-09-18 2018-07-11 Psソリューションズ株式会社 画像判定方法
US10477827B2 (en) 2016-08-17 2019-11-19 Technologies Holdings Corp. Vision system for teat detection
US10817970B2 (en) * 2016-08-17 2020-10-27 Technologies Holdings Corp. Vision system with teat detection
US9807972B1 (en) * 2016-08-17 2017-11-07 Technologies Holdings Corp. Vision system with leg detection
US9807971B1 (en) 2016-08-17 2017-11-07 Technologies Holdings Corp. Vision system with automatic teat detection
NL2017365B1 (en) 2016-08-25 2018-03-01 Lely Patent Nv Method and device to detect lameness of a cow
PT3531373T (pt) * 2018-02-26 2022-08-16 Expert Ymaging Sl Método e dispositivo para a caracterização de espécimes vivos à distância
EP3787500A4 (de) 2018-05-02 2021-09-29 Geissler Companies, LLC System und verfahren zur bestimmung des oberflächenareals eines tierischen körpers und folgenden bestimmung des gesundheitszustandes eines tieres
CN110505412B (zh) * 2018-05-18 2021-01-29 杭州海康威视数字技术股份有限公司 一种感兴趣区域亮度值的计算方法及装置
CA3115612A1 (en) * 2018-10-10 2020-04-16 Delaval Holding Ab Animal identification using vision techniques
NL2023104B1 (en) * 2019-05-10 2020-11-30 Lely Patent Nv Ruminant animal monitoring system
CN111145240A (zh) * 2019-11-18 2020-05-12 西宁市动物疫病预防控制中心(挂西宁市畜牧兽医站牌子) 一种基于3d相机的活体西门塔尔牛体尺在线测量方法
CN111310595B (zh) * 2020-01-20 2023-08-25 北京百度网讯科技有限公司 用于生成信息的方法和装置
US11657498B2 (en) 2020-04-10 2023-05-23 X Development Llc Multi-chamber lighting controller for aquaculture

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5576949A (en) 1991-08-30 1996-11-19 C-Scan, Llc System for animal evaluation through image acquisition
US5483441A (en) * 1991-08-30 1996-01-09 Uniform Scanner Data Analysis, Inc. System for animal evaluation through image acquisition
US5412420A (en) * 1992-10-26 1995-05-02 Pheno Imaging, Inc. Three-dimensional phenotypic measuring system for animals
US5398290A (en) * 1993-05-03 1995-03-14 Kansas State University Research Foundation System for measurement of intramuscular fat in cattle
US5578949A (en) * 1995-02-16 1996-11-26 Analogic Corporation Single stage voltage to current conversion switching system
SE9701547D0 (sv) * 1997-04-23 1997-04-23 Alfa Laval Agri Ab Apparatus and method for recognising and determining the positon of a part of an animal
US6377353B1 (en) * 2000-03-07 2002-04-23 Pheno Imaging, Inc. Three-dimensional measuring system for animals using structured light
US6974373B2 (en) * 2002-08-02 2005-12-13 Geissler Technologies, Llc Apparatus and methods for the volumetric and dimensional measurement of livestock
US7039220B2 (en) * 2002-08-14 2006-05-02 C-Scan, L.L.P. Methods and apparatus for the dimensional measurement of livestock using a single camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2004012146A1 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2027770A2 (de) 2007-08-22 2009-02-25 Icerobotics Limited Verfahren und Vorrichtung zur automatischen Einstufung des Viehzustands
EP2027770A3 (de) * 2007-08-22 2012-12-19 Icerobotics Limited Verfahren und Vorrichtung zur automatischen Einstufung des Viehzustands
US8538126B2 (en) 2007-08-22 2013-09-17 Icerobotics, Ltd. Method and apparatus for the automatic grading of condition of livestock
NL1034292C2 (nl) 2007-08-27 2009-03-02 Maasland Nv Systeem en werkwijze voor het beheren van een groep dieren.
WO2010063527A1 (en) * 2008-12-03 2010-06-10 Delaval Holding Ab Arrangement and method for determining a body condition score of an animal
US9684956B2 (en) 2008-12-03 2017-06-20 Delaval Holding Ab Arrangement and method for determining a body condition score of an animal
US10639014B2 (en) 2015-02-27 2020-05-05 Biondi Engineering Sa Method and relevant apparatus for the determination of the body condition score, body weight and state of fertility
EP3574751A1 (de) * 2018-05-28 2019-12-04 Bayer Animal Health GmbH Vorrichtung zur fliegenbekämpfung
WO2019228861A1 (en) * 2018-05-28 2019-12-05 Bayer Animal Health Gmbh Apparatus for fly management

Also Published As

Publication number Publication date
ATE358304T1 (de) 2007-04-15
US20060126903A1 (en) 2006-06-15
WO2004012146A1 (en) 2004-02-05
EP1537531B1 (de) 2007-03-28
US7853046B2 (en) 2010-12-14
AU2003281732A1 (en) 2004-02-16
IL150915A0 (en) 2003-02-12
DE60312869T2 (de) 2008-01-24
DE60312869D1 (de) 2007-05-10

Similar Documents

Publication Publication Date Title
EP1537531B1 (de) Bildaufnahmesystem und -verfahren zur auswertung der körperlichen verfassung
RU2543948C2 (ru) Устройство и способ определения количественного показателя состояния тела животного
US6974373B2 (en) Apparatus and methods for the volumetric and dimensional measurement of livestock
US7583391B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and three-dimensional measuring program
US6567682B1 (en) Apparatus and method for lesion feature identification and characterization
US20080273760A1 (en) Method and apparatus for livestock assessment
US6377353B1 (en) Three-dimensional measuring system for animals using structured light
US5576949A (en) System for animal evaluation through image acquisition
CN102246205B (zh) 用于测量植物的叶盘生长的方法
US20200077667A1 (en) 3d imaging system and method of imaging carcasses
Liu et al. Automatic estimation of dairy cattle body condition score from depth image using ensemble model
JPH10500207A (ja) 動物の三次元表現型測定装置
CN108628306B (zh) 机器人行走障碍检测方法、装置、计算机设备和存储介质
EP3769036B1 (de) Verfahren und system zur extraktion einer statistischen stichprobe von beweglichen fischen
CN104586404A (zh) 体质监测的姿态识别方法及系统
CN112508890A (zh) 一种基于二级评测模型的奶牛体脂率检测方法
Li et al. Automatic dairy cow body condition scoring using depth images and 3D surface fitting
IL166480A (en) Imaging system and method for body condition evaluation
CN115294181B (zh) 基于两阶段关键点定位的奶牛体型评定指标测量方法
RU2785149C2 (ru) Способ и устройство для получения характеристик живых особей на расстоянии
Zong et al. Comparisons of non-contact methods for pig weight estimation
KR20230071813A (ko) 한우의 성장검사 자료 수집 장치 및 그 방법
Cheng Measuring antler lengths using low-cost ToF cameras
Alempijevic Prototype on-farm 3D camera system to assess traits in Angus cattle
CN114403023A (zh) 基于太赫兹测量膘厚的猪只饲喂方法、装置及系统

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20050223

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: CH

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: LI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REF Corresponds to:

Ref document number: 60312869

Country of ref document: DE

Date of ref document: 20070510

Kind code of ref document: P

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070628

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070709

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070828

ET Fr: translation filed
REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20080102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070731

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070629

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070727

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070628

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070328

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20070929

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20090727

PGRI Patent reinstated in contracting state [announced from national office to epo]

Ref country code: IT

Effective date: 20110616

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 14

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 15

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 16

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20190724

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20190723

Year of fee payment: 17

Ref country code: DE

Payment date: 20190723

Year of fee payment: 17

Ref country code: IT

Payment date: 20190723

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20190723

Year of fee payment: 17

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 60312869

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20200801

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20200727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200727

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200731

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200801

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210202

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200727