WO2019004902A1 - Control of a milking station - Google Patents

Control of a milking station Download PDF

Info

Publication number
WO2019004902A1
WO2019004902A1 PCT/SE2018/050661 SE2018050661W WO2019004902A1 WO 2019004902 A1 WO2019004902 A1 WO 2019004902A1 SE 2018050661 W SE2018050661 W SE 2018050661W WO 2019004902 A1 WO2019004902 A1 WO 2019004902A1
Authority
WO
WIPO (PCT)
Prior art keywords
milking station
sensor
control unit
milking
sign
Prior art date
Application number
PCT/SE2018/050661
Other languages
French (fr)
Inventor
Martin SJÖLUND
Original Assignee
Delaval Holding Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Delaval Holding Ab filed Critical Delaval Holding Ab
Publication of WO2019004902A1 publication Critical patent/WO2019004902A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/017Automatic attaching or detaching of clusters
    • A01J5/0175Attaching of clusters
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/007Monitoring milking processes; Control or regulation of milking machines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K1/00Housing animals; Equipment therefor
    • A01K1/12Milking stations

Definitions

  • the present invention generally relates to milking equipment, and in particular the invention relates to control of a milking station.
  • AMS solutions Today, many large dairy farms are equipped with automatic milking systems, A Ss, involving robot assisted milking of dairy animals.
  • AMS solutions For example, there are fully automated rotary milking systems, where dairy animals are milked automatically, i.e. robot assisted, while standing on a rotating platform. In such systems, one or more robots are typically located either on the outside of the outer circumference of the platform, or inside of the inner circumference of the platform, depending on how the animals are oriented on the platform during milking.
  • AMSs where dairy animals may enter a stationary milking station equipped with a milking robot.
  • FIGS 1a-1d show an exemplifying milking station from different angles, and details thereof (figure 1c).
  • the milking station is equipped with a multi-purpose robot arm 101 , a teat locating camera unit 102, and a magazine 103 for storing teat cups (not visible). It also comprises a teat preparation module 104 for e.g. pre-cleaning and post-treatment of animal teats; a feeding module 105 for providing feed to animals during milking, and a cleaning unit 106 for cleaning of teat cups and other equipment.
  • the milking station further comprises a hydraulic pump unit 107, a power box 108, an electrical box 109, and a service switch 110. The animals enter and exit the milking station through gates 11.
  • the milking station can operate in an automatic mode, in which all parts of the milking procedure are handled automatically by the milking station; everything from letting the animal into the station to that it is let out from the station after milking.
  • the milking station can also be operated in other modes, in which all or some of the actions related to milking are performed manually or semi-automatically.
  • the movements of the multi-purpose arm may, at least partly, be controlled by a human operator. This is typically used for placing a robot arm equipped with a teat locating sensor in a suitable start position when training a milking station to find the teats of a specific animal.
  • the multi-purpose arm may be placed and kept in a parked position, and the operation e.g. of attaching the teat cups could be performed manually.
  • Automatic milking systems have facilitated the work on dairy farms, which historically has been associated with hard physical labour. But even today, when many of the dairy farm chores have been automated, the ergonomic working conditions for manual labour on a dairy farm could still be improved.
  • automatic milking stations may also have a manual mode, in which a human operator can interact with the milking station and, for example, take part in the milking procedure. It is desired to make human interaction with the milking station as easy as possible.
  • An object of the invention is to improve the control of a milking station for a human operator.
  • a milking station, a control unit, a method and a computer program are provided for achieving this object, as defined by the attached set of claims.
  • a sensor of a milking station which is configured for detection of a non-human animal part, such as a teat, is also utilized for providing a convenient, contact-less man-machine interface between a human operator and a milking station.
  • a method for control of a milking station.
  • the method is to be performed by a control unit operable to control the milking station.
  • the milking station comprises a sensor configured for detection of a non-human animal part, such as a teat or an udder.
  • the method comprises, during a time period when the sensor is not used for detection of a non-human animal part: obtaining at least one image from the sensor;
  • the method thus provides a man-machine interface by use of said sensor.
  • a control unit for control of a milking station comprising a sensor configured for detection of a non-human animal part.
  • the control unit is operable to control the milking station, and the control unit is configured to, during a time period when the sensor is not used for detection of a non-human animal part: obtain at least one image from the sensor, and to identify a sign or gesture made by a human body part within said at least one image.
  • the control unit is further configured to trigger an action to be performed by the milking station based on the identified sign or gesture.
  • the control unit thereby provides a man-machine interface utilizing said sensor.
  • a milking station is provided, comprising (and being controlled by) the control unit according to the second aspect.
  • a computer program which when executed by at least one processor, causes the at least one processor to carry out the method according to the first aspect.
  • a carrier which contains a computer program according to the fourth aspect.
  • Figures 1a-1 d illustrate different parts of a milking station.
  • Figures 2-3 are flow charts illustrating a method according to exemplifying embodiments.
  • Figure 4 is an example of sings made by a human hand, which could be identified when captured in an image by a sensor, such as a camera.
  • FIGS. 5a-5c are schematic block diagrams illustrating different implementations of a control unit, CU, according to exemplifying embodiments.
  • an automatic milking station When an automatic milking station is set in a manual mode of operation, actions of the milking station are controlled by a human operator.
  • milking related actions such as cleaning teats or attaching teat cups to teats may be performed by the human operator.
  • Running an automatic milking station in a manual mode could be considered e.g. when animals are new to the system or are milked for the first time. Animals, such as cows, may need time to become accustomed to the automated milking procedures, such as e.g. teat preparation.
  • Manual procedures such as manual teat cup attachment may then be an option to overcome a cow's initial reluctance. Manual procedures might also be considered e.g.
  • a manual mode could also be used when an operator wants to check, and/or closely inspect, certain functionality of the milking station, such as the release of teat cups from a storing location or the onset of a vacuum.
  • buttons associated with different actions can be provided, by a human operator, by use of buttons associated with different actions.
  • a human operator wants to manually attach one or more teat cups
  • s/he can push a button for releasing the teat cups from the magazine where they are stored, such that they may be removed from the magazine and be attached to the teats of a cow.
  • buttons may be implemented by use of a touch screen.
  • Alternative actions can be presented on the touch screen e.g. as virtual buttons, and a human operator can select the desired alternative by touching the screen.
  • the touch screen used for manual interaction with the milking station is also used for displaying information associated with the milking station. Since the milking station, typically, is operated in automatic mode most of the time, and only seldom in manual mode, the location of such a screen should preferably be convenient for a human monitoring the milking station when operated in automatic mode.
  • moving parts of a milking station In automatic mode, moving parts of a milking station, such as a robot arm, are performing actions such as cleaning teats and attaching teat cups to teats, etc.
  • a human monitoring the milking station in automatic mode would need to stay away from such moving parts.
  • a screen where information is displayed should be located such that a person monitoring the screen is kept out of the way of any moving parts of the milking station.
  • some actions such as attaching teat cups, requires that a human operator moves into the area or zone where the moving parts operate in automatic mode.
  • milking stations are typically equipped with a sensor detection of a non-human animal part, such as locating the teats to which the teat cups are to be attached, and that this sensor could be utilized for providing an interface to the milking station for a human operator that is to interact with the milking station.
  • the embodiments of the invention described herein are intended for a situation where the human operator is located and/or is to perform actions in a position or operation space where normally, when the milking station is set in automatic mode, a (moving) robot is operating.
  • the embodiments of the invention are not intended or suitable for manual control of operations entailing robot movements risking to injure the operator, such as guiding or training a robot arm to find a suitable start position for teat treatment or teat detection.
  • the embodiments are intended to be performed by a control unit which is associated with a milking station.
  • the term "associated with” is here intended to cover e.g. the control unit being comprised in the milking station; being operatively connected to the milking station and/or constituting a part of the milking station.
  • the control unit will be described in more detail further below in association with figures 5 and 6.
  • the milking station comprises different components, which may be referred to as milking station equipment. An example of a milking station has been described above in association with figure 1.
  • the milking station is suitable for automatic milking of an animal, such as cow, buffalo, pig, goat, sheep, camel or horse.
  • Figure 2 shows an exemplifying method embodiment for control of a milking station, i.e. milking station equipment.
  • the method illustrated in figure 2 comprises an action 201 , of obtaining at least one image from a sensor being configured for detection of a non-human animal part.
  • the method in figure 2 further comprises an action 202, of identifying a sign or gesture made by a human body part within said at least one image.
  • the human body part is preferably a hand, but it could alternatively, or in addition, be e.g. an arm or a face.
  • the method in figure 2 further comprises an action 203, of triggering an action to be performed by the milking station based on the identified sign or gesture.
  • the sensor from which the at least one image is obtained 201 is also, i.e. in addition to being used for the herein described method, used for detection of a non-human animal part in the milking station.
  • the animal part is preferably a teat or an udder, and the sensor thus preferably the sensor used for detecting, or locating, the teats of an animal to be milked in the milking station, when this is to be performed automatically.
  • the same sensor is used for the different tasks, but not simultaneously.
  • a sensor which is typically comprised in a standard automatic milking station, is utilized for supporting and/or improving human interaction with the milking station.
  • the sensor is preferably operable to capture two-, or three-dimensional images, and may e.g.
  • the image/s is/are processed by adequate signal processing functionality comprised in, or otherwise accessible to the control unit. For example, image recognition by matching could be used to identify a sign, and/or, some set of parameters could be derived from the image/s and be compared to reference material indicative of a certain sign or gesture.
  • image recognition by matching could be used to identify a sign, and/or, some set of parameters could be derived from the image/s and be compared to reference material indicative of a certain sign or gesture.
  • the specific technology used for identifying the sign or gesture based on one or more images is outside the scope of this invention.
  • Embodiments of the invention are intended for use when the milking station is set in a specific mode allowing this type of manual interaction with the milking station.
  • One important thing with such a mode could be that, according to the configuration of said mode, moving parts of the milking station are prevented from moving in or into an operation zone where the human operator is or may be located.
  • the identified sign or gesture made by the human body part would preferably be one out of a set of predefined signs or gestures, which are assigned a specific meaning e.g. action, and stored e.g. in a database or register.
  • the signs or gestures should preferably be selected such that they are distinct and easily separable, e.g. by a suitable image recognition processing unit.
  • the action triggered 203 to be performed by the milking station based on the identified sign or gesture would be an action that is desired by a human operator when standing/bending or squatting in a position for interacting with the milking station. Examples of such actions are e.g. release of a teat cup from a stored position; activation or deactivation of a vacuum associated with one or more teat cups; opening or closing of a gate of the milking station; starting of a cleaning procedure and provision of feed in a feed station associated with the milking station.
  • the triggering 203 of an action to be performed may be implemented in different ways.
  • the control unit could indicate, e.g.
  • the desired action in a message, the desired action to another entity controlling a specific part or function of the milking station concerned by the desired action.
  • desired action is here meant the action associated with the identified 202 sign or gesture.
  • the control unit is in control of the specific part or function of the milking station concerned by the desired action, and could then send an instruction or other execution signal to the part in question.
  • a milking station typically comprises parts which move during an automatic milking procedure, e.g. a robot arm or similar.
  • the movement of these parts could be restricted during the time the milking station is set in a specific mode, such as a specific manual operation mode.
  • the milking station may be configured to park and keep all moving parts in a parked position when being set in such a specific mode.
  • the milking station may be configured to refrain from actions which imply that the moving parts enters a certain region, space or zone where the human operator would be located when e.g. manually attaching teat cups.
  • Such a region, space or zone could be defined in a suitable coordinate system known to a unit controlling the moving parts, and the definition could be stored such that it is accessible by said unit.
  • Figure 3 shows an exemplifying method embodiment for control of a milking station.
  • the method illustrated in figure 3 comprises an action 301 for determining whether or not the milking station is in a specific mode, i.e. a mode allowing human-machine interaction according to embodiments of the invention.
  • an indication may be obtained which indicates that at least part of the milking station is set in a specific mode, in which the movement of moving parts is restricted or prevented.
  • the action 301 is optional, e.g. since this could be solved in other ways, or not be needed for certain milking stations due to their design.
  • the method in figure 3 also comprises an action 308, "regular operation" which is to be performed when the milking station is found not to be set in the specific mode.
  • This "regular operation” would be the actions and configuration associated with whichever mode the milking station is in.
  • the method in figure 3 also comprises an action 302 of triggering adjustment of the direction of the sensor, such that it is directed towards a specified region, space or zone.
  • the specified region, space or zone would be a region where a human operator is expected to interact with the milking station by means of a body part.
  • This action 302 is optional, since the sensor may be directed in an adequate direction by default e.g. when a moving robot arm of the milking station on which the sensor is fastened is in a parked position.
  • a suitable direction for a sensor would be, e.g. having a field of view comprising an area where a human operator could easily move a hand when being in a working position.
  • the moving robot arm could be constrained to remaining in a parked position during the period when the milking station is set in a specific manual mode.
  • the direction of the sensor could be adjusted such that it is directed towards a specific area of interest. This adjustment could be triggered by that the milking station, or at least part thereof, is set in a specific mode 301. This redirection may be needed, since the sensor is mounted on the milking station in order to also fulfil another purpose, namely identifying a non-human animal part.
  • Actions 303 - 306 in figure 3 corresponds to actions 201 -203 in figure 2.
  • an action 304 of detecting a human body part in the image is explicitly illustrated. This action may be part of an implementation of action 305 of identifying a sign or gesture made by a human body part based on an obtained image.
  • the exemplifying method illustrated in figure 3 also comprises an action 307 of verifying the identified sign or gesture. If it is desired to have an extra level of control before triggering any action, it could be verified in action 307 that the right sign or gesture has been identified, and thereby, also verify that the right action is triggered.
  • the action 307 could be implemented by that a second sign or gesture is identified based on, e.g.
  • the first sign or gesture was identified.
  • two correlated signs or gestures are identified.
  • the two signs or gestures could be identical, or be recognised as a series of signs/gestures that together are associated with a certain action.
  • the verification could be a feature that is selected by the user, or that e.g. is set by the manufacturer.
  • some variant of feedback could be used. For example, a diod could be lit.
  • Another possibility is to configure the interpretation of a sign or gesture to depend on a current state or position of an item, such as whether a teat cup is in a released or retracted position.
  • a sign is interpreted as "release teat cup”
  • retract teat cup is interpreted as "retract teat cup”.
  • This principle is applicable also for e.g. vacuum on/off; gates open/close.
  • FIG 4. An example of different signs made by a human hand is shown in figure 4. Naturally, more and/or other signs could be used than the ones exemplified in figure 4. Below is a table showing an exemplifying association between the signs illustrated in figure 4 and actions, which may be triggered by a control unit at identification of the sign in question.
  • control unit is operable to control a milking station and may be assumed to be operable to obtain information related to the milking station and to trigger certain operations or actions to be performed by the milking station, which have been described above, and which will be briefly described below.
  • the milking station which the control unit is operable to control may be assumed to comprise, i.e. be equipped with, a sensor configured for detecting of a non- human animal part.
  • the control unit is preferably comprised in the milking station, e.g. as a module or a part of the milking station, but could alternatively be external to the milking station.
  • control unit could be a part of a central system or arrangement for controlling a plurality of milking stations.
  • the control unit may alternatively be denoted e.g. "control device".
  • the communication between the control unit and parts of the milking station may be performed over a state of the art wireless and/or wired interface.
  • the control unit 500 is configured to perform the actions of at least one of the method embodiments described above with reference to any of figures 2-4.
  • the control unit 500 is associated with the same technical features, objects and advantages as the previously described method
  • control unit will be described in brief in order to avoid unnecessary repetition.
  • the control unit may be implemented and/or described as follows:
  • the control unit 500 comprises processing circuitry 501 and a communication interface 502.
  • the processing circuitry 501 is configured to cause the control unit 500 to obtain at least one image from a sensor configured for detecting of a non-human animal part in a milking station.
  • the processing circuitry 501 is further configured to cause the control unit 500 to identify a sign or gesture made by a human body part within said at least one image; and further to trigger an action to be performed by the milking station based on the identified sign or gesture.
  • the control unit is configured to provide a man-machine interface between a human operator and the milking station by use of said sensor.
  • the communication interface 502 which may also be denoted e.g.
  • I/O interface includes a wired and/or a wireless interface for sending data, such as commands, to other nodes or entities, e.g. of the milking station; and for receiving information from other nodes or entities, such as a sensor of the milking station.
  • Figure 5b shows an embodiment of the processing circuitry 501 which comprises a processing device 503, such as a general-purpose microprocessor, e.g. a CPU, and a memory 504, in communication with the processing device, that stores or holds instruction code readable and executable by the processing device.
  • the instruction code stored or held in the memory may be in the form of a computer program 505, which when executed by the processing device 503 causes the control unit 500 to perform the actions in the manner described above.
  • An alternative implementation of the processing circuitry 501 is shown in figure 5c.
  • the processing circuitry here comprises an obtaining unit 507 for causing the control unit to obtain at least one image from a sensor configured for detecting of a non-human animal part in a milking station.
  • the processing circuitry further comprises an identifying unit 509 for causing the control unit to identify a sign or gesture made by a human body part within said at least one image.
  • the processing circuitry further comprises a triggering unit 510, for causing the control unit 500 to trigger an action to be performed by the milking station based on the identified sign or gesture.
  • the processing circuitry 501 could comprise more units configured to cause the control unit to perform actions associated with one or more of the method embodiments described herein.
  • units 506 and 508 are provided, having dashed outlines.
  • any of the units 507, 509-510 could be configured to also cause the control unit to perform such other actions.
  • the control unit 500 could, for example, comprise a determining unit 506 for determining whether the milking station is set in a specific mode, allowing for human interaction via the sensor.
  • the control unit 500 could further comprise a detection unit 508, for detecting a sign or gesture in the at least one obtained image, e.g. before a sign or gesture from within a set of predefined gestures is identified by the identifying unit 509. This, and other tasks, could alternatively be performed by one of the other units.
  • the control unit 500 may comprise further functionality, for carrying out control unit functions not specifically mentioned herein, related e.g. to standard operation of the milking station.
  • the foregoing description of a control unit 500 is not intended be limiting.
  • the processing circuitry may also be implemented by other techniques known in the art, such as, e.g., hardwired transistor logic or application-specific integrated circuits arranged in a manner sufficient to carry out the actions of the control unit 500 as described above. Exemplifying embodiments of arrangement, figure 6
  • Figure 6 illustrates an arrangement 600 for feeding management of an animal 603 (animal not comprised in arrangement).
  • the arrangement comprises a control unit 601 , as the ones described above, and an identification unit 606 configured to provide a unique identification of the animal 603, e.g. to the control unit.
  • the arrangement also comprises a BCS device 604, configured to automatically derive a BSC estimate of the animal based e.g. on a two-, or three-dimensional image, and to provide the BCS estimate to the control unit 601.
  • the identification unit 606 could alternatively be denoted e.g. "ID reader", and may derive the identity of the animal e.g. by reading an RFID tag 602 attached to the animal, or by some other known method for identifying animals.
  • the identification unit then provides the unique identification of the animal to the control unit or to the BCS device, depending on
  • the BCS device 604 preferably comprises a sensor for capturing a two- or three-dimensional image of an animal, based on which the BCS estimate may be derived.
  • the sensor may be optical, using reflection of light of suitable frequency, such as a time of flight camera (3D), but could alternatively use e.g. ultrasound to obtain images.
  • the BCS device may comprise more than one sensor (not illustrated). For certain animal types, such as goats, more than one sensor may be used, since the BCS of goats typically includes evaluation of both the lower back (as for cows) and the sternum area.
  • the BCS device should be located and mounted such that images can be obtained of the relevant parts of the animals in question, e.g. at a milking stall where animals stand while being milked.
  • the arrangement 600 may further comprise other devices, such as feeding equipment 605. This could be a complete feeding system or an automatic feed dispenser.
  • the feeding equipment 605 could then obtain indications or instructions from the control unit 601 and execute a change of the feed composition provided to the animal.
  • the feeding equipment 605 may obtain information from a database 608 concerning the change of feed composition, where the database 608 comprises information or instructions provided by the control unit 601.
  • the arrangement 600 could also comprise or be operatively connected to entities such as a PC 607 or to a network 609, such as the Internet and/or radio access systems, with all the possibilities associated with such networks in terms of connecting to mobile and remote devices 510.
  • entities such as a PC 607 or to a network 609, such as the Internet and/or radio access systems, with all the possibilities associated with such networks in terms of connecting to mobile and remote devices 510.
  • the steps, functions, procedures, modules, units and/or blocks described herein may be implemented in hardware using any conventional technology, such as discrete circuit or integrated circuit technology, including both general-purpose electronic circuitry and application-specific circuitry.
  • at least some of the steps, functions, procedures, modules, units and/or blocks described above may be implemented in software such as a computer program for execution by suitable processing circuitry including one or more processing units.
  • the software could be carried by a carrier, such as an electronic signal, an optical signal, a radio signal, or a computer readable storage medium before and/or during the use of the computer program in the nodes.
  • the flow diagram or diagrams presented herein may be regarded as a computer flow diagram or diagrams, when performed by one or more processors.
  • a corresponding apparatus may be defined as a group of function modules, where each step performed by the processor corresponds to a function module.
  • the function modules are implemented as a computer program running on the processor.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Animal Husbandry (AREA)
  • Zoology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to control of a milking station comprising a sensor for detection of a non-human animal part. A method, a control unit and a milking station are provided for this purpose. The method, which is to be performed by the control unit, comprises obtaining of at least one image from the sensor (303); identifying a sign or gesture made by a human body part within said at least one image (305); and then triggering an action to be performed by the milking station based on the identified sign or gesture (306). Thereby, a man-machine interface is provided by use of said sensor, via which interface a human operator can control the milking station.

Description

CONTROL OF A MILKING STATION
TECHNICAL FIELD
The present invention generally relates to milking equipment, and in particular the invention relates to control of a milking station.
BACKGROUND
Today, many large dairy farms are equipped with automatic milking systems, A Ss, involving robot assisted milking of dairy animals. There are different types of AMS solutions. For example, there are fully automated rotary milking systems, where dairy animals are milked automatically, i.e. robot assisted, while standing on a rotating platform. In such systems, one or more robots are typically located either on the outside of the outer circumference of the platform, or inside of the inner circumference of the platform, depending on how the animals are oriented on the platform during milking. There are also other types of AMSs, where dairy animals may enter a stationary milking station equipped with a milking robot. On dairy farms with AMS systems, detailed information about the milking sessions can typically be seen in real time on displays connected to the highly advanced milking equipment, and data can also be collected by the same highly advanced milking equipment and be supplied to a herd management system on a computer. Automatic milking systems are typically used together with so-called "loose-housing", and the dairy animals are brought to, or come voluntarily to the so-called milking parlor or milking station.
Figures 1a-1d show an exemplifying milking station from different angles, and details thereof (figure 1c). The milking station is equipped with a multi-purpose robot arm 101 , a teat locating camera unit 102, and a magazine 103 for storing teat cups (not visible). It also comprises a teat preparation module 104 for e.g. pre-cleaning and post-treatment of animal teats; a feeding module 105 for providing feed to animals during milking, and a cleaning unit 106 for cleaning of teat cups and other equipment. The milking station further comprises a hydraulic pump unit 107, a power box 108, an electrical box 109, and a service switch 110. The animals enter and exit the milking station through gates 11.
The milking station can operate in an automatic mode, in which all parts of the milking procedure are handled automatically by the milking station; everything from letting the animal into the station to that it is let out from the station after milking. The milking station can also be operated in other modes, in which all or some of the actions related to milking are performed manually or semi-automatically. In such an exemplifying other mode, the movements of the multi-purpose arm may, at least partly, be controlled by a human operator. This is typically used for placing a robot arm equipped with a teat locating sensor in a suitable start position when training a milking station to find the teats of a specific animal. In another exemplifying mode, the multi-purpose arm may be placed and kept in a parked position, and the operation e.g. of attaching the teat cups could be performed manually. Automatic milking systems have facilitated the work on dairy farms, which historically has been associated with hard physical labour. But even today, when many of the dairy farm chores have been automated, the ergonomic working conditions for manual labour on a dairy farm could still be improved.
SUMMARY
In addition to fully automatic modes of operation, automatic milking stations may also have a manual mode, in which a human operator can interact with the milking station and, for example, take part in the milking procedure. It is desired to make human interaction with the milking station as easy as possible. An object of the invention is to improve the control of a milking station for a human operator. A milking station, a control unit, a method and a computer program are provided for achieving this object, as defined by the attached set of claims. According to embodiments of the invention, a sensor of a milking station, which is configured for detection of a non-human animal part, such as a teat, is also utilized for providing a convenient, contact-less man-machine interface between a human operator and a milking station. According to a first aspect, a method is provided for control of a milking station. The method is to be performed by a control unit operable to control the milking station. The milking station comprises a sensor configured for detection of a non-human animal part, such as a teat or an udder. The method comprises, during a time period when the sensor is not used for detection of a non-human animal part: obtaining at least one image from the sensor;
identifying a sign or gesture made by a human body part within said at least one image; and then triggering an action to be performed by the milking station based on the identified sign or gesture. The method thus provides a man-machine interface by use of said sensor.
According to a second aspect, a control unit is provided, for control of a milking station comprising a sensor configured for detection of a non-human animal part. The control unit is operable to control the milking station, and the control unit is configured to, during a time period when the sensor is not used for detection of a non-human animal part: obtain at least one image from the sensor, and to identify a sign or gesture made by a human body part within said at least one image. The control unit is further configured to trigger an action to be performed by the milking station based on the identified sign or gesture. The control unit thereby provides a man-machine interface utilizing said sensor. According to a third aspect, a milking station is provided, comprising (and being controlled by) the control unit according to the second aspect.
According to a fourth aspect, a computer program is provided, which when executed by at least one processor, causes the at least one processor to carry out the method according to the first aspect.
According to a fifth aspect, a carrier is provided, which contains a computer program according to the fourth aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features, and advantages of the technology disclosed herein will be apparent from the following more particular description of embodiments as illustrated in the accompanying drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the technology disclosed herein.
Figures 1a-1 d illustrate different parts of a milking station. Figures 2-3 are flow charts illustrating a method according to exemplifying embodiments.
Figure 4 is an example of sings made by a human hand, which could be identified when captured in an image by a sensor, such as a camera.
Figures 5a-5c are schematic block diagrams illustrating different implementations of a control unit, CU, according to exemplifying embodiments. DETAILED DESCRIPTION
When an automatic milking station is set in a manual mode of operation, actions of the milking station are controlled by a human operator. In the type of manual mode referred to herein, milking related actions, such as cleaning teats or attaching teat cups to teats may be performed by the human operator. Running an automatic milking station in a manual mode could be considered e.g. when animals are new to the system or are milked for the first time. Animals, such as cows, may need time to become accustomed to the automated milking procedures, such as e.g. teat preparation. Manual procedures, such as manual teat cup attachment may then be an option to overcome a cow's initial reluctance. Manual procedures might also be considered e.g. when cows with a teat anatomy unsuitable for automatic milking need to be milked; when milking colostrum from newly calved cows, or when some teats of a cow should not be milked to tank. A manual mode could also be used when an operator wants to check, and/or closely inspect, certain functionality of the milking station, such as the release of teat cups from a storing location or the onset of a vacuum.
The instructions to a milking station in manual mode can be provided, by a human operator, by use of buttons associated with different actions. For example, when the human operator wants to manually attach one or more teat cups, s/he can push a button for releasing the teat cups from the magazine where they are stored, such that they may be removed from the magazine and be attached to the teats of a cow. In modern milking stations, such "buttons" may be implemented by use of a touch screen. Alternative actions can be presented on the touch screen e.g. as virtual buttons, and a human operator can select the desired alternative by touching the screen. The touch screen used for manual interaction with the milking station is also used for displaying information associated with the milking station. Since the milking station, typically, is operated in automatic mode most of the time, and only seldom in manual mode, the location of such a screen should preferably be convenient for a human monitoring the milking station when operated in automatic mode.
In automatic mode, moving parts of a milking station, such as a robot arm, are performing actions such as cleaning teats and attaching teat cups to teats, etc. In order to avoid accidents or damaging of the milking station, a human monitoring the milking station in automatic mode would need to stay away from such moving parts. Thus, a screen where information is displayed should be located such that a person monitoring the screen is kept out of the way of any moving parts of the milking station. However, when the milking station is operated in manual mode, some actions, such as attaching teat cups, requires that a human operator moves into the area or zone where the moving parts operate in automatic mode. This means that the human operator must move from a first position, where s/he can give commands on the screen, to a second position, where s/he can reach the teat cups and/or the udder of an animal positioned in the milking station, and then back again to the first position to give new commands, etc.. This is identified as inconvenient and
uncomfortable.
The inventor has further realized that milking stations are typically equipped with a sensor detection of a non-human animal part, such as locating the teats to which the teat cups are to be attached, and that this sensor could be utilized for providing an interface to the milking station for a human operator that is to interact with the milking station.
It should be noted that the embodiments of the invention described herein are intended for a situation where the human operator is located and/or is to perform actions in a position or operation space where normally, when the milking station is set in automatic mode, a (moving) robot is operating. In other words, the embodiments of the invention are not intended or suitable for manual control of operations entailing robot movements risking to injure the operator, such as guiding or training a robot arm to find a suitable start position for teat treatment or teat detection.
Exemplifying embodiments of method, figures 2-4
Below, exemplifying embodiments of a method will be described with reference to figures 2-4. The embodiments are intended to be performed by a control unit which is associated with a milking station. The term "associated with" is here intended to cover e.g. the control unit being comprised in the milking station; being operatively connected to the milking station and/or constituting a part of the milking station. The control unit will be described in more detail further below in association with figures 5 and 6. The milking station comprises different components, which may be referred to as milking station equipment. An example of a milking station has been described above in association with figure 1. The milking station is suitable for automatic milking of an animal, such as cow, buffalo, pig, goat, sheep, camel or horse.
Figure 2 shows an exemplifying method embodiment for control of a milking station, i.e. milking station equipment. The method illustrated in figure 2 comprises an action 201 , of obtaining at least one image from a sensor being configured for detection of a non-human animal part. The method in figure 2 further comprises an action 202, of identifying a sign or gesture made by a human body part within said at least one image. The human body part is preferably a hand, but it could alternatively, or in addition, be e.g. an arm or a face. The method in figure 2 further comprises an action 203, of triggering an action to be performed by the milking station based on the identified sign or gesture. In other words, the sensor from which the at least one image is obtained 201 is also, i.e. in addition to being used for the herein described method, used for detection of a non-human animal part in the milking station. The animal part is preferably a teat or an udder, and the sensor thus preferably the sensor used for detecting, or locating, the teats of an animal to be milked in the milking station, when this is to be performed automatically. The same sensor is used for the different tasks, but not simultaneously. Thereby, a sensor, which is typically comprised in a standard automatic milking station, is utilized for supporting and/or improving human interaction with the milking station. The sensor is preferably operable to capture two-, or three-dimensional images, and may e.g. be a time-of-flight camera or some other type of camera. In order to identify a sign or gesture based on the one or more images obtained from the sensor, the image/s is/are processed by adequate signal processing functionality comprised in, or otherwise accessible to the control unit. For example, image recognition by matching could be used to identify a sign, and/or, some set of parameters could be derived from the image/s and be compared to reference material indicative of a certain sign or gesture. The specific technology used for identifying the sign or gesture based on one or more images is outside the scope of this invention.
Embodiments of the invention are intended for use when the milking station is set in a specific mode allowing this type of manual interaction with the milking station. One important thing with such a mode could be that, according to the configuration of said mode, moving parts of the milking station are prevented from moving in or into an operation zone where the human operator is or may be located.
The identified sign or gesture made by the human body part would preferably be one out of a set of predefined signs or gestures, which are assigned a specific meaning e.g. action, and stored e.g. in a database or register. The signs or gestures should preferably be selected such that they are distinct and easily separable, e.g. by a suitable image recognition processing unit.
The action triggered 203 to be performed by the milking station based on the identified sign or gesture would be an action that is desired by a human operator when standing/bending or squatting in a position for interacting with the milking station. Examples of such actions are e.g. release of a teat cup from a stored position; activation or deactivation of a vacuum associated with one or more teat cups; opening or closing of a gate of the milking station; starting of a cleaning procedure and provision of feed in a feed station associated with the milking station. The triggering 203 of an action to be performed may be implemented in different ways. For example, the control unit could indicate, e.g. in a message, the desired action to another entity controlling a specific part or function of the milking station concerned by the desired action. By desired action is here meant the action associated with the identified 202 sign or gesture. Alternatively, the control unit is in control of the specific part or function of the milking station concerned by the desired action, and could then send an instruction or other execution signal to the part in question.
As previously mentioned, a milking station typically comprises parts which move during an automatic milking procedure, e.g. a robot arm or similar. In order to make sure that these parts do not injure a human operator interacting with the milking station, the movement of these parts could be restricted during the time the milking station is set in a specific mode, such as a specific manual operation mode. For example, the milking station may be configured to park and keep all moving parts in a parked position when being set in such a specific mode. Alternatively, the milking station may be configured to refrain from actions which imply that the moving parts enters a certain region, space or zone where the human operator would be located when e.g. manually attaching teat cups. Such a region, space or zone could be defined in a suitable coordinate system known to a unit controlling the moving parts, and the definition could be stored such that it is accessible by said unit.
Figure 3 shows an exemplifying method embodiment for control of a milking station. The method illustrated in figure 3 comprises an action 301 for determining whether or not the milking station is in a specific mode, i.e. a mode allowing human-machine interaction according to embodiments of the invention. In order to be able to do this, an indication may be obtained which indicates that at least part of the milking station is set in a specific mode, in which the movement of moving parts is restricted or prevented. The action 301 is optional, e.g. since this could be solved in other ways, or not be needed for certain milking stations due to their design. For completeness, the method in figure 3 also comprises an action 308, "regular operation" which is to be performed when the milking station is found not to be set in the specific mode. This "regular operation" would be the actions and configuration associated with whichever mode the milking station is in. The method in figure 3 also comprises an action 302 of triggering adjustment of the direction of the sensor, such that it is directed towards a specified region, space or zone. The specified region, space or zone would be a region where a human operator is expected to interact with the milking station by means of a body part. This action 302 is optional, since the sensor may be directed in an adequate direction by default e.g. when a moving robot arm of the milking station on which the sensor is fastened is in a parked position. A suitable direction for a sensor would be, e.g. having a field of view comprising an area where a human operator could easily move a hand when being in a working position. The moving robot arm could be constrained to remaining in a parked position during the period when the milking station is set in a specific manual mode. However, when desired or required, the direction of the sensor could be adjusted such that it is directed towards a specific area of interest. This adjustment could be triggered by that the milking station, or at least part thereof, is set in a specific mode 301. This redirection may be needed, since the sensor is mounted on the milking station in order to also fulfil another purpose, namely identifying a non-human animal part. Actions 303 - 306 in figure 3 corresponds to actions 201 -203 in figure 2. However, in figure 3, an action 304 of detecting a human body part in the image is explicitly illustrated. This action may be part of an implementation of action 305 of identifying a sign or gesture made by a human body part based on an obtained image. The exemplifying method illustrated in figure 3 also comprises an action 307 of verifying the identified sign or gesture. If it is desired to have an extra level of control before triggering any action, it could be verified in action 307 that the right sign or gesture has been identified, and thereby, also verify that the right action is triggered. The action 307 could be implemented by that a second sign or gesture is identified based on, e.g. in, another image or series of images than the one/s based on (e.g. in) which the first sign or gesture was identified. In other words, instead of only identifying one sign or gesture, two correlated signs or gestures are identified. The two signs or gestures could be identical, or be recognised as a series of signs/gestures that together are associated with a certain action. The verification could be a feature that is selected by the user, or that e.g. is set by the manufacturer. In order for a human interacting with the milking station to know when a sign/gesture has been identified and need to be verified, some variant of feedback could be used. For example, a diod could be lit. Another possibility is to configure the interpretation of a sign or gesture to depend on a current state or position of an item, such as whether a teat cup is in a released or retracted position. When the teat cup is in a retracted position a sign is interpreted as "release teat cup", and when the teat cup is in a released position, the same sign is interpreted as "retract teat cup". This principle is applicable also for e.g. vacuum on/off; gates open/close.
An example of different signs made by a human hand is shown in figure 4. Naturally, more and/or other signs could be used than the ones exemplified in figure 4. Below is a table showing an exemplifying association between the signs illustrated in figure 4 and actions, which may be triggered by a control unit at identification of the sign in question.
401 Release teat cup #1 from stored position
402 Release teat cup #2 from stored position
403 Release teat cup #3 from stored position
404 Release teat cup #4 from stored position
405 Release all teat cups from stored position
406 Activate vacuum (could be combined with 401-405)
407 Deactivate vacuum (could be combined with 401-405)
408 Open gate A 409 Close gate A
410 Start cleaning procedure
Table 1
Exemplifying embodiments of control unit, figures 5a-5c
An exemplifying embodiment of a control unit is illustrated in a general manner in figure 5a. The control unit is operable to control a milking station and may be assumed to be operable to obtain information related to the milking station and to trigger certain operations or actions to be performed by the milking station, which have been described above, and which will be briefly described below. The milking station which the control unit is operable to control may be assumed to comprise, i.e. be equipped with, a sensor configured for detecting of a non- human animal part. The control unit is preferably comprised in the milking station, e.g. as a module or a part of the milking station, but could alternatively be external to the milking station. For example, the control unit could be a part of a central system or arrangement for controlling a plurality of milking stations. The control unit may alternatively be denoted e.g. "control device". The communication between the control unit and parts of the milking station may be performed over a state of the art wireless and/or wired interface. The control unit 500 is configured to perform the actions of at least one of the method embodiments described above with reference to any of figures 2-4. The control unit 500 is associated with the same technical features, objects and advantages as the previously described method
embodiments. The control unit will be described in brief in order to avoid unnecessary repetition.
The control unit may be implemented and/or described as follows:
The control unit 500 comprises processing circuitry 501 and a communication interface 502. The processing circuitry 501 is configured to cause the control unit 500 to obtain at least one image from a sensor configured for detecting of a non-human animal part in a milking station. The processing circuitry 501 is further configured to cause the control unit 500 to identify a sign or gesture made by a human body part within said at least one image; and further to trigger an action to be performed by the milking station based on the identified sign or gesture. Thereby, the control unit is configured to provide a man-machine interface between a human operator and the milking station by use of said sensor. The communication interface 502, which may also be denoted e.g. Input/Output (I/O) interface, includes a wired and/or a wireless interface for sending data, such as commands, to other nodes or entities, e.g. of the milking station; and for receiving information from other nodes or entities, such as a sensor of the milking station.
Figure 5b shows an embodiment of the processing circuitry 501 which comprises a processing device 503, such as a general-purpose microprocessor, e.g. a CPU, and a memory 504, in communication with the processing device, that stores or holds instruction code readable and executable by the processing device. The instruction code stored or held in the memory may be in the form of a computer program 505, which when executed by the processing device 503 causes the control unit 500 to perform the actions in the manner described above. An alternative implementation of the processing circuitry 501 is shown in figure 5c. The processing circuitry here comprises an obtaining unit 507 for causing the control unit to obtain at least one image from a sensor configured for detecting of a non-human animal part in a milking station. The processing circuitry further comprises an identifying unit 509 for causing the control unit to identify a sign or gesture made by a human body part within said at least one image. The processing circuitry further comprises a triggering unit 510, for causing the control unit 500 to trigger an action to be performed by the milking station based on the identified sign or gesture.
The processing circuitry 501 could comprise more units configured to cause the control unit to perform actions associated with one or more of the method embodiments described herein. As examples, units 506 and 508 are provided, having dashed outlines. Alternatively, any of the units 507, 509-510 could be configured to also cause the control unit to perform such other actions. The control unit 500 could, for example, comprise a determining unit 506 for determining whether the milking station is set in a specific mode, allowing for human interaction via the sensor. The control unit 500 could further comprise a detection unit 508, for detecting a sign or gesture in the at least one obtained image, e.g. before a sign or gesture from within a set of predefined gestures is identified by the identifying unit 509. This, and other tasks, could alternatively be performed by one of the other units.
The control unit 500 may comprise further functionality, for carrying out control unit functions not specifically mentioned herein, related e.g. to standard operation of the milking station. The foregoing description of a control unit 500 is not intended be limiting. The processing circuitry may also be implemented by other techniques known in the art, such as, e.g., hardwired transistor logic or application-specific integrated circuits arranged in a manner sufficient to carry out the actions of the control unit 500 as described above. Exemplifying embodiments of arrangement, figure 6
Figure 6 illustrates an arrangement 600 for feeding management of an animal 603 (animal not comprised in arrangement). The arrangement comprises a control unit 601 , as the ones described above, and an identification unit 606 configured to provide a unique identification of the animal 603, e.g. to the control unit. The arrangement also comprises a BCS device 604, configured to automatically derive a BSC estimate of the animal based e.g. on a two-, or three-dimensional image, and to provide the BCS estimate to the control unit 601. The identification unit 606 could alternatively be denoted e.g. "ID reader", and may derive the identity of the animal e.g. by reading an RFID tag 602 attached to the animal, or by some other known method for identifying animals. The identification unit then provides the unique identification of the animal to the control unit or to the BCS device, depending on
implementation.
The BCS device 604 preferably comprises a sensor for capturing a two- or three-dimensional image of an animal, based on which the BCS estimate may be derived. The sensor may be optical, using reflection of light of suitable frequency, such as a time of flight camera (3D), but could alternatively use e.g. ultrasound to obtain images. The BCS device may comprise more than one sensor (not illustrated). For certain animal types, such as goats, more than one sensor may be used, since the BCS of goats typically includes evaluation of both the lower back (as for cows) and the sternum area. The BCS device should be located and mounted such that images can be obtained of the relevant parts of the animals in question, e.g. at a milking stall where animals stand while being milked.
The arrangement 600 may further comprise other devices, such as feeding equipment 605. This could be a complete feeding system or an automatic feed dispenser. The feeding equipment 605 could then obtain indications or instructions from the control unit 601 and execute a change of the feed composition provided to the animal. Alternatively, the feeding equipment 605 may obtain information from a database 608 concerning the change of feed composition, where the database 608 comprises information or instructions provided by the control unit 601.
The arrangement 600 could also comprise or be operatively connected to entities such as a PC 607 or to a network 609, such as the Internet and/or radio access systems, with all the possibilities associated with such networks in terms of connecting to mobile and remote devices 510.
To summarize, the steps, functions, procedures, modules, units and/or blocks described herein may be implemented in hardware using any conventional technology, such as discrete circuit or integrated circuit technology, including both general-purpose electronic circuitry and application-specific circuitry. Alternatively, at least some of the steps, functions, procedures, modules, units and/or blocks described above may be implemented in software such as a computer program for execution by suitable processing circuitry including one or more processing units. The software could be carried by a carrier, such as an electronic signal, an optical signal, a radio signal, or a computer readable storage medium before and/or during the use of the computer program in the nodes.
The flow diagram or diagrams presented herein may be regarded as a computer flow diagram or diagrams, when performed by one or more processors. A corresponding apparatus may be defined as a group of function modules, where each step performed by the processor corresponds to a function module. In this case, the function modules are implemented as a computer program running on the processor.
It should also be understood that it may be possible to re-use the general processing capabilities of any conventional device or unit in which the proposed technology is implemented. It may also be possible to re-use existing software, e.g. by reprogramming of the existing software or by adding new software components.
The embodiments described above are merely given as examples, and it should be understood that the proposed technology is not limited thereto. It will be understood by those skilled in the art that various modifications, combinations and changes may be made to the embodiments without departing from the present scope. In particular, different part solutions in the different embodiments can be combined in other configurations, where technically possible.
When using the word "comprise" or "comprising" it shall be interpreted as non- limiting, i.e. meaning "consist at least of. It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. It is to be understood that the choice of interacting units, as well as the naming of the units within this disclosure are only for exemplifying purpose, and nodes suitable to execute any of the methods described above may be configured in a plurality of alternative ways in order to be able to execute the suggested procedure actions. It should also be noted that the units described in this disclosure are to be regarded as logical entities and not with necessity as separate physical entities.

Claims

1. A method to control a milking station comprising a sensor configured for detection of a non-human animal part, the method comprising: during a time period when the sensor is not used for detection of a non-human animal part:
-obtaining at least one image from the sensor;
-identifying a sign or gesture made by a human body part within said at least one image;
-triggering an action to be performed by the milking station based on the identified sign or gesture;
thus providing a man-machine interface by use of said sensor
2. Method according to claim 1 , wherein the sensor is configured for teat detection, i.e., the non-human animal part is an udder or a teat.
3. Method according to claim 1 or 2, wherein the sensor is a camera.
4. Method according to any of claims 1-3, wherein the time period when the sensor is not used for detection of a non-human animal part corresponds to a time period when the milking station is set in a specific mode.
5. Method according to any of the preceding claims, wherein the triggered action is one of:
-release of a teat cup from a stored position;
-activation of a vacuum;
-deactivation of a vacuum;
-opening of a gate of the milking station;
-closing of a gate of the milking station;
-starting of a cleaning procedure;
-provision of feed in a feed station comprised in the milking station.
6. Method according to any of the preceding claims, further comprising:
-determining that at least part of the milking station is set in a specific mode.
7. Method according to any of the preceding claims, further comprising: -triggering adjustment of the direction of the sensor, such that it is directed towards a specified region, space or zone.
8. Method according to any of the preceding claims, wherein the milking station
comprises moving parts, and wherein the method further comprises:
-obtaining an indication of that at least part of the milking station is set in a specific mode, where the moving parts are prevented to move i) at all, or ii) in a specified region, space or zone.
9. Method according to any of the preceding claims, wherein a plurality of images are obtained from the sensor, and the method further comprises:
before triggering the action to be performed:
-verifying the identified sign or gesture by identifying a consecutive, second sign or gesture made by the human body part within said obtained images.
10. Method according to any of the preceding claims, wherein the sensor is located on a robot arm comprised in the milking station.
11. A control unit (500) operable to control a milking station, said milking station
comprising a sensor configured for detection of a non-human animal part, the control unit being configured to: during a time period when the sensor is not used for detection of a non-human animal part:
-obtain at least one image from the sensor;
-identify a sign or gesture made by a human body part within said at least one image;
-trigger an action to be performed by the milking station based on the identified sign or gesture;
thus providing a man-machine interface by use of said sensor
2. The control unit according to claim 11 , wherein the non-human animal part which the sensor is configured to detect is an udder or a teat.
13. The control unit according to claim 11 or 12, wherein the sensor is a camera.
14. The control unit according to any of claims 1 1-13, wherein the time period when the sensor is not used for detection of a non-human animal part corresponds to a time period when the milking station is set in a specific mode.
15. The control unit according to any of claims 11-14, being configured to trigger one of the following actions to be performed by the milking station based on the identified sign or gesture:
-release of a teat cup from a stored position;
-activation of a vacuum;
-deactivation of a vacuum;
-opening of a gate of the milking station;
-closing of a gate of the milking station;
-starting of a cleaning procedure;
-provision of feed in a feed station comprised in the milking station.
16. The control unit according to any of claims 1 1-15, being further configured to:
-determine that at least part of the milking station is set in a specific mode.
17. The control unit according to any of claims 1 1-16, being further configured to:
-trigger an adjustment of the direction of the sensor, such that it is directed towards a specified region, space or zone.
18. The control unit according to any of claims 1 1-17, wherein the milking station
comprises moving parts, and wherein the control unit is further configured to:
-obtain an indication of that at least part of the milking station is set in a specific mode, according to which the moving parts are prevented to move i) at all, or ii) in a specified region, space or zone.
19. The control unit according to any of claims 1-18, being configured to obtain a
plurality of images from the sensor, and being further configured to:
before triggering the action to be performed:
-verify the identified sign or gesture by identifying a consecutive, second sign or gesture made by the human body part within said obtained images.
20. Milking station comprising a sensor configured for detection of a non-human animal part, and further comprising a control unit according to any of claims 1 1-19.
21. Computer program (505) comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the method according to any of claims 1-10.
22. A carrier containing the computer program of claim 21 , wherein the carrier is one of an electronic signal, optical signal, radio signal, or computer readable storage medium.
PCT/SE2018/050661 2017-06-27 2018-06-20 Control of a milking station WO2019004902A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1750825-0 2017-06-27
SE1750825 2017-06-27

Publications (1)

Publication Number Publication Date
WO2019004902A1 true WO2019004902A1 (en) 2019-01-03

Family

ID=62778975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/050661 WO2019004902A1 (en) 2017-06-27 2018-06-20 Control of a milking station

Country Status (1)

Country Link
WO (1) WO2019004902A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2716059C1 (en) * 2019-06-14 2020-03-05 Открытое акционерное общество "Авангард" System for intelligent control and monitoring of parameters and operating modes of machines and equipment of milk production farms

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009057996A1 (en) * 2007-10-30 2009-05-07 Lely Patent N.V. Method of and device for connecting a teat cup
US20150020739A1 (en) * 2012-03-14 2015-01-22 Gea Farm Technologies Gmbh Milking parlor arrangement with an inner robot device
WO2017051403A1 (en) * 2015-09-21 2017-03-30 Dairymaster A milking system and a method for operating a milking system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009057996A1 (en) * 2007-10-30 2009-05-07 Lely Patent N.V. Method of and device for connecting a teat cup
US20150020739A1 (en) * 2012-03-14 2015-01-22 Gea Farm Technologies Gmbh Milking parlor arrangement with an inner robot device
WO2017051403A1 (en) * 2015-09-21 2017-03-30 Dairymaster A milking system and a method for operating a milking system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2716059C1 (en) * 2019-06-14 2020-03-05 Открытое акционерное общество "Авангард" System for intelligent control and monitoring of parameters and operating modes of machines and equipment of milk production farms

Similar Documents

Publication Publication Date Title
EP3119188B1 (en) System and method for treating livestock
CN104968194B (en) Nipple treating method and apparatus
US9763422B2 (en) Milking box with robotic attacher
EP1940218B1 (en) Arrangement and method for visual detection in a milking system
US6167839B1 (en) Arrangement and a method of performing an animal-related action
US20140029797A1 (en) Method for locating animal teats
EP2651210B1 (en) System and a method for controlling an automatic milking system
DE60203619D1 (en) SYSTEM AND METHOD FOR MELKING ANIMALS
RU2619754C2 (en) System and method of assistance for decisions related to care for animals on farm
EP2648503B1 (en) System and method for automatically determining animal position and animal activity
KR102584357B1 (en) Apparatus for identifying a livestock using a pattern, and system for classifying livestock behavior pattern based on images using the apparatus and method thereof
CN104968195A (en) Teat treatment method and apparatus
WO2019004902A1 (en) Control of a milking station
CN110597081B (en) Method and device for sending control instruction based on smart home operating system
US20180082016A1 (en) Methods and Systems for Biometric Identification of Dairy Animals Using Vein Pattern Recognition
EP3968762A1 (en) Method and control arrangement for controlling an automated crowd gate
US20150268646A1 (en) Methods, arrangements and devices for animal management
JP3396810B2 (en) Milking management system
US5931115A (en) Method of milking and a milking apparatus
EP3107378B1 (en) Milking robot for attaching a teat cup
US20120132142A1 (en) Safety system
RU2732002C2 (en) Method and installation for dairy farming
NL1038363C2 (en) AUTOMATIC MILK DEVICE WITH CAMERA CONTROL.
US20170150694A1 (en) Dairy farming system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18734987

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18734987

Country of ref document: EP

Kind code of ref document: A1