EP3426020A1 - Control system for automatic milking installation and method of controlling automatic milking installation - Google Patents

Control system for automatic milking installation and method of controlling automatic milking installation

Info

Publication number
EP3426020A1
EP3426020A1 EP17712260.3A EP17712260A EP3426020A1 EP 3426020 A1 EP3426020 A1 EP 3426020A1 EP 17712260 A EP17712260 A EP 17712260A EP 3426020 A1 EP3426020 A1 EP 3426020A1
Authority
EP
European Patent Office
Prior art keywords
transparent cover
cover surface
dirt
amount
graphical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP17712260.3A
Other languages
German (de)
French (fr)
Other versions
EP3426020B1 (en
Inventor
Anders HALLSTRÖM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DeLaval Holding AB
Original Assignee
DeLaval Holding AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DeLaval Holding AB filed Critical DeLaval Holding AB
Publication of EP3426020A1 publication Critical patent/EP3426020A1/en
Application granted granted Critical
Publication of EP3426020B1 publication Critical patent/EP3426020B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/017Automatic attaching or detaching of clusters
    • A01J5/0175Attaching of clusters
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J5/00Milking machines or devices
    • A01J5/007Monitoring milking processes; Control or regulation of milking machines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J7/00Accessories for milking machines or devices
    • A01J7/02Accessories for milking machines or devices for cleaning or sanitising milking machines or devices
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01JMANUFACTURE OF DAIRY PRODUCTS
    • A01J7/00Accessories for milking machines or devices
    • A01J7/04Accessories for milking machines or devices for treatment of udders or teats, e.g. for cleaning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present invention relates generally to solutions for controlling automatic milking installations. More particularly the invention relates to a system according to the preamble of claim 1 and a corresponding method. The invention also relates to a computer program and a processor-readable medium.
  • WO 2010/031632 shows a solution for testing a camera arranged in an automated milking system.
  • a camera test arrangement is provided for determining the performance of a camera arranged for use in an automated milking system.
  • the camera test arrangement includes a camera test device with means for providing a repeatable test environment for the camera.
  • the camera test arrangement further includes image processing means for processing data obtained from the camera when arranged in the camera test device and for determining the performance of the camera.
  • the invention provides reliable yet non- expensive means for ensuring proper functioning of the camera.
  • the object of the present invention is therefore to offer an improved solution for prompting the operator of a milking installation to keep the vision system sufficiently clean to ensure the functionality of the system.
  • the object is achieved by the initially described system, wherein the control unit is configured to process the recorded image data to determine a parameter indicating an amount of dirt on the transparent cover surface.
  • the control unit is further configured to generate graphics da- ta for presentation to the user via the graphical user interface, which graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface.
  • This system is advantageous because it renders it impossible for the operator to interact with the system without being remin- ded to clean the cover surface of the imaging unit should this be necessary.
  • the at least one graphical element is configured to express the amount of dirt on the transparent cover surface proportionally, such that a relatively small amount of dirt is expressed as one or more graphical elements of comparatively small size, number and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements of comparatively large size, number and/or low transparency.
  • the control unit determines that the amount of dirt on the transparent cover surface is below a first threshold level, the graphics data reflect that the transparent cover surface is clean. Namely, inevitably there will always be some dirt on the transparent cover surface. However, as long as this does not influence the quality of the image data, there is no need to inform the user about the dirt.
  • the control unit determines that the amount of dirt on the transparent cover surface is above a second threshold level
  • the graphics data reflect that the transparent cover surface is highly obstructed with dirt by the control unit being configured to cause the at least one graphical element to be presented in a cyclic manner alternating between first and second phases.
  • the at least one graphical element blocks a relatively large proportion of information in at least one of the at least one display view; and in the second phase, the at least one graphical element blocks a relatively small proportion of the information in said at least one display view. Consequently, it becomes evident to the user that the transparent cover surface is heavily soiled; and still, it is possible to interact via the user interface.
  • the at least one graphical element may be presented alternating- ly either in the first or second phase.
  • the control unit is configured to generate the graphics data such that the at least one graphical element gradually transitions between the first and second phases, either in a stepwise manner or continuously.
  • control unit is configured to generate the graphics data such that the at least one graphical element is superimposed on at least one first display view of the at least one display view.
  • the at least one graphical element is arranged to mimic a number of dirt particles.
  • control unit is configured to repeatedly update the determining of the parameter indicating the amount of dirt on the transparent cover surface; and repeatedly generate updatings of the at least one graphical element based on the updated determining of the parameter indicating the amount of dirt on the transparent cover surface.
  • the at least one graphical element reflect this, e.g. by increasing in size and/or number and/or with decreasing transparency.
  • the graphics data reflecting that the transparent cover surface is clean, e.g. by not presenting any graphical elements. This is advantageous because the user is provided with immediate feedback on the quality of any cleaning performed.
  • the control unit is configured to generate the graphics data such that the at least one graphical element is presented in the graphical user interface in a manner associating the determined amount of dirt to a specific imaging unit of said imaging units on whose transparent cover surface has been determined. For example, if an imaging unit employed at a particular milking point has a dirty transparent cover surface, the at least one graphical element is presented in a graphical user interface associated with that milking point. Naturally, this facilitates identification of the imaging unit in need of cleaning.
  • the system includes a user-manipulable input member configured to receive a user command, which, when received by the control unit, temporarily prevents the at least one graphical element from being presented via the graphical user interface.
  • a user command which, when received by the control unit, temporarily prevents the at least one graphical element from being presented via the graphical user interface.
  • the object is achieved by the method described initially, wherein the recorded image data are processed to determine a parameter indicating an amount of dirt on the transparent cover surface. Furthermore, the graphics data are generated for presentation to the user via the graphical user interface, which graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface.
  • the object is achieved by a computer program loadable into the memory of at least one processor, and includes software adapted to implement the method proposed above when said program is run on at least one processor.
  • the object is achieved by a processor-readable medium, having a program recorded thereon, where the program is to control at least one processor to perform the method proposed above when the prog- ram is loaded into the at least one processor.
  • Figure 1 illustrates how an imaging unit records image data containing data representing a portion of an animal
  • Figure 2 shows a user interacting with a graphical user interface of the proposed system
  • Figures 3-7 illustrate the graphical user interface according to embodiments of the invention.
  • Figure 8 illustrates, by means of a flow diagram, the general method according to the invention.
  • Figure 1 illustrates how an imaging unit 1 10 records image data Dimg containing data representing a portion of an animal A.
  • the recorded image data Dimg inevitably contain a relatively large amount of data in addition to said data representing the portion of the animal A.
  • the recorded image data Dimg also include data reflecting the presence of any dirt particles located within a view field of the imaging unit 1 10.
  • Figure 2 shows a user U interacting with a graphical user interface 135 of the proposed system for controlling an automatic mil- king installation.
  • the graphical user interface 135 is preferably implemented by a display unit 130.
  • the graphical user interface 135 is configured to enable the user U to interact with the automatic milking installation via at least one display view DV1 , DV2 and/or DV3.
  • the system includes an imaging unit 1 10 and a control unit 120.
  • the imaging unit 1 10 is configured to record image data Di mg containing data representing at least one portion of a dairy animal A, say the animal's A udder and its teats.
  • the imaging unit 1 10 has a transparent cover surface 1 17 configured to protect an optics section 1 5 of the imaging unit 1 10.
  • the transparent cover surface 1 17 is transparent with respect to a range of wavelengths within which the imaging unit 1 10 is operative to record the image data Dimg. For example, if the imaging unit 1 10 is configured to record the image data Di mg from a first wavelength (say in the infra- red part of the light spectrum) to a second wavelength (say in in the ultraviolet part of the light spectrum), the transparent cover surface 1 17 is transparent at least with respect to a spectrum of electromagnetic energy ranging between the first and second wavelengths.
  • the control unit 120 is configured to receive the recorded image data Dimg, and based thereon, produce at least one control signal S-Ctrl arranged to control at least one function of the automatic milking installation.
  • control unit 120 is configured to process the recorded image data Di mg to determine a parameter indicating an amount of dirt on the transparent cover surface 1 17. For example, the technique described in WO 2010/031632 (hereby incorporated by reference) can be used to determine the parameter in question. Additionally, the control unit 120 is configured to generate graphics data D gr for presentation to the user U via the graphical user interface 135, which graphics data D gr contain at least one graphical element reflecting the amount of dirt on the transparent cover surface 1 17.
  • Figures 3-7 illustrate the graphical user interface 135 according to embodiments of the invention.
  • Figure 3 shows one example of how the graphical user interface 135 can be implemented to show display views DV1 , DV2 and DV3 respectively, each of which represents a particular function of the automatic milking installation.
  • Figure 3 illustrates a situation where no dirt has been determined to be lo- cated on the transparent cover surface 1 17, or at least an amount of dirt below or equal to a first threshold level has been determined. Namely, in such a case the graphics data D gr reflect that the transparent cover surface 1 17 is clean, e.g. by leaving a dis- play window DV1 associated with an imaging unit 1 10 completely unaffected 300. In other words, the display window DV1 shows no signs of the transparent cover surface 17 being dirty.
  • the control unit 120 is configured to generate the graphics data D gr such that the graphics data D gr reflect that the transparent cover surface 1 17 is dirty.
  • at least one graphical element in the graphics data D gr is configured to express the amount of dirt on the transparent cover surface 1 17 proportionally.
  • a relatively small amount of dirt can be expressed as one or more graphical elements of comparatively small size, number, and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements of comparatively large size, number and/or low transparency.
  • Figure 4 shows an example where a comparatively large number of graphical elements 400 are included in the graphics data D g r in a display window DV1 to reflect the fact that the transparent cover surface 1 17 is relatively dirty.
  • control unit 120 is configured to generate the graphics data D gr , such that the graphical elements are presented in the graphical user interface 135 in a manner associating the determined amount of dirt to a specific imaging unit 1 10 of said imaging units 1 10 on whose transparent cover surface 1 17 has been determined.
  • the graphical elements 400 presented in display view DV1 of the graphical user interface 135 may indicate that the transparent cover surface 1 17 of a first imaging unit has a medium degree of contamination, whereas the fact that display views DV2 and DV3 do not include any obstructing graphical elements at all may indicate that any associated ima- ging units have clean transparent cover surface 1 1 7, or at least that the degree of contamination is below the first threshold level.
  • Fig ure 5 shows an example where somewhat less graphical elements 500 are included in the g raphics data D gr to reflect the fact that the transparent cover surface 1 1 7 is somewhat less dirty.
  • the g raphical elements 500 are spread out over the entire graphical user interface 1 35 , inter alia covering display views DV1 , DV2 and DV3.
  • the graphical elements 600 block a relatively large proportion of information in at least one display view, say DV1 ; and in the second phase, the graphical elements block a relatively small proportion of the information in the display view DV1 , for example as 400 in Figure 4.
  • the graphics data D gr can reflect that the transparent cover surface 1 1 7 is highly obstructed with dirt, and at the same time, the user U can extract information from the display view DV1 in question .
  • the control unit 120 may be configured to generate the graphics data Dg r such that the at least one graphical element is presented in either the first or the second phase in an alternating manner.
  • the control unit 120 is configured to generate the g raphics data D gr such that the at least one graphical element gradually transitions between the first and second phases, either stepwise, or in a continuous manner.
  • the system may include a user-manipulable input member 210 for inhibiting presentation of the at least one graphical element.
  • the user-manipulable input member 210 is configured to forward a user command CMD (e.g. generated in response to activation of an on-screen button , a physical button or key on a keyboard) to the control unit 120.
  • the control unit 120 is configured to temporarily prevent the at least one graphical element 400, 500 or 600 from being presented via the graphical user interface 135. Consequently, for example while the user U keeps an on-screen button activated, no graphical elements are shown via the graphical user interface 35.
  • control unit 120 is configured to generate the graphics data D gr , such that the at least one graphical element is superimposed on at least one, e.g. DV1 , and the information therein is thereby at least partially blocked from being visually inspected by the user U.
  • the at least one graphical element may either be completely opaque, or more or less transparent depending on the degree of visual obstruction desired.
  • the graphical elements 400, 500 and 600 respectively are arranged to mimic a number of dirt particles. This number is typically not equal to the number of dirt particles on the transparent cover surface 1 17, however it is proportional thereto as discussed above.
  • Figure 7 illustrates an alternative way of indicating the amount of dirt on the transparent cover surface 1 17 via the graphics data Dgr.
  • the graphical user interface 135 includes an indicator bar 700, which designates how dirty the control unit 120 has determined the transparent cover surface 1 17 to be.
  • a first field 710 of the indicator bar 700 may correspond to a relatively low degree of contamination
  • a second field 720 of the indi- cator bar 700 may correspond to medium degree of contamination
  • a third field 730 of the indicator bar 700 may correspond to a relatively high degree of contamination.
  • an intensity of the field may express more detailed information about the degree of contamination.
  • the fact that the second field 720 has a somewhat lower intensity than the first field 710 is preferably interpreted to indicate that transparent cover surface 1 17 has not yet reached the medium degree of contamination.
  • control unit 120 determines that the amount of dirt on the transparent cover surface 1 17 is below the first threshold level, the control unit 120 is configured to generate the graphics data D gr to reflect that the transparent cover surface 1 17 is clean, for example as in Figure 3, i.e. where no obstructing graphical elements are included in the graphics data D gr . This is beneficial, since it provides the user U with feedback as to whether or not a cleaning of the transparent cover surface 1 17 has been successful.
  • control unit 120 is preferably configured to repeatedly update the determining of the parameter indicating the amount of dirt on the transparent cover surface 1 17.
  • control unit 120 is configured to repeatedly generate upda- tings of the graphical elements 400, 500, 600 and 700 in the graphics data D gr .
  • the control unit 120 is configured to generate the graphics data D gr such that the graphics data D gr reflect that the transparent cover surface 1 17 is clean.
  • control unit 120 is configured to effect the above-mentioned procedure in a fully automatic manner, for instance by an executing computer program. Therefore, the control unit 120 may be communicatively connected to a memory unit 125 storing a computer program product, which , in turn, contains software for making at least one processor in the control unit 120 execute the above-described actions when the computer program product is run on the control unit 120.
  • image data are recorded by means of an imaging unit.
  • the recorded image data contain data that represent at least one portion of a dairy animal. It is further presumed that the imaging unit has a transparent cover surface that is configured to protect an optics section of the imaging unit, and through which transparent cover surface the image data are recorded.
  • a control signal is produced based on the recorded image data.
  • the at least one control signal is arranged to control at least one function of the automatic milking installation, for example attaching and detaching teatcups.
  • a user is enabled to interact with the automatic milking installation via at least one display view in a graphical user interface.
  • steps 820 and 830 the procedure loops back to step 810 for updated recording of the image data.
  • step 810 the recorded image data are processed to determine a parameter indi- eating an amount of dirt on the transparent cover surface.
  • graphics data are generated for presentation to the user via the graphical user interface.
  • the graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface, thus informing the user of any dirt on the transparent cover surface protecting the optics section of the imaging unit. Then, the procedure loops back to step 81 0.
  • All of the process steps, as well as any sub-sequence of steps, described with reference to Figure 8 above may be controlled by means of a programmed processor.
  • the embodiments of the invention described above with reference to the drawings comprise processor and processes performed in at least one processor, the invention thus also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice.
  • the program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the pro- cess according to the invention .
  • the prog ram may either be a part of an operating system, or be a separate application.
  • the carrier may be any entity or device capable of carrying the program.
  • the carrier may comprise a storage medium, such as a Flash memory, a ROM (Read Only Memory), for ex- ample a DVD (Digital Video/Versatile Disk), a CD (Compact Disc) or a semiconductor ROM , an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a magnetic recording medium, for example a floppy disc or hard disc.
  • the carrier may be a transmissible carrier such as an electrical or optical signal which may be conveyed via electrical or optical cable or by radio or by other means.
  • the carrier may be constituted by such cable or device or means.
  • the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted for performing , or for use in the performance of, the relevant processes.
  • the invention is advantageous in connection with cow milking, the invention is equally well adapted for implementation in milking machines for any other kind of mammals, such as goats, sheep or buffaloes.

Abstract

An automatic milking installation is controlled by a system including: a graphical user interface (135) enabling user interaction with the automatic milking installation via at least one display view; an imaging unit (110) recording image data (D img) representing at least one portion of a dairy animal (A), the imaging unit (110) having a transparent cover surface (117) configured to protect an optics section (115) of the imaging unit (110) and through which transparent cover surface (117) the image data (D img) are recorded; and a control unit (120) receiving the recorded image data (D img), and based thereon, producing a control signal (S-Ctrl) for controlling functions of the automatic milking installation. The control unit (120) further processes the recorded image data (D img) to determine a parameter indicating an amount of dirt on the transparent cover surface (117), and generates graphics data (D gr) for presentation to the user (U) via the graphical user interface (135), which graphics data (D gr) contain at least one graphical element reflecting the amount of dirt on the transparent cover surface (117).

Description

Control System for Automatic Milking Installation and Method of Controlling Automatic Milking Installation
THE BACKGROUND OF THE INVENTION AND PRIOR ART
The present invention relates generally to solutions for controlling automatic milking installations. More particularly the invention relates to a system according to the preamble of claim 1 and a corresponding method. The invention also relates to a computer program and a processor-readable medium.
In modern milk production, an increasing number of functions rely on image data registered via one or more cameras of a vision system. Although this is generally an efficient and reliable approach, the image registration is also associated with problems. Namely, the farm environment is relatively dirty, and over time the cameras' optical systems risk being obstructed by dirt particles. Of course, this degrades the performance, and if left unattended, a dirty lens will result in that an intended function relying on image data cannot be effected.
WO 2010/031632 shows a solution for testing a camera arranged in an automated milking system. In particular, a camera test arrangement is provided for determining the performance of a camera arranged for use in an automated milking system. The camera test arrangement includes a camera test device with means for providing a repeatable test environment for the camera. The camera test arrangement further includes image processing means for processing data obtained from the camera when arranged in the camera test device and for determining the performance of the camera. The invention provides reliable yet non- expensive means for ensuring proper functioning of the camera. PROBLEMS ASSOCIATED WITH THE PRIOR ART
Hence, a solution is known for detecting if the vision system of a milking installation needs cleaning. However, it is still a challenge to ensure that the operator remembers to actually perform the cleaning. Further, since many of the functions in a milking installation relying on image data do not forward any image data to the operator, there is no natural means for the operator to notice, or recall, that the vision system needs cleaning if, for some reason, this has been neglected. SUMMARY OF THE INVENTION
The object of the present invention is therefore to offer an improved solution for prompting the operator of a milking installation to keep the vision system sufficiently clean to ensure the functionality of the system. According to one aspect of the invention, the object is achieved by the initially described system, wherein the control unit is configured to process the recorded image data to determine a parameter indicating an amount of dirt on the transparent cover surface. The control unit is further configured to generate graphics da- ta for presentation to the user via the graphical user interface, which graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface.
This system is advantageous because it renders it impossible for the operator to interact with the system without being remin- ded to clean the cover surface of the imaging unit should this be necessary.
According to one embodiment of this aspect of the invention, the at least one graphical element is configured to express the amount of dirt on the transparent cover surface proportionally, such that a relatively small amount of dirt is expressed as one or more graphical elements of comparatively small size, number and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements of comparatively large size, number and/or low transparency. Thereby, the user is informed about the gravity of the contamination in an intuitive and comprehensible manner. According to another embodiment of this aspect of the invention, if the control unit determines that the amount of dirt on the transparent cover surface is below a first threshold level, the graphics data reflect that the transparent cover surface is clean. Namely, inevitably there will always be some dirt on the transparent cover surface. However, as long as this does not influence the quality of the image data, there is no need to inform the user about the dirt.
According to yet another embodiment of this aspect of the invention, if the control unit determines that the amount of dirt on the transparent cover surface is above a second threshold level, the graphics data reflect that the transparent cover surface is highly obstructed with dirt by the control unit being configured to cause the at least one graphical element to be presented in a cyclic manner alternating between first and second phases. Here, in the first phase, the at least one graphical element blocks a relatively large proportion of information in at least one of the at least one display view; and in the second phase, the at least one graphical element blocks a relatively small proportion of the information in said at least one display view. Consequently, it becomes evident to the user that the transparent cover surface is heavily soiled; and still, it is possible to interact via the user interface.
The at least one graphical element may be presented alternating- ly either in the first or second phase. However, preferably, the control unit is configured to generate the graphics data such that the at least one graphical element gradually transitions between the first and second phases, either in a stepwise manner or continuously.
According to a further embodiment of this aspect of the invention, the control unit is configured to generate the graphics data such that the at least one graphical element is superimposed on at least one first display view of the at least one display view. As a result, information in the at least one first display view is at least partially blocked from being visually inspected by the user. Hence, the impression of a dirty transparent cover surface is conveyed to the user in a highly intuitive manner. Preferably, to further enhance this impression, the at least one graphical element is arranged to mimic a number of dirt particles.
According to another embodiment of this aspect of the invention, the control unit is configured to repeatedly update the determining of the parameter indicating the amount of dirt on the transparent cover surface; and repeatedly generate updatings of the at least one graphical element based on the updated determining of the parameter indicating the amount of dirt on the transparent cover surface. In other words, if the degree of dirtiness increases, the at least one graphical element reflect this, e.g. by increasing in size and/or number and/or with decreasing transparency. Conversely, if the amount of dirt on the transparent cover surface is determined to be below the first threshold level (i.e. the typical result of manual cleaning), the graphics data reflecting that the transparent cover surface is clean, e.g. by not presenting any graphical elements. This is advantageous because the user is provided with immediate feedback on the quality of any cleaning performed.
According to yet another embodiment of this aspect of the inven- tion, if the system comprises two or more imaging units, the control unit is configured to generate the graphics data such that the at least one graphical element is presented in the graphical user interface in a manner associating the determined amount of dirt to a specific imaging unit of said imaging units on whose transparent cover surface has been determined. For example, if an imaging unit employed at a particular milking point has a dirty transparent cover surface, the at least one graphical element is presented in a graphical user interface associated with that milking point. Naturally, this facilitates identification of the imaging unit in need of cleaning. According to one embodiment of this aspect of the invention , the system includes a user-manipulable input member configured to receive a user command, which, when received by the control unit, temporarily prevents the at least one graphical element from being presented via the graphical user interface. Thus, for example as long as the user holds down a button, a key or activates an on-screen button on a touchscreen, he/she can obtain an unobstructed view of any payload data shown via the graphical user interface even if the transparent cover surface has been determined to be relatively dirty.
According to another aspect of the invention , the object is achieved by the method described initially, wherein the recorded image data are processed to determine a parameter indicating an amount of dirt on the transparent cover surface. Furthermore, the graphics data are generated for presentation to the user via the graphical user interface, which graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface. The advantages of this method, as well as the preferred embodiments thereof, are apparent from the discus- sion above with reference to the proposed system.
According to a further aspect of the invention the object is achieved by a computer program loadable into the memory of at least one processor, and includes software adapted to implement the method proposed above when said program is run on at least one processor.
According to another aspect of the invention the object is achieved by a processor-readable medium, having a program recorded thereon, where the program is to control at least one processor to perform the method proposed above when the prog- ram is loaded into the at least one processor.
Further advantages, beneficial features and applications of the present invention will be apparent from the following description and the dependent claims. BRIEF DESCRIPTION OF THE DRAWINGS
The invention is now to be explained more closely by means of preferred embodiments, which are disclosed as examples, and with reference to the attached drawings.
Figure 1 illustrates how an imaging unit records image data containing data representing a portion of an animal;
Figure 2 shows a user interacting with a graphical user interface of the proposed system;
Figures 3-7 illustrate the graphical user interface according to embodiments of the invention; and
Figure 8 illustrates, by means of a flow diagram, the general method according to the invention.
DESCRIPTION OF PREFERRED EMBODIMENTS OF THE IN- VENTION
Figure 1 illustrates how an imaging unit 1 10 records image data Dimg containing data representing a portion of an animal A. Naturally, the recorded image data Dimg inevitably contain a relatively large amount of data in addition to said data representing the portion of the animal A. For example, the recorded image data Dimg also include data reflecting the presence of any dirt particles located within a view field of the imaging unit 1 10.
Figure 2 shows a user U interacting with a graphical user interface 135 of the proposed system for controlling an automatic mil- king installation.
The graphical user interface 135 is preferably implemented by a display unit 130. The graphical user interface 135 is configured to enable the user U to interact with the automatic milking installation via at least one display view DV1 , DV2 and/or DV3. In ad- dition to the graphical user interface 135, the system includes an imaging unit 1 10 and a control unit 120. The imaging unit 1 10 is configured to record image data Dimg containing data representing at least one portion of a dairy animal A, say the animal's A udder and its teats. The imaging unit 1 10 has a transparent cover surface 1 17 configured to protect an optics section 1 5 of the imaging unit 1 10. The transparent cover surface 1 17 is transparent with respect to a range of wavelengths within which the imaging unit 1 10 is operative to record the image data Dimg. For example, if the imaging unit 1 10 is configured to record the image data Dimg from a first wavelength (say in the infra- red part of the light spectrum) to a second wavelength (say in in the ultraviolet part of the light spectrum), the transparent cover surface 1 17 is transparent at least with respect to a spectrum of electromagnetic energy ranging between the first and second wavelengths. The control unit 120 is configured to receive the recorded image data Dimg, and based thereon, produce at least one control signal S-Ctrl arranged to control at least one function of the automatic milking installation. Further, the control unit 120 is configured to process the recorded image data Dimg to determine a parameter indicating an amount of dirt on the transparent cover surface 1 17. For example, the technique described in WO 2010/031632 (hereby incorporated by reference) can be used to determine the parameter in question. Additionally, the control unit 120 is configured to generate graphics data Dgr for presentation to the user U via the graphical user interface 135, which graphics data Dgr contain at least one graphical element reflecting the amount of dirt on the transparent cover surface 1 17.
Figures 3-7 illustrate the graphical user interface 135 according to embodiments of the invention. Specifically, Figure 3 shows one example of how the graphical user interface 135 can be implemented to show display views DV1 , DV2 and DV3 respectively, each of which represents a particular function of the automatic milking installation. Figure 3 illustrates a situation where no dirt has been determined to be lo- cated on the transparent cover surface 1 17, or at least an amount of dirt below or equal to a first threshold level has been determined. Namely, in such a case the graphics data Dgr reflect that the transparent cover surface 1 17 is clean, e.g. by leaving a dis- play window DV1 associated with an imaging unit 1 10 completely unaffected 300. In other words, the display window DV1 shows no signs of the transparent cover surface 17 being dirty.
According to one embodiment of the invention, if an amount of dirt above the first threshold level has been determined on the trans- parent cover surface 1 17, the control unit 120 is configured to generate the graphics data Dgr such that the graphics data Dgr reflect that the transparent cover surface 1 17 is dirty. Preferably, at least one graphical element in the graphics data Dgr is configured to express the amount of dirt on the transparent cover surface 1 17 proportionally. This means that a relatively small amount of dirt can be expressed as one or more graphical elements of comparatively small size, number, and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements of comparatively large size, number and/or low transparency. Figure 4 shows an example where a comparatively large number of graphical elements 400 are included in the graphics data Dg r in a display window DV1 to reflect the fact that the transparent cover surface 1 17 is relatively dirty.
Preferably, if the automatic milking system includes two or more imaging units 1 10, the control unit 120 is configured to generate the graphics data Dgr, such that the graphical elements are presented in the graphical user interface 135 in a manner associating the determined amount of dirt to a specific imaging unit 1 10 of said imaging units 1 10 on whose transparent cover surface 1 17 has been determined. For example, the graphical elements 400 presented in display view DV1 of the graphical user interface 135 may indicate that the transparent cover surface 1 17 of a first imaging unit has a medium degree of contamination, whereas the fact that display views DV2 and DV3 do not include any obstructing graphical elements at all may indicate that any associated ima- ging units have clean transparent cover surface 1 1 7, or at least that the degree of contamination is below the first threshold level.
Fig ure 5 shows an example where somewhat less graphical elements 500 are included in the g raphics data Dgr to reflect the fact that the transparent cover surface 1 1 7 is somewhat less dirty. Here, the g raphical elements 500 are spread out over the entire graphical user interface 1 35 , inter alia covering display views DV1 , DV2 and DV3.
In Figure 6, we see another example, where a very large number of graphical elements 600 are included in the graphics data Dg r in the display view DV1 to reflect the fact that the transparent cover surface 1 1 7 is hig hly obstructed with dirt. Such a h ig h density of graphical elements 600 may render it impossible for a user to extract any information from the display view DV1 . Therefore, according to one embodiment of the invention , if the control unit 1 20 determines that the amount of dirt on the transparent cover surface 1 7 is above a second threshold level, the control unit 120 is config ured to cause g raphical elements 600 to be presented in a cyclic manner alternating between first and se- cond phases. In the first phase the graphical elements 600 block a relatively large proportion of information in at least one display view, say DV1 ; and in the second phase, the graphical elements block a relatively small proportion of the information in the display view DV1 , for example as 400 in Figure 4. Thereby, the graphics data Dgr can reflect that the transparent cover surface 1 1 7 is highly obstructed with dirt, and at the same time, the user U can extract information from the display view DV1 in question .
The control unit 120 may be configured to generate the graphics data Dgr such that the at least one graphical element is presented in either the first or the second phase in an alternating manner. Preferably, however, the control unit 120 is configured to generate the g raphics data Dgr such that the at least one graphical element gradually transitions between the first and second phases, either stepwise, or in a continuous manner.
Especially if the transparent cover surface 1 17 is highly obstructed with dirt, it may be difficult for the user U to extract information via the graphical user interface 135. Preferably, there- fore, the system may include a user-manipulable input member 210 for inhibiting presentation of the at least one graphical element. More precisely, the user-manipulable input member 210 is configured to forward a user command CMD (e.g. generated in response to activation of an on-screen button , a physical button or key on a keyboard) to the control unit 120. In response to the user command CMD, in turn, the control unit 120 is configured to temporarily prevent the at least one graphical element 400, 500 or 600 from being presented via the graphical user interface 135. Consequently, for example while the user U keeps an on-screen button activated, no graphical elements are shown via the graphical user interface 35.
Preferably, to enhance the user's U intuitive understanding of the degree of contamination on the transparent cover surface 1 17, the control unit 120 is configured to generate the graphics data Dgr, such that the at least one graphical element is superimposed on at least one, e.g. DV1 , and the information therein is thereby at least partially blocked from being visually inspected by the user U. The at least one graphical element may either be completely opaque, or more or less transparent depending on the degree of visual obstruction desired.
Further preferably, the graphical elements 400, 500 and 600 respectively are arranged to mimic a number of dirt particles. This number is typically not equal to the number of dirt particles on the transparent cover surface 1 17, however it is proportional thereto as discussed above.
Figure 7 illustrates an alternative way of indicating the amount of dirt on the transparent cover surface 1 17 via the graphics data Dgr. Here, the graphical user interface 135 includes an indicator bar 700, which designates how dirty the control unit 120 has determined the transparent cover surface 1 17 to be. For example, a first field 710 of the indicator bar 700 may correspond to a relatively low degree of contamination, a second field 720 of the indi- cator bar 700 may correspond to medium degree of contamination, and a third field 730 of the indicator bar 700 may correspond to a relatively high degree of contamination. Moreover, an intensity of the field may express more detailed information about the degree of contamination. For instance, in Figure 7, the fact that the second field 720 has a somewhat lower intensity than the first field 710 is preferably interpreted to indicate that transparent cover surface 1 17 has not yet reached the medium degree of contamination.
According to one embodiment of the invention , if the control unit 120 determines that the amount of dirt on the transparent cover surface 1 17 is below the first threshold level, the control unit 120 is configured to generate the graphics data Dgr to reflect that the transparent cover surface 1 17 is clean, for example as in Figure 3, i.e. where no obstructing graphical elements are included in the graphics data Dgr. This is beneficial, since it provides the user U with feedback as to whether or not a cleaning of the transparent cover surface 1 17 has been successful.
Since the contamination of the transparent cover surface 17 varies over time, or more precisely gradually increases during use of the milking installation, and is then reduced substantially in connection with cleaning, the control unit 120 is preferably configured to repeatedly update the determining of the parameter indicating the amount of dirt on the transparent cover surface 1 17.
Further, based on the updated determining of the parameter indi- eating the amount of dirt on the transparent cover surface 1 17, the control unit 120 is configured to repeatedly generate upda- tings of the graphical elements 400, 500, 600 and 700 in the graphics data Dgr. Thus, if the amount of dirt on the transparent cover surface 1 17 is determined to be below the first threshold level, the control unit 120 is configured to generate the graphics data Dgr such that the graphics data Dgr reflect that the transparent cover surface 1 17 is clean.
It is generally advantageous if the control unit 120 is configured to effect the above-mentioned procedure in a fully automatic manner, for instance by an executing computer program. Therefore, the control unit 120 may be communicatively connected to a memory unit 125 storing a computer program product, which , in turn, contains software for making at least one processor in the control unit 120 execute the above-described actions when the computer program product is run on the control unit 120.
In order to sum up, and with reference to the flow diagram in Figure 8, we will now describe the general method according to the invention for controlling an automatic milking installation. I n a first step 810, image data are recorded by means of an imaging unit. The recorded image data contain data that represent at least one portion of a dairy animal. It is further presumed that the imaging unit has a transparent cover surface that is configured to protect an optics section of the imaging unit, and through which transparent cover surface the image data are recorded.
Then, in a step 820, at least one control signal is produced based on the recorded image data. The at least one control signal is arranged to control at least one function of the automatic milking installation, for example attaching and detaching teatcups. In a step 830, subsequent to step 8 0 and preferably parallel with step 820, a user is enabled to interact with the automatic milking installation via at least one display view in a graphical user interface.
After steps 820 and 830 the procedure loops back to step 810 for updated recording of the image data. However, after step 810 and preferably parallel with steps 820 and 830, in a step 840 the recorded image data are processed to determine a parameter indi- eating an amount of dirt on the transparent cover surface.
Thereafter, in a step 850, graphics data are generated for presentation to the user via the graphical user interface. The graphics data contain at least one graphical element reflecting the amount of dirt on the transparent cover surface, thus informing the user of any dirt on the transparent cover surface protecting the optics section of the imaging unit. Then, the procedure loops back to step 81 0.
All of the process steps, as well as any sub-sequence of steps, described with reference to Figure 8 above may be controlled by means of a programmed processor. Moreover, although the embodiments of the invention described above with reference to the drawings comprise processor and processes performed in at least one processor, the invention thus also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other form suitable for use in the implementation of the pro- cess according to the invention . The prog ram may either be a part of an operating system, or be a separate application. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a Flash memory, a ROM (Read Only Memory), for ex- ample a DVD (Digital Video/Versatile Disk), a CD (Compact Disc) or a semiconductor ROM , an EPROM (Erasable Programmable Read-Only Memory), an EEPROM (Electrically Erasable Programmable Read-Only Memory), or a magnetic recording medium, for example a floppy disc or hard disc. Further, the carrier may be a transmissible carrier such as an electrical or optical signal which may be conveyed via electrical or optical cable or by radio or by other means. When the program is embodied in a sig nal which may be conveyed directly by a cable or other device or means, the carrier may be constituted by such cable or device or means. Alternatively, the carrier may be an integrated circuit in which the program is embedded, the integrated circuit being adapted for performing , or for use in the performance of, the relevant processes.
Although the invention is advantageous in connection with cow milking, the invention is equally well adapted for implementation in milking machines for any other kind of mammals, such as goats, sheep or buffaloes.
The term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components. However, the term does not preclude the presence or addition of one or more additional features, integers, steps or components or groups thereof.
The invention is not restricted to the described embodiments in the figures, but may be varied freely within the scope of the claims.

Claims

Claims
1 . A system for controlling an automatic milking installation, comprising:
a graphical user interface (135) configured to enable a user (U) to interact with the automatic milking installation via at least one display view (DV1 , DV2, DV3);
an imaging unit (1 10) configured to record image data (Dimg) comprising data representing at least one portion of a dairy animal (A), the imaging unit (1 10) having a transparent cover surface (1 17) configured to protect an optics section (1 15) of the imaging unit (1 10) and through which transparent cover surface (1 17) the image data (Dimg) are recorded; and
a control unit (120) configured to receive the recorded image data (Dimg), and based thereon, produce at least one control signal (S-Ctrl) arranged to control at least one function of the automatic milking installation, characterized in that the control unit (120) is further configured to:
process the recorded image data (Dimg) to determine a parameter indicating an amount of dirt on the transparent cover surfa- ce (1 17), and
generate graphics data (Dgr) for presentation to the user (U) via the graphical user interface (135), which graphics data (Dgr) contain at least one graphical element (400, 500, 600, 700) reflecting the amount of dirt on the transparent cover surface (1 17).
2. The system according to claim 1 , wherein the at least one graphical element is configured to express the amount of dirt on the transparent cover surface (1 17) proportionally, such that a relatively small amount of dirt is expressed as one or more graphical elements (500, 710) of comparatively small size number and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements (400, 600, 720) of comparatively large size, number and/or low transparency.
3. The system according to any one of claims 1 or 2, wherein if the control unit (120) determines that the amount of dirt on the transparent cover surface (1 17) is below a first threshold level, the graphics data (Dgr) reflect (300) that the transparent cover surface (1 17) is clean.
4. The system according to any one of the preceding claims, wherein if the control unit (120) determines that the amount of dirt on the transparent cover surface (1 17) is above a second threshold level, the graphics data (Dgr) reflect that the transparent cover surface (1 17) is highly obstructed with dirt by the control unit (120) being configured to cause the at least one graphical ele- ment to be presented in a cyclic manner alternating between first and second phases, such that in the first phase the at least one graphical element (600) block a relatively large proportion of information in at least one of the at least one display view (DV1 ), and in the second phase the at least one graphical element (400) block a relatively small proportion of the information in said at least one display view (DV1 ).
5. The system according to claim 4, wherein the control unit (120) is configured to generate the graphics data (Dgr) such that the at least one graphical element gradually transitions between the first and second phases.
6. The system according to any one of the preceding claims, wherein the control unit (120) is configured to generate the graphics data (Dgr) such that the at least one graphical element is superimposed on at least one first display view (DV1 ) of the at least one display view (DV1 , DV2, DV3) and information in the at least one first display view (DV1 ) is thereby at least partially blocked from being visually inspected by the user (U).
7. The system according to claim 6, wherein the at least one graphical element (400, 500, 600) is arranged to mimic a number of dirt particles.
8. The system according to any one of claims 3 to 7, wherein the control unit (120) is configured to:
repeatedly update the determining of the parameter indicating the amount of dirt on the transparent cover surface (1 17), and
repeatedly generate updatings of the at least one graphical element (400, 500, 600, 700) based on the updated determining of the parameter indicating the amount of dirt on the transparent cover surface (1 17); and if the amount of dirt on the transparent cover surface (1 17) is determined to be below the first threshold level, the graphics data (Dgr) reflecting (300) that the transparent cover surface (1 17) is clean.
9. The system according to any one of the preceding claims, wherein, if the system comprises two or more imaging units (1 10), the control unit (120) is configured to generate the graphics data (Dgr) such that the at least one graphical element (400) is presented in the graphical user interface (135) in a manner associating the determined amount of dirt to a specific imaging unit (1 10) of said imaging units (1 10) on whose transparent cover surface ( 17) has been determined.
10. The system according to any one of the preceding claims, further comprising a user-manipulable input member (210) configured to forward a user command (CMD) to the control unit (120), which user command (CMD) when received by the control unit (120) is configured to cause the control unit (120), to temporarily prevent the at least one graphical element (400, 500, 600) from being presented via the graphical user interface (135).
1 1 . A method of controlling an automatic milking installation, the method comprising:
enabling, via at least one display view (DV1 , DV2, DV3) in a graphical user interface (135), a user (U) to interact with the automatic milking installation;
recording image data (Dimg), by means of an imaging unit ( 10), the recorded image data (Dimg) comprising data represen- ting at least one portion of a dairy animal (A), and the imaging unit (1 10) having a transparent cover surface (1 17) configured to protect an optics section (1 15) of the imaging unit (1 10) and through which transparent cover surface (1 17) the image data (Dimg) are recorded; and
producing at least one control signal (S-Ctrl) based on the recorded image data (Dimg), the at least one control signal (S-Ctrl) being arranged to control at least one function of the automatic milking installation, characterized by:
processing the recorded image data (Dimg) to determine a parameter indicating an amount of dirt on the transparent cover surface (1 17), and
generating graphics data (Dgr) for presentation to the user (U) via the graphical user interface (135), which graphics data (Dgr) contains at least one graphical element (400, 500, 600, 700) reflecting the amount of dirt on the transparent cover surface (1 17).
12. The method according to claim 1 1 , wherein the at least one graphical element is configured to express the amount of dirt on the transparent cover surface (1 17) proportionally, such that a relatively small amount of dirt is expressed as one or more graphical elements (500, 710) of comparatively small size, number and/or high transparency, and a relatively large amount of dirt is expressed as one or more graphical elements (400, 600, 720) of comparatively large size, number and/or low transparency.
13. The method according to any one of claims 1 1 or 12, wherein if it is determined that the amount of dirt on the transparent cover surface (1 17) is below a first threshold level, the method comprises generating the graphics data (Dgr) to reflect (300) that the transparent cover surface ( 17) is clean.
14. The method according to any one of claims 1 1 to 13, wherein if it is determined that the amount of dirt on the transparent cover surface (1 17) is above a second threshold level, the method comprises generating the graphics data (Dgr) to reflect (600) that the transparent cover surface (1 17) is highly obstructed with dirt by presenting the at least one graphical element in a cyclic manner alternating between first and second phases, such that in the first phase the at least one graphical element (600) block a relatively large proportion of information in at least one of the at least one display view (DV1 ), and in the second phase the at least one graphical element (400) block a relatively small proportion of the information in said at least one display view (DV1 ).
15. The method according to any one of claims 1 1 to 14, comprising generating the graphics data (Dgr) such that the at least one graphical element is superimposed on at least one first display view (DV1 ) of the at least one display view (DV1 , DV2, DV3) and information in the at least one first display view (DV1 ) is thereby at least partially blocked from being visually inspected by the user (U).
16. The method according to claim 15, wherein the at least one graphical element (400, 500, 600) is arranged to mimic a number of dirt particles.
17. The method according to any one of claims 13 to 16, comprising:
updating, repeatedly, the determining of the parameter indicating the amount of dirt on the transparent cover surface (1 17), and
generating repeated updatings of the at least one graphical element (400, 500, 600, 700) based on the updated determining of the parameter indicating the amount of dirt on the transparent cover surface (1 17); and if the amount of dirt on the transparent cover surface (1 17) is determined to be below the first threshold level, the graphics data (Dgr) reflecting (300) that the transparent cover surface (1 17) is clean.
18. The method according to any one of claims 1 1 to 17, where- in, if the system comprises two or more imaging units (1 10), the method comprises generating the graphics data (Dgr) such that the at least one graphical element (400) is presented in the graphical user interface (135) in a manner associating the determi- ned amount of dirt to a specific imaging unit (1 10) of said imaging units (1 10) on whose transparent cover surface (1 17) has been determined.
19. The method according to any one of the claims to 18, further comprising:
checking if a user command (CMD) has been received via a user-manipulable input member (210), and if so
temporarily preventing the at least one graphical element (400, 500, 600) from being presented via the graphical user interface (135).
20. A computer program (SW) loadable into the memory (125) of at least one processor (120), comprising software for controlling the steps of any of the claims 1 1 to 18 when the program is run on the at least one processor (120).
21 . A processor-readable medium (125), having a program re- corded thereon, where the program is to make at least one processor control the steps of any of the claims 1 1 to 18 when the program is loaded into the at least one processor.
EP17712260.3A 2016-03-11 2017-03-08 Control system for an automatic milking installation and method of controlling an automatic milking installation Active EP3426020B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1650333 2016-03-11
PCT/SE2017/050222 WO2017155457A1 (en) 2016-03-11 2017-03-08 Control system for automatic milking installation and method of controlling automatic milking installation

Publications (2)

Publication Number Publication Date
EP3426020A1 true EP3426020A1 (en) 2019-01-16
EP3426020B1 EP3426020B1 (en) 2020-11-18

Family

ID=58361063

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17712260.3A Active EP3426020B1 (en) 2016-03-11 2017-03-08 Control system for an automatic milking installation and method of controlling an automatic milking installation

Country Status (3)

Country Link
US (1) US20200267924A1 (en)
EP (1) EP3426020B1 (en)
WO (1) WO2017155457A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4171205A1 (en) * 2020-06-29 2023-05-03 DeLaval Holding AB System and computer-implemented method for image data quality assurance in an installation arranged to perform animal-related actions, computer program and non-volatile data carrier

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8208043B2 (en) * 2008-05-09 2012-06-26 Aptina Imaging Corporation Lens cleaning warning system and method
WO2010031632A1 (en) 2008-09-19 2010-03-25 Delaval Holding Ab Camera test arrangement, device and method in an automated milking system
US9848575B2 (en) * 2011-03-17 2017-12-26 Mirobot Ltd. Human assisted milking robot and method
US9253375B2 (en) * 2013-04-02 2016-02-02 Google Inc. Camera obstruction detection
EP3164831A4 (en) * 2014-07-04 2018-02-14 Light Labs Inc. Methods and apparatus relating to detection and/or indicating a dirty lens condition

Also Published As

Publication number Publication date
WO2017155457A1 (en) 2017-09-14
US20200267924A1 (en) 2020-08-27
EP3426020B1 (en) 2020-11-18

Similar Documents

Publication Publication Date Title
JP6205172B2 (en) Near-field camera obstacle detection
KR102101916B1 (en) Taking photos through visual obstructions
US20130120536A1 (en) Optical Self-Diagnosis of a Stereoscopic Camera System
CN105554437B (en) Method and apparatus for visualizing loitering objects
EP3375282B1 (en) Method, information processing apparatus and program for determining activity quantities of animals
WO2010076882A1 (en) Image classification standard update method, program, and image classification device
JP2018191087A (en) Adhesive matter detection device and adhesive matter detection method
JP2006238802A (en) Cell observation apparatus, cell observation method, microscope system, and cell observation program
US20160165129A1 (en) Image Processing Method
CN108322637B (en) Equipment protection method and device under strong light
JP2009034517A (en) Abnormal tissue pattern detection apparatus, method and program
KR101941585B1 (en) Embedded system for examination based on artificial intelligence thereof
CN103959332A (en) Image processing
EP3426020B1 (en) Control system for an automatic milking installation and method of controlling an automatic milking installation
AU2013240637A1 (en) System and method for grooming-related farm decision support
US20150116543A1 (en) Information processing apparatus, information processing method, and storage medium
JPWO2019244944A1 (en) 3D reconstruction method and 3D reconstruction device
WO2021133090A3 (en) Method for investigating optical element embedded in intraoral scanner, and system using same
US20170091585A1 (en) Methods and systems for tank level monitoring and alerting
WO2020039606A1 (en) Gas detection device, information processing device, and program
CA2860014A1 (en) Video based indoor leak detection
WO2008093344A1 (en) System for detecting particles in a dairy fluid such as milk
KR101553707B1 (en) Image stain detection method for camera module defect determination
EP2417902B1 (en) Pupil covering state detection device, and in-vehicle camera provided therewith
JP2017055262A (en) Data missing pixel detector

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180911

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602017027728

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: A01J0005007000

Ipc: A01J0007020000

RIC1 Information provided on ipc code assigned before grant

Ipc: A01J 7/04 20060101ALI20200320BHEP

Ipc: A01J 5/007 20060101ALI20200320BHEP

Ipc: A01J 5/017 20060101ALI20200320BHEP

Ipc: A01J 7/02 20060101AFI20200320BHEP

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200630

RIN1 Information on inventor provided before grant (corrected)

Inventor name: HALLSTROEM, ANDERS

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017027728

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1334740

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201215

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1334740

Country of ref document: AT

Kind code of ref document: T

Effective date: 20201118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210218

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210318

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210219

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210218

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210318

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017027728

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20210819

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210308

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210308

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210318

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20220215

Year of fee payment: 6

Ref country code: FR

Payment date: 20220221

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210331

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230202

Year of fee payment: 7

Ref country code: DE

Payment date: 20230131

Year of fee payment: 7

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20201118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20170308

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20230401

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230401

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20230331