WO2018128542A1 - Method and system for providing information of an animal - Google Patents

Method and system for providing information of an animal Download PDF

Info

Publication number
WO2018128542A1
WO2018128542A1 PCT/NL2018/050006 NL2018050006W WO2018128542A1 WO 2018128542 A1 WO2018128542 A1 WO 2018128542A1 NL 2018050006 W NL2018050006 W NL 2018050006W WO 2018128542 A1 WO2018128542 A1 WO 2018128542A1
Authority
WO
WIPO (PCT)
Prior art keywords
animal
terminal
display
information
tag
Prior art date
Application number
PCT/NL2018/050006
Other languages
French (fr)
Inventor
Roxie Sabri Romero MULLER
Original Assignee
N.V. Nederlandsche Apparatenfabriek Nedap
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by N.V. Nederlandsche Apparatenfabriek Nedap filed Critical N.V. Nederlandsche Apparatenfabriek Nedap
Publication of WO2018128542A1 publication Critical patent/WO2018128542A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K11/00Marking of animals
    • A01K11/006Automatic identification systems for animals, e.g. electronic devices, transponders for animals
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry

Definitions

  • the invention relates to a method and a system for providing information of an animal, in particular readily providing information of an animal to a farmer in a barn environment.
  • Information of an animal is very important in situations of inspection and routine examination by a farmer. Often, such information is used by a farmer in support of safeguarding animal welfare. For example, information about health or relevant medical history is important when an animal is being inspected.
  • a problem is that the farmer often lacks ready information. For example, it happens that a farmer has to look up information in the absence of an animal. Readiness of information in the presence of an animal is of great benefit to the farmer, since it may shorten the handling time of taking care of or inspecting an animal.
  • a problem is, further, that the animal can move freely in an environment such as a barn. The animal thus has the freedom to mingle with other animals in a group. This is why it happens that a farmer has to identify an animal repeatedly, or has to deprive the animal of its freedom of movement during retrieval of information about the animal. This is time -intensive during inspection of an animal and disadvantageous to the animal itself. It is also at the expense of valuable response time for the farmer to react in case of an acute disease.
  • the invention provides a method for providing
  • the animal is here provided with a tag, such as a smart tag, the position of which can be determined and an identification code of which can be read out.
  • a tag such as a smart tag
  • An example is a smart tag with a UHF transmitter which irradiates a plurality of beacons within a space, such as a barn, so that it can be determined with multilateration what the physical location of the tag in the space is.
  • the smart tag then also emits a unique identification code, by which the animal wearing the tag can be identified.
  • Another example of determining the position of the tag concerns a position determining system with a plurality of beacons each continuously emitting a unique electromagnetic signal which is received by a smart tag.
  • the tag By the tag, its position relative to the beacons can be calculated on the basis of the received signal strengths of the beacons.
  • the tag may wirelessly transmit information about the calculated position to, for example, a computer for further processing or storage.
  • the smart tag wirelessly transmits information about the received signal strengths of the beacons to a computer which then calculates the position of the tag relative to the beacons on the basis of the signal strengths.
  • Other possibilities of determining the position of a tag are also possible, such as a smart tag which includes a GPS receiver, the smart tag in a known manner wirelessly transmitting information about the position of the smart tag determined with the GPS receiver.
  • the method comprises step a., determining the position of the tag with the aid of a first position determining system.
  • the method comprises step b., reading out the identification code of the tag with the aid of an information readout system. Reading out may be done in a variety of manners, for example, the tag may also comprise a visible code which can be optically read out by a camera, on a portable terminal.
  • the method comprises step c, determining the position and orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system.
  • the method comprises step d., with the aid of a camera of the terminal, displaying a recording of an environment of the terminal on a first display of the terminal.
  • the method comprises step e. of, on the basis of the information obtained in steps a., b.
  • the method comprises step f., wirelessly sending the information from step e. to the terminal. This may be done, for example, via a wireless telecommunication network or Wi-Fi.
  • the method comprises step g., of the terminal adding received information about the animal to the picture on the first display. Individual animal information is thereby projected at the respective animal on the basis of position determination.
  • the adding of information about the animal to the picture on the first display involves placing a digital layer (Augmented Reality) over a reality, namely the image that the user himself sees.
  • a digital layer Augmented Reality
  • the digital layer can be added to reality on any display with access to an image of reality.
  • the animal is indicated as a sign that the identity of the animal has been established in steps a. and b.
  • the indicating may then involve displaying at least a part of the information about the animal in step g.
  • the indication is implemented in a manner other than with characters or signs, for example, by a coloration of a part of the picture, such as the part on which the animal can be seen.
  • a part of the picture such as the part on which the animal can be seen.
  • displaying the information about the animal on the first display in step g. comprises the indicating of the animal by highlighting the animal as an indication that the position and identity of the animal have been established in steps a. and b.
  • the user himself has to make the link between those animals that he/she sees and the position on a map.
  • step h. by clicking on the animal that is indicated in step g., additional information is displayed in the first display with the aid of text.
  • step h. the same may also be displayed on the second display when the animal has been chcked on in step g. This may be done, for example, through mutual communication between the terminal and the second display via a computer system. Possibly, clicking on the animal may lead the user, such as the farmer, to an animal page for that specific animal.
  • information about the animal being displayed for example a photograph or video stream of the animal made with the terminal is supplied to a computer system, in particular to the database.
  • Information supply to the computer system or database thus becomes dependent on user-observation.
  • the information that is supplied to the computer system is entered at the terminal by a user of the terminal.
  • Notes on the animal can then be supplied to the database by a farmer or veterinarian with the terminal.
  • An advantage is that the database can be supplemented with current information while the farmer is with the animal in the barn.
  • the information that is supplied to the computer system, in particular the database is supplied by a sensor to the terminal or measured by a sensor of the terminal.
  • a sensor for example, a thermometer, sphygmomanometer, heart rhythm monitor, or acoustic measuring instrument can be such a sensor.
  • a heartbeat, blood pressure and/or temperature of the animal is determined.
  • the sensor may also be present in or on the tag of the animal and may be connected to the terminal for reading it out.
  • the picture of the animal obtained with the camera is supplied to a computer system and/or possibly the database.
  • the computer system comprises a second display which may be viewed by a third party such as a veterinarian.
  • a third party such as a veterinarian.
  • the computer system comprises means for audio and/or visual communication so that the veterinarian or other third party is able not only to co-view, but also to communicate with the farmer or user of the terminal.
  • Audio and visual communication means can be a microphone and webcam.
  • a marking may be placed on the picture on the first display, for example, a circle or full line around a particular part of the animal. This might be done, for example, by touching the second display, for instance when the second display is a touchscreen. This simplifies diagnosing remotely, and enables the third party to draw the user's attention to a specific part of the animal.
  • the picture on the first display preferably in combination with the information from step e., is also displayed on a second and/or a third display of a third party.
  • a third party such as a veterinarian
  • a third display may, for example, be supported on, or be part of, a further portable terminal which communicates with the computer system.
  • the third party such as the veterinarian, could be mobile.
  • the second and third display is the same display.
  • the second display is both mobile and part of the computer system.
  • step i. the identity of an animal to be found is entered in the computer system and if the entered identity of the animal corresponds to the identification code of the tag read out in step b., step f. is carried out.
  • An advantage is that in a picture of a group of animals, the animal whose identity has been entered can be found directly.
  • an indicator could be shown on the first display when the animal is not in the picture, so that the user can be led to the animal on the basis of an orientation of the terminal.
  • entering the identity of the animal in step i. is carried out by a user with the aid of the terminal.
  • displaying in step d. involves displaying a video stream.
  • the invention provides a system for providing information of an animal.
  • the system comprises: a position-determinable tag provided with a readable identification code, an animal being provided with the tag; a first position determining system, the first position determining system being configured to determine the position of the tag; an information readout system for reading out the identification code of the tag; a second position determining system for determining the position and orientation of a portable terminal; the portable terminal provided with a camera and a first display, the portable terminal being configured to display a recording with the camera of an environment of the terminal on the first display; and a computer system.
  • the computer system comprises a database for keeping information about the animal, the system being configured to retrieve the information about the animal that is visible in the recording from the database on the basis of the determined position of the tag and the identification code read out.
  • the computer system is furthermore configured for wirelessly sending the information to the terminal, the terminal being configured to add the received information about the animal to the picture on the first display.
  • the computer system may, for example, be connected to a telecommunication network, or include a router unit.
  • the second position determining system comprises a GPS and orientation sensor which are each arranged at the portable terminal.
  • the position determining system could be built into the portable terminal.
  • the computer system comprises a second display which can be viewed by a third party such as a veterinarian.
  • the system may then be configured to show on the second display at least the recording of the first display.
  • the system is configured to add information about the animal to the picture on the second display.
  • the second display is a part of a further terminal, the further terminal, for example, itself forming part of the computer system.
  • the system may be regarded as a network of at least one portable terminal and a central computer with access to a database, wherein a further portable terminal optionally also forms part of the network and wherein the further portable terminal comprises the second or third display.
  • Communicative connection between each portable terminal and the central computer also: the computer system, may proceed via wireless
  • the system is configured to communicate markings made on the second display by a user of the second display to the first portable terminal, and wherein the first terminal is configured to represent the markings on the first display, and wherein the marking is, for example, a circle or full line for indication of a particular part of the displayed animal, for example, in that the circle or full line is around a particular part of the displayed animal.
  • Fig. 1 schematically shows a system for providing information of an animal
  • FIG. 2 schematically shows a method for providing information of an animal
  • Fig. 3 shows a special embodiment of (part of or no part of) a terminal of the system of Fig. 1;
  • FIG. 4 shows three angles which reflect an orientation of a terminal.
  • Figure 1 shows schematically a system 1 for providing information of an animal such as a cow 3.
  • the animal 3 is provided with a smart tag 5.
  • the smart tag 5 contains an energy source and transmitting means (not represented, but conventional) for periodically emitting, at a predetermined UHF frequency, a signal 6 to a first position determining system 7, 11.
  • the UHF frequency here contains an identification code which can be read out by the first position determining system 7, 11.
  • the signal 6 may thus comprise, for example, a carrier having the predetermined UHF frequency, with the identification code modulated onto the carrier.
  • the first position determining system 7, 11 has at least three beacons 7, also: receivers (not represented, but conventional).
  • a computer system 11 is communicatively connected with the beacons 7 and is configured to determine the location of the smart tag 5 on the basis of triangulation of the signal 6.
  • the computer system 11 receives from each of the beacons the received signal 6, whereupon, on the basis of the at least three received signals 6, based on a known method, the position of the smart tag 5 is determined by the computer system 11.
  • the computer system 11 further includes a software-based information readout system (not represented, but conventional) which, on the basis of the predetermined UHF frequency, 300MHz - 3GHz, of the signal 6, reads out the identification code of the smart tag 5.
  • the computer system 11 is configured to extract the identification code from at least one of the received signals 6.
  • the beacons 7 each emit a unique signal which is received by the smart tag 5. Based on the signal strengths of the received signals, the smart tag 5 can then compute its position relative to the beacons in a known manner. A signal 6 transmitted by the smart tag can then comprise an identification code of the smart tag 5 as well as the position calculated by the smart tag, which can be received by the first position determining system 7, 11.
  • the beacons each emit a unique signal which is received by the smart tag 5.
  • the smart tag determines the signal strengths of the received signals.
  • the smart tag 5 emits a signal 6 which comprises the identification code of the smart tag 5 and information about the determined signal strengths.
  • the position determining system 7, 11 receives the information of the signal 6, whereupon the computer system 11 determines the position of the smart tag relative to the beacons in a known manner on the basis of the information about the signal strengths.
  • the smart tag includes a GPS receiver to determine its position.
  • the smart tag 5 then comprises the first position determining system and in that case emits a signal 6 which comprises the identification code as well as information about the determined position.
  • the identification code of the smart tag 5 and the information about the position of the smart tag 5 is received by, for example, a receiver of the computer system 11. In that case, the beacons 7 can be omitted.
  • the system further includes a second position determining system 12.
  • This second position determining system is provided as part of a terminal 13.
  • the terminal 13 in this example comprises a tablet computer 13.1.
  • the second position determining system 12 is included in the tablet computer 13.1.
  • the tablet computer 13.1 is carried by a user 14 such as a farmer 14 in the barn 9.
  • the tablet computer 13.1 has a camera 16.
  • the second position determining system 12 has an orientation sensor and a GPS unit.
  • the tablet computer 13.1 determines with the second position determining system the location and orientation of the tablet computer 13.1 and in particular the direction of the camera, for example, of a viewing direction of the camera.
  • On a first display 15 of the tablet computer 13.1, a camera recording of an environment of tablet computer 13.1 is represented, in this example this recording is a video stream.
  • the camera recording is here renewed with a frequency of at least 30Hz.
  • the tablet computer 13.1 is configured to forward the information about orientation and location of the tablet computer 13.1 to the computer system 11 via a wireless
  • the computer system 11 is configured to determine whether any and what animal is where in the camera picture.
  • the computer system 11 is hence configured, for example, to determine whether there is an animal in the camera picture and, if so, what animal is where in the camera picture. All of this can also be done for a plurality of animals.
  • the computer system 11 is configured to determine this on the basis of the information about the identification code of the smart tag 5 and the location of the smart tag 5.
  • the tablet computer 13.1 is configured to retrieve this information about the animal 3 from the computer system 11 and to represent it in the picture on the first display 15.
  • the tablet computer 13.1 is configured, for example, to highlight the animal 3 in the picture on the first display 15 with a bright color, such as yellow.
  • the tablet computer 13.1 is configured so as, when the animal 3 is clicked on in the picture on the first display 15, to query the computer system 11 for further information about the animal 3.
  • the computer system is configured to retrieve this information from a database 17.
  • the computer system 11 includes a second display 19 on which the video stream of the camera is shared.
  • a veterinarian (not represented, but conventional) can co-view via the second display 19.
  • the computer system 11 is configured for entering digital shadings on the second display 19. This can be carried out, for example, by implementing the display 19 as a touchscreen and entering a shading in the picture by touching of the display 19.
  • the second display 19 may be set up fixedly. It is also possible that the computer system sends the same picture that is displayed on the first display to a mobile third display (not shown) to be displayed on the third display. This third display too may be carried by a third party, such as a veterinarian.
  • the computer system 11 is configured to send information about the shadings via the telecommunication network to the tablet computer 13.1.
  • the tablet computer 13.1 is configured to show the shadings on the first display 15.
  • Figure 2 schematically shows a method 100 for providing information of a cow 3 which is provided with the smart tag 5.
  • step a the position of the smart tag 5 in the barn 9 is determined with the aid of the first position determining system 7.
  • the first step 101 leads to a second step 102.
  • step b the identification code of the smart tag 5 is read out with the aid of the information readout system.
  • step 102 leads to a third step 103.
  • step 103 step c, the position and orientation of the tablet computer 13.1 in the barn 9 is determined with the aid of the second position determining system.
  • step 103 leads to a fourth step 104.
  • step 104 step d., with the aid of the camera of the tablet computer 13.1, the recording of the environment of the terminal is displayed on the first display 15 of the tablet computer 13.1.
  • the fourth step 104 leads to a fifth step 105.
  • step 105 step e., on the basis of the information obtained in steps a., b., and c, information is retrieved from the database 17 of the computer system 11 about the animal 3, in this example medical
  • the fifth step 105 leads to a sixth step 106.
  • step f. the information from the fifth step 105 is sent by the computer system 11 to the tablet computer 13.1 via the wireless telecommunication network.
  • the sixth step 106 leads to a seventh step 107.
  • step g. by the tablet computer, a part of the picture on the first display in which the animal is present is highlighted. This provides to the farmer 14 the information that the animal has been identified.
  • the seventh step 107 leads to an eighth step 108.
  • the eighth step 108 it is determined whether the farmer 14, or user, has pressed or clicked on the highlighted part of the picture on the first display. If so, the eighth step 108 leads to a ninth step 109; if not, then the eighth step 108 leads to a tenth step 110. This tenth step, however, also may not be done.
  • the received information about the animal 3 is added to the picture on the first display 15.
  • the received information is acute medical information.
  • the ninth step 109 leads to the tenth step 110, which, as said, does not need to be done.
  • a veterinarian makes a shading, on the second display 19.
  • the shading is then also represented on the first display 15. Such shading may be clicked away on the first or second display so that it does not appear on the first or second display anymore.
  • the tenth step 110 leads to an eleventh step 111.
  • the farmer takes the temperature of the animal 3 with a thermometer on the tablet computer 13.1. This information is then sent by the tablet computer 13.1 to the computer system 11 and stored in the database 17.
  • the user removes, except for the highlight displayed over the video stream, all other information about the animal from the picture, for example by clicking outside this information or outside the highlight.
  • the information returns when the user clicks on the highlighted area on the first display again.
  • the method comprises step a. of determining the position of the tag with the aid of a first position determining system.
  • the method comprises step b. of reading out the identification code of the tag with the aid of an information readout system.
  • the method comprises step c. of determining the position and directional orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system.
  • the method comprises step d. of displaying with the aid of a camera of the terminal a recording of an environment of the terminal on a first display of the terminal.
  • the method comprises step e. of, on the basis of the information obtained in steps a., b. and c, retrieving information from a database of a computer system about at least an animal that is visible in the recording.
  • the method comprises step f. of wirelessly sending the
  • the method comprises step g. of the terminal adding received information about the animal to the picture on the first display.
  • a recognized animal was highlighted in the picture on the display 15. Also, after selecting of the highhghted animal, further information about the animal was displayed on the display 15. This information is, for example, displayed above the animal. Here, preferably, the highlighted part of the picture and the further information are coupled to the animal. If, for example, the position of the animal changes because the animal is walking, a position change of the tag will be detected by the first position determining system. As a result, an updated position can be determined, about what part of the picture on the display is to be highhghted and where in the picture the additional information is to be displayed.
  • Highlighting may be, for example, a circle around the position in the picture where the tag of the animal is present. So if the animal is walking, the position of the animal in the picture on the display 15 will change. The highlighted part and the further information that is displayed in the picture on the display will then move along with the animal and is in effect locked to the position of the animal. If the position of the animal in the picture on the display 15 is known, however, the tablet computer may also determine on the basis of pattern recognition software known per se what the contours of the animal are and specifically highlight only that part of the picture in which the animal is present. In this manner, therefore, the whole animal or, for example, just its head, is highlighted. This, then, is an alternative to the highlighted circle. Also if the position of the animal changes in that the animal is walking, in this case the highlighted (part of the) animal will move along with the animal in the picture on the display 15.
  • the second position determining means If the position and/or orientation of the tablet computer 13.1 changes, this will be detected by the second position determining means. This will generally also result in a new position of the animal in the picture on the display 15. Then too, entirely analogously to what has been discussed for the case where the animal walks, the highlighted part of the animal and the further information about the animal will move along with the animal. To put it differently, in the picture on the display 15, the position of the highlighted part of the animal relative to the animal and the position of the further information relative to the animal does not change if the position and/or orientation of the tablet computer 13.1 changes and/or if the position of the animal changes. This also applies to the picture on the display 19 because this is the same picture as on the display 15.
  • the orientation of the tablet computer 13.1 can be expressed in three angle coordinates.
  • the orientation of the tablet computer 13.1 can be defined by a first angle al in the horizontal plane H that represents a direction of a perpendicular projection P of the length axis on the horizontal plane (for example, relative to the north N), a second angle a2 between the length axis and a vertical V, and a third rotation angle a3 of, for example, a (fictitious) flat plane Q of the tablet computer 13.1 around the length axis of the tablet computer 13.1 relative to a fictitious predetermined zero position (this zero position effectively being a predetermined position of the plane Q around the axis L) (see also Figure 4 in which also the west W is indicated).
  • the position of the further information in the picture on the display does not change, may be realized as follows.
  • the first position determining system 7, 11 determines coordinates of the position of the tag in a first coordinate system and that the second position determining system 12 determines the coordinates of the position of the terminal 13 in a second coordinate system, wherein the second position determining system 12 is included in the terminal and wherein the terminal, in use, converts the coordinates of the position of the tag in the first coordinate system to coordinates of the position of the tag in the second coordinate system for carrying out step g.
  • Both coordinate systems can utilize, for example, cartesian coordinates. Other coordinates, however, such as spherical coordinates or polar coordinates, are also possible.
  • conversion is carried out with a conversion algorithm which is executed by a processor of the terminal. For the purpose of conversion, for example, a calibration is carried out, wherein:
  • the terminal is brought to the second position, while at the second position, coordinates of the second position are determined in the second coordinate system with the terminal;
  • parameters are determined to enable conversion of the coordinates of the position of the tag in the first coordinate system to the coordinates of the position of the tag in the second coordinate system for carrying out step g.
  • the terminal 13 comprises an augmented reality device 13.2 (see Figure 3) such as a pair of reality glasses.
  • An example of a pair of reality glasses is Microsoft's HoloLens.
  • An augmented reality device has for its aim to partly bridge the gap between the digital and the analog world, inter aha by enabling digital 3D models to be displayed in the real space such as it is perceived by a user wearing the device. To this end, the illusion is created that the digital models effectively are at a particular distance from the user. This functionality is reminiscent of what are popularly called holograms.
  • Reality glasses such as the
  • HoloLens look like good-sized sunglasses and are in effect a head-mounted display into which a computer has been incorporated.
  • a device like the Kinect, features a camera system which in addition to length and width also perceives depth in view.
  • This camera has a field of view of, for example, 120° by 120°.
  • the device can respond to gestures, spoken commands and/or eye movements.
  • the computer of the device 13.2 is typically equipped with three types of processors: in addition to the traditional CPU and GPU, there is also a so-called 'Holographic Processing Unit' (HPU).
  • HPU 'Holographic Processing Unit
  • the device 13.2 may be communicatively connected with the tablet 13.1, the display 15 of the tablet being functionally replaced with the device 13.2.
  • the first display may hence also be formed by an augmented reality device which generates a 3D picture to be perceived by the user 14.
  • the first display 15 is hence a broad term.
  • the second display 19 may be replaced with an augmented reality device which generates the same picture as the device 13.2.
  • the term display 19 hence also comprises an augmented reality device.
  • the second position determining system 12 is also functionally included in the device 13.2 instead of in the tablet computer.
  • the position determining system of the tablet computer is not used then.
  • the device 13.2 is worn by a user 14 such as a farmer 14 in the barn
  • the device 13.2 functionally comprises the camera 16.
  • the camera of the tablet computer is not used then.
  • the second position determining system 12 has an orientation sensor and a GPS unit. With the second position determining system, the device 13.2 determines the location and orientation of the device.
  • the location can be expressed in cartesian coordinates and the orientation in three angles.
  • An example of the three angle coordinates is the following (and the same as described above for the tablet 13.1).
  • the orientation of the device can be defined by a first angle al in the horizontal plane H which represents a direction of a projection P of the axis L perpendicular to the horizontal plane (for example, relative to the north N), a second angle a2 between the axis L and a vertical V, and a third rotation angle a3 of, for example, a (fictitious) flat plane Q of the device around the axis L of the device relative to a predetermined zero position of the flat plane Q (see Figure 4 in which also the west W is indicated).
  • the tablet 13.1 then communicates with the computer system 11 as discussed before and sends information to be displayed, such as the earlier-mentioned highlighting of the animal, to the device 13.2.
  • the user 14 can then point out the highlighted animal with a gesture so that the earlier-discussed further information is displayed in the real world which the user perceives with the device.
  • the information may be displayed in 3D, for example, above the selected animal.
  • a hologram is then displayed above the animal. If the animal starts to move, this is noticed by the first position determining system. The result is that the position of the animal expressed in the second position determining system will change along and that the hologram moves along with the animal. Also when a user 14 moves the second device so that the animal moves in the view of the user, the hologram will move along.
  • Also making the information invisible again can be initialized by the user 14 with gestures. If a veterinarian on the display 19 shades a part of an animal, such shading is also shown to the user by display in the picture of the real world he perceives. Also the position of the shading is locked to the animal in the picture generated by the device 13.2 and will hence move along with the animal if the position of the animal in the picture changes in that the position of the animal in the real world changes and/or in that the position and/or orientation of the device 13.2 changes.
  • the received information about the animal is virtually displayed in the real environment such as it is perceived by the user 14.
  • the first coordinate system is a two-dimensional system which is in a horizontal plane
  • the second coordinate system is a three-dimensional system, two coordinates of which set up a horizontal plane, while the coordinates which indicate the position of the tag in the second coordinate system correspond to the coordinates that set up the horizontal plane in the second coordinate system.
  • the second algorithm is implemented as an app, which, in use, is executed by the terminal, in this example in the tablet computer 13.1.
  • the terminal only comprises the augmented reality device 13.2 which is also provided with the above-mentioned software (app) to perform the necessary

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Birds (AREA)
  • Zoology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Method and system for providing information of an animal, wherein the animal is provided with a tag, the position of which can be determined and an identification code of which can be read out. The method comprises step a. of determining the position of the tag with the aid of a first position determining system. The method comprises step b. of reading out the identification code of the tag with the aid of an information readout system. The method comprises step c. of determining the position and orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system. The method comprises step d. of, with the aid of a camera of the terminal, displaying a recording of an environment of the terminal on a first display of the terminal. The method comprises step e. of, on the basis of the information obtained in steps a., b. and c, retrieving information from a database of a computer system about at least an animal that is visible in the recording. The method comprises step f. of wirelessly sending the information from step e. to the terminal. The method comprises step g. of the terminal adding received information about the animal to the picture on the first display.

Description

Method and system for providing information of an animal.
The invention relates to a method and a system for providing information of an animal, in particular readily providing information of an animal to a farmer in a barn environment.
Information of an animal is very important in situations of inspection and routine examination by a farmer. Often, such information is used by a farmer in support of safeguarding animal welfare. For example, information about health or relevant medical history is important when an animal is being inspected.
Various methods and systems have been proposed for providing information of an animal to a farmer. In current methods, a farmer looks up information about an animal in a database which is updated by the farmer himself.
A problem is that the farmer often lacks ready information. For example, it happens that a farmer has to look up information in the absence of an animal. Readiness of information in the presence of an animal is of great benefit to the farmer, since it may shorten the handling time of taking care of or inspecting an animal. A problem is, further, that the animal can move freely in an environment such as a barn. The animal thus has the freedom to mingle with other animals in a group. This is why it happens that a farmer has to identify an animal repeatedly, or has to deprive the animal of its freedom of movement during retrieval of information about the animal. This is time -intensive during inspection of an animal and disadvantageous to the animal itself. It is also at the expense of valuable response time for the farmer to react in case of an acute disease.
In line with these circumstances, there is a need to support the farmer in his inspections of his animals. Further, there is a need to provide ready information to the farmer in the presence of an animal. It is an object of the invention to provide a method and system for providing ready information of an animal. It is thus an object of the invention to obviate at least one of the existing disadvantages of the current method or mitigate the consequences thereof. It is also an object of the present invention to provide ameliorated, or alternative solutions that can be implemented in a simpler manner and, moreover, can be made
comparatively inexpensively. Alternatively, it is an object of the invention to provide to the public an at least useful option.
To this end, the invention provides a method for providing
information of an animal. The animal is here provided with a tag, such as a smart tag, the position of which can be determined and an identification code of which can be read out. An example is a smart tag with a UHF transmitter which irradiates a plurality of beacons within a space, such as a barn, so that it can be determined with multilateration what the physical location of the tag in the space is. The smart tag then also emits a unique identification code, by which the animal wearing the tag can be identified. Another example of determining the position of the tag concerns a position determining system with a plurality of beacons each continuously emitting a unique electromagnetic signal which is received by a smart tag. By the tag, its position relative to the beacons can be calculated on the basis of the received signal strengths of the beacons. The tag may wirelessly transmit information about the calculated position to, for example, a computer for further processing or storage. Also, it is possible that the smart tag wirelessly transmits information about the received signal strengths of the beacons to a computer which then calculates the position of the tag relative to the beacons on the basis of the signal strengths. Other possibilities of determining the position of a tag are also possible, such as a smart tag which includes a GPS receiver, the smart tag in a known manner wirelessly transmitting information about the position of the smart tag determined with the GPS receiver. The method comprises step a., determining the position of the tag with the aid of a first position determining system. The method comprises step b., reading out the identification code of the tag with the aid of an information readout system. Reading out may be done in a variety of manners, for example, the tag may also comprise a visible code which can be optically read out by a camera, on a portable terminal. The method comprises step c, determining the position and orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system. The method comprises step d., with the aid of a camera of the terminal, displaying a recording of an environment of the terminal on a first display of the terminal. The method comprises step e. of, on the basis of the information obtained in steps a., b. and c, retrieving further information from a database of a computer system about at least an animal that is visible in the recording. The method comprises step f., wirelessly sending the information from step e. to the terminal. This may be done, for example, via a wireless telecommunication network or Wi-Fi. The method comprises step g., of the terminal adding received information about the animal to the picture on the first display. Individual animal information is thereby projected at the respective animal on the basis of position determination. The adding of information about the animal to the picture on the first display involves placing a digital layer (Augmented Reality) over a reality, namely the image that the user himself sees. The received
information can then be a digital context which is coupled to the respective animal and remains 'tagged' to the animal regardless of the orientation of the viewer. In this way, the digital layer can be added to reality on any display with access to an image of reality.
Optionally, the animal is indicated as a sign that the identity of the animal has been established in steps a. and b. The indicating may then involve displaying at least a part of the information about the animal in step g. An advantage is that in this way incorrect information provision is avoided. Further, this provides confirmation of identity without repeated identification being necessary.
Optionally, the indication is implemented in a manner other than with characters or signs, for example, by a coloration of a part of the picture, such as the part on which the animal can be seen. The advantage is that no part of the visible picture is pushed away. As a result, the overview of the environment by the first and possibly any other display remains intact.
Optionally, displaying the information about the animal on the first display in step g. comprises the indicating of the animal by highlighting the animal as an indication that the position and identity of the animal have been established in steps a. and b. In the current situation, the user himself has to make the link between those animals that he/she sees and the position on a map. An advantage is that the animal in this way clearly stands out relative to the environment, and any decrease of awareness of the environment is prevented in that no part of the picture is ousted by text, clouds, signs, features.
Optionally, in a step h., by clicking on the animal that is indicated in step g., additional information is displayed in the first display with the aid of text. When there is a second display, in a step h. the same may also be displayed on the second display when the animal has been chcked on in step g. This may be done, for example, through mutual communication between the terminal and the second display via a computer system. Possibly, clicking on the animal may lead the user, such as the farmer, to an animal page for that specific animal.
Optionally, with the aid of the terminal, information about the animal being displayed, for example a photograph or video stream of the animal made with the terminal is supplied to a computer system, in particular to the database. Information supply to the computer system or database thus becomes dependent on user-observation. An advantage is that in this way information can be gathered about any specific observation period of a specific animal.
Optionally, the information that is supplied to the computer system is entered at the terminal by a user of the terminal. Notes on the animal can then be supplied to the database by a farmer or veterinarian with the terminal. An advantage is that the database can be supplemented with current information while the farmer is with the animal in the barn.
Optionally, the information that is supplied to the computer system, in particular the database, is supplied by a sensor to the terminal or measured by a sensor of the terminal. Thus, for example, a thermometer, sphygmomanometer, heart rhythm monitor, or acoustic measuring instrument can be such a sensor. An advantage is that measurements are coupled to moments of observation and are immediately available via the computer system or database. As a result, a measurement can be coupled directly to a specific animal and the right time. For an animal, thus a coherent (medical) file is created which is automatically built up.
Optionally, with the aid of the sensor, a heartbeat, blood pressure and/or temperature of the animal is determined. The sensor may also be present in or on the tag of the animal and may be connected to the terminal for reading it out.
Optionally, with the aid of the terminal, the picture of the animal obtained with the camera is supplied to a computer system and/or possibly the database.
Optionally, the computer system comprises a second display which may be viewed by a third party such as a veterinarian. An advantage is that the picture on the first display which the user sees may be remotely shared so that a third party such as a veterinarian with access to video can arrive at decisions remotely.
Optionally, the computer system comprises means for audio and/or visual communication so that the veterinarian or other third party is able not only to co-view, but also to communicate with the farmer or user of the terminal. Audio and visual communication means can be a microphone and webcam.
Optionally, from the second display, by the third party, such as the veterinarian, a marking may be placed on the picture on the first display, for example, a circle or full line around a particular part of the animal. This might be done, for example, by touching the second display, for instance when the second display is a touchscreen. This simplifies diagnosing remotely, and enables the third party to draw the user's attention to a specific part of the animal.
Optionally, the picture on the first display, preferably in combination with the information from step e., is also displayed on a second and/or a third display of a third party. An advantage is that the picture on the first display which the user sees may be shared remotely, so that a third party, such as a veterinarian, with access to video in combination with the information, can arrive at decisions remotely. A third display may, for example, be supported on, or be part of, a further portable terminal which communicates with the computer system. As a result, the third party, such as the veterinarian, could be mobile.
Optionally, the second and third display is the same display. An advantage of this is that the second display is both mobile and part of the computer system.
Optionally, in a step i., the identity of an animal to be found is entered in the computer system and if the entered identity of the animal corresponds to the identification code of the tag read out in step b., step f. is carried out. An advantage is that in a picture of a group of animals, the animal whose identity has been entered can be found directly. Optionally, an indicator could be shown on the first display when the animal is not in the picture, so that the user can be led to the animal on the basis of an orientation of the terminal. Optionally, entering the identity of the animal in step i. is carried out by a user with the aid of the terminal.
Optionally, displaying in step d. involves displaying a video stream. Further, the invention provides a system for providing information of an animal. The system comprises: a position-determinable tag provided with a readable identification code, an animal being provided with the tag; a first position determining system, the first position determining system being configured to determine the position of the tag; an information readout system for reading out the identification code of the tag; a second position determining system for determining the position and orientation of a portable terminal; the portable terminal provided with a camera and a first display, the portable terminal being configured to display a recording with the camera of an environment of the terminal on the first display; and a computer system. The computer system comprises a database for keeping information about the animal, the system being configured to retrieve the information about the animal that is visible in the recording from the database on the basis of the determined position of the tag and the identification code read out. The computer system is furthermore configured for wirelessly sending the information to the terminal, the terminal being configured to add the received information about the animal to the picture on the first display. For this purpose, the computer system may, for example, be connected to a telecommunication network, or include a router unit.
Optionally, the second position determining system comprises a GPS and orientation sensor which are each arranged at the portable terminal. In an example, the position determining system could be built into the portable terminal.
Optionally, the computer system comprises a second display which can be viewed by a third party such as a veterinarian. The system may then be configured to show on the second display at least the recording of the first display. Preferably, the system is configured to add information about the animal to the picture on the second display.
Optionally, the second display is a part of a further terminal, the further terminal, for example, itself forming part of the computer system. Optionally, the system may be regarded as a network of at least one portable terminal and a central computer with access to a database, wherein a further portable terminal optionally also forms part of the network and wherein the further portable terminal comprises the second or third display. Communicative connection between each portable terminal and the central computer, also: the computer system, may proceed via wireless
communication over, for example, a telephone network or Wi-Fi.
Optionally, the system is configured to communicate markings made on the second display by a user of the second display to the first portable terminal, and wherein the first terminal is configured to represent the markings on the first display, and wherein the marking is, for example, a circle or full line for indication of a particular part of the displayed animal, for example, in that the circle or full line is around a particular part of the displayed animal.
The invention will be further clarified by a description of a few specific embodiments. For this purpose, use is made of references to the appended drawings. The detailed description provides examples of possible implementations of the invention. These implementations are not to be regarded as the only possible embodiments that fall within the scope of the invention. The scope of the invention is defined in the claims, and the description is to be regarded as illustrative without being restrictive on the invention.
Fig. 1 schematically shows a system for providing information of an animal;
Fig. 2 schematically shows a method for providing information of an animal; Fig. 3 shows a special embodiment of (part of or no part of) a terminal of the system of Fig. 1; and
Fig. 4 shows three angles which reflect an orientation of a terminal. Figure 1 shows schematically a system 1 for providing information of an animal such as a cow 3. The animal 3 is provided with a smart tag 5. The smart tag 5 contains an energy source and transmitting means (not represented, but conventional) for periodically emitting, at a predetermined UHF frequency, a signal 6 to a first position determining system 7, 11. The UHF frequency here contains an identification code which can be read out by the first position determining system 7, 11. The signal 6 may thus comprise, for example, a carrier having the predetermined UHF frequency, with the identification code modulated onto the carrier. The first position determining system 7, 11 has at least three beacons 7, also: receivers (not represented, but conventional). A computer system 11 is communicatively connected with the beacons 7 and is configured to determine the location of the smart tag 5 on the basis of triangulation of the signal 6.
To put it differently, the computer system 11 receives from each of the beacons the received signal 6, whereupon, on the basis of the at least three received signals 6, based on a known method, the position of the smart tag 5 is determined by the computer system 11. The computer system 11 further includes a software-based information readout system (not represented, but conventional) which, on the basis of the predetermined UHF frequency, 300MHz - 3GHz, of the signal 6, reads out the identification code of the smart tag 5. To put it differently, the computer system 11 is configured to extract the identification code from at least one of the received signals 6.
Other techniques of determining the position of the smart tag 5, however, are also possible. For example, it is possible that the beacons 7 each emit a unique signal which is received by the smart tag 5. Based on the signal strengths of the received signals, the smart tag 5 can then compute its position relative to the beacons in a known manner. A signal 6 transmitted by the smart tag can then comprise an identification code of the smart tag 5 as well as the position calculated by the smart tag, which can be received by the first position determining system 7, 11.
It is also possible that the beacons each emit a unique signal which is received by the smart tag 5. The smart tag then determines the signal strengths of the received signals. Then, the smart tag 5 emits a signal 6 which comprises the identification code of the smart tag 5 and information about the determined signal strengths. The position determining system 7, 11 receives the information of the signal 6, whereupon the computer system 11 determines the position of the smart tag relative to the beacons in a known manner on the basis of the information about the signal strengths.
It is also possible that the smart tag includes a GPS receiver to determine its position. The smart tag 5 then comprises the first position determining system and in that case emits a signal 6 which comprises the identification code as well as information about the determined position. The identification code of the smart tag 5 and the information about the position of the smart tag 5 is received by, for example, a receiver of the computer system 11. In that case, the beacons 7 can be omitted.
The system further includes a second position determining system 12. This second position determining system is provided as part of a terminal 13. The terminal 13 in this example comprises a tablet computer 13.1. In this example, the second position determining system 12 is included in the tablet computer 13.1. The tablet computer 13.1 is carried by a user 14 such as a farmer 14 in the barn 9. The tablet computer 13.1 has a camera 16. The second position determining system 12 has an orientation sensor and a GPS unit. The tablet computer 13.1 determines with the second position determining system the location and orientation of the tablet computer 13.1 and in particular the direction of the camera, for example, of a viewing direction of the camera. On a first display 15 of the tablet computer 13.1, a camera recording of an environment of tablet computer 13.1 is represented, in this example this recording is a video stream. The camera recording is here renewed with a frequency of at least 30Hz. The tablet computer 13.1 is configured to forward the information about orientation and location of the tablet computer 13.1 to the computer system 11 via a wireless
telecommunication network (not represented, but conventional). The computer system 11 is configured to determine whether any and what animal is where in the camera picture. The computer system 11 is hence configured, for example, to determine whether there is an animal in the camera picture and, if so, what animal is where in the camera picture. All of this can also be done for a plurality of animals. The computer system 11 is configured to determine this on the basis of the information about the identification code of the smart tag 5 and the location of the smart tag 5. The tablet computer 13.1 is configured to retrieve this information about the animal 3 from the computer system 11 and to represent it in the picture on the first display 15. The tablet computer 13.1 is configured, for example, to highlight the animal 3 in the picture on the first display 15 with a bright color, such as yellow. The tablet computer 13.1 is configured so as, when the animal 3 is clicked on in the picture on the first display 15, to query the computer system 11 for further information about the animal 3. The computer system is configured to retrieve this information from a database 17. Further, the computer system 11 includes a second display 19 on which the video stream of the camera is shared. A veterinarian (not represented, but conventional) can co-view via the second display 19. The computer system 11 is configured for entering digital shadings on the second display 19. This can be carried out, for example, by implementing the display 19 as a touchscreen and entering a shading in the picture by touching of the display 19. The second display 19 may be set up fixedly. It is also possible that the computer system sends the same picture that is displayed on the first display to a mobile third display (not shown) to be displayed on the third display. This third display too may be carried by a third party, such as a veterinarian.
The computer system 11 is configured to send information about the shadings via the telecommunication network to the tablet computer 13.1. The tablet computer 13.1 is configured to show the shadings on the first display 15.
Figure 2 schematically shows a method 100 for providing information of a cow 3 which is provided with the smart tag 5.
In a first step 101, step a., the position of the smart tag 5 in the barn 9 is determined with the aid of the first position determining system 7. The first step 101 leads to a second step 102.
In the second step 102, step b., the identification code of the smart tag 5 is read out with the aid of the information readout system. The second step 102 leads to a third step 103.
In the third step 103, step c, the position and orientation of the tablet computer 13.1 in the barn 9 is determined with the aid of the second position determining system. The third step 103 leads to a fourth step 104.
In the fourth step 104, step d., with the aid of the camera of the tablet computer 13.1, the recording of the environment of the terminal is displayed on the first display 15 of the tablet computer 13.1. The fourth step 104 leads to a fifth step 105.
In the fifth step 105, step e., on the basis of the information obtained in steps a., b., and c, information is retrieved from the database 17 of the computer system 11 about the animal 3, in this example medical
information about the animal 3, that is visible in the recording. The fifth step 105 leads to a sixth step 106.
In the sixth step 106, step f., the information from the fifth step 105 is sent by the computer system 11 to the tablet computer 13.1 via the wireless telecommunication network. The sixth step 106 leads to a seventh step 107. In the seventh step 107, step g., by the tablet computer, a part of the picture on the first display in which the animal is present is highlighted. This provides to the farmer 14 the information that the animal has been identified. The seventh step 107 leads to an eighth step 108.
In the eighth step 108, it is determined whether the farmer 14, or user, has pressed or clicked on the highlighted part of the picture on the first display. If so, the eighth step 108 leads to a ninth step 109; if not, then the eighth step 108 leads to a tenth step 110. This tenth step, however, also may not be done.
In the ninth step 109, by the tablet computer 13.1, the received information about the animal 3 is added to the picture on the first display 15. In this example, the received information is acute medical information. The ninth step 109 leads to the tenth step 110, which, as said, does not need to be done.
In the tenth step 110, a veterinarian makes a shading, on the second display 19. The shading is then also represented on the first display 15. Such shading may be clicked away on the first or second display so that it does not appear on the first or second display anymore. The tenth step 110 leads to an eleventh step 111.
In the eleventh step 111, the farmer takes the temperature of the animal 3 with a thermometer on the tablet computer 13.1. This information is then sent by the tablet computer 13.1 to the computer system 11 and stored in the database 17.
In an optional twelfth step 112, the user removes, except for the highlight displayed over the video stream, all other information about the animal from the picture, for example by clicking outside this information or outside the highlight. The information returns when the user clicks on the highlighted area on the first display again.
Thus, there has been described a method and system for providing information of an animal, the animal being provided with a tag the position of which can be determined and an identification code of which can be read out. The method comprises step a. of determining the position of the tag with the aid of a first position determining system. The method comprises step b. of reading out the identification code of the tag with the aid of an information readout system. The method comprises step c. of determining the position and directional orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system. The method comprises step d. of displaying with the aid of a camera of the terminal a recording of an environment of the terminal on a first display of the terminal. The method comprises step e. of, on the basis of the information obtained in steps a., b. and c, retrieving information from a database of a computer system about at least an animal that is visible in the recording. The method comprises step f. of wirelessly sending the
information from step e. to the terminal. The method comprises step g. of the terminal adding received information about the animal to the picture on the first display.
In the foregoing, for example, a recognized animal was highlighted in the picture on the display 15. Also, after selecting of the highhghted animal, further information about the animal was displayed on the display 15. This information is, for example, displayed above the animal. Here, preferably, the highlighted part of the picture and the further information are coupled to the animal. If, for example, the position of the animal changes because the animal is walking, a position change of the tag will be detected by the first position determining system. As a result, an updated position can be determined, about what part of the picture on the display is to be highhghted and where in the picture the additional information is to be displayed.
Highlighting may be, for example, a circle around the position in the picture where the tag of the animal is present. So if the animal is walking, the position of the animal in the picture on the display 15 will change. The highlighted part and the further information that is displayed in the picture on the display will then move along with the animal and is in effect locked to the position of the animal. If the position of the animal in the picture on the display 15 is known, however, the tablet computer may also determine on the basis of pattern recognition software known per se what the contours of the animal are and specifically highlight only that part of the picture in which the animal is present. In this manner, therefore, the whole animal or, for example, just its head, is highlighted. This, then, is an alternative to the highlighted circle. Also if the position of the animal changes in that the animal is walking, in this case the highlighted (part of the) animal will move along with the animal in the picture on the display 15.
If the position and/or orientation of the tablet computer 13.1 changes, this will be detected by the second position determining means. This will generally also result in a new position of the animal in the picture on the display 15. Then too, entirely analogously to what has been discussed for the case where the animal walks, the highlighted part of the animal and the further information about the animal will move along with the animal. To put it differently, in the picture on the display 15, the position of the highlighted part of the animal relative to the animal and the position of the further information relative to the animal does not change if the position and/or orientation of the tablet computer 13.1 changes and/or if the position of the animal changes. This also applies to the picture on the display 19 because this is the same picture as on the display 15. It is noted that the orientation of the tablet computer 13.1 can be expressed in three angle coordinates. An example of this is the following. Suppose that the tablet computer 13.1 has a length axis L, then the orientation of the tablet computer 13.1 can be defined by a first angle al in the horizontal plane H that represents a direction of a perpendicular projection P of the length axis on the horizontal plane (for example, relative to the north N), a second angle a2 between the length axis and a vertical V, and a third rotation angle a3 of, for example, a (fictitious) flat plane Q of the tablet computer 13.1 around the length axis of the tablet computer 13.1 relative to a fictitious predetermined zero position (this zero position effectively being a predetermined position of the plane Q around the axis L) (see also Figure 4 in which also the west W is indicated).
What has been said above for the highlighted part of the animal and the display of the further information also applies to the above-mentioned shading applied at the animal. To put it differently, the position of the shading relative to the animal does not change in the picture on the display 15 and on the display 19 if the position and/or orientation of the tablet computer 13.1 changes and/or if the position of the tag changes in the real world. If the animal moves, the shading moves along with it.
That, for example, the position of the further information in the picture on the display does not change, may be realized as follows.
In particular, it holds that the first position determining system 7, 11 determines coordinates of the position of the tag in a first coordinate system and that the second position determining system 12 determines the coordinates of the position of the terminal 13 in a second coordinate system, wherein the second position determining system 12 is included in the terminal and wherein the terminal, in use, converts the coordinates of the position of the tag in the first coordinate system to coordinates of the position of the tag in the second coordinate system for carrying out step g. Both coordinate systems can utilize, for example, cartesian coordinates. Other coordinates, however, such as spherical coordinates or polar coordinates, are also possible. In particular, it holds here that conversion is carried out with a conversion algorithm which is executed by a processor of the terminal. For the purpose of conversion, for example, a calibration is carried out, wherein:
a. in the first coordinate system at least two positions are defined; b. the coordinates of the two positions expressed in the first
coordinate system are sent to the terminal; c. the terminal is brought to the first position, while at the first position, coordinates of the first position are determined in the second coordinate system with the terminal;
d. the terminal is brought to the second position, while at the second position, coordinates of the second position are determined in the second coordinate system with the terminal;
e. with the conversion algorithm, on the basis of the first position expressed in the first coordinate system, the first position expressed in the second coordinate system, the second position expressed in the first coordinate system, the second position expressed in the second coordinate system, parameters are determined to enable conversion of the coordinates of the position of the tag in the first coordinate system to the coordinates of the position of the tag in the second coordinate system for carrying out step g.
In particular, it further holds that the terminal 13 comprises an augmented reality device 13.2 (see Figure 3) such as a pair of reality glasses. An example of a pair of reality glasses is Microsoft's HoloLens. An augmented reality device has for its aim to partly bridge the gap between the digital and the analog world, inter aha by enabling digital 3D models to be displayed in the real space such as it is perceived by a user wearing the device. To this end, the illusion is created that the digital models effectively are at a particular distance from the user. This functionality is reminiscent of what are popularly called holograms. Reality glasses such as the
HoloLens look like good-sized sunglasses and are in effect a head-mounted display into which a computer has been incorporated. Such a device, like the Kinect, features a camera system which in addition to length and width also perceives depth in view. This camera has a field of view of, for example, 120° by 120°. In addition, the device can respond to gestures, spoken commands and/or eye movements.
For the purpose, among others, of processing all this input, and for the calculations that are needed for creating the holographic illusion, the computer of the device 13.2 is typically equipped with three types of processors: in addition to the traditional CPU and GPU, there is also a so-called 'Holographic Processing Unit' (HPU).
In this example, the device 13.2 may be communicatively connected with the tablet 13.1, the display 15 of the tablet being functionally replaced with the device 13.2. Within the context of this application, the first display may hence also be formed by an augmented reality device which generates a 3D picture to be perceived by the user 14. The first display 15 is hence a broad term. Also the second display 19 may be replaced with an augmented reality device which generates the same picture as the device 13.2. The term display 19 hence also comprises an augmented reality device.
In this example, the second position determining system 12 is also functionally included in the device 13.2 instead of in the tablet computer. The position determining system of the tablet computer is not used then.
The device 13.2 is worn by a user 14 such as a farmer 14 in the barn
9. The device 13.2 functionally comprises the camera 16. The camera of the tablet computer is not used then. The second position determining system 12 has an orientation sensor and a GPS unit. With the second position determining system, the device 13.2 determines the location and orientation of the device. The location can be expressed in cartesian coordinates and the orientation in three angles. An example of the three angle coordinates is the following (and the same as described above for the tablet 13.1). Suppose that the device has a fictitious length axis L, then the orientation of the device can be defined by a first angle al in the horizontal plane H which represents a direction of a projection P of the axis L perpendicular to the horizontal plane (for example, relative to the north N), a second angle a2 between the axis L and a vertical V, and a third rotation angle a3 of, for example, a (fictitious) flat plane Q of the device around the axis L of the device relative to a predetermined zero position of the flat plane Q (see Figure 4 in which also the west W is indicated). The tablet 13.1 then communicates with the computer system 11 as discussed before and sends information to be displayed, such as the earlier-mentioned highlighting of the animal, to the device 13.2. The user 14 can then point out the highlighted animal with a gesture so that the earlier-discussed further information is displayed in the real world which the user perceives with the device. This further
information may be displayed in 3D, for example, above the selected animal. A hologram is then displayed above the animal. If the animal starts to move, this is noticed by the first position determining system. The result is that the position of the animal expressed in the second position determining system will change along and that the hologram moves along with the animal. Also when a user 14 moves the second device so that the animal moves in the view of the user, the hologram will move along.
Also making the information invisible again can be initialized by the user 14 with gestures. If a veterinarian on the display 19 shades a part of an animal, such shading is also shown to the user by display in the picture of the real world he perceives. Also the position of the shading is locked to the animal in the picture generated by the device 13.2 and will hence move along with the animal if the position of the animal in the picture changes in that the position of the animal in the real world changes and/or in that the position and/or orientation of the device 13.2 changes.
With the device 13.2, therefore, the received information about the animal is virtually displayed in the real environment such as it is perceived by the user 14. In this case, it holds in particular that the first coordinate system is a two-dimensional system which is in a horizontal plane and that the second coordinate system is a three-dimensional system, two coordinates of which set up a horizontal plane, while the coordinates which indicate the position of the tag in the second coordinate system correspond to the coordinates that set up the horizontal plane in the second coordinate system. Here, it holds further, in particular, that the second algorithm is implemented as an app, which, in use, is executed by the terminal, in this example in the tablet computer 13.1.
For the sake of clarity and conciseness of the description, features have been described here as part of the same or of separate embodiments. It will be clear to those skilled in the art that embodiments comprising combinations of any or all of the described features also fall within the scope of protection of the invention. It is possible, for example, that the terminal only comprises the augmented reality device 13.2 which is also provided with the above-mentioned software (app) to perform the necessary
calculations which in the foregoing example were performed by the tablet computer 13.1. The device 13.2 is then in direct communicative connection with the computer system 11.
Within the purview of those skilled in the art, modifications are possible which are considered to be within the scope of protection. Also, all kinematic inversions are understood to be within the scope of protection of the present invention. Expressions such as "consisting of, when used in this description or the appended claims, should be construed not as an
exhaustive enumeration, but rather in an inclusive sense of "at least consisting of. Indications such as "a" or "one" must not be construed as a limitation to "only one", but have the meaning of "at least one" and do not exclude plurality. Expressions such as: "means for ..." should be read as: "component configured for ..." or "member constructed to ..." and should be construed to encompass all equivalents for the structures disclosed. The use of expressions such as: "critical", "advantageous", "preferred", "desired", etc., is not intended to limit the invention. Furthermore, features that are not specifically or explicitly described or claimed in the construction according to the invention, but are within the purview of the skilled person, may also be encompassed without departing from the scope of the invention, as determined by the claims.

Claims

1. A method for providing information of an animal, wherein the animal is provided with a tag the position of which can be determined and an identification code of which can be read out, the method comprising the following steps:
a. determining the position of the tag with the aid of a first position determining system;
b. reading out the identification code of the tag with the aid of an information readout system;
c. determining the position and orientation of a portable terminal such as a tablet and a mobile phone with the aid of a second position determining system;
d. with the aid of a camera of the terminal, displaying a recording of an environment of the terminal on a first display of the terminal;
e. on the basis of the information obtained in steps a., b. and c, retrieving information from a database of a computer system about at least an animal that is visible in the recording;
f. wirelessly sending the information from step e. to the terminal;
g. the terminal adding received information about the animal to the picture on the first display.
2. The method according to claim 1, characterized in that the
information which is displayed about the animal in step g. concerns the indicating of the animal as a sign that the position and identity of the animal has been established in steps a. and b.
3. The method according to claim 2, characterized in that the indication is executed in a manner other than with characters or signs.
4. The method according to claim 2 or 3, characterized in that the information which is displayed about the animal in step g. on the first display concerns the indicating of the animal by highlighting the animal as an indication that the position and identity of the animal has been
established in steps a. and b.
5. The method according to claim 2, 3 or 4, characterized in that in a step h., by clicking on the animal that is indicated in step g., additional information is displayed in the first display with the aid of text.
6. The method according to any one of the preceding claims,
characterized in that, with the aid of the terminal, information about the animal that is displayed is supplied to a computer system.
7. The method according to claim 6, characterized in that the
information which is supplied to the computer system is entered by a user of the terminal at the terminal.
8. The method according to claim 6 or 7, characterized in that the information which is supplied to the computer system is supplied by a sensor to the terminal or is measured by a sensor of the terminal.
9. The method according to claim 8, characterized in that with the aid of the sensor, a heartbeat, blood pressure and/or temperature of the animal is determined.
10. The method according to any one of the preceding claims,
characterized in that with the aid of the terminal, the picture of the animal obtained with the camera is supplied to a computer system.
11. The method according to claim 10, characterized in that the computer system comprises a second display which can be viewed by a third party such as a veterinarian.
12. The method according to claim 11, characterized in that by touching and/or with the aid of the second display, by the third party a marking can be placed on the picture on the first display, for example a circle or full hne around a particular part of the animal.
13. The method according to any one of the preceding claims,
characterized in that the picture on the first display, preferably in
combination with the information from step e., is displayed also on a second and/or a third display of a third party, while in particular the second display is a fixedly set-up display and/or the third display is a mobile display.
14. The method according to claim 11 or 12 and according to claim 13, characterized in that the second and third display is the same display.
15. The method according to any one of the preceding claims,
characterized in that in a step i. the identity of an animal to be found is entered in the computer system and that step f. is carried out if the entered identity of the animal corresponds to the identification code of the tag read out in step b.
16. The method according to claim 15, characterized in that the entering of the identity of the animal in step i. is carried out by a user with the aid of the terminal.
17. The method according to any one of the preceding claims,
characterized in that displaying in step d. concerns displaying a video stream.
18. The method according to any one of the preceding claims,
characterized in that the first position determining system determines coordinates of the position of the tag in a first coordinate system and that the second position determining system determines the coordinates of the position of the terminal in a second coordinate system, wherein the second position determining system is included in the terminal and wherein the terminal, in use, converts the coordinates of the position of the tag in the first coordinate system to coordinates of the position of the tag in the second coordinate system for carrying out step g.
19. The method according to claim 18, characterized in that converting is carried out with a conversion algorithm which is executed by a processor of the terminal.
20. The method according to claim 18 or 19, characterized in that beforehand and for the purpose of converting, a calibration is carried out, wherein:
a. in the first coordinate system at least two positions are defined; b. the coordinates of the two positions expressed in the first coordinate system are sent to the terminal;
c. the terminal is brought to the first position, while at the first position, coordinates of the first position are determined in the second coordinate system with the terminal;
d. the terminal is brought to the second position, while at the second position, coordinates of the second position are determined in the second coordinate system with the terminal; e. with the conversion algorithm, on the basis of the first position expressed in the first coordinate system, the first position expressed in the second coordinate system, the second position expressed in the first coordinate system, the second position expressed in the second coordinate system, parameters are determined for enabling conversion of the
coordinates of the position of the tag in the first coordinate system to coordinates of the position of the tag in the second coordinate system for carrying out step g.
21. The method according to any one of claims 18-20, characterized in that the first coordinate system is a two-dimensional system which is in a horizontal plane and that the second coordinate system is a three- dimensional system of which two coordinates set up a horizontal plane, while the coordinates which indicate the position of the tag in the second coordinate system correspond to the coordinates which set up the horizontal plane in the second coordinate system.
22. The method according to at least claim 19, characterized in that the second algorithm is implemented as an app which, in use, is executed by the terminal.
23. The method according to any one of the preceding claims,
characterized in that in step g. the information about the animal is displayed at a predetermined position of the animal in the picture so that in the picture on the first display the position of the information relative to the animal does not change if the position of the tag in the real world changes and/or if the position and/or orientation of the terminal in the real world changes.
24. The method according to any one of the preceding claims,
characterized in that for carrying out step g. the second position
determining system includes an orientation sensor, while on the basis of the position and orientation of the terminal as determined by the second position determining system in combination with the position of the tag in the real world, it is determined where in the picture on the first display of the terminal the information is positioned in the picture on the first display relative to the position of the tag.
25. The method according to claim 24, characterized in that the terminal includes an augmented reality device such as a pair of reality glasses, the augmented reality device comprising the first display and the second position determining system.
26. The method according to claim 25, characterized in that information about the animal is displayed in 3D in the picture.
27. A system for providing information of an animal, the system
including:
a position-determinable tag provided with a readable identification code, an animal being provided with the tag;
a first position determining system, the first position determining system being configured to determine the position of the tag;
an information readout system for reading out the identification code of the tag;
a second position determining system for determining the position and orientation of a portable terminal;
the portable terminal provided with a camera and a first display, the portable terminal being configured to display on the first display a recording with the camera of an environment of the terminal; and a computer system including a database for keeping information about the animal, wherein the system is configured to retrieve the
information about the animal that is visible in the recording from the database on the basis of the determined position of the tag and the identification code read out and for wirelessly sending the information to the terminal, and wherein the terminal is configured to add the received information about the animal to the picture on the first display.
28. The system according to claim 27, characterized in that the second position determining system comprises a GPS and orientation sensor which are each arranged at the portable terminal.
29. The system according to any one of claims 27-28, characterized in that the computer system comprises a second display which can be viewed by a third party such as a veterinarian, wherein the system is configured to show on the second display at least the recording of the first display, and wherein preferably the system is configured to add information about the animal to the picture on the second display.
30. The system according to claim 29, characterized in that the second display is part of a further terminal.
31. The system according to any one of claims 27-30, characterized in that the system is configured to communicate markings made on the second display by a user of the second display to the first portable terminal and wherein the first terminal is configured to represent the markings on the first display, the marking comprising, for example, a circle or full line around a particular part of the displayed animal.
32. The system according to any one of the preceding claims 27-31, characterized in that the system is configured such that, in use, the information about the animal is displayed at a predetermined position of the animal in the picture so that in the picture the position of the information relative to the animal does not change if the position of the tag in the real world changes and/or if the position and/or orientation of the terminal in the real world changes.
33. The system according to any one of the preceding claims 27-32, characterized in that the terminal comprises an augmented reality device which generates the picture for the user, while in the picture the
information about the animal is displayed in 3D, in particular as a hologram.
PCT/NL2018/050006 2017-01-04 2018-01-04 Method and system for providing information of an animal WO2018128542A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2018122 2017-01-04
NL2018122A NL2018122B1 (en) 2017-01-04 2017-01-04 Method and system for providing information from an animal.

Publications (1)

Publication Number Publication Date
WO2018128542A1 true WO2018128542A1 (en) 2018-07-12

Family

ID=57906958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2018/050006 WO2018128542A1 (en) 2017-01-04 2018-01-04 Method and system for providing information of an animal

Country Status (2)

Country Link
NL (1) NL2018122B1 (en)
WO (1) WO2018128542A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200025809A (en) * 2018-08-31 2020-03-10 노명진 Access control system of dog park using augmented reality and controling dog park access using the same
CZ309188B6 (en) * 2018-07-20 2022-04-27 Výzkumný Ústav Živočišné Výroby V.V.I. Locator of housed animals
NL2026683B1 (en) * 2020-10-15 2022-06-08 Nedap Nv Method of retrieving animal data from an animal management system, and computer program product therefore.

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012079107A2 (en) * 2010-12-15 2012-06-21 Mkw Electronics Gmbh Method for displaying a piece of information associated with an animal
NL2012882B1 (en) * 2014-05-23 2016-03-15 N V Nederlandsche Apparatenfabriek Nedap Farm system equipped with a portable unit.

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014030156A1 (en) * 2012-08-18 2014-02-27 Scr Engineers Ltd Cow retrieval system
WO2014197631A1 (en) * 2013-06-04 2014-12-11 Clicrweight, LLC Methods and systems for marking animals
WO2015034377A1 (en) * 2013-09-04 2015-03-12 Livestock Improvement Corporation Limited Farm management system and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012079107A2 (en) * 2010-12-15 2012-06-21 Mkw Electronics Gmbh Method for displaying a piece of information associated with an animal
NL2012882B1 (en) * 2014-05-23 2016-03-15 N V Nederlandsche Apparatenfabriek Nedap Farm system equipped with a portable unit.

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JOHANNES KARLSSON ET AL: "Augmented reality to enhance visitors experience in a digital zoo", MUM ;10 0 PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON MOBILE AND UBIQUITOUS MULTIMEDIA, ARTICLE NO. 20, ACM, LIMASSOL, CYPRUS, 1 December 2010 (2010-12-01), pages 1 - 4, XP058173292, ISBN: 978-1-4503-0424-5, DOI: 10.1145/1899475.1899482 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CZ309188B6 (en) * 2018-07-20 2022-04-27 Výzkumný Ústav Živočišné Výroby V.V.I. Locator of housed animals
KR20200025809A (en) * 2018-08-31 2020-03-10 노명진 Access control system of dog park using augmented reality and controling dog park access using the same
KR102142500B1 (en) * 2018-08-31 2020-08-11 주식회사 마이크젠바이옴 Access control system of dog park using augmented reality and controling dog park access using the same
NL2026683B1 (en) * 2020-10-15 2022-06-08 Nedap Nv Method of retrieving animal data from an animal management system, and computer program product therefore.

Also Published As

Publication number Publication date
NL2018122B1 (en) 2018-07-25

Similar Documents

Publication Publication Date Title
EP3165939B1 (en) Dynamically created and updated indoor positioning map
US10488659B2 (en) Apparatus, systems and methods for providing motion tracking using a personal viewing device
RU2740259C2 (en) Ultrasonic imaging sensor positioning
US20200380178A1 (en) Tracking safety conditions of an area
US20170193679A1 (en) Information processing apparatus and information processing method
JP7439844B2 (en) Terminal device, information processing device, information output method, information processing method, customer service support method and program
EP3716220B1 (en) Information processing device, information processing method, and information processing program
US20180315246A1 (en) Information processing device, information processing method, and program
RU2731306C2 (en) Baby tracking device
KR20110071210A (en) Apparatus and method for mixed reality content operation based on indoor and outdoor context awareness
WO2016098457A1 (en) Information processing device, information processing method, and program
JPWO2014016986A1 (en) 3D environment sharing system and 3D environment sharing method
WO2018128542A1 (en) Method and system for providing information of an animal
CN109117684A (en) System and method for the selective scanning in binocular augmented reality equipment
EP4235382A3 (en) System and method for placement of augmented reality information for users based on their activity
US10380468B2 (en) Proactive transmission of measured values to mobile devices
US11139070B2 (en) Medical information virtual reality server system, medical information virtual reality program, medical information virtual reality system, method of creating medical information virtual reality data, and medical information virtual reality data
JP2018068478A (en) Biological monitoring system, portable electronic apparatus, biological monitoring program, computer-readable recording medium, and biological monitoring method
KR20140046652A (en) Learning monitering device and method for monitering of learning
US20240047046A1 (en) Virtual augmentation of clinical care environments
KR20230084446A (en) System and Method for improving measurement accuracy of the momentum in a health care system
Shukla et al. Enhancing user navigation experience, object identification and surface depth detection for" low vision" with proposed electronic cane
EP3407168A1 (en) Determining full-body pose for a virtual reality environment
US20180082119A1 (en) System and method for remotely assisted user-orientation
JP2013016110A (en) Information providing system and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18700254

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18700254

Country of ref document: EP

Kind code of ref document: A1