WO2011154949A2 - Optical tracking system and method for herd management therewith - Google Patents

Optical tracking system and method for herd management therewith Download PDF

Info

Publication number
WO2011154949A2
WO2011154949A2 PCT/IL2011/000454 IL2011000454W WO2011154949A2 WO 2011154949 A2 WO2011154949 A2 WO 2011154949A2 IL 2011000454 W IL2011000454 W IL 2011000454W WO 2011154949 A2 WO2011154949 A2 WO 2011154949A2
Authority
WO
WIPO (PCT)
Prior art keywords
optical
tracking system
herd
optical signal
optical tracking
Prior art date
Application number
PCT/IL2011/000454
Other languages
French (fr)
Other versions
WO2011154949A3 (en
Inventor
Tomer Fruchtman
Yotam Raz
Original Assignee
Audhumbla Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audhumbla Ltd. filed Critical Audhumbla Ltd.
Publication of WO2011154949A2 publication Critical patent/WO2011154949A2/en
Publication of WO2011154949A3 publication Critical patent/WO2011154949A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/74Systems using reradiation of electromagnetic waves other than radio waves, e.g. IFF, i.e. identification of friend or foe
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images

Definitions

  • the present invention in some embodiments thereof, relates to a system and method for optically tracking moving objects within a defined space and more particularly, but not exclusively to an optical tracking system and method for herd management.
  • RTLS Real-time locating systems
  • badges or tags attached to or embedded in the objects and devices (readers) that receive wireless signals from these tags to determine their locations.
  • RTLS is based on barcodes attached to objects that can be read with a barcode reader or scanner. Barcode scanning typically provides close range tracking at specific locations, e.g. at checkpoint through which an object passes and where a reader is installed.
  • RFID radio frequency identification
  • RF radio frequency
  • RFID interrogation can potentially be performed over large ranges, electromagnetic interference inherently present in the surrounding environment and cluttering in the frequency bands typically used for transmission limit the tracking range that can be practically implemented.
  • RF interrogation can potentially be performed over large ranges, electromagnetic interference inherently present in the surrounding environment and cluttering in the frequency bands typically used for transmission limit the tracking range that can be practically implemented.
  • Known RTLS based on RF interrogation is also typically used at specific checkpoints where a reader is installed.
  • An image sensor having a plurality of pixels generates both video data responsive to light incident on a pixel from a respective portion of an image formed on the image sensor and a communication data signal responsive to an optical data signal incident on the pixel and emitted from an optical tag present in a corresponding part of the image.
  • the optical tag includes an optical modulator such as a LED or a modulated reflector.
  • the communication data signal is generated and sampled at a much higher frequency than the video data.
  • a digital video camera or an analog video camera can be used for lower bandwidth communication of a communication data signal.
  • an additional software component is used to separate video frame data into a video stream and a data stream. The data stream is determined by comparing the intensity value from each pixel to a threshold value to determine whether an optical bit is present during a video frame.
  • Imager the contents of which is incorporated by reference in its entirety, describes a method and an optical communication processor for receiving communications data in an image formed on a plurality of pixels.
  • the method includes receiving active pixel information, identifying which of the pixels are receiving communications data and retrieving communication data from each of the pixels identified by the active pixel information.
  • An active source present in the field of view of the detector generates an intensity that is greater than a known intensity resulting from the background image.
  • the data threshold module samples the electrical signal generated by the detector according to a frequency of a data clock, e.g. at around 1 GHz or more. If the sampled electrical signal exceeds a predetermined threshold value, a logic "1" bit is generated by the data threshold module and stored in the pixel buffer otherwise a logic "0" bit is stored.
  • an asynchronous optical tracking system and method for simultaneously tracking and identifying a plurality of objects within a defined area subjected to variable lighting conditions According to some embodiments of the present invention, one or more optical tags positioned on the objects communicate modulated optical signals to one or more image sensors in outdoor lighting conditions.
  • the optical tracking system and method additionally provide for monitoring behavior aspects of individuals in the herd as compared to determined behavior aspects of the herd and/or expected behavior of the individual.
  • any individuals straying from the determined behavior aspects of the herd are identified and reported.
  • An aspect of some embodiments of the present invention provides an optical tracking system comprising: at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light; at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
  • the at least one optical emitter and the at least one camera are asynchronous.
  • the at least one optical emitter is adapted to emit light in a near infrared wavelength.
  • the near infrared wavelength is between 800-1000 nanometers.
  • the at least one optical emitter includes a filter adapted to transmit light over a selected bandwidth.
  • the modulated pattern of pulsed light provides an information code.
  • the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
  • the bit is defined based on a trend in intensity of pixels neighboring pixels with saturated output.
  • each trend is determined over at least four contiguous frames.
  • the information code is a digital code.
  • the trend in intensity is selected from a group including: an increasing trend in intensity and a decreasing trend in intensity.
  • the trend is defined by a slope of increasing or decreasing intensity.
  • the processing unit is adapted to generate a new stream of frames constructed by subtracting pixel values from pairs of contiguous frames, wherein pixel values of image frames in the new stream include both positive and negative pixel values.
  • the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by pixel values that have a same sign (positive or negative) over a pre-defined number of image frames.
  • the information code is an identification code.
  • the information code is information received from a sensor in communication with the at least one optical emitter.
  • the optical tracking system provides a bit rate of 15-30 bits per second.
  • the pattern of pulses is modulated by at least one of pulse width modulation and frequency modulation.
  • a frequency of the pulsing is at least four times faster than a frame capture rate of the camera.
  • the exposure period for capturing frames in the stream is between 10 ⁇ and 10 msec.
  • the pre-defined areas spans 100-2500 square meters.
  • the optical tracking system is adapted to tracking herd animals housed in the pre-defined area.
  • the herd animal is a cow.
  • An aspect of some embodiments of the present invention provides a method for optical tracking, the method comprising: emitting an optical signal including a modulated pattern of pulsed light from an optical tag; capturing radiation from the optical signal over a stream of captured image frames; and identifying an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
  • the emitting and the capturing is asynchronous.
  • the optical signal is emitted in the near infrared range.
  • the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
  • the optical signal is adapted to be emitted and received in an area exposed to outdoor lighting and temperature conditions.
  • An aspect of some embodiments of the present invention provides an optical tracking system comprising: at least one optical tag adapted to transmit an optical signal in a range between 800-1000 nanometers; at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10 msec; and a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
  • the at least one optical tag transmits an optical signal in a selected sub-range of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
  • the at least one camera includes a plurality of filters, each filter adapted to filter radiation received from the optical signal over a different sub-range of the range.
  • the plurality of filters is applied sequentially over different image frames.
  • the plurality of filters is applied to different pixels in the same image frame.
  • the processing unit is adapted to determine the sub-range of the input and to identify the optical tag based on the sub-range.
  • the optical tag includes a reflector and wherein the transmitted optical signal is an optical signal reflected from a light source.
  • the reflector is a retro-reflector.
  • the reflector includes a filter adapted to filter light received from the light source over a sub-range of the range.
  • the filter is a passive element.
  • the camera is adapted to periodically capture an image frame using an exposure period longer than 50 msec.
  • the at least one optical tag includes at least one optical emitter adapted to emit pulsed light.
  • the processing unit adapted to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream.
  • the at least one optical tag and the at least one camera are asynchronous.
  • modulation of pulsing of the pulsed light provides an information code.
  • the information code includes a plurality of bits of information, wherein each bit is defined by a trend in intensity of defined over a pre-defined number of frames.
  • An aspect of some embodiments of the present invention provides a method for optical tracking, the method comprising: transmitting an optical signal in a range between 800-1000 nanometers with an optical tag; capturing radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames over an exposure period ranging between 1-10 msec; and identifying coordinates on the image frames where input was received from the optical signal; and determining a location of the optical tag based on the identified coordinates.
  • the optical signal is emitted in a selected sub-range of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
  • the optical tracking system is adapted for tracking herd animals housed in an enclosure exposed to outdoor lighting and temperature conditions.
  • An aspect of some embodiments of the present invention provides a herd management system for managing a herd housed in an enclosure comprising: a real time locating system operable to track position and changes in position of a plurality of herd animals housed in an enclosure; a group analysis unit adapted to identify one or more groups of the herd animals within the enclosure based on input received from the real time locating system, wherein the one or more groups is defined based on a proximity between the herd animals tracked by the real time locating system and to track positions and movement of the one or more groups; and wherein group analysis unit is adapted to determine a behavioral pattern of the herd animals based on input received from the real time locating system and position and movement of the one or more groups; and an output device adapted to report a behavioral pattern of the herd animals.
  • the real time locating system is adapted to determine identity of each of the herd animals that are tracked.
  • the group analysis unit is adapted to track a direction of movement of the one or more groups and to identify a herd animal leading the one or more groups in the direction of movement.
  • the group analysis unit is adapted to identify straying of a herd animal from a group.
  • the group analysis unit is adapted to identify a socially rejected cow.
  • the group analysis unit is adapted to identify gathering of the herd animals in specific locations in the enclosure.
  • the group analysis unit is adapted to identify position of individual cows in the herd.
  • the group analysis unit is adapted to learn a typical behavioral pattern of the one or more groups and to identify an atypical behavioral pattern.
  • the real time locating system comprises:
  • At least one optical tag adapted to transmit an optical signal in a range between 850-950 nanometers; at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10 ' msec; and a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
  • the real time locating system comprises: at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light; at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
  • FIG. 1 is an exemplary schematic illustration of an optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention
  • FIG. 2 is an exemplary block diagram of an optical tracking system in accordance with some embodiments of the present invention.
  • FIGs. 3A, 3B and 3C are exemplary block diagrams of tags including an optical emitter in accordance with some embodiments of the present invention.
  • FIGs. 4A and 4B are simplified waveform diagram of light pulses emitted by two exemplary optical tags over three exposure periods of a camera in accordance with some embodiments of the present invention
  • FIGs. 4C and 4D are simplified graphs of actual and captured intensity levels as provided by an optical tag over three exposure periods in accordance with some embodiments of the present invention.
  • FIGs. 5A, 5B and 5C are exemplary simplified encoded patterns of outputs obtained from image streams in accordance with some embodiments of the present invention.
  • FIGs. 6A and 6B is an exemplary simplified encoded pattern of outputs obtained from an original image stream and from an image stream constructed by subtracting pixel values from contiguous image frames in accordance with some embodiments of the present invention
  • FIG. 7 is a simplified flow chart of an exemplary method for tracking an optical tag and decoding information received from an optical tag in accordance with some embodiments of the present invention
  • FIG. 8 is a simplified schematic illustration of identified data sites in a portion of an image frame in accordance with some embodiments of the present invention.
  • FIG. 9 is a simplified flow chart of an exemplary method for decoding information received from an optical tag by analyzing output from pixels surrounding saturated pixels in accordance with some embodiments of the present invention.
  • FIG. 10 is an exemplary schematic illustration of an alternate optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention
  • FIGs. 11 A, 11B and 11C are exemplary schematic diagrams of tags including a reflector in accordance with some embodiments of the present invention
  • FIG. 12 is a simplified flow chart of an exemplary method for tracking and identifying an optical tag reflecting received light in accordance with some embodiments of the present invention.
  • FIGs. 13A and 13B are exemplary schematic illustrations of distribution patterns of a herd identified by a herd management system in accordance with some embodiments of the present invention.
  • FIG. 14 is an exemplary schematic illustration of atypical behavior of a herd animal identified by a herd management system in accordance with some embodiments of the present invention.
  • FIG. 15 is an exemplary schematic illustration of a behavior pattern of specific animals of a herd animal identified by a herd management system in accordance with some embodiments of the present invention.
  • FIG. 16 is an exemplary schematic illustration of a behavior pattern of herd animals at a specific location identified by a herd management system in accordance with some embodiments of the present invention.
  • FIG. 17 is a simplified flow chart of an exemplary method for identifying atypical behavior of animals in a herd in accordance with some embodiments of the present invention.
  • FIG. 18 is a simplified block diagram of an exemplary herd management system in accordance with some embodiments of the present invention.
  • the present invention in some embodiments thereof, relates to a system and method for optically tracking moving objects within a defined space and more particularly, but not exclusively to an optical tracking system and method for herd management.
  • An aspect of some embodiments of the present invention provides for an optical tracking system that simultaneously tracks a plurality of asynchronous optical tags positioned on objects that are housed in a defined space subjected to outdoor lighting conditions.
  • the optical tracking system is a RTLS.
  • the optical tracking system includes a plurality of optical tags, each tag operable to emit and/or transmit a modulated optical signal, one or more video cameras for capturing a video stream of the optical signal transmitted by the tags, and a processor for decoding and processing the modulated optical signal captured by the video camera.
  • the optical tracking system is operable to track herd animals, e.g. cows housed in a shed, in an enclosure e.g. a roofed cow shed or other enclosure.
  • an optical tag is positioned on each animal and one or more video cameras are positioned over the shed, e.g. under the roof of the shed, to track positioning of the animals in the shed.
  • the modulated signal transmitted by tag encodes information used to identify the tagged animal and/or provide information with regard to the tagged animal.
  • a video camera is positioned at a distance between 3-10 meters above or up to 50 meters away from the optical tags, e.g. to cover a space defined by 10 to 60 meter diameter.
  • optical tags can be detected in a shed covering about 30 meter diameter with a resolution of about 10-15 cm using a single video camera.
  • the video camera captures images of the defined space at a rate ranging between 40-80 frames per second, e.g. 60 frames per second.
  • the camera captures a frame over a short exposure period, e.g. between 15-50 ⁇ or up to 1 msec.
  • the camera captures a frame over an exposure period between 10 ⁇ and 200 msec.
  • shorter exposure periods are used to reduce effect of sunlight or other sources of constant light on the output of the camera.
  • exposure periods are varied over different parts of the day, e.g. longer exposure periods at night and shorter exposure periods during daylight.
  • the optical tags include one or more LEDs (light emitting diodes) that provide an optical signal in a NIR (near infrared) range, e.g. between 850-950 nm wavelength, 850-980 nm wavelength and/or 850-1000 nm wavelength and controlled by a tag controller integrated in the tag.
  • NIR near infrared
  • the present inventors have found that it is advantageous to provide optical signals in a range between 850-950 nm when tracking optical signals with a video camera in outdoor lighting conditions.
  • the present inventors have found that when using optical filtering optical signals ranging between 850-950 nm can be detected in strong sun light conditions at distances ranging between 3-50 meters, using off the shelf camera.
  • the tag controller pulses the one or more LEDs at a defined rate and fixed intensity and the modulation is provided by ON / OFF pulsing of the LEDs over a defined period, e.g. pulse width modulation or frequency modulation.
  • the rate of pulsing is defined to be significantly higher than the frame rate of the camera and the intensity level of the optical signal captured in each frame is controUably modulated by altering the number of ON pulses per frame capture period.
  • the pulsing frequency ranges between 0.2-66 KHz, e.g. 33KHz.
  • an optical tracking system processing unit receives a stream of images, e.g. image frames from the video camera, determines a location on the image frames of each of the optical signals captured and decodes information provided by each of optical signals by detecting trends in intensity levels in each of the determined location as detected over a plurality of frames.
  • information is provided by a plurality of bits where each bit is defined by a trend.
  • binary encoding is used.
  • each digit of the binary code e.g. logical ⁇ ' and logical T, is defined by a specified trend so that obstruction of the optical signal may be confused with input provided by the signal.
  • a downward trend of intensity over a pre-defined number of frames is defined a logic '0' bit while an upward trend of intensity over the pre-defined number of frames is defined as a logic T bit.
  • additional trends are defined and used to represent additional symbols.
  • at least three data points are used to define a trend, e.g. to define each bit of information.
  • an optical tracking system processing unit receives a stream of images, e.g. image frames from the video camera, determines a location and/or area on the image frames where emission from the optical signals is captured and determines any of the optical signals provide predefined code and/or predefined trend.
  • the present inventors have found when operating in variable lighting conditions, e.g. outdoor lighting conditions it is advantageous to decode information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames as opposed to traditional methods of detecting and decoding the optical signal based on an absolute threshold level of intensity.
  • the present inventors have found that traditional methods of detecting and decoding the optical signal based on an absolute threshold level of intensity can be problematic in outdoor lighting conditions, since the background light varies significantly over the coarse of the day and in response to different weather conditions or other disturbances.
  • decoding based on relative intensities in neighboring frames changes in the surrounding lighting conditions are typically insignificant over the time period for detecting each bit or trend and inaccuracies related to long term changes in lighting conditions, e.g. occurring over a span of a few hours can be avoided.
  • decoding information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames avoids inaccuracies occurring due to variations in LED performance in response to changes in temperature which can be significant in outdoor conditions.
  • decoding based on relative intensities in neighboring frames changes in the outdoor temperature are typically insignificant over the time period for detecting each bit.
  • One other advantage in decoding information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames is that since the trend is identified over a plurality of frames (as opposed to a single frame), if one of the data points in the trend is missed due to temporary obstruction in the field of view, e.g. due to a bird flying past the video camera, the trend may still be recognized.
  • the optical tracking system processing unit receives and processes a video image stream constructed by subtracting corresponding pixel values of consecutive pairs of neighboring image frames in an original video stream captured by the video camera.
  • the subtracted images provides for eliminating much of the noise originating from light sources other than the optical signals and increases the SNR of the optical signal.
  • much of the input from sources other than the optical signals e.g. from the cows in the shed and/or from sun light, are eliminated and/or reduced when subtracting neighboring pairs of images, since their variation per frame capture, e.g. due to changes in lighting conditions or due to movement of the cows, is relative low.
  • a frame rate of 60 frames per second is used, an upward and/or downward trend can be detected over four or more frames providing a communication rate of about 5-15 bits per second depending how many samples are required to identify the bit.
  • a cow may move during transmission of identification code so that the location of pixels providing an identification code of the tag changes during transmission.
  • an interest region surrounding the location of pixels providing an identification code is defined to cover expected movement during transmission and output of all the pixels in the interest region is examined during decoding.
  • an average value of pixels in the interest region is examined for decoding.
  • high/low values within the interest regions are detected and used for decoding.
  • history tracking is used to track location of the pixels receiving the identification code.
  • predetermined information regarding the possible rate of movement and/or minimum distances between optical tags is used for history tracking.
  • one or more optical tags include a narrow band filter and modulation of the optical signal is provided by wavelength modulation.
  • optical filters with a 10-40 nm pass within the defined range between 850-950nm is used for encoding.
  • each optical tag includes a plurality of LEDs, with different filters for increasing the number of combinations possible. The LEDs with the different filters in each tag may be synchronized to emit simultaneously so as to be detected in a single frame and/or consecutively for detection over a plurality of frames.
  • optical tags may include both wavelength modulation capability and intensity modulation capability, e.g. based on pulse width and/or pulse frequency modulation, and decoding maybe based on both.
  • wavelength modulation is detected with a plurality of filters associated with the camera, e.g. similar to RGB filters in a colored video.
  • one or more optical tags include a reflector, e.g. retro-reflectors adapted to reflect light to one or more video cameras.
  • a light source associated with one or more video cameras and/or positioned in the vicinity of the video camera emits light toward the reflectors.
  • the light source emits light in the NIR range, e.g. between 850-1000 nm wavelength.
  • the optical tags include optical filters and only a wavelength band specified by the optical filter is reflected off the reflectors.
  • the filters are narrow band filters.
  • the video camera is equipped with corresponding filters for identifying the wavelength band(s) of the reflected light.
  • a liquid crystal shutter is used to pulse the reflected light.
  • An aspect of some embodiments of the present invention provides for a herd management system and method for tracking behavioral aspects of a herd moving within a defined space and for identifying and reporting pre-defined behaviors of the herd and/or pre-defined behavioral changes in the herd.
  • the present inventors have found that tracking behavior of the herd as a group can provide valuable information regarding the conditions in a shed as well as the well-being of the animals.
  • gathering of the herd in a particular part of the shed over an extended period of time e.g. a few hours and/or evacuation of a particular part of the shed over an extended period of time is identified and reported.
  • the present inventors have found that information regarding atypical positioning of the herd in particular parts of the shed may provide indication that parts of the shed may lack suitable conditions, e.g. not enough shade, too much wind, noisy or muddy or alternatively that favorable conditions exist in particular parts of the shed. By reporting such occurrences, the farmer can identify problematic conditions and introduce changes. The effect of the changes introduced can be monitored by the heard management system.
  • changes in the general activity level of the herd is monitored by herd management system and can be used indicate a change in the comfort level of the animals.
  • any changes from a defined norm is tracked and reported. For example a sudden movement of the herd may indicate an abnormal event.
  • the herd management system and method is operable to track behavior of individuals as compared to the group.
  • the optical tracking system is operable to track positions of animals within the space and a herd management unit defines one or more groups based on proximity between the animals.
  • individual animals that stray from the rest of the herd are identified and reported. Straying of an animal may indicate a poor health condition that requires medical attention and/or may indicate that a cow is socially rejected.
  • animals within the group that exhibit a particular pre-defined behavior are identified and reported.
  • animals that tend to lead the herd are identified and reported.
  • animals that refrain from approaching eating area and/or are obstructed from approaching the food when food is served are identified and reported.
  • the farmer is interested in identifying the leaders of the herd for general herd management.
  • optical tracking system 100 includes one or more optical tags 50 adapted to be fixed on animals 20.
  • each optical tag 50 emits an optical signal captured by one or more cameras 80 viewing the defined area.
  • cameras 80 include a CCD or CMOS image sensor.
  • output from each of cameras 80 is transmitted to a local processing unit 76 and then to a central processing unit 70 both of which processes information emitted by optical tags 50 and captured by cameras 80.
  • local processing unit 76 includes a local pre-processing unit and a local tracking unit for processing and tracking input to each of cameras 80.
  • optical tracking system 100 tracks between 1-500, e.g. 5-50, 5-100, or 5-200 optical tags in one defined area.
  • central processing unit 70 and/or local processing unit 76 identifies and tracks position and movement of tags 50 within the defined area based on input received from cameras 80 and provides related output to one or more output devices 72 and/or a controllers 73 operative to control a device associated with the animals, e.g. a controller for operating a gateway, fans, and an automatic feeder.
  • Communication between central processing unit 70, local processing units 76, cameras 80, output device 72 and controller 73 may be tethered and/or wireless communication.
  • location of cameras 80 are registered and calibrated, so that local processing units 76 and/or central processing unit 70 is able to associate pixels in the images produced by each of the cameras with specific location coordinates in the pre-defined area defined by enclosure 85.
  • identification and tracking of all tags 50 within the defined area is performed simultaneously.
  • optical tracking system 100 is adapted and used to track animals and/or other objects in a defined area confined by an enclosure 85.
  • the defined area is roofed, partially roofed and/or sheltered with a covering 60.
  • optical tracking system 100 is adapted to be used in outdoor lighting and temperature conditions.
  • the defined area is not enclosed and/or roofed.
  • tags 50 are positioned on animals 20, e.g. on the animals back or head, so that output from tags 50 can be captured by camera(s) 80 generally positioned over the defined area, e.g. area enclosed by enclosure 85.
  • cameras 80 are positioned under covering 60, e.g. directly under covering 60 and/or associated support beams at a height of about 3-10 meters above the ground and/or tag 50, e.g. 5-8 meters over the ground with a field of view that is generally directed downwards.
  • cameras 80 have an angle of view of between 60-150 degrees, e.g. 110-135, or 135 degrees and each view a defined area spanning 8-50 meters wide, e.g. 64-2500 square meters or 100 square meters.
  • an area of about 2500 square meters houses approximately 50- 150 animals, e.g. 100 animals.
  • cameras 80 are video cameras, e.g. digital video cameras.
  • cameras 80 include an image sensor having a resolution of 640x480 pixels.
  • cameras 80 have a resolution matched for locating and tracking optical tags 50 with a resolution of about 5-15 cm, e.g. 10 cm. Higher resolution can be achieved by reducing the area covered by the cameras field of view and/or by using a higher resolution camera.
  • the present inventors have found that a resolution of about 10 cm is typically adequate for tracking and identifying herd animals, e.g. cows, sheep and goats.
  • a black and white video camera e.g. a video camera without RGB filters is used.
  • NIR filter video camera with an NIR filter is used.
  • cameras 80 including an NIR filters are operated using short exposure periods ranging between 1-10 msec, defined to provide enough time to capture optical signal from tags 50 but not enough time to capture background details.
  • cameras 80 are operated using much shorter exposure periods, e.g. 10-50 ⁇ or up to 1 msec.
  • the camera captures a frame over an exposure period between 10 ⁇ and 200 msec.
  • the cameras are periodically operated at a longer exposure period, e.g.
  • the still images are used for security surveillance and/or for verification of output provided by optical data.
  • optical tags (50) includes at least one optical emitter, e.g. at least one Light Emitting Diode (LED) that pulses light in a NIR range, e.g. light between 700-1000 nanometer, 850-940 nanometers or 880-950 nanometers.
  • LED Light Emitting Diode
  • the optical emitter is pulsed at a frequency that is significantly higher than the exposure period of the camera.
  • the optical emitter is pulsed at rate of 1-10 KHz, e.g. 10 KHz.
  • tag(s) 50 emits a modulated signal by controlling ON/OFF pulsing pattern of its optical emitter(s).
  • ON/OFF pulsing is used to modulate intensity captured by cameras 80 over a plurality of exposure periods.
  • the optical signal carries information such as identification information and/or information received from one or more sensors.
  • the optical signal is transmitted over a time period within which animals 20 can be in motion, e.g. over a time frame of few seconds and is repeated and/or updated periodically.
  • the optical signal e.g. the pulsed signal is repeated and/or updated continuously.
  • local processing units 76 receives a stream of image frames from cameras 80, identify outputs received by tag(s) 50 on the image frames, track position and movement of the tags identified in the image stream and decode information transmitted by the tags over time.
  • decoding is performed in central processing unit 70.
  • optical tracking unit 100 includes light sources 99 associated with cameras 80 adapted to emit light in a direction of a field of associated camera 80.
  • light emitted by light sources 99 is picked up by one or more light sensors included in optical tags 50.
  • input received by optical tags 50 from light sources 99 is used as a signal for operating and/or triggering emission of optical tags 50.
  • light source 99 emits light at a wavelength other than the wavelength emitted by optical tags 50 and/or captured by cameras 80.
  • light picked up by optical tags 50 is used by optical tag 50 to determine a relative positioning of camera 80 and thereby select its direction of illumination.
  • one or more cameras 80 capture a stream of image frames of a field of view in which one or more tag(s) 50 are transmitting optical signals.
  • tags transmit optical signals in a field of view of each camera.
  • cameras 80 include a wide angle lens 82.
  • cameras 80 include one or more filters 83 for selecting one or more NIR bands.
  • filters selecting specific bands of NIR light are used in place of color filters typically used in video cameras.
  • output from cameras 80 is transmitted to local processing unit 76 typically including at least image pre-processing units 74 and local tracking units 75 for processing output from each camera 80.
  • Output from local processing unit 76 is transmitted to a central processing unit 70.
  • central processing unit includes a processor 71 and a system controller 77. Information received by processing unit 70 is used by system controller 77 to control operation of the optical tracking system 100.
  • pre-processing unit 74 performs image processing on the captured video stream prior to analyzing information captured.
  • pre-processing units 74 transform the sampled video stream from each camera into a new video stream representing differences between contiguous frames. For example, if each frame in a video stream is represented by a matrix of pixel values, a third frame in the new video stream is obtained by subtracting a matrix of pixel values of a third frame (from the original video stream) from a matrix of pixel values of a fourth frame (from the original video stream).
  • the new video stream only includes differences between contiguous frames above a defined threshold.
  • pre-processing unit 74 identifies location on frames where an optical signal was received and crops non relevant regions, e.g. regions with no optical signal.
  • pre-processing unit compresses information received by cameras 80 prior to transmitting to a central processing unit.
  • central processing unit 70 integrates information from the plurality of cameras 80.
  • each local tracking unit 75 operates to identify location and track movement of each optical tag transmitting within the field of view of a camera 80.
  • each local tracking unit 75 is also operative to decode optical signal received from tags 50, e.g. to decode an identification code transmitted by tag 50 and to associate each tag 50 with the information included in its optical signal.
  • decoding is performed in central processing unit 70, e.g. with processor 71.
  • output from the central processing unit 70 is reported to a reporting device (an output device) 72, e.g. a printer or a screen and/or recorded in a storage device 78.
  • a reporting device an output device
  • reporting and storing is controlled by system controller 77 and by input received from a user via an input device 79.
  • system controller 77 sends commands to one or more controllers 73 controlling one or more devices used for managing the tagged animals.
  • controller 73 controls operation of cameras 80.
  • controller 73 based on information received about the presence of a pre-defined number of optical tags in a specific area and/or the presence of a specific tag in a specific location, controller 73 operates (or controls operation of) a gate, a feeder, a fan, a sprinkler, a station in a milking parlor, and an alarm.
  • controller 73 provides information to system controller 77 regarding a status of one or more devices and/or receives input from one or more sensors sensing a condition affecting management of the tagged animals, e.g. an environmental condition.
  • information and/or tasks required from optical tracking system 100 are selected by user input 79.
  • system controller 77 controls operation of local processing unit 76 and/or cameras 80.
  • local tracking unit 75 and pre-processing unit 74 are shown as discrete units, functionality of both local tracking unit 75 and pre-processing unit 74 can be provided within a single processing unit.
  • more than one camera is associated with a single pre-processing unit and/or local tracking unit.
  • each of tags 50, 50' and 50" include a power source 52 such as a battery, at least one optical emitter 53, e.g. LED and a controller 54.
  • controller 54 controls pulsing, e.g. ON/OFF pulsing of optical emitter 53.
  • controller 54 additionally (or alternatively) controls selection of the optical emitters 53 to be operated.
  • controller 54 is associated with memory for storing parameters of operation and/or coding of ON/OFF pulsing.
  • each tag 50' includes more than one optical emitter 53 operating at a different wavelength that transmit simultaneously or consecutively, e.g. based on a defined pattern.
  • controller 54 of tag 50' receives input from one or more sensors associated with the object or animal being tracked by tag 50'.
  • receiver is a tethered receiver.
  • input is received by wireless communication with an antenna 58 and receiver 56.
  • antenna 58 provides for short distance reception, e.g. using blue tooth communication.
  • input received by receiver 56 is used to define operation of tag 50' and/or output of one or more optical emitters 53.
  • input received by receiver 56 is coded by controller 54 and outputted by one or more optical emitters 53.
  • receiver 56 receives input from a user input device to control operation of tag 50'.
  • each optical tag 50" includes a plurality of light emitting elements 53, e.g. LEDs each directed at different angles to provide a wider range of illumination.
  • optical tag 50" additionally includes a light emitter 59 that is centrally positioned and illuminates at a wide angle, e.g. 120 degrees.
  • each emitter 53 has a relatively narrow angle of illumination, e.g. an LED with a 20 degree angle of illumination and is angled at 20-50 degree angle off surface 555, e.g. 40 degrees from surface 555.
  • optical tag 50" includes one or more light sensors 57, e.g. photodiodes for sensing a light source directed from a vicinity of camera 80 and for determining location of camera 80 with respect to optical tag 50".
  • controller 54 in response to detecting light on one or more light sensors 57, selectively operates emitters 53 and/or 59 illuminating in the direction of the sensed light.
  • FIG. 4A-4B showing simplified waveform diagrams of light pulses emitted by an optical tag over three exposure periods of a camera and FIG. 4C showing a simplified graph of actual and captured intensity levels as provided by a tag over the three exposure periods, both in accordance with some embodiments of the present invention.
  • encoding by intensity modulation is provided by modulating a frequency of ON pulses 150 over different exposure periods, e.g. exposure periods 181, 182 and 183 of the camera.
  • the tag is ON/OFF pulsed at a much higher rate than the frame capture rate and the intensity level captured by the camera is related to the number of pulses 150 of the tag.
  • the tag is pulsed every 0.05-0.5 msec, e.g. 0.03 msec or 0.1 msec, while duration of exposure for each frame is between 1-5 msec, e.g. 2 msec as shown in FIG. 4 A. Exposure is repeated approximately every 17 msec for a camera capturing at a rate of 60 frames per second.
  • encoding by intensity modulation is provided by Pulse Width Modulation (PWM) of pulses 155 and/or by modulating duration of ON pulsing of pulses 150.
  • PWM Pulse Width Modulation
  • a width of an ON pulse (or a number of contiguous ON pulses 50) is modulated over a plurality of exposure periods.
  • cycle duration 1-5 msec, e.g. 1 msec is used.
  • the actual intensity level 180 provided by ON/OFF pulsing as well as the intensity level captured per frame, e.g. over exposure periods 181, 182, 183 is shown to increase as the number of ON pulses 150 increases.
  • the actual intensity level 180 is modulated gradually over a plurality of exposure periods and typically at a higher frequency than the frame capture frequency.
  • a trend of increasing intensity 180 (or alternatively a trend of decreasing intensity) may be typically recognized over 2 or more frames, e.g. 3-4 frames.
  • optical tracking system is an asynchronous system so that pulsing of optical tags 50 is not synchronized with exposure periods, e.g. exposure periods 181, 182 and 183 of the camera, e.g. camera 80. Due to lack of synchronization, the number of ON pulses 150 that will occur over a specific exposure period (and thereby the intensity level over that exposure period) may not be pre-defined and will be a function of alignment between pulsing and exposure.
  • a point in time over which frames 1, 2, and 3 are captured may shift to the left (or right) depending on the start-up condition of the system, e.g. the alignment between pulsing of the optical tags and exposure of the camera.
  • shifting may affect intensity levels captured during each exposure periods 181, 182 and 183, the trend in intensity level over the three capture periods is can still be detected.
  • the exposure periods of cameras 80 are synchronized, e.g. cameras 80 are synchronized to simultaneously capture images of the defined area or synchronized to capture images of a same field of view in a sequential manner to increase the bit rate achievable.
  • cameras 80 are not synchronized.
  • encoding is provided by gradually modulating a frequency or width of ON pulsing so that that a trend in intensity levels can be identified over a plurality of frames.
  • the present inventors have found that encoding by trends as opposed to defined levels in intensity can be used to overcome ambiguity (or reduce noise) due to changes in outdoor light conditions and/or background lighting conditions. Additionally, the present inventors have found that encoding by trends can overcome ambiguity due to lack of synchronization between the cameras and the optical tags.
  • FIGS. 5A, 5B and 5C showing exemplary simplified encoded patterns of outputs obtained from image streams in accordance with some embodiments of the present invention. Referring now to FIG.
  • digital encoding is used to transmit information from optical tag 50 to camera 80.
  • a trend of increasing intensity 310 over a plurality of exposure periods encodes a logical T digit while a trend in decreasing intensity 320 over a plurality of exposure periods (frames) encodes a logical ⁇ ' digit.
  • any upward trend defining a slope above a pre-defined first threshold is identified as a logical ' while any downward trend below a pre-defined second threshold is defined as a logical ⁇ '.
  • each bit is defined over three exposure periods, e.g. as shown in FIGS. 5A and 5C.
  • the optical tracking system is not synchronized, defining the trend over at least three exposure periods (or more) avoids ambiguity in situations where the camera samples the output from tag 50 while the tag is changing trends.
  • each bit can be defined over two exposure periods.
  • a trend in intensity levels is defined over four exposure periods (FIG. 5B).
  • a trend in increasing intensity 315 over four exposure periods is used to encode a logical T bit while a trend in decreasing intensity 325 over four exposure periods is used to encode a logical ⁇ ' bit.
  • more than four exposure periods are used to define a trend, e.g. 5-10 exposure periods.
  • increasing the number of exposure periods used to identify each bit reduces the achievable bit rate but increases the accuracy in which the trend can be identified.
  • the optical tracking system when using a frame capture rate of 60 frames per second, the optical tracking system provides a bit rate between 15-30 bit per second, e.g.
  • an optical tag 50 transmits an 8 bit identification code within 0 0.25 - 0.6 seconds, e.g. 0.4 seconds.
  • shorter or longer codes can be transmitted by optical tag 50, e.g. 4-16 bit codes for identifying optical tags 50 and/or for transmitting additional information, e.g. sensor readings measuring temperature of animal or surrounding environment or activity level of animal.
  • encoding other than digital encoding is used to transmit information from tags 50 to cameras 80.
  • a trend of steady intensity levels 330 over a plurality of exposure periods is used to encode a bit ⁇ '
  • a trend of increasing intensity 340 over a plurality of exposure periods is used to encode a bit ⁇ '
  • a trend in decreasing intensity 350 over a plurality of exposure periods is used to encode a bit 'C.
  • the actual values of intensity per frame do not define the bit. This is clearly exemplified, for example, with respect to the two instances of bit 'A' in FIG. 5C, each of which is defined using a different intensity levels.
  • different rates of increasing (or decreasing) intensities e.g. different slopes are used to encode different bits.
  • a sharp increase in intensity represents a bit ⁇ ' while a shallow increase in intensity represents a bit ⁇
  • Similar encoding may be provided for decreasing intensities.
  • threshold values are defined to distinguish between different rates of changes in intensities and thereby to decode the information.
  • FIG. 6A and 6B showing an exemplary simplified encoded pattern of outputs obtained from an original image stream and from an image stream constructed by subtracting pixel values from contiguous image frames in accordance with some embodiments of the present invention.
  • analysis of output from the optical tags is performed on a stream of images constructed by subtracting pixel values from contiguous image frames of an image stream captured by cameras 80.
  • performing analysis on a stream of differences between images is advantageous since it reduces the noise and increase the signal to noise ratio of input obtained from the optical tags.
  • input due to steady light conditions and/or slow changes in light intensities e.g.
  • trends of increasing intensities 315 over four exposure periods are identified by three consecutive positive values 415 in a difference image stream.
  • trends of decreasing intensities 325 over four exposure periods are identified by three consecutive positive values 425 in a difference image stream.
  • one or more threshold values 400 are used to distinguish between data contributing to an increasing or decreasing trend, e.g. values 415 and 425 and transition values 450 between bits.
  • different levels, e.g. positive and negative magnitudes, of values are used to identify additional bits represented by different rates of increasing (or decreasing) intensities.
  • one or more optical tags e.g. 0-2000 tags transmit an optical signal within a defined area.
  • the optical signal is a pulsed signal that provides encoded information based on pulse width modulation and/or pulse frequency modulation.
  • one or more cameras 80 capture a video stream of the defined area (block 710).
  • the optical tracking system is an asynchronous system and the optical signals emitted by the tags are not synchronized with frame capturing of cameras 80.
  • up to 100 tags concurrently transmit an optical signal within a field of view of a camera capturing the video stream and the optical signal is captured by the camera over a plurality of frames.
  • one or more cameras 80 capture image data over a plurality of distinct wavelengths, e.g. using a plurality of filters similar to RGB filters used in colored cameras.
  • the distinct wavelengths are narrow band wavelengths in the NIR range.
  • captured video streams are transformed into a difference video stream where each frame in the difference video stream represents differences in light intensities between two contiguous frames in the captured video stream (block 720).
  • the difference video stream is used to increase the signal to noise ratio of the emitted optical signal.
  • the optical signal is modulated at a frequency that is higher than a frame capture rate of the camera as compared with other sources of light captured by the cameras that are typically modulated at a frequency that is lower than a frame capture rate of the camera.
  • frames in the video stream are analyzed to identify data sites, e.g. pixels receiving an optical signal from the optical tags (block 730).
  • data sites are identified as areas in an image, e.g. difference image associated with a magnitude greater than a threshold magnitude.
  • each data site includes a plurality of pixels.
  • an average of all pixels included in the data site is used to determine an output of the data site and/or to determine the optical signal.
  • both positive values above a threshold e.g. representing a significant increase in intensity and negative values below a threshold, e.g. representing a significant decrease in intensity are identified as data sites.
  • a plurality of pixels is identified for each data site.
  • analysis is performed on-line as the frames are being captured.
  • movement e.g. displacement of each data site is tracked through the image stream (block 740).
  • tracking of tag positioning and movement is based on predetermined knowledge regarding movement pattern or minimum distances between the tags and/or based on parameters of optical signal transmitted by the tag.
  • the tags move (and thereby the data sites on which the optical signal is identified changes) over a period during which an identification code of the tag (or other code) is in the process of being transmitted.
  • tracking is performed based on a known moving pattern and/or a known minimum distances between the tags.
  • position (coordinates) of each tag within the defined area is determined and recorded from tracking of the data sites in the stream.
  • trends in output of each optical tag (or at least one optical tag) as captured over a plurality of frames is identified (block 750).
  • the output is an intensity level.
  • trends in intensity levels over specific wavelengths are identified.
  • trends include an increase and/or a decrease in an intensity level of a data site over a pre-defined number of frames.
  • a trend is defined by a slope of an increase and/or decrease in intensity.
  • image frames include a plurality of data sites obtained from a plurality of tags.
  • output from a plurality of cameras is integrated to track objects over a larger area and/or from different angles.
  • the optical tracking system decodes identified trends and determines information encoded by the trends (block 760).
  • the trends encode identification information and decoding of the identified trends is used to identify the tracked objects (block 770).
  • the decoded information provides information regarding the tracked animal, e.g. body temperature and activity level.
  • coordinates of each tag is determined and tracked over time (block 780).
  • FIG. 8 showing a simplified schematic illustration of identified data sites ' in a portion of an image frame in accordance with some embodiments of the present invention.
  • one or more groups of pixels e.g. pixel group 801, 802, 803 and 804 are identified as data sites in an image frame 800 (only a portion of image frame 800 is shown).
  • the optical signal for each data site is captured on one or more pixels 810 and additionally on one or more surrounding pixels 820 at a relatively lower intensity as compared to pixels 810.
  • pixels 810 represent input received from light emitted directly toward camera 80 and pixels 820 represent input due to leakage from pixels 810, e.g. a saturated pixel.
  • pixels 820 additionally represent input received from scattered light from emission of optical tags 50.
  • pixels 810 and pixels 820 are tracked and analyzed when identifying trends in output of the pixels in a video stream. It is noted that although in FIG. 8 only one pixel 810 is shown for each data site, in practice more than one pixel 810 can be identified for each data site.
  • FIG. 9 showing a simplified flow chart of an exemplary method for decoding information received from an optical tag by analyzing output from pixels surrounding saturated pixels in accordance with some embodiments of the present invention.
  • captured data from contiguous frames are subtracted to detect changes in illumination levels (block 905).
  • a new video stream is constructed from the subtracted data and each pixel in the new video stream represents a difference between corresponding pixels values in the contiguous frames.
  • pixels (in new video stream) having an absolute value above a defined threshold level is used to identify data sites (or potential data sites) (block 910).
  • One or more data sites can be identified in each difference frame.
  • data sites are identified from frames of an original video stream, and/or pre-processed video stream before subtraction between contiguous frames is performed.
  • a plurality of pixels belonging to a single data site are grouped based on proximity between the pixels and/or based on a known area occupied by each optical signal on a frame.
  • an area on a frame occupied by an optical signal is a function of the distance between the tag and the camera and the size of the optical emitter.
  • pixels 810 are saturated, only pixels 820 will be identified in the new video stream in response to thresholding and pixels 810 will be missed.
  • pixels surrounding and/or neighboring identified pixels are checked, e.g. based on data from an original video stream to locate saturated pixels that are part of the data site but were missed in the tresholding step (block 920).
  • identified saturated pixels are also included as part of the data site.
  • a location of an optical tag is tracked based on output from a saturated and/or near saturated pixel, e.g. pixels 810 (block 930) while trends in changing intensities levels are determined based on identified pixels with an absolute value above the defined threshold, e.g. pixels 820 (block 940).
  • saturated pixels are ignored and tracking is based only on pixels identified in response to thresholding.
  • output from the new video stream is used to decode output from one or more optical tags (block 950).
  • decoding of the tags provides information for identifying the tracked objects on which the tags are positioned (block 960).
  • additional or other information is provided by the coded signals.
  • FIG. 10 showing an exemplary schematic illustration of an alternate optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention.
  • optical tracking system 110 includes optical tags 55 fixed on an animal 20 that reflect an optical signal in response to emission by one or more light sources 99.
  • light reflected from optical tags 55 are captured by one or more cameras 80 viewing the defined area.
  • Output from camera(s) 80 is transmitted to a local processing unit 76 that processes information received from optical tags 55.
  • local processing unit 76 identifies and tracks position and movement of tags 55 and provides output to a central processing unit 70 for further processing.
  • Output from optical tracking unit 100 is reported to a user with one or more output devices 72 and/or transmitted to a controller 73 operative to control a device associated with the animals, e.g. a controller for operating a gateway, fans, and an automatic feeder.
  • Communication between local processing unit 76, cameras 80, central processing unit 70 and output device 72 and controller 73 may be tethered and/or wireless communication.
  • optical tracking system 110 is adapted to be used in outdoor lighting and temperature conditions as described in reference to FIGS. 1-2.
  • a black and white video camera e.g. a video camera without RGB optical filters is used.
  • optical filters typically used to provide color images are replaced by narrow band filters selecting bandwidths spanning 5-50 nm in width within a range 850-950 nm.
  • camera(s) 80 including narrow bandwidth filters are operated using short exposure periods, e.g. 1-10 msec or 2 msec, as described in reference to FIGS. 1 and 2.
  • optical tags 55 include at least one a retro-reflector (typically known to reflect light generally back to its source).
  • optical tag 55 includes at least one filter, e.g. narrow band filter in NIR range that reflects light in a narrow bandwidth defined by the filter.
  • optical tags 55 are identified by the optical tracking unit 110 based on the wavelength of light reflected from them and captured by camera(s) 80.
  • optical tag 55 includes a liquid crystal shutter that is programmed to pulse reflection of optical tag 55.
  • liquid crystal shutter is pulsed with an encoded pulse.
  • one or more light sources 99 emit light in the NIR range, e.g. over a band of 850-1000 nm.
  • light sources 99 are positioned proximal to camera 80 so that light reaching tags 55 can be reflected back toward camera 80, e.g. in the case that retro-reflectors are used.
  • camera 80 is protected so that light from light source 55 is not directly received by camera 80.
  • a partially reflecting surface is used so that light source 99 can be effectively transmitted at a point where camera 80 captures the reflected light.
  • local processing unit 76 receives captured video streams from camera(s) 80, identifies wavelength received by tag(s) 55, tracks position and movement of the tags and decodes information transmitted by the tags over time.
  • cameras 80 include a flywheel filter for sequentially filtering narrow band wavelengths within an NIR range.
  • camera 80 includes filters similar to RGB filters in a colored video for filtering narrow band wavelengths in the NIR range.
  • different cameras with a same field of view include dedicated filters for filter at different narrow band frequencies.
  • each of tags 55, 55' and 55" include at least a reflector 510.
  • the reflector is a retro- reflector.
  • different sizes and shapes of reflector 510 e.g. is used to distinguish between different tags 55'.
  • tag 55' and 55 are passive tags, e.g. do not require power source.
  • a tag 55 includes one or more narrow band filter 515 covering an upper surface of the reflector 510 that operates to reflect a specific band of light in the NIR range.
  • a plurality of different filters is positioned side to side over reflector 510.
  • identification is based on one or more identified wavelength of reflected light corresponding to one or more filters (or no filters) positioned over reflector 510.
  • tag 55" is an active tag that includes a liquid crystal shutter that alternately cover and expose reflector 510 in response to an electric signal provided by a controller 520.
  • tag 55" includes a power supply 52 for activating active filter 516 and/or controller 520.
  • one or more light emitters emit light over a predefined area (block 610).
  • the light emitters emit light within the NIR range.
  • one or more optical tags including reflectors, e.g. retro-reflector, reflect light toward one or more cameras with a field of view of the defined area (block 620).
  • one or more optical tags include one or more optical filters and reflect light within one or more specified narrow band frequencies.
  • light reflected off of one or more optical tags are captured by cameras 80 over a video stream of frames (block 630).
  • areas in captured frames receiving light from one or more optical tags are identified (block 640).
  • the wavelengths of the received light are used to identify the optical tag associated with each identified area (block 650).
  • pulsing of the reflected signal is decoded and used to identify the optical tag.
  • a location of an identified optical tag is tracked over the captured video stream (block 660).
  • the tagged object is tracked over the video stream (block 660).
  • tag 55 is identified in based on output from a single frame.
  • output over a few frames is used to identify tag 55.
  • each animal of the herd is individually identified by a dedicated code transmitted by an optical tag 50 (or tag 55).
  • specific types of cows e.g. sharing a particular characteristic are identified with a same code.
  • all cows within a specified age range, a particular breed, and/or having a particular health condition may be tagged with the same code and tracked as a group.
  • objects other than the herd animal e.g. a gate are tagged and information based on positioning of these objects is used for herd management.
  • the tagged object is a sensor, e.g. temperature sensor and output from the sensor is optically transmitted to cameras 80 for decoding.
  • optical tracking system is used for herd management.
  • one or more parameters are defined to determine and track a general well being of a herd and/or a comfort level of the herd housed in an enclosure.
  • FIGS. 13A and 13B showing an exemplary schematic illustration of a distribution pattern of a herd identified by a herd management system in accordance with some embodiments of the present invention.
  • one or more cameras 80 capture an image stream of tags 50 positioned on herd animals 20.
  • output from cameras 80 is analyzed, e.g. by optical tracking system 100 or 110 to identify atypical grouping of the herd.
  • an assumption is made that typically herd animals 20 distribute themselves substantially evenly across an enclosure 85 as shown in FIG. 13A and any atypical distribution of herd animals 20, e.g. distribution other then substantially even distribution within enclosure 85 is reported.
  • cluster analysis is used to characterize distribution of herd animals 20.
  • an absence of herd animals in a specific location triggers reporting.
  • output from cameras 80 is analyzed, e.g. by optical tracking system 100 or 110 to locate position of individual animals of herd.
  • grouping of herd animals 20 in a specific area of an enclosure 85, as shown in FIG. 13B is identified by optical tracking system 100 (or optical tracking system 110) and reported.
  • grouping of herd animals 20 in a specific location may indicate that that specific location has favorable conditions for herd animals 20 and/or that a vacated portion of the enclosure 87 has unfavorable conditions for herd animal 20, e.g. exposure to sun, noise, and/or wind.
  • Knowledge of atypical behavior of the herd may alert the farmer of changes required in housing of the herd animals.
  • an image is captured in response to determining such an event.
  • FIGS. 14A-14B showing an exemplary schematic illustration of typical and atypical behavior of a herd animal identified by a herd management system in accordance with some embodiments of the present invention.
  • a herd management system identifies an individual animal 21 displaying atypical behavior as compared to the rest of a herd 29.
  • atypical behavior of an animal 21 as compared to a group 29 may give indication that animal 21 is discomforted by an ailment.
  • the heard management system identifies groups of animals, e.g. group 29 (FIG. 14A) and uses the grouping to identify individual animal 21 that are disassociated from a group 29 (FIG.
  • a full image is captured of animal 21 and stored for reporting.
  • an identity of animal 21 is determined by the optical tracking system.
  • a user identifies animal 21 based on the captured image.
  • FIG. 15 showing an exemplary schematic illustration of a behavior pattern of specific animals of a herd animal identified by a herd management system in accordance with some embodiments of the present invention.
  • a herd management system identifies a leader 22 in a group 29.
  • an animal 22 that heads the movement is identified and reported.
  • knowledge regarding leaders of a herd is known to be useful in herd management.
  • FIG. 16 showing an exemplary schematic illustration of a behavior pattern of herd animals at a specific location identified by a herd management system in accordance with some embodiments of the present invention.
  • a behavior pattern of one or more animals 20 at a specific location 88 is tracked.
  • a portion of an image frame is associated with a specific location 88 and optical signals received from location 88 is captured by a camera 80 and analyzed.
  • a number of times an animal 20 approaches a water and/or food stall 11 is tracked.
  • duration of time spent in a particular stall 11, e.g. food or water is determined.
  • an optical tracking system learns typical behavior patterns of a herd housed in an enclosure, e.g. enclosure 85 (block 360).
  • learning is provided by machine learning of an on-site herd or of a herd with same animals during a controlled study.
  • typical behavior of the herd is learned based on input from a user.
  • cows associate with a herd are identified and tracked (block 365).
  • optical tracking system identifies any straying from the learned typical behavior.
  • the optical tracking system only identifies pre-selected types of atypical event, e.g. gathering of the herd in one specific location, straying of an individual animal from the herd.
  • animals associated with an atypical event are identified, e.g. an animal straying from the herd and recorded (block 370).
  • a full and/or still image is captured in response to identifying an atypical event (block 375).
  • the atypical event e.g. straying from identified typical behavior is reported (block 380).
  • herd management system 120 is similar to optical tracking system 100 (or optical tracking system 110) and has an additional unit for detecting and analyzing grouping of animals 20 tracked by optical tags 50.
  • output from local processing unit 76 is used by group analysis unit 111 to define or detect behavior patterns of groups of animals and/or individual animals as compared to a group.
  • group analysis unit 111 is used to identify any atypical behavior of an animal 20 tracked by optical tag 50.
  • group analysis unit 111 is used to identify socially rejected animals 20, e.g.
  • herd management unit may similarly operate with input provided by optical tracking system 110 or other RTLS.
  • optical tracking system systems described herein have been mostly described in reference to tracking and managing cows, persons skilled in the art will appreciate that a same and/or similar system can be applied for tracking and managing animals other than cows, tracking and managing objects (other than animals). It is additionally noted that although the optical tracking system is adapted for use in outdoor lighting and temperature conditions, persons skilled in the art will appreciate that a same system can be also used indoors.
  • range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.

Abstract

An optical tracking system includes at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light, at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames, and a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.

Description

OPTICAL TRACKING SYSTEM AND METHOD FOR HERD
MANAGEMENT THEREWITH
FIELD OF THE INVENTION
The present invention, in some embodiments thereof, relates to a system and method for optically tracking moving objects within a defined space and more particularly, but not exclusively to an optical tracking system and method for herd management.
BACKGROUND OF THE INVENTION
Real-time locating systems (RTLS) are used to track and identify the location of objects in real time using badges or tags attached to or embedded in the objects and devices (readers) that receive wireless signals from these tags to determine their locations.
One common type of RTLS is based on barcodes attached to objects that can be read with a barcode reader or scanner. Barcode scanning typically provides close range tracking at specific locations, e.g. at checkpoint through which an object passes and where a reader is installed. Another known type of RTLS operates using radio frequency identification (RFID) tags attached to objects that can be read using a radio frequency (RF) interrogation device. Although RF interrogation can potentially be performed over large ranges, electromagnetic interference inherently present in the surrounding environment and cluttering in the frequency bands typically used for transmission limit the tracking range that can be practically implemented. Known RTLS based on RF interrogation is also typically used at specific checkpoints where a reader is installed.
International Patent Publication No. WO 2004/102462, entitled "Tracking System using Optical Tags," the contents of which are incorporated by reference in its entirety, describes a method for identifying objects, e.g. cows, with a camera capturing sequences of electronic images of a defined area. A tag including an optical emitter is attached to each object and each tag is driven to emit a specified color during a time slot specified for that tag. The tags are synchronized with a central unit so that each tag emits at its specified time slot and so that no more than one of the tags emits any one of the colors during any one of the time slots. The electronic images in the sequence are processed to identify, responsive to the colors of the emitted light and the time slots in which the light was emitted, the objects to which the tags are fixed.
U.S. Patent Publication No. US2005/0116821, entitled "Optical Asset Tracking
System," the contents of which is incorporated by reference in its entirety, describes a system and method for optically tracking assets with attached optical tags. An image sensor having a plurality of pixels generates both video data responsive to light incident on a pixel from a respective portion of an image formed on the image sensor and a communication data signal responsive to an optical data signal incident on the pixel and emitted from an optical tag present in a corresponding part of the image. The optical tag includes an optical modulator such as a LED or a modulated reflector. Typically, the communication data signal is generated and sampled at a much higher frequency than the video data. Alternatively, it is suggested that a digital video camera or an analog video camera can be used for lower bandwidth communication of a communication data signal. When using a video camera, an additional software component is used to separate video frame data into a video stream and a data stream. The data stream is determined by comparing the intensity value from each pixel to a threshold value to determine whether an optical bit is present during a video frame.
U.S. Patent Publication No. US2004/0101308 entitled "Optical Communications
Imager," the contents of which is incorporated by reference in its entirety, describes a method and an optical communication processor for receiving communications data in an image formed on a plurality of pixels. The method includes receiving active pixel information, identifying which of the pixels are receiving communications data and retrieving communication data from each of the pixels identified by the active pixel information. An active source present in the field of view of the detector generates an intensity that is greater than a known intensity resulting from the background image. The data threshold module samples the electrical signal generated by the detector according to a frequency of a data clock, e.g. at around 1 GHz or more. If the sampled electrical signal exceeds a predetermined threshold value, a logic "1" bit is generated by the data threshold module and stored in the pixel buffer otherwise a logic "0" bit is stored. SUMMARY OF THE INVENTION
According to an aspect of some embodiments of the present invention there is provided an asynchronous optical tracking system and method for simultaneously tracking and identifying a plurality of objects within a defined area subjected to variable lighting conditions. According to some embodiments of the present invention, one or more optical tags positioned on the objects communicate modulated optical signals to one or more image sensors in outdoor lighting conditions.
According to an aspect of some embodiments of the present invention there is provided an optical tracking system and method for monitoring behavioral aspects a herd as a group based on identifying groups in the herd and tracking location and movement of the groups. In some exemplary embodiments, the optical tracking system and method additionally provide for monitoring behavior aspects of individuals in the herd as compared to determined behavior aspects of the herd and/or expected behavior of the individual. Optionally, any individuals straying from the determined behavior aspects of the herd are identified and reported.
An aspect of some embodiments of the present invention provides an optical tracking system comprising: at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light; at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
Optionally, the at least one optical emitter and the at least one camera are asynchronous.
Optionally, the at least one optical emitter is adapted to emit light in a near infrared wavelength.
Optionally, the near infrared wavelength is between 800-1000 nanometers.
Optionally, the at least one optical emitter includes a filter adapted to transmit light over a selected bandwidth.
Optionally, the modulated pattern of pulsed light provides an information code. Optionally, the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
Optionally, the bit is defined based on a trend in intensity of pixels neighboring pixels with saturated output.
Optionally, each trend is determined over at least four contiguous frames.
Optionally, the information code is a digital code.
Optionally, the trend in intensity is selected from a group including: an increasing trend in intensity and a decreasing trend in intensity.
Optionally, the trend is defined by a slope of increasing or decreasing intensity.
Optionally, the processing unit is adapted to generate a new stream of frames constructed by subtracting pixel values from pairs of contiguous frames, wherein pixel values of image frames in the new stream include both positive and negative pixel values.
Optionally, the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by pixel values that have a same sign (positive or negative) over a pre-defined number of image frames.
Optionally, the information code is an identification code.
Optionally, the information code is information received from a sensor in communication with the at least one optical emitter.
Optionally, the optical tracking system provides a bit rate of 15-30 bits per second.
Optionally, the pattern of pulses is modulated by at least one of pulse width modulation and frequency modulation.
Optionally, a frequency of the pulsing is at least four times faster than a frame capture rate of the camera.
Optionally, the exposure period for capturing frames in the stream is between 10 μβεΰ and 10 msec.
Optionally, the pre-defined areas spans 100-2500 square meters.
Optionally, the optical tracking system is adapted to tracking herd animals housed in the pre-defined area. Optionally, the herd animal is a cow.
An aspect of some embodiments of the present invention provides a method for optical tracking, the method comprising: emitting an optical signal including a modulated pattern of pulsed light from an optical tag; capturing radiation from the optical signal over a stream of captured image frames; and identifying an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
Optionally, the emitting and the capturing is asynchronous.
Optionally, the optical signal is emitted in the near infrared range.
Optionally, the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
Optionally, the optical signal is adapted to be emitted and received in an area exposed to outdoor lighting and temperature conditions.
An aspect of some embodiments of the present invention provides an optical tracking system comprising: at least one optical tag adapted to transmit an optical signal in a range between 800-1000 nanometers; at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10 msec; and a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
Optionally, the at least one optical tag transmits an optical signal in a selected sub-range of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
Optionally, the at least one camera includes a plurality of filters, each filter adapted to filter radiation received from the optical signal over a different sub-range of the range.
Optionally, the plurality of filters is applied sequentially over different image frames. Optionally, the plurality of filters is applied to different pixels in the same image frame.
Optionally, the processing unit is adapted to determine the sub-range of the input and to identify the optical tag based on the sub-range.
Optionally, the optical tag includes a reflector and wherein the transmitted optical signal is an optical signal reflected from a light source.
Optionally, the reflector is a retro-reflector.
Optionally, the reflector includes a filter adapted to filter light received from the light source over a sub-range of the range.
Optionally, the filter is a passive element.
Optionally, the camera is adapted to periodically capture an image frame using an exposure period longer than 50 msec.
Optionally, the at least one optical tag includes at least one optical emitter adapted to emit pulsed light.
Optionally, the processing unit adapted to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream.
Optionally, the at least one optical tag and the at least one camera are asynchronous.
Optionally, modulation of pulsing of the pulsed light provides an information code.
Optionally, the information code includes a plurality of bits of information, wherein each bit is defined by a trend in intensity of defined over a pre-defined number of frames.
An aspect of some embodiments of the present invention provides a method for optical tracking, the method comprising: transmitting an optical signal in a range between 800-1000 nanometers with an optical tag; capturing radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames over an exposure period ranging between 1-10 msec; and identifying coordinates on the image frames where input was received from the optical signal; and determining a location of the optical tag based on the identified coordinates. Optionally, the optical signal is emitted in a selected sub-range of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
Optionally, the optical tracking system is adapted for tracking herd animals housed in an enclosure exposed to outdoor lighting and temperature conditions.
An aspect of some embodiments of the present invention provides a herd management system for managing a herd housed in an enclosure comprising: a real time locating system operable to track position and changes in position of a plurality of herd animals housed in an enclosure; a group analysis unit adapted to identify one or more groups of the herd animals within the enclosure based on input received from the real time locating system, wherein the one or more groups is defined based on a proximity between the herd animals tracked by the real time locating system and to track positions and movement of the one or more groups; and wherein group analysis unit is adapted to determine a behavioral pattern of the herd animals based on input received from the real time locating system and position and movement of the one or more groups; and an output device adapted to report a behavioral pattern of the herd animals.
Optionally, the real time locating system is adapted to determine identity of each of the herd animals that are tracked.
Optionally, the group analysis unit is adapted to track a direction of movement of the one or more groups and to identify a herd animal leading the one or more groups in the direction of movement.
Optionally, the group analysis unit is adapted to identify straying of a herd animal from a group.
Optionally, the group analysis unit is adapted to identify a socially rejected cow.
Optionally, the group analysis unit is adapted to identify gathering of the herd animals in specific locations in the enclosure.
Optionally, the group analysis unit is adapted to identify position of individual cows in the herd.
Optionally, the group analysis unit is adapted to learn a typical behavioral pattern of the one or more groups and to identify an atypical behavioral pattern.
Optionally, the real time locating system comprises:
at least one optical tag adapted to transmit an optical signal in a range between 850-950 nanometers; at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10' msec; and a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
Optionally, the real time locating system comprises: at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light; at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced. In the drawings:
FIG. 1 is an exemplary schematic illustration of an optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention;
FIG. 2 is an exemplary block diagram of an optical tracking system in accordance with some embodiments of the present invention;
FIGs. 3A, 3B and 3C are exemplary block diagrams of tags including an optical emitter in accordance with some embodiments of the present invention;
FIGs. 4A and 4B are simplified waveform diagram of light pulses emitted by two exemplary optical tags over three exposure periods of a camera in accordance with some embodiments of the present invention;
FIGs. 4C and 4D are simplified graphs of actual and captured intensity levels as provided by an optical tag over three exposure periods in accordance with some embodiments of the present invention;
FIGs. 5A, 5B and 5C are exemplary simplified encoded patterns of outputs obtained from image streams in accordance with some embodiments of the present invention;
FIGs. 6A and 6B is an exemplary simplified encoded pattern of outputs obtained from an original image stream and from an image stream constructed by subtracting pixel values from contiguous image frames in accordance with some embodiments of the present invention;
FIG. 7 is a simplified flow chart of an exemplary method for tracking an optical tag and decoding information received from an optical tag in accordance with some embodiments of the present invention;
FIG. 8 is a simplified schematic illustration of identified data sites in a portion of an image frame in accordance with some embodiments of the present invention;
FIG. 9 is a simplified flow chart of an exemplary method for decoding information received from an optical tag by analyzing output from pixels surrounding saturated pixels in accordance with some embodiments of the present invention;
FIG. 10 is an exemplary schematic illustration of an alternate optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention; FIGs. 11 A, 11B and 11C are exemplary schematic diagrams of tags including a reflector in accordance with some embodiments of the present invention;
FIG. 12 is a simplified flow chart of an exemplary method for tracking and identifying an optical tag reflecting received light in accordance with some embodiments of the present invention;
FIGs. 13A and 13B are exemplary schematic illustrations of distribution patterns of a herd identified by a herd management system in accordance with some embodiments of the present invention;
FIG. 14 is an exemplary schematic illustration of atypical behavior of a herd animal identified by a herd management system in accordance with some embodiments of the present invention;
FIG. 15 is an exemplary schematic illustration of a behavior pattern of specific animals of a herd animal identified by a herd management system in accordance with some embodiments of the present invention;
FIG. 16 is an exemplary schematic illustration of a behavior pattern of herd animals at a specific location identified by a herd management system in accordance with some embodiments of the present invention;
FIG. 17 is a simplified flow chart of an exemplary method for identifying atypical behavior of animals in a herd in accordance with some embodiments of the present invention; and
FIG. 18 is a simplified block diagram of an exemplary herd management system in accordance with some embodiments of the present invention.
DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION
The present invention, in some embodiments thereof, relates to a system and method for optically tracking moving objects within a defined space and more particularly, but not exclusively to an optical tracking system and method for herd management.
An aspect of some embodiments of the present invention provides for an optical tracking system that simultaneously tracks a plurality of asynchronous optical tags positioned on objects that are housed in a defined space subjected to outdoor lighting conditions. Typically, the optical tracking system is a RTLS. According to some embodiments of the present invention, the optical tracking system includes a plurality of optical tags, each tag operable to emit and/or transmit a modulated optical signal, one or more video cameras for capturing a video stream of the optical signal transmitted by the tags, and a processor for decoding and processing the modulated optical signal captured by the video camera.
According to some embodiments of the present invention, the optical tracking system is operable to track herd animals, e.g. cows housed in a shed, in an enclosure e.g. a roofed cow shed or other enclosure. Optionally, an optical tag is positioned on each animal and one or more video cameras are positioned over the shed, e.g. under the roof of the shed, to track positioning of the animals in the shed. Optionally, the modulated signal transmitted by tag encodes information used to identify the tagged animal and/or provide information with regard to the tagged animal. According to some embodiments of the present invention, a video camera is positioned at a distance between 3-10 meters above or up to 50 meters away from the optical tags, e.g. to cover a space defined by 10 to 60 meter diameter. In some exemplary embodiments, optical tags can be detected in a shed covering about 30 meter diameter with a resolution of about 10-15 cm using a single video camera. Typically, the video camera captures images of the defined space at a rate ranging between 40-80 frames per second, e.g. 60 frames per second. In some exemplary embodiments, the camera captures a frame over a short exposure period, e.g. between 15-50 μβεο or up to 1 msec. Optionally, the camera captures a frame over an exposure period between 10 μβεΰ and 200 msec. Typically, shorter exposure periods are used to reduce effect of sunlight or other sources of constant light on the output of the camera. Optionally, exposure periods are varied over different parts of the day, e.g. longer exposure periods at night and shorter exposure periods during daylight.
According to some embodiments of the present invention, the optical tags include one or more LEDs (light emitting diodes) that provide an optical signal in a NIR (near infrared) range, e.g. between 850-950 nm wavelength, 850-980 nm wavelength and/or 850-1000 nm wavelength and controlled by a tag controller integrated in the tag. The present inventors have found that it is advantageous to provide optical signals in a range between 850-950 nm when tracking optical signals with a video camera in outdoor lighting conditions. The present inventors have found that when using optical filtering optical signals ranging between 850-950 nm can be detected in strong sun light conditions at distances ranging between 3-50 meters, using off the shelf camera.
According to some embodiments of the present invention, the tag controller pulses the one or more LEDs at a defined rate and fixed intensity and the modulation is provided by ON / OFF pulsing of the LEDs over a defined period, e.g. pulse width modulation or frequency modulation. Typically, the rate of pulsing is defined to be significantly higher than the frame rate of the camera and the intensity level of the optical signal captured in each frame is controUably modulated by altering the number of ON pulses per frame capture period. Optionally, the pulsing frequency ranges between 0.2-66 KHz, e.g. 33KHz.
According to some embodiments of the present invention, an optical tracking system processing unit receives a stream of images, e.g. image frames from the video camera, determines a location on the image frames of each of the optical signals captured and decodes information provided by each of optical signals by detecting trends in intensity levels in each of the determined location as detected over a plurality of frames. In some exemplary embodiments, information is provided by a plurality of bits where each bit is defined by a trend. In some exemplary embodiments, binary encoding is used. Optionally, each digit of the binary code, e.g. logical Ό' and logical T, is defined by a specified trend so that obstruction of the optical signal may be confused with input provided by the signal. In some exemplary embodiments, a downward trend of intensity over a pre-defined number of frames is defined a logic '0' bit while an upward trend of intensity over the pre-defined number of frames is defined as a logic T bit. Optionally, additional trends are defined and used to represent additional symbols. In some exemplary embodiments, at least three data points are used to define a trend, e.g. to define each bit of information.
According to some embodiments of the present invention, an optical tracking system processing unit receives a stream of images, e.g. image frames from the video camera, determines a location and/or area on the image frames where emission from the optical signals is captured and determines any of the optical signals provide predefined code and/or predefined trend.
The present inventors have found when operating in variable lighting conditions, e.g. outdoor lighting conditions it is advantageous to decode information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames as opposed to traditional methods of detecting and decoding the optical signal based on an absolute threshold level of intensity. The present inventors have found that traditional methods of detecting and decoding the optical signal based on an absolute threshold level of intensity can be problematic in outdoor lighting conditions, since the background light varies significantly over the coarse of the day and in response to different weather conditions or other disturbances. When decoding based on relative intensities in neighboring frames changes in the surrounding lighting conditions are typically insignificant over the time period for detecting each bit or trend and inaccuracies related to long term changes in lighting conditions, e.g. occurring over a span of a few hours can be avoided.
Additionally, the present inventors have found that decoding information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames avoids inaccuracies occurring due to variations in LED performance in response to changes in temperature which can be significant in outdoor conditions. When decoding based on relative intensities in neighboring frames changes in the outdoor temperature are typically insignificant over the time period for detecting each bit. One other advantage in decoding information based on trends and/or patterns of intensities defined in relation to intensities in neighboring frames is that since the trend is identified over a plurality of frames (as opposed to a single frame), if one of the data points in the trend is missed due to temporary obstruction in the field of view, e.g. due to a bird flying past the video camera, the trend may still be recognized.
According to some embodiments of the present invention, the optical tracking system processing unit receives and processes a video image stream constructed by subtracting corresponding pixel values of consecutive pairs of neighboring image frames in an original video stream captured by the video camera. The subtracted images provides for eliminating much of the noise originating from light sources other than the optical signals and increases the SNR of the optical signal. Typically, much of the input from sources other than the optical signals, e.g. from the cows in the shed and/or from sun light, are eliminated and/or reduced when subtracting neighboring pairs of images, since their variation per frame capture, e.g. due to changes in lighting conditions or due to movement of the cows, is relative low. In some exemplary embodiments, if a frame rate of 60 frames per second is used, an upward and/or downward trend can be detected over four or more frames providing a communication rate of about 5-15 bits per second depending how many samples are required to identify the bit.
Typically, it is assumed that a cow may move during transmission of identification code so that the location of pixels providing an identification code of the tag changes during transmission. In some exemplary embodiments, an interest region surrounding the location of pixels providing an identification code is defined to cover expected movement during transmission and output of all the pixels in the interest region is examined during decoding. In some exemplary embodiments, an average value of pixels in the interest region is examined for decoding. Optionally, high/low values within the interest regions are detected and used for decoding.
Alternatively in some exemplary embodiments, history tracking is used to track location of the pixels receiving the identification code. Typically, predetermined information regarding the possible rate of movement and/or minimum distances between optical tags is used for history tracking.
Alternatively, according to some embodiments of the present invention, one or more optical tags include a narrow band filter and modulation of the optical signal is provided by wavelength modulation. In some exemplary embodiments, optical filters with a 10-40 nm pass within the defined range between 850-950nm is used for encoding. Optionally, each optical tag includes a plurality of LEDs, with different filters for increasing the number of combinations possible. The LEDs with the different filters in each tag may be synchronized to emit simultaneously so as to be detected in a single frame and/or consecutively for detection over a plurality of frames. Optionally, optical tags may include both wavelength modulation capability and intensity modulation capability, e.g. based on pulse width and/or pulse frequency modulation, and decoding maybe based on both. Typically, wavelength modulation is detected with a plurality of filters associated with the camera, e.g. similar to RGB filters in a colored video.
According to some embodiments of the present invention, one or more optical tags include a reflector, e.g. retro-reflectors adapted to reflect light to one or more video cameras. Typically, a light source associated with one or more video cameras and/or positioned in the vicinity of the video camera emits light toward the reflectors. Optionally the light source emits light in the NIR range, e.g. between 850-1000 nm wavelength. In some exemplary embodiments, the optical tags include optical filters and only a wavelength band specified by the optical filter is reflected off the reflectors. Optionally, the filters are narrow band filters. In some exemplary embodiments, the video camera is equipped with corresponding filters for identifying the wavelength band(s) of the reflected light. Optionally, a liquid crystal shutter is used to pulse the reflected light.
An aspect of some embodiments of the present invention provides for a herd management system and method for tracking behavioral aspects of a herd moving within a defined space and for identifying and reporting pre-defined behaviors of the herd and/or pre-defined behavioral changes in the herd. The present inventors have found that tracking behavior of the herd as a group can provide valuable information regarding the conditions in a shed as well as the well-being of the animals. In some exemplary embodiments, gathering of the herd in a particular part of the shed over an extended period of time, e.g. a few hours and/or evacuation of a particular part of the shed over an extended period of time is identified and reported. The present inventors have found that information regarding atypical positioning of the herd in particular parts of the shed may provide indication that parts of the shed may lack suitable conditions, e.g. not enough shade, too much wind, noisy or muddy or alternatively that favorable conditions exist in particular parts of the shed. By reporting such occurrences, the farmer can identify problematic conditions and introduce changes. The effect of the changes introduced can be monitored by the heard management system.
Optionally, changes in the general activity level of the herd is monitored by herd management system and can be used indicate a change in the comfort level of the animals. In some exemplary embodiments, any changes from a defined norm is tracked and reported. For example a sudden movement of the herd may indicate an abnormal event.
According to some embodiments of the present invention, the herd management system and method is operable to track behavior of individuals as compared to the group. According to- some embodiments of the present invention the optical tracking system is operable to track positions of animals within the space and a herd management unit defines one or more groups based on proximity between the animals. In some exemplary embodiments, individual animals that stray from the rest of the herd are identified and reported. Straying of an animal may indicate a poor health condition that requires medical attention and/or may indicate that a cow is socially rejected. In some exemplary embodiments, animals within the group that exhibit a particular pre-defined behavior are identified and reported. In some exemplary embodiments, animals that tend to lead the herd are identified and reported. In some exemplary embodiments, animals that refrain from approaching eating area and/or are obstructed from approaching the food when food is served are identified and reported. Typically, the farmer is interested in identifying the leaders of the herd for general herd management.
Reference is now made to FIG. 1 showing an exemplary schematic illustration of an optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention. According to some embodiments of the present invention, optical tracking system 100 includes one or more optical tags 50 adapted to be fixed on animals 20. According to some embodiments of the present invention, each optical tag 50 emits an optical signal captured by one or more cameras 80 viewing the defined area. Typically, cameras 80 include a CCD or CMOS image sensor. In some exemplary embodiments, output from each of cameras 80 is transmitted to a local processing unit 76 and then to a central processing unit 70 both of which processes information emitted by optical tags 50 and captured by cameras 80. Typically, local processing unit 76 includes a local pre-processing unit and a local tracking unit for processing and tracking input to each of cameras 80. Optionally, optical tracking system 100 tracks between 1-500, e.g. 5-50, 5-100, or 5-200 optical tags in one defined area. Typically, central processing unit 70 and/or local processing unit 76 identifies and tracks position and movement of tags 50 within the defined area based on input received from cameras 80 and provides related output to one or more output devices 72 and/or a controllers 73 operative to control a device associated with the animals, e.g. a controller for operating a gateway, fans, and an automatic feeder. Communication between central processing unit 70, local processing units 76, cameras 80, output device 72 and controller 73 may be tethered and/or wireless communication. Typically, location of cameras 80 are registered and calibrated, so that local processing units 76 and/or central processing unit 70 is able to associate pixels in the images produced by each of the cameras with specific location coordinates in the pre-defined area defined by enclosure 85. According to some embodiments of the present invention, identification and tracking of all tags 50 within the defined area is performed simultaneously.
According to some embodiments of the present invention, optical tracking system 100 is adapted and used to track animals and/or other objects in a defined area confined by an enclosure 85. Optionally, the defined area is roofed, partially roofed and/or sheltered with a covering 60. Typically, optical tracking system 100 is adapted to be used in outdoor lighting and temperature conditions. Optionally, the defined area is not enclosed and/or roofed.
In some exemplary embodiments, tags 50 are positioned on animals 20, e.g. on the animals back or head, so that output from tags 50 can be captured by camera(s) 80 generally positioned over the defined area, e.g. area enclosed by enclosure 85. Optionally, cameras 80 are positioned under covering 60, e.g. directly under covering 60 and/or associated support beams at a height of about 3-10 meters above the ground and/or tag 50, e.g. 5-8 meters over the ground with a field of view that is generally directed downwards. According to some embodiments of the present invention, cameras 80 have an angle of view of between 60-150 degrees, e.g. 110-135, or 135 degrees and each view a defined area spanning 8-50 meters wide, e.g. 64-2500 square meters or 100 square meters. Typically, an area of about 2500 square meters houses approximately 50- 150 animals, e.g. 100 animals.
In some exemplary embodiments of the present invention, cameras 80 are video cameras, e.g. digital video cameras. Optionally off the shelf video cameras that operate at 60 frames per second are used. Optionally, cameras 80 include an image sensor having a resolution of 640x480 pixels. Optionally, cameras 80 have a resolution matched for locating and tracking optical tags 50 with a resolution of about 5-15 cm, e.g. 10 cm. Higher resolution can be achieved by reducing the area covered by the cameras field of view and/or by using a higher resolution camera. The present inventors have found that a resolution of about 10 cm is typically adequate for tracking and identifying herd animals, e.g. cows, sheep and goats.
In some exemplary embodiments a black and white video camera, e.g. a video camera without RGB filters is used. Typically, NIR filter video camera with an NIR filter is used. According to some embodiments of the present invention, cameras 80 including an NIR filters are operated using short exposure periods ranging between 1-10 msec, defined to provide enough time to capture optical signal from tags 50 but not enough time to capture background details. In some exemplary embodiments, when NIR filtering are not used, cameras 80 are operated using much shorter exposure periods, e.g. 10-50 μβεϋ or up to 1 msec. Optionally, the camera captures a frame over an exposure period between 10 μβεϋ and 200 msec. Optionally, the cameras are periodically operated at a longer exposure period, e.g. once every two seconds to capture background details, e.g. of animals 20 and shed 85 that can be recorded. Optionally, when NIR filter is performed, longer exposure periods include exposure periods longer than 50 msec. Optionally, the still images are used for security surveillance and/or for verification of output provided by optical data.
According to some embodiments of the present invention, optical tags (50) includes at least one optical emitter, e.g. at least one Light Emitting Diode (LED) that pulses light in a NIR range, e.g. light between 700-1000 nanometer, 850-940 nanometers or 880-950 nanometers.
In some exemplary embodiments, the optical emitter is pulsed at a frequency that is significantly higher than the exposure period of the camera. Optionally, the optical emitter is pulsed at rate of 1-10 KHz, e.g. 10 KHz. According to some embodiments of the present invention, tag(s) 50 emits a modulated signal by controlling ON/OFF pulsing pattern of its optical emitter(s). Typically, ON/OFF pulsing is used to modulate intensity captured by cameras 80 over a plurality of exposure periods. According to some embodiments of the present invention, the optical signal carries information such as identification information and/or information received from one or more sensors. Typically, the optical signal is transmitted over a time period within which animals 20 can be in motion, e.g. over a time frame of few seconds and is repeated and/or updated periodically. Optionally, the optical signal, e.g. the pulsed signal is repeated and/or updated continuously.
According to some embodiments of the present invention, local processing units 76 receives a stream of image frames from cameras 80, identify outputs received by tag(s) 50 on the image frames, track position and movement of the tags identified in the image stream and decode information transmitted by the tags over time. Optionally decoding is performed in central processing unit 70. Optionally, optical tracking unit 100 includes light sources 99 associated with cameras 80 adapted to emit light in a direction of a field of associated camera 80. In some exemplary embodiments, light emitted by light sources 99 is picked up by one or more light sensors included in optical tags 50. Optionally, input received by optical tags 50 from light sources 99 is used as a signal for operating and/or triggering emission of optical tags 50. Optionally, light source 99 emits light at a wavelength other than the wavelength emitted by optical tags 50 and/or captured by cameras 80. Optionally, light picked up by optical tags 50 is used by optical tag 50 to determine a relative positioning of camera 80 and thereby select its direction of illumination.
Reference is now made to FIG. 2 an exemplary block diagram of an optical tracking system in accordance with some embodiments of the present invention. According to some embodiments of the present invention, one or more cameras 80 capture a stream of image frames of a field of view in which one or more tag(s) 50 are transmitting optical signals. Typically, 0-100 tags transmit optical signals in a field of view of each camera. Optionally, cameras 80 include a wide angle lens 82. Optionally, cameras 80 include one or more filters 83 for selecting one or more NIR bands. Optionally, filters selecting specific bands of NIR light are used in place of color filters typically used in video cameras.
In some exemplary embodiments, output from cameras 80 is transmitted to local processing unit 76 typically including at least image pre-processing units 74 and local tracking units 75 for processing output from each camera 80. Output from local processing unit 76 is transmitted to a central processing unit 70. Typically central processing unit includes a processor 71 and a system controller 77. Information received by processing unit 70 is used by system controller 77 to control operation of the optical tracking system 100.
According to some embodiments of the present invention, pre-processing unit 74 performs image processing on the captured video stream prior to analyzing information captured. In some exemplary embodiments, pre-processing units 74 transform the sampled video stream from each camera into a new video stream representing differences between contiguous frames. For example, if each frame in a video stream is represented by a matrix of pixel values, a third frame in the new video stream is obtained by subtracting a matrix of pixel values of a third frame (from the original video stream) from a matrix of pixel values of a fourth frame (from the original video stream). Optionally, the new video stream only includes differences between contiguous frames above a defined threshold. In some exemplary embodiments, pre-processing unit 74 identifies location on frames where an optical signal was received and crops non relevant regions, e.g. regions with no optical signal. Optionally, pre-processing unit compresses information received by cameras 80 prior to transmitting to a central processing unit. In some exemplary embodiments, central processing unit 70 integrates information from the plurality of cameras 80.
According to some embodiments of the present invention, each local tracking unit 75 operates to identify location and track movement of each optical tag transmitting within the field of view of a camera 80. Optionally, each local tracking unit 75 is also operative to decode optical signal received from tags 50, e.g. to decode an identification code transmitted by tag 50 and to associate each tag 50 with the information included in its optical signal. Optionally, decoding is performed in central processing unit 70, e.g. with processor 71.
According to some embodiments of the present invention, output from the central processing unit 70 is reported to a reporting device (an output device) 72, e.g. a printer or a screen and/or recorded in a storage device 78. Typically, reporting and storing is controlled by system controller 77 and by input received from a user via an input device 79.
Optionally, system controller 77 sends commands to one or more controllers 73 controlling one or more devices used for managing the tagged animals. Optionally, controller 73 controls operation of cameras 80. In some exemplary embodiments, based on information received about the presence of a pre-defined number of optical tags in a specific area and/or the presence of a specific tag in a specific location, controller 73 operates (or controls operation of) a gate, a feeder, a fan, a sprinkler, a station in a milking parlor, and an alarm. Optionally, controller 73 provides information to system controller 77 regarding a status of one or more devices and/or receives input from one or more sensors sensing a condition affecting management of the tagged animals, e.g. an environmental condition. Optionally, information and/or tasks required from optical tracking system 100 are selected by user input 79. In some exemplary embodiments, system controller 77 controls operation of local processing unit 76 and/or cameras 80.
It is noted that although local tracking unit 75 and pre-processing unit 74 are shown as discrete units, functionality of both local tracking unit 75 and pre-processing unit 74 can be provided within a single processing unit. Optionally more than one camera is associated with a single pre-processing unit and/or local tracking unit.
Reference is now made to FIGS. 3 A 3B and 3C showing exemplary block diagrams of tags including an optical emitter in accordance with some embodiments of the present invention. According to some embodiments of the present invention each of tags 50, 50' and 50" include a power source 52 such as a battery, at least one optical emitter 53, e.g. LED and a controller 54. Typically, controller 54 controls pulsing, e.g. ON/OFF pulsing of optical emitter 53. Optionally, controller 54 additionally (or alternatively) controls selection of the optical emitters 53 to be operated. Typically, controller 54 is associated with memory for storing parameters of operation and/or coding of ON/OFF pulsing.
Referring now to FIG. 3B, in some exemplary embodiments, each tag 50' includes more than one optical emitter 53 operating at a different wavelength that transmit simultaneously or consecutively, e.g. based on a defined pattern. Optionally controller 54 of tag 50' receives input from one or more sensors associated with the object or animal being tracked by tag 50'. Optionally, receiver is a tethered receiver. In some exemplary embodiments, input is received by wireless communication with an antenna 58 and receiver 56. Typically antenna 58 provides for short distance reception, e.g. using blue tooth communication. Optionally, input received by receiver 56 is used to define operation of tag 50' and/or output of one or more optical emitters 53. Optionally, input received by receiver 56 is coded by controller 54 and outputted by one or more optical emitters 53. Optionally, receiver 56 receives input from a user input device to control operation of tag 50'.
Referring now to FIG. 3C showing a top view of an exemplary optical tag 50". In some exemplary embodiments, each optical tag 50" includes a plurality of light emitting elements 53, e.g. LEDs each directed at different angles to provide a wider range of illumination. In some exemplary embodiments, optical tag 50" additionally includes a light emitter 59 that is centrally positioned and illuminates at a wide angle, e.g. 120 degrees. Optionally, each emitter 53 has a relatively narrow angle of illumination, e.g. an LED with a 20 degree angle of illumination and is angled at 20-50 degree angle off surface 555, e.g. 40 degrees from surface 555.
In some exemplary embodiments, optical tag 50" includes one or more light sensors 57, e.g. photodiodes for sensing a light source directed from a vicinity of camera 80 and for determining location of camera 80 with respect to optical tag 50". In some exemplary embodiments, in response to detecting light on one or more light sensors 57, controller 54 (not shown in FIG. 3C) selectively operates emitters 53 and/or 59 illuminating in the direction of the sensed light.
Reference is now made to FIG. 4A-4B showing simplified waveform diagrams of light pulses emitted by an optical tag over three exposure periods of a camera and FIG. 4C showing a simplified graph of actual and captured intensity levels as provided by a tag over the three exposure periods, both in accordance with some embodiments of the present invention. Referring now to FIG. 4A, according to some embodiments of the present invention encoding by intensity modulation is provided by modulating a frequency of ON pulses 150 over different exposure periods, e.g. exposure periods 181, 182 and 183 of the camera. Typically, the tag is ON/OFF pulsed at a much higher rate than the frame capture rate and the intensity level captured by the camera is related to the number of pulses 150 of the tag. In some exemplary embodiments, the tag is pulsed every 0.05-0.5 msec, e.g. 0.03 msec or 0.1 msec, while duration of exposure for each frame is between 1-5 msec, e.g. 2 msec as shown in FIG. 4 A. Exposure is repeated approximately every 17 msec for a camera capturing at a rate of 60 frames per second.
Referring now to FIG. 4B, alternatively, according to some embodiments of the present invention, encoding by intensity modulation is provided by Pulse Width Modulation (PWM) of pulses 155 and/or by modulating duration of ON pulsing of pulses 150. In some exemplary embodiments, a width of an ON pulse (or a number of contiguous ON pulses 50) is modulated over a plurality of exposure periods. Optionally, for PWM, cycle duration of 1-5 msec, e.g. 1 msec is used.
Referring now to FIG. 4C, the actual intensity level 180 provided by ON/OFF pulsing as well as the intensity level captured per frame, e.g. over exposure periods 181, 182, 183 is shown to increase as the number of ON pulses 150 increases. In some exemplary embodiments and as shown in FIG. 4C, the actual intensity level 180 is modulated gradually over a plurality of exposure periods and typically at a higher frequency than the frame capture frequency. In some exemplary embodiments, a trend of increasing intensity 180 (or alternatively a trend of decreasing intensity) may be typically recognized over 2 or more frames, e.g. 3-4 frames.
According to some embodiments of the present invention, optical tracking system is an asynchronous system so that pulsing of optical tags 50 is not synchronized with exposure periods, e.g. exposure periods 181, 182 and 183 of the camera, e.g. camera 80. Due to lack of synchronization, the number of ON pulses 150 that will occur over a specific exposure period (and thereby the intensity level over that exposure period) may not be pre-defined and will be a function of alignment between pulsing and exposure.
Referring now to FIG. 4D, for an asynchronous system, a point in time over which frames 1, 2, and 3 are captured may shift to the left (or right) depending on the start-up condition of the system, e.g. the alignment between pulsing of the optical tags and exposure of the camera. Although shifting may affect intensity levels captured during each exposure periods 181, 182 and 183, the trend in intensity level over the three capture periods is can still be detected. Optionally, when more than one camera 80 is used to capture a video stream of the defined area, the exposure periods of cameras 80 are synchronized, e.g. cameras 80 are synchronized to simultaneously capture images of the defined area or synchronized to capture images of a same field of view in a sequential manner to increase the bit rate achievable. Optionally cameras 80 are not synchronized.
According to some embodiments of the present invention, encoding is provided by gradually modulating a frequency or width of ON pulsing so that that a trend in intensity levels can be identified over a plurality of frames. The present inventors have found that encoding by trends as opposed to defined levels in intensity can be used to overcome ambiguity (or reduce noise) due to changes in outdoor light conditions and/or background lighting conditions. Additionally, the present inventors have found that encoding by trends can overcome ambiguity due to lack of synchronization between the cameras and the optical tags. Reference is now made to FIGS. 5A, 5B and 5C showing exemplary simplified encoded patterns of outputs obtained from image streams in accordance with some embodiments of the present invention. Referring now to FIG. 5A, in some exemplary embodiments, digital encoding is used to transmit information from optical tag 50 to camera 80. Optionally, a trend of increasing intensity 310 over a plurality of exposure periods encodes a logical T digit while a trend in decreasing intensity 320 over a plurality of exposure periods (frames) encodes a logical Ό' digit. In some exemplary embodiments, any upward trend defining a slope above a pre-defined first threshold is identified as a logical ' while any downward trend below a pre-defined second threshold is defined as a logical Ό'. In some exemplary embodiments, each bit is defined over three exposure periods, e.g. as shown in FIGS. 5A and 5C. Since according to some embodiments, the optical tracking system is not synchronized, defining the trend over at least three exposure periods (or more) avoids ambiguity in situations where the camera samples the output from tag 50 while the tag is changing trends. Optionally, each bit can be defined over two exposure periods.
According to some embodiments of the present invention, a trend in intensity levels is defined over four exposure periods (FIG. 5B). Optionally, a trend in increasing intensity 315 over four exposure periods is used to encode a logical T bit while a trend in decreasing intensity 325 over four exposure periods is used to encode a logical Ό' bit. Optionally, more than four exposure periods are used to define a trend, e.g. 5-10 exposure periods. Typically, increasing the number of exposure periods used to identify each bit reduces the achievable bit rate but increases the accuracy in which the trend can be identified. In some exemplary embodiments, when using a frame capture rate of 60 frames per second, the optical tracking system provides a bit rate between 15-30 bit per second, e.g. 30 bits per second for a trend defined over two exposure periods and 10 bits per second for a trend defined over 6 exposure periods. Optionally, an additional start bit and/or stop bit is used before and after transmission of codes. Optionally, a break in transmission (emission) is used as a start bit and/or stop bit. Optionally a parity bit is used for error detection. In some exemplary embodiments, an optical tag 50 transmits an 8 bit identification code within 0 0.25 - 0.6 seconds, e.g. 0.4 seconds. Optionally, shorter or longer codes can be transmitted by optical tag 50, e.g. 4-16 bit codes for identifying optical tags 50 and/or for transmitting additional information, e.g. sensor readings measuring temperature of animal or surrounding environment or activity level of animal.
Referring now to FIG. 5C, in some exemplary embodiments, encoding other than digital encoding is used to transmit information from tags 50 to cameras 80. In some exemplary embodiments, a trend of steady intensity levels 330 over a plurality of exposure periods is used to encode a bit Ά', a trend of increasing intensity 340 over a plurality of exposure periods is used to encode a bit Β' while a trend in decreasing intensity 350 over a plurality of exposure periods is used to encode a bit 'C.
Typically, when encoding based on trends, the actual values of intensity per frame do not define the bit. This is clearly exemplified, for example, with respect to the two instances of bit 'A' in FIG. 5C, each of which is defined using a different intensity levels. Optionally, different rates of increasing (or decreasing) intensities, e.g. different slopes are used to encode different bits. In some exemplary embodiments, a sharp increase in intensity represents a bit Έ' while a shallow increase in intensity represents a bit Ψ Similar encoding may be provided for decreasing intensities. Optionally, threshold values are defined to distinguish between different rates of changes in intensities and thereby to decode the information.
Reference is now made to FIG. 6A and 6B showing an exemplary simplified encoded pattern of outputs obtained from an original image stream and from an image stream constructed by subtracting pixel values from contiguous image frames in accordance with some embodiments of the present invention. According to some embodiments of the present invention, analysis of output from the optical tags is performed on a stream of images constructed by subtracting pixel values from contiguous image frames of an image stream captured by cameras 80. In some exemplary embodiments, performing analysis on a stream of differences between images is advantageous since it reduces the noise and increase the signal to noise ratio of input obtained from the optical tags. Typically, when subtracting contiguous images, input due to steady light conditions and/or slow changes in light intensities, e.g. due to changes in background light conditions and/or movement of objects is decreased while input due to high frequency changes in intensity, e.g. as obtained from the optical tags is increased (emphasized). Additionally, the dynamic range of data obtained by subtracting images is effectively doubled, e.g. from 8 bit (0-255) to 16 bits (-255-255). According to some embodiments of the present invention, trends of increasing intensities 315 over four exposure periods are identified by three consecutive positive values 415 in a difference image stream. In a similar fashion, in accordance with some embodiments of the present invention, trends of decreasing intensities 325 over four exposure periods are identified by three consecutive positive values 425 in a difference image stream. Optionally, one or more threshold values 400 are used to distinguish between data contributing to an increasing or decreasing trend, e.g. values 415 and 425 and transition values 450 between bits. Optionally, different levels, e.g. positive and negative magnitudes, of values are used to identify additional bits represented by different rates of increasing (or decreasing) intensities.
Reference is now made to FIG. 7 showing a simplified flow chart of an exemplary method for tracking an optical tag and decoding information received from an optical tag in accordance with some embodiments of the present invention. According to some embodiments of a present invention, one or more optical tags, e.g. 0-2000 tags transmit an optical signal within a defined area. In some exemplary embodiments, the optical signal is a pulsed signal that provides encoded information based on pulse width modulation and/or pulse frequency modulation.
According to some embodiments of a present invention one or more cameras 80 capture a video stream of the defined area (block 710). Typically, the optical tracking system is an asynchronous system and the optical signals emitted by the tags are not synchronized with frame capturing of cameras 80. Optionally, up to 100 tags concurrently transmit an optical signal within a field of view of a camera capturing the video stream and the optical signal is captured by the camera over a plurality of frames.
According to some embodiments of the present invention, one or more cameras 80 capture image data over a plurality of distinct wavelengths, e.g. using a plurality of filters similar to RGB filters used in colored cameras. Optionally, the distinct wavelengths are narrow band wavelengths in the NIR range.
Optionally, captured video streams are transformed into a difference video stream where each frame in the difference video stream represents differences in light intensities between two contiguous frames in the captured video stream (block 720). Typically, the difference video stream is used to increase the signal to noise ratio of the emitted optical signal. Typically the optical signal is modulated at a frequency that is higher than a frame capture rate of the camera as compared with other sources of light captured by the cameras that are typically modulated at a frequency that is lower than a frame capture rate of the camera.
According to some embodiments of the present invention, frames in the video stream are analyzed to identify data sites, e.g. pixels receiving an optical signal from the optical tags (block 730). Typically, data sites are identified as areas in an image, e.g. difference image associated with a magnitude greater than a threshold magnitude. Typically, each data site includes a plurality of pixels. Optionally an average of all pixels included in the data site is used to determine an output of the data site and/or to determine the optical signal. Optionally, when difference images are used, both positive values above a threshold, e.g. representing a significant increase in intensity and negative values below a threshold, e.g. representing a significant decrease in intensity are identified as data sites. Optionally, a plurality of pixels is identified for each data site. Optionally, analysis is performed on-line as the frames are being captured.
According to some embodiments of the present invention, movement, e.g. displacement of each data site is tracked through the image stream (block 740). Typically, tracking of tag positioning and movement is based on predetermined knowledge regarding movement pattern or minimum distances between the tags and/or based on parameters of optical signal transmitted by the tag. In some exemplary embodiments, the tags move (and thereby the data sites on which the optical signal is identified changes) over a period during which an identification code of the tag (or other code) is in the process of being transmitted. Typically, over transmission periods of information, tracking is performed based on a known moving pattern and/or a known minimum distances between the tags. Optionally, position (coordinates) of each tag within the defined area is determined and recorded from tracking of the data sites in the stream.
According to some embodiments of the present invention, trends in output of each optical tag (or at least one optical tag) as captured over a plurality of frames, e.g. difference frames is identified (block 750). Typically, the output is an intensity level. Optionally, trends in intensity levels over specific wavelengths are identified. In some exemplary embodiments, trends include an increase and/or a decrease in an intensity level of a data site over a pre-defined number of frames. Optionally, a trend is defined by a slope of an increase and/or decrease in intensity. Typically, image frames include a plurality of data sites obtained from a plurality of tags. Optionally, output from a plurality of cameras is integrated to track objects over a larger area and/or from different angles.
According to some embodiments of the present invention, the optical tracking system decodes identified trends and determines information encoded by the trends (block 760). According to some embodiments of the present invention, the trends encode identification information and decoding of the identified trends is used to identify the tracked objects (block 770). Optionally, the decoded information provides information regarding the tracked animal, e.g. body temperature and activity level. According to some embodiments of the present invention, coordinates of each tag is determined and tracked over time (block 780).
Reference is now made to FIG. 8 showing a simplified schematic illustration of identified data sites ' in a portion of an image frame in accordance with some embodiments of the present invention. According to some embodiments of the present invention, one or more groups of pixels, e.g. pixel group 801, 802, 803 and 804 are identified as data sites in an image frame 800 (only a portion of image frame 800 is shown). According to some embodiments of the present invention, the optical signal for each data site is captured on one or more pixels 810 and additionally on one or more surrounding pixels 820 at a relatively lower intensity as compared to pixels 810. Optionally, pixels 810 represent input received from light emitted directly toward camera 80 and pixels 820 represent input due to leakage from pixels 810, e.g. a saturated pixel. Optionally, pixels 820 additionally represent input received from scattered light from emission of optical tags 50. According to some embodiments of the present invention, pixels 810 and pixels 820 are tracked and analyzed when identifying trends in output of the pixels in a video stream. It is noted that although in FIG. 8 only one pixel 810 is shown for each data site, in practice more than one pixel 810 can be identified for each data site.
Reference is now made to FIG. 9 showing a simplified flow chart of an exemplary method for decoding information received from an optical tag by analyzing output from pixels surrounding saturated pixels in accordance with some embodiments of the present invention. According to some embodiments of the present invention, captured data from contiguous frames are subtracted to detect changes in illumination levels (block 905). Optionally, a new video stream is constructed from the subtracted data and each pixel in the new video stream represents a difference between corresponding pixels values in the contiguous frames. According to some embodiments of the present invention pixels (in new video stream) having an absolute value above a defined threshold level is used to identify data sites (or potential data sites) (block 910). One or more data sites can be identified in each difference frame. Alternatively, data sites are identified from frames of an original video stream, and/or pre-processed video stream before subtraction between contiguous frames is performed.
Optionally, a plurality of pixels belonging to a single data site are grouped based on proximity between the pixels and/or based on a known area occupied by each optical signal on a frame. Typically, an area on a frame occupied by an optical signal is a function of the distance between the tag and the camera and the size of the optical emitter.
Typically, in cases when pixels 810 are saturated, only pixels 820 will be identified in the new video stream in response to thresholding and pixels 810 will be missed. According to some embodiments of the present invention, pixels surrounding and/or neighboring identified pixels are checked, e.g. based on data from an original video stream to locate saturated pixels that are part of the data site but were missed in the tresholding step (block 920). In some exemplary embodiments, identified saturated pixels are also included as part of the data site.
According to some embodiments of the present invention, a location of an optical tag is tracked based on output from a saturated and/or near saturated pixel, e.g. pixels 810 (block 930) while trends in changing intensities levels are determined based on identified pixels with an absolute value above the defined threshold, e.g. pixels 820 (block 940). Optionally, saturated pixels are ignored and tracking is based only on pixels identified in response to thresholding.
According to some embodiments of the present invention, output from the new video stream is used to decode output from one or more optical tags (block 950). In some exemplary embodiments, decoding of the tags provides information for identifying the tracked objects on which the tags are positioned (block 960). Optionally, additional or other information is provided by the coded signals. Reference is now made to FIG. 10 showing an exemplary schematic illustration of an alternate optical tracking system tracking herd animals confined within a defined area in accordance with some embodiments of the present invention. According to some embodiments of the present invention, optical tracking system 110 includes optical tags 55 fixed on an animal 20 that reflect an optical signal in response to emission by one or more light sources 99. According to some embodiments of the present invention, light reflected from optical tags 55 are captured by one or more cameras 80 viewing the defined area. Output from camera(s) 80 is transmitted to a local processing unit 76 that processes information received from optical tags 55. Typically, local processing unit 76 identifies and tracks position and movement of tags 55 and provides output to a central processing unit 70 for further processing. Output from optical tracking unit 100 is reported to a user with one or more output devices 72 and/or transmitted to a controller 73 operative to control a device associated with the animals, e.g. a controller for operating a gateway, fans, and an automatic feeder. Communication between local processing unit 76, cameras 80, central processing unit 70 and output device 72 and controller 73 may be tethered and/or wireless communication.
Typically, optical tracking system 110 is adapted to be used in outdoor lighting and temperature conditions as described in reference to FIGS. 1-2.
In some exemplary embodiments a black and white video camera, e.g. a video camera without RGB optical filters is used. Optionally, optical filters typically used to provide color images are replaced by narrow band filters selecting bandwidths spanning 5-50 nm in width within a range 850-950 nm. In some exemplary embodiments, camera(s) 80 including narrow bandwidth filters are operated using short exposure periods, e.g. 1-10 msec or 2 msec, as described in reference to FIGS. 1 and 2.
According to some embodiments of the present invention, optical tags 55 include at least one a retro-reflector (typically known to reflect light generally back to its source). In some exemplary embodiments, optical tag 55 includes at least one filter, e.g. narrow band filter in NIR range that reflects light in a narrow bandwidth defined by the filter. In some exemplary embodiments, optical tags 55 are identified by the optical tracking unit 110 based on the wavelength of light reflected from them and captured by camera(s) 80. Optionally, optical tag 55 includes a liquid crystal shutter that is programmed to pulse reflection of optical tag 55. Optionally, liquid crystal shutter is pulsed with an encoded pulse.
According to some embodiments of the present invention, one or more light sources 99 emit light in the NIR range, e.g. over a band of 850-1000 nm. Optionally, light sources 99 are positioned proximal to camera 80 so that light reaching tags 55 can be reflected back toward camera 80, e.g. in the case that retro-reflectors are used. Typically, camera 80 is protected so that light from light source 55 is not directly received by camera 80. Optionally, a partially reflecting surface is used so that light source 99 can be effectively transmitted at a point where camera 80 captures the reflected light.
According to some embodiments of the present invention, local processing unit 76 receives captured video streams from camera(s) 80, identifies wavelength received by tag(s) 55, tracks position and movement of the tags and decodes information transmitted by the tags over time. According to some embodiments of the present invention, cameras 80 include a flywheel filter for sequentially filtering narrow band wavelengths within an NIR range. Optionally, camera 80 includes filters similar to RGB filters in a colored video for filtering narrow band wavelengths in the NIR range. Optionally, different cameras with a same field of view include dedicated filters for filter at different narrow band frequencies.
Reference is now made to FIGS. 11A, 11B, and 11C showing exemplary schematic diagrams of tags including a reflector in accordance with some embodiments of the present invention. According to some embodiments of the present invention, each of tags 55, 55' and 55" include at least a reflector 510. Optionally the reflector is a retro- reflector. Optionally, different sizes and shapes of reflector 510, e.g. is used to distinguish between different tags 55'. Optionally tag 55' and 55 are passive tags, e.g. do not require power source.
Referring to now to FIG. 11B, in some exemplary embodiments, a tag 55 includes one or more narrow band filter 515 covering an upper surface of the reflector 510 that operates to reflect a specific band of light in the NIR range. Optionally, a plurality of different filters is positioned side to side over reflector 510. Typically, for tag 55, identification is based on one or more identified wavelength of reflected light corresponding to one or more filters (or no filters) positioned over reflector 510. Referring now to FIG. 11C in some exemplary embodiments, tag 55" is an active tag that includes a liquid crystal shutter that alternately cover and expose reflector 510 in response to an electric signal provided by a controller 520. Optionally, tag 55" includes a power supply 52 for activating active filter 516 and/or controller 520.
Reference is now made to FIG. 12 showing a simplified flow chart of an exemplary method for tracking and identifying an optical tag reflecting received light in accordance with some embodiments of the present invention. According to some embodiments of the present invention, one or more light emitters emit light over a predefined area (block 610). Optionally, the light emitters emit light within the NIR range. According to some embodiments of the present invention, one or more optical tags including reflectors, e.g. retro-reflector, reflect light toward one or more cameras with a field of view of the defined area (block 620). Optionally, one or more optical tags include one or more optical filters and reflect light within one or more specified narrow band frequencies. According to some embodiments of the present invention, light reflected off of one or more optical tags are captured by cameras 80 over a video stream of frames (block 630). According to some embodiments of the present invention, areas in captured frames receiving light from one or more optical tags are identified (block 640). In some exemplary embodiments, the wavelengths of the received light are used to identify the optical tag associated with each identified area (block 650). Optionally pulsing of the reflected signal is decoded and used to identify the optical tag. Typically a location of an identified optical tag is tracked over the captured video stream (block 660). Optionally, the tagged object is tracked over the video stream (block 660). According to some embodiments of the present invention, tag 55 is identified in based on output from a single frame. Optionally, output over a few frames is used to identify tag 55.
According to some embodiments of the present invention, each animal of the herd is individually identified by a dedicated code transmitted by an optical tag 50 (or tag 55). Optionally, specific types of cows, e.g. sharing a particular characteristic are identified with a same code. For example, all cows within a specified age range, a particular breed, and/or having a particular health condition may be tagged with the same code and tracked as a group. Optionally, objects other than the herd animal, e.g. a gate are tagged and information based on positioning of these objects is used for herd management. Optionally, the tagged object is a sensor, e.g. temperature sensor and output from the sensor is optically transmitted to cameras 80 for decoding.
Herd Management Systems and Exemplary Applications
According to some embodiments of the present invention, optical tracking system is used for herd management. In some exemplary embodiments, one or more parameters are defined to determine and track a general well being of a herd and/or a comfort level of the herd housed in an enclosure.
Reference is now made to FIGS. 13A and 13B showing an exemplary schematic illustration of a distribution pattern of a herd identified by a herd management system in accordance with some embodiments of the present invention. According to some embodiments of the present invention, one or more cameras 80 capture an image stream of tags 50 positioned on herd animals 20. According to some embodiments of the present invention, output from cameras 80 is analyzed, e.g. by optical tracking system 100 or 110 to identify atypical grouping of the herd. In some exemplary embodiments, an assumption is made that typically herd animals 20 distribute themselves substantially evenly across an enclosure 85 as shown in FIG. 13A and any atypical distribution of herd animals 20, e.g. distribution other then substantially even distribution within enclosure 85 is reported. Optionally, cluster analysis is used to characterize distribution of herd animals 20. Optionally, an absence of herd animals in a specific location triggers reporting. Optionally, output from cameras 80 is analyzed, e.g. by optical tracking system 100 or 110 to locate position of individual animals of herd.
In some exemplary embodiments, grouping of herd animals 20 in a specific area of an enclosure 85, as shown in FIG. 13B is identified by optical tracking system 100 (or optical tracking system 110) and reported. Optionally, grouping of herd animals 20 in a specific location, e.g. at a specific part of the day, may indicate that that specific location has favorable conditions for herd animals 20 and/or that a vacated portion of the enclosure 87 has unfavorable conditions for herd animal 20, e.g. exposure to sun, noise, and/or wind. Knowledge of atypical behavior of the herd may alert the farmer of changes required in housing of the herd animals. Optionally an image is captured in response to determining such an event.
Reference is now made to FIGS. 14A-14B showing an exemplary schematic illustration of typical and atypical behavior of a herd animal identified by a herd management system in accordance with some embodiments of the present invention. According to some embodiments of the present invention, a herd management system identifies an individual animal 21 displaying atypical behavior as compared to the rest of a herd 29. Optionally, atypical behavior of an animal 21 as compared to a group 29 may give indication that animal 21 is discomforted by an ailment. In some exemplary embodiments, the heard management system identifies groups of animals, e.g. group 29 (FIG. 14A) and uses the grouping to identify individual animal 21 that are disassociated from a group 29 (FIG. 14B) and reports the event to a user. Optionally in response to identifying an atypical behavior of animal 21, a full image is captured of animal 21 and stored for reporting. In some exemplary embodiments, an identity of animal 21 is determined by the optical tracking system. Optionally, a user identifies animal 21 based on the captured image.
Reference is now made to FIG. 15 showing an exemplary schematic illustration of a behavior pattern of specific animals of a herd animal identified by a herd management system in accordance with some embodiments of the present invention. According to some embodiments of the present invention, a herd management system identifies a leader 22 in a group 29. In some exemplary embodiments, during movement of a group of animals 29, an animal 22 that heads the movement is identified and reported. Typically, knowledge regarding leaders of a herd is known to be useful in herd management.
Reference is now made to FIG. 16 showing an exemplary schematic illustration of a behavior pattern of herd animals at a specific location identified by a herd management system in accordance with some embodiments of the present invention. In some exemplary embodiments, a behavior pattern of one or more animals 20 at a specific location 88 is tracked. According to some embodiments of the present invention a portion of an image frame is associated with a specific location 88 and optical signals received from location 88 is captured by a camera 80 and analyzed. In some exemplary embodiments, a number of times an animal 20 approaches a water and/or food stall 11 is tracked. Optionally duration of time spent in a particular stall 11, e.g. food or water is determined.
Reference is now made to FIG. 17 showing a simplified flow chart of an exemplary method for identifying atypical behavior of animals in a herd in accordance with some embodiments of the present invention. According to some embodiments of the present invention, an optical tracking system learns typical behavior patterns of a herd housed in an enclosure, e.g. enclosure 85 (block 360). Optionally, learning is provided by machine learning of an on-site herd or of a herd with same animals during a controlled study. Optionally, typical behavior of the herd is learned based on input from a user.
According to some embodiments of the present invention, cows associate with a herd are identified and tracked (block 365). In some exemplary embodiments, optical tracking system identifies any straying from the learned typical behavior. Optionally, the optical tracking system only identifies pre-selected types of atypical event, e.g. gathering of the herd in one specific location, straying of an individual animal from the herd. Optionally animals associated with an atypical event are identified, e.g. an animal straying from the herd and recorded (block 370). Optionally, a full and/or still image is captured in response to identifying an atypical event (block 375). According to some embodiments of the present invention, the atypical event, e.g. straying from identified typical behavior is reported (block 380).
Reference is now made to FIG. 18 showing a simplified block diagram of an exemplary herd management system according to some embodiments of the present invention. According to some embodiments of the present invention, herd management system 120 is similar to optical tracking system 100 (or optical tracking system 110) and has an additional unit for detecting and analyzing grouping of animals 20 tracked by optical tags 50. According to some embodiments of the present invention, output from local processing unit 76 is used by group analysis unit 111 to define or detect behavior patterns of groups of animals and/or individual animals as compared to a group. Optionally, group analysis unit 111 is used to identify any atypical behavior of an animal 20 tracked by optical tag 50. In some exemplary embodiments, group analysis unit 111 is used to identify socially rejected animals 20, e.g. based on identifying a cow that gests near an eating area but is obstructed from other animals from getting to the food. It is noted that although herd management unit has been described mostly as receiving input from optical tracking system 100, herd management unit may similarly operate with input provided by optical tracking system 110 or other RTLS.
It is noted that although the optical tracking system systems described herein have been mostly described in reference to tracking and managing cows, persons skilled in the art will appreciate that a same and/or similar system can be applied for tracking and managing animals other than cows, tracking and managing objects (other than animals). It is additionally noted that although the optical tracking system is adapted for use in outdoor lighting and temperature conditions, persons skilled in the art will appreciate that a same system can be also used indoors.
Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
The terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean "including but not limited to".
The term "consisting of means "including and limited to".
The term "consisting essentially of means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.

Claims

WHAT IS CLAIMED IS:
1. An optical tracking system comprising:
at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light;
at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and
a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
2. The optical tracking system of claim 1, wherein the at least one optical emitter and the at least one camera are asynchronous.
3. The optical tracking system of claim 1 or claim 2, wherein the at least one optical emitter is adapted to emit light in a near infrared wavelength.
4. The optical tracking system of any of claims 1-3, wherein the near infrared wavelength is between 800-1000 nanometers.
5. The optical tracking system of claim 3 or claim 4, wherein the at least one optical emitter includes a filter adapted to transmit light over a selected bandwidth.
6. The optical tracking system of any of claims 1-5, wherein the modulated pattern of pulsed light provides an information code.
7. The optical tracking system of claim 6, wherein the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
8. The optical tracking systems of claim 7, wherein the bit is defined based on a trend in intensity of pixels neighboring pixels with saturated output.
9. The optical tracking system of claim 6 or claim 7, wherein each trend is determined over at least four contiguous frames.
10. The optical tracking system of any of claims 6-9, wherein the information code is a digital code.
11. The optical tracking system of any of claims 7-10, wherein the trend in intensity is selected from a group including: an increasing trend in intensity and a decreasing trend in intensity.
12. The optical tracking system of claim 11, wherein the trend is defined by a slope of increasing or decreasing intensity.
13. The optical tracking system of any of claims 1-5, wherein the processing unit is adapted to generate a new stream of frames constructed by subtracting pixel values from pairs of contiguous frames, wherein pixel values of image frames in the new stream include both positive and negative pixel values.
14. The optical tracking system of claim 13, wherein the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by pixel values that have a same sign (positive or negative) over a pre-defined number of image frames.
15. The optical tracking system of any of claims 6-14, wherein the information code is an identification code.
16. The optical tracking system of any of claims 6-15, wherein the information code is information received from a sensor in communication with the at least one optical emitter.
17. The optical tracking system of any of claims 1-16, wherein the optical tracking system provides a bit rate of 15-30 bits per second.
18. The optical tracking system of any of claims 1-17, wherein the pattern of pulses is modulated by at least one of pulse width modulation and frequency modulation.
19. The optical tracking system of any of claims 1-18, wherein a frequency of the pulsing is at least four times faster than a frame capture rate of the camera.
20. The optical tracking system of any of claims 1-19, wherein the exposure period for capturing frames in the stream is between 10. μβεΰ and 10 msec.
21. The optical tracking system of any of claims 1-20, wherein the pre-defined areas spans 100-2500 square meters.
22. The optical tracking system of any of claim 1-21, wherein the optical tracking system is adapted to tracking herd animals housed in the pre-defined area.
23. The optical tracking system of claim 22, wherein the herd animal is a cow.
24. A method for optical tracking, the method comprising:
emitting an optical signal including a modulated pattern' of pulsed light from an optical tag;
capturing radiation from the optical signal over a stream of captured image frames; and
identifying an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
25. The method of claim 24, wherein the emitting and the capturing is asynchronous.
26. The method of claim 24 or claim 25, wherein the optical signal is emitted in the near infrared range.
27. The method of any of claims 24-26, wherein the modulated pattern of pulsed light provides an information code, wherein the information code includes at least one bit of information, the bit defined by a trend in intensity over a pre-defined number of image frames, wherein the pre-defined number is greater than 2 image frames.
28. The method of any of claim 24 or claim 27, wherein the optical signal is adapted to be emitted and received in an area exposed to outdoor lighting and temperature conditions.
29. An optical tracking system comprising:
at least one optical tag adapted to transmit an optical signal in a range between 800-1000 nanometers;
at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10 msec; and
a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
30. The optical tracking system of claim 29, wherein the at least one optical tag transmits an optical signal in a selected sub-range of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
31. The optical tracking system of claim 30, wherein the at least one camera includes a plurality of filters, each filter adapted to filter radiation received from the optical signal over a different sub-range of the range.
32. The optical tracking system of claim 31, wherein the plurality of filters is applied sequentially over different image frames.
33. The optical tracking system of claim 31, wherein the plurality of filters is applied to different pixels in the same image frame.
34. The optical tracking system of any of claims 30-33, wherein the processing unit is adapted to determine the sub-range of the input and to identify the optical tag based on the sub-range.
35. The optical tracking system of any of claims 29-34, wherein the optical tag includes a reflector and wherein the transmitted optical signal is an optical signal reflected from a light source.
36. The optical tracking system of claim 35, wherein the reflector is a retro-reflector.
37. The optical tracking system of claim 35 or claim 36, wherein the reflector includes a filter adapted to filter light received from the light source over a sub-range of the range.
38. The optical tracking system of claim 37, wherein the filter is a passive element.
39. The optical tracking system of any of claims 29-38, wherein the camera is adapted to periodically capture an image frame using an exposure period longer than 50 msec.
40. The optical tracking system of any of claims 29-34, wherein the at least one optical tag includes at least one optical emitter adapted to emit pulsed light.
41. The optical tracking system of claim 40, wherein the processing unit adapted to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream.
42. The optical tracking system of claim 40 or claim 41, wherein the at least one optical tag and the at least one camera are asynchronous.
43. The optical tracking system of any of claims 40-42, wherein modulation of pulsing of the pulsed light provides an information code.
44. The optical tracking system of any of claims 40-42, wherein the information code includes a plurality of bits of information, wherein each bit is defined by a trend in intensity of defined over a pre-defined number of frames.
45. A method for optical tracking, the method comprising:
transmitting an optical signal in a range between 800-1000 nanometers with an optical tag;
capturing radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames over an exposure period ranging between 1-10 msec; and
identifying coordinates on the image frames where input was received from the optical signal; and
determining a location of the optical tag based on the identified coordinates.
46. The method of claim 45, wherein the optical signal is emitted in a selected subrange of the range, wherein a bandwidth of the sub-range is 5-20 nanometers wide.
47. The method of claim 45 or claim 46 adapted for tracking herd animals housed in an enclosure exposed to outdoor lighting and temperature conditions.
48. A herd management system for managing a herd housed in an enclosure comprising:
a real time locating system operable to track position and changes in position of a plurality of herd animals housed in an enclosure;
a group analysis unit adapted to identify one or more groups of the herd animals within the enclosure based on input received from the real time locating system, wherein the one or more groups is defined based on a proximity between the herd animals tracked by the real time locating system and to track positions and movement of the one or more groups; and
wherein group analysis unit is adapted to determine a behavioral pattern of the herd animals based on input received from the real time locating system and position and movement of the one or more groups; and
an output device adapted to report a behavioral pattern of the herd animals.
49. The herd management system of claim 48, wherein the real time locating system is adapted to determine identity of each of the herd animals that are tracked.
50. The herd management system of claim 49, wherein the group analysis unit is adapted to track a direction of movement of the one or more groups and to identify a herd animal leading the one or more groups in the direction of movement.
51. The herd management system of claim 49 or claim 50, wherein the group analysis unit is adapted to identify straying of a herd animal from a group.
52. The herd management system of any of claims 49-51, wherein the group analysis unit is adapted to identify a socially rejected cow.
53. The herd management system of any of claims 48-52, wherein the group analysis unit is adapted to identify gathering of the herd animals in specific locations in the enclosure.
54. The herd management system of any of claims 48-53, wherein the group analysis unit is adapted to identify position of individual cows in the herd.
55. The herd management system of any of claims 48-54, wherein the group analysis unit is adapted to learn a typical behavioral pattern of the one or more groups and to identify an atypical behavioral pattern.
56. The herd management system of any of claims 48-55, wherein the real time locating system comprises:
at least one optical tag adapted to transmit an optical signal in a range between 850-950 nanometers;
at least one camera adapted to capture radiation from the optical signal at a distance between 3-50 meters from the optical tag over a stream of image frames, wherein at least a portion of the frames are captured over an exposure period ranging between 1-10 msec; and
a processing unit adapted to receive input from the at least one camera and identify an area on the image frames where input was received from the optical signal and to determine a location of the optical tag in the pre-defined area.
57. The herd management system of any of claims 48-55, wherein the real time locating system comprises:
at least one optical emitter adapted to emit an optical signal including a modulated pattern of pulsed light;
at least one camera with a field of view directed toward a pre-defined area operable to capture radiation from the optical signal over a stream of captured image frames; and
a processing unit adapted to receive input from the at least one camera and to identify an area on the image frames where input was received from the optical signal based on differences in intensities between contiguous frames in the stream that receive input from the optical signal.
PCT/IL2011/000454 2010-06-10 2011-06-09 Optical tracking system and method for herd management therewith WO2011154949A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US35324010P 2010-06-10 2010-06-10
US61/353,240 2010-06-10

Publications (2)

Publication Number Publication Date
WO2011154949A2 true WO2011154949A2 (en) 2011-12-15
WO2011154949A3 WO2011154949A3 (en) 2012-04-12

Family

ID=44511133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2011/000454 WO2011154949A2 (en) 2010-06-10 2011-06-09 Optical tracking system and method for herd management therewith

Country Status (1)

Country Link
WO (1) WO2011154949A2 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014030156A1 (en) * 2012-08-18 2014-02-27 Scr Engineers Ltd Cow retrieval system
EP3375282A1 (en) * 2017-03-13 2018-09-19 Fujitsu Limited Method, information processing apparatus and program
WO2019009719A1 (en) * 2017-07-05 2019-01-10 N.V. Nederlandsche Apparatenfabriek Nedap A farm system with position determination for animals
US10507063B2 (en) 2014-11-21 2019-12-17 Think Surgical, Inc. Visible light communication system for transmitting data between visual tracking systems and tracking markers
EP3479055A4 (en) * 2016-06-30 2020-02-26 Magicom KFT. Method for identifying and locating a movable object
US10986817B2 (en) 2014-09-05 2021-04-27 Intervet Inc. Method and system for tracking health in animal populations
US10986816B2 (en) 2014-03-26 2021-04-27 Scr Engineers Ltd. Livestock location system
GB2589080A (en) * 2019-11-08 2021-05-26 Ethersec Ind Ltd Surveillance system
US11071279B2 (en) 2014-09-05 2021-07-27 Intervet Inc. Method and system for tracking health in animal populations
NL2025886B1 (en) * 2020-06-23 2022-02-21 Nedap Nv Method for reading a label identification mark from a label, as well as label.
GB2603131A (en) * 2021-01-26 2022-08-03 Ethersec Ind Ltd Surveillance system
USD990063S1 (en) 2020-06-18 2023-06-20 S.C.R. (Engineers) Limited Animal ear tag
USD990062S1 (en) 2020-06-18 2023-06-20 S.C.R. (Engineers) Limited Animal ear tag
US11832584B2 (en) 2018-04-22 2023-12-05 Vence, Corp. Livestock management system and method
US11832587B2 (en) 2020-06-18 2023-12-05 S.C.R. (Engineers) Limited Animal tag
US11864529B2 (en) 2018-10-10 2024-01-09 S.C.R. (Engineers) Limited Livestock dry off method and device
US11960957B2 (en) 2020-11-25 2024-04-16 Identigen Limited System and method for tracing members of an animal population
US11963515B2 (en) 2021-02-24 2024-04-23 S.C.R. (Engineers) Limited Livestock location system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101308A1 (en) 2002-11-27 2004-05-27 Beyette Fred R. Optical communications imager
WO2004102462A2 (en) 2003-05-14 2004-11-25 Precision Location Systems Ltd. Tracking system using optical tags
US20050116821A1 (en) 2003-12-01 2005-06-02 Clifton Labs, Inc. Optical asset tracking system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0813040A3 (en) * 1996-06-14 1999-05-26 Xerox Corporation Precision spatial mapping with combined video and infrared signals
JPH1074249A (en) * 1996-08-30 1998-03-17 Japan Radio Co Ltd Motion capture system
JP3678404B2 (en) * 2000-05-12 2005-08-03 株式会社東芝 Video information processing device
EP1989926B1 (en) * 2006-03-01 2020-07-08 Lancaster University Business Enterprises Limited Method and apparatus for signal presentation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040101308A1 (en) 2002-11-27 2004-05-27 Beyette Fred R. Optical communications imager
WO2004102462A2 (en) 2003-05-14 2004-11-25 Precision Location Systems Ltd. Tracking system using optical tags
US20050116821A1 (en) 2003-12-01 2005-06-02 Clifton Labs, Inc. Optical asset tracking system

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014030156A1 (en) * 2012-08-18 2014-02-27 Scr Engineers Ltd Cow retrieval system
US10986816B2 (en) 2014-03-26 2021-04-27 Scr Engineers Ltd. Livestock location system
US11071279B2 (en) 2014-09-05 2021-07-27 Intervet Inc. Method and system for tracking health in animal populations
US10986817B2 (en) 2014-09-05 2021-04-27 Intervet Inc. Method and system for tracking health in animal populations
US10507063B2 (en) 2014-11-21 2019-12-17 Think Surgical, Inc. Visible light communication system for transmitting data between visual tracking systems and tracking markers
EP3479055A4 (en) * 2016-06-30 2020-02-26 Magicom KFT. Method for identifying and locating a movable object
US10475211B2 (en) 2017-03-13 2019-11-12 Fujitsu Limited Method, information processing apparatus and non-transitory computer-readable storage medium
EP3375282A1 (en) * 2017-03-13 2018-09-19 Fujitsu Limited Method, information processing apparatus and program
NL2019186B1 (en) * 2017-07-05 2019-01-16 N V Nederlandsche Apparatenfabriek Nedap a farm system with position determination for animals.
WO2019009719A1 (en) * 2017-07-05 2019-01-10 N.V. Nederlandsche Apparatenfabriek Nedap A farm system with position determination for animals
US11832584B2 (en) 2018-04-22 2023-12-05 Vence, Corp. Livestock management system and method
US11864529B2 (en) 2018-10-10 2024-01-09 S.C.R. (Engineers) Limited Livestock dry off method and device
GB2589080A (en) * 2019-11-08 2021-05-26 Ethersec Ind Ltd Surveillance system
GB2589080B (en) * 2019-11-08 2022-01-19 Ethersec Ind Ltd Surveillance system
USD990062S1 (en) 2020-06-18 2023-06-20 S.C.R. (Engineers) Limited Animal ear tag
USD990063S1 (en) 2020-06-18 2023-06-20 S.C.R. (Engineers) Limited Animal ear tag
US11832587B2 (en) 2020-06-18 2023-12-05 S.C.R. (Engineers) Limited Animal tag
NL2025886B1 (en) * 2020-06-23 2022-02-21 Nedap Nv Method for reading a label identification mark from a label, as well as label.
US11960957B2 (en) 2020-11-25 2024-04-16 Identigen Limited System and method for tracing members of an animal population
GB2603131B (en) * 2021-01-26 2023-06-07 Ethersec Ind Ltd Surveillance system
WO2022161900A1 (en) * 2021-01-26 2022-08-04 Ethersec Industries Ltd Surveillance system
GB2603131A (en) * 2021-01-26 2022-08-03 Ethersec Ind Ltd Surveillance system
US11963515B2 (en) 2021-02-24 2024-04-23 S.C.R. (Engineers) Limited Livestock location system

Also Published As

Publication number Publication date
WO2011154949A3 (en) 2012-04-12

Similar Documents

Publication Publication Date Title
WO2011154949A2 (en) Optical tracking system and method for herd management therewith
US9277878B2 (en) Image processing sensor systems
US7395966B2 (en) Tracking system using optical tags
US7411497B2 (en) System and method for intruder detection
US9740921B2 (en) Image processing sensor systems
US20050116821A1 (en) Optical asset tracking system
US7504956B2 (en) System and method for pest detection
KR101716365B1 (en) Module-based intelligent video surveillance system and antitheft method for real-time detection of livestock theft
US11938614B2 (en) Control device for robot to tease pet and mobile robot
US9235795B2 (en) Optical system and method for monitoring and locating objects
CN109479746A (en) A kind of robot is funny to dote on control method and chip
CN113273179B (en) Device for managing farm environment
JP5042177B2 (en) Image sensor
KR20190143518A (en) Apparatus and method for determining abnormal object
WO2018082931A1 (en) Method and apparatus for free-space optical transmission
US20230011926A1 (en) Wearable Badge and Method for Determining a Type of an Identification Card Removably in said Badge
González et al. Real-time monitoring of poultry activity in breeding farms
Ko et al. Embedded imagers: Detecting, localizing, and recognizing objects and events in natural habitats
CA3110492A1 (en) Autonomous monitoring system
WO2012115558A1 (en) Apparatus and method for tracking a stabled animal
EP3336783A1 (en) Method and light assembly for tracking livestock inside a building
KR20200009335A (en) Apparatus and method for detecting abnormal object and imaging device comprising the same
KR102295989B1 (en) Bio-information transmitter, bio-information receiver, and bio-information communication system
KR102307609B1 (en) Apparatus and method for detecting abnormal object and imaging device comprising the same
WO2016178752A1 (en) Arrangement for and method of differently illuminating targets to be electro-optically read in different day and night modes of operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11748748

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24.04.2013)

122 Ep: pct application non-entry in european phase

Ref document number: 11748748

Country of ref document: EP

Kind code of ref document: A2