WO2022266705A1 - A system and apparatus for animal management - Google Patents

A system and apparatus for animal management Download PDF

Info

Publication number
WO2022266705A1
WO2022266705A1 PCT/AU2022/050627 AU2022050627W WO2022266705A1 WO 2022266705 A1 WO2022266705 A1 WO 2022266705A1 AU 2022050627 W AU2022050627 W AU 2022050627W WO 2022266705 A1 WO2022266705 A1 WO 2022266705A1
Authority
WO
WIPO (PCT)
Prior art keywords
animal
target animal
power mode
processor
environment
Prior art date
Application number
PCT/AU2022/050627
Other languages
French (fr)
Inventor
John Llewellyn READ
Timothy James Henry Edwards
Original Assignee
Thylation R&D Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021901865A external-priority patent/AU2021901865A0/en
Application filed by Thylation R&D Pty Ltd filed Critical Thylation R&D Pty Ltd
Priority to AU2022297009A priority Critical patent/AU2022297009A1/en
Publication of WO2022266705A1 publication Critical patent/WO2022266705A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M31/00Hunting appliances
    • A01M31/002Detecting animals in a given area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/96Management of image or video recognition tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Definitions

  • the present application relates to an apparatus adapted to detect and identify animals and/or animal species.
  • Embodiments of the present invention are particularly adapted for the detection, identification, monitoring and management of target animal populations and detection of individual animals within an environment being monitored.
  • the invention is applicable in broader contexts and other applications.
  • systems for detecting animal species include passive infrared (PIR) cameras, which detect heat signatures but are not well adapted to distinguish animals from changes in other-non animal sources of heat such as moving shadows.
  • Other systems for detecting animals involve physical tags or physical mechanisms such as trip-wires or pressure plates and other traps.
  • PIR passive infrared
  • Other systems for detecting animals involve physical tags or physical mechanisms such as trip-wires or pressure plates and other traps.
  • These systems suffer from a number of deficiencies including but not limited to; the need to capture and tag animals, the inability to precisely and quickly identify a target animal species and, in cases where the animal is identified, unwanted delays in identifying the target animal species, resulting in the target animal moving out of the area in which the detection occurs. This can be problematic if for instance, an action such as administering a compound onto the target animal is required.
  • Other problems associated with present systems include a generally low level of accuracy in distinguishing a target animal species, potentially resulting in an action being taken on the wrong species of animal
  • sensors such as passive infra-red sensors and optical sensors have been employed to monitor animal species have the advantage of remote sensing over wide areas.
  • these systems suffer from a large number of false triggers due to moving objects like blowing vegetation and ambient temperatures approaching animal core temperatures.
  • false triggers, or lack of triggers when the ambient temperature is close to body temperature give rise to incorrect tracking and administration of control actions, as well as increasing overall power consumption of the monitoring devices.
  • the present inventors have identified a need for improvement to monitoring devices to be able to quickly and accurately identify target animals, potentially down to the individual animal, while being robust and energy efficient to be situated in the field for long periods of time.
  • an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are controlled to capture images in a low power mode and the processor is configured to detect a change in the environment from the images and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are controlled to capture images in a higher power mode than in the low power mode and the processor is configured to identify a target animal in the environment.
  • the one or more image sensors are configured to capture images at a lower resolution and/or a lower frame rate than in the high power mode.
  • the processor in the higher power mode, is configured to classify a detected animal as a target animal.
  • the apparatus is configured to operate in a second higher power mode in which the processor performs a classification to determine if a detected animal is a target animal.
  • the processor is preferably configured to classify the animal as an individual target animal.
  • the apparatus includes a communications module configured to transmit one or more of the captured images to a remote server and wherein the remote server performs a classification to identify if a detected animal is a target animal.
  • switching between the low power mode and higher power mode occurs within a period of less than 100 milliseconds.
  • an apparatus configured to perform real-time identification of an individual target animal in an environment, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture one or more 2D or 3D images of the environment; a processor contained within the housing and configured to process the one or more images to identify the individual target animal in real time based on one or more detected visual characteristics in the one or more images and, in response, generate an identification data packet; and a communications module configured to communicate the identification data packet to a remote server which is in communication with other apparatus.
  • the apparatus includes a GPS device configured to determine a location of the apparatus.
  • At least one of the one or more image sensors is a thermal image sensor capable of imaging in the infrared wavelength range to detect thermal characteristics.
  • the thermal image sensor may be calibrated to detect a temperature of the target animal such as an average temperature.
  • the processor is configured to use the temperature of the target animal and background to determine a health status of the target animal.
  • the apparatus includes an acoustic sensor to detect acoustic characteristics within the environment.
  • the processor is configured to detect a presence of or identify the target animal at least in part by one or more acoustic characteristics that match or closely match an acoustic characteristic indicative of the target animal.
  • the apparatus includes a particle detector or chemical analyser configured to detect scent characteristics within the environment.
  • the processor is preferably configured to detect a presence of or identify the target animal at least in part by determining one or more scent signatures of the target animal based on the detected scent characteristics.
  • the apparatus includes a battery and the one or more image sensors and the processor are powered locally by the battery.
  • the processor is implemented on a system-on-chip (SoC) device.
  • SoC system-on-chip
  • the processor and/or other hardware are incorporated into an embedded hardware system.
  • the one or more image sensors are also implemented on an SoC device.
  • the processor includes a clock to determine the current time at the location. [0029] In some embodiments, the processor is configured to execute a machine learned classifier to identify the target animal. In some embodiments, the processor is configured to execute a neural network classifier algorithm.
  • the detected visual characteristics include a shape of the target animal. In some embodiments, the detected visual characteristics include one or more predefined movements or behavioural characteristics of the target animal over a plurality of images. In some embodiments, the detected visual characteristics include thermal characteristics of the target animal. In some embodiments, the detected visual characteristics include a brightness or reflectivity or absorbance characteristic of the target animal. In some embodiments, the detected visual characteristics include a distinct colour and/or marking of the target animal.
  • the target animal includes a predefined animal species.
  • the target animal is an individual animal.
  • the individual animal is a unique individual that can be distinguished from all other animals.
  • the apparatus includes a wireless identifier configured to detect pre-stored data from a wireless tag associated with one or more animals.
  • the apparatus is configured to operate in one of a low power mode or a high power mode.
  • the one or more image sensors are configured to capture images at a lower frame rate and/or a lower resolution.
  • the high power mode is preferably activated by a trigger signal.
  • the trigger signal may be based on a mechanical or electromechanical trigger associated with the apparatus.
  • the trigger signal is based on detection of one or more trigger visual characteristics in the one or more images.
  • the one or more trigger visual characteristics include movement characteristics in the images.
  • the one or more trigger visual characteristics include detection of a predefined shape or shapes in the images.
  • the one or more trigger visual characteristics include detection of a predefined temperature or brightness in the images.
  • the trigger signal is based on detection of one or more trigger acoustic characteristics detected by an acoustic sensor.
  • the one or more image sensors are deactivated.
  • an acoustic sensor is configured to sense acoustic signals and the processor is configured to process the acoustic signals to identify one or more acoustic sounds indicative of an animal or a target animal.
  • the apparatus includes an illumination device configured to selectively illuminate at least a part of the environment.
  • the apparatus includes one or more of a light level sensor, humidity sensor and/or a temperature sensor.
  • the apparatus includes an actuation device responsive to a sensor signal generated from the processor for initiating an action in response to identification of the target animal.
  • the actuation device includes a dispenser for dispensing a compound onto the target animal.
  • the actuation device includes a visual or acoustic stimulus actuated in response to the identification of the target animal.
  • the apparatus includes a sound generator adapted to generate sounds to lure the target animal.
  • the apparatus includes a visual lure to lure the target animal toward the apparatus.
  • the communications module is further adapted to communicate the characteristics and/or the presence of a target animal to other nearby apparatus for detecting or identifying animals.
  • the communications device is adapted to send and receive data to a remote database at predetermined time periods.
  • the apparatus is incorporated into a drone or UAV which is controllably moveable around the environment.
  • an actuation device including one or more actuators responsive to a sensor signal generated from an apparatus of the first or second aspect.
  • a system configured to detect a presence of a target animal in an environment, the system including a plurality of apparatuses of the second aspect, wherein the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server.
  • the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server.
  • one or more other apparatus are controlled from a lower power mode into one or more high power modes.
  • an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more sensors mounted on or within the housing and configured to sense one or more characteristics of the environment; a processor contained within the housing and configured to process the one or more images and environment characteristics to detect a presence of a target animal in real time; and a communication device adapted to communicate the environment characteristics and/or the presence of a target animal to other nearby apparatus to switch the other nearby apparatus into a monitoring mode.
  • the one or more sensors include one or more of image sensors, acoustic sensors, temperature sensors, particle sensors and/or chemical analysers.
  • an animal monitoring system for monitoring animals within an environment, the system including a plurality of animal monitoring apparatus positioned at spatially separated locations within the environment, each animal monitoring apparatus including: one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; and a communications module for communicating the identification of a target animal to one or more others of the animal monitoring apparatus.
  • the system of the sixth aspect may include a server configured to communicate with the communications modules of the animal monitoring apparatus to receive the identification of a target animal from one of the animal monitoring apparatus.
  • the server upon identification of a target animal from one of the animal monitoring apparatus, is configured to receive images from the animal monitoring apparatus and process those images to classify the identified target animal as a subgroup of the target animals or an individual target animal.
  • the communications module of that apparatus upon identification of a target animal by one of the animal monitoring apparatus, sends a signal to one or more other animal monitoring apparatus to switch that apparatus from a low power mode into a higher power mode.
  • one or more of the animal monitoring apparatus are incorporated onto a drone or UAV device.
  • an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; a processor contained within the housing and configured to process the images to detect a presence of an animal and/or classify an animal in real-time; and a communications device configured to transmit at least a subset of the captured images to a remote server upon detection of an animal by the processor, wherein the remote server is configured to process the images to classify the animal as a target animal.
  • an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; an acoustic sensor configured to sense acoustic signals; and a processor contained within the housing and configured to process the acoustic signals to detect acoustic signals indicative of an animal and process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are deactivated and the processor is configured to detect acoustic signals indicative of an animal and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are activated and controlled to capture images and the processor is configured to identify a target animal in the environment.
  • Figure 1 shows a schematic view of an apparatus configured to detect a presence of a target animal in an environment
  • Figure 2 shows the apparatus in use in the field
  • Figure 3 shows a system of the apparatus of Figure 1 and Figure 2 in use
  • Figure 4 shows a dispensing device in accordance with an embodiment of the invention
  • Figure 5 exemplifies a neural network for detecting the target animal
  • Figure 6 shows a flow chart depicting the process of detecting a target animal in accordance with an embodiment of the invention.
  • FIG. 1 there is illustrated schematically an apparatus 1000 configured to detect the presence of a target animal (shown in Figure 2 as 3000) in an environment 2000.
  • the apparatus 1000 allows for local and/or remote processing of data inputs such as image and acoustic data with the use of sensors 102, 103, 105.
  • the primary sensors include visible image sensors 102 and infrared image sensors 103 to image a region of an environment 2000 proximal to apparatus 1000.
  • additional sensors 105 may be used to augment the primary sensors and these additional sensors include particle detectors/chemical analyser, acoustic sensors, optical or ultrasonic rangefinder sensors and temperature sensors.
  • Image sensors 102 and 103 may include conventional two dimensional image sensors such as CMOS or CCD arrays, or may include more sophisticated sensors such as phase detect pixel sensors, stereo imaging systems, LIDAR systems, hyperspectral imagers, time of flight cameras, structured light systems and other systems capable of imaging a scene in three dimensions. These more sophisticated imaging devices are capable of extracting depth information from an imaged scene. Depth information allows the determination of distance to an object which, in turn can more accurately allow determination of an animal’s size and speed of movement.
  • individual target animal will be used. These terms are intended to refer to a single unique individual animal that can be distinguished from all other animals via distinct characteristics such as physical markings, tags, gait and/or behaviour.
  • the ability to process the data locally allows for real time or near real time processing and evaluations of the data which may otherwise not be possible if the processing was performed remotely.
  • the local processing may also minimize the use of network bandwidth for communications which tends to be limited (and expensive) in remote areas in which the apparatus 1000 is likely to be used.
  • communication with a cloud server or other remote device to perform remote processing may be performed in some instances where higher processing power is required.
  • FIG. 3 there is illustrated a system 3500 of apparatus 1000 which communicate wirelessly via a central remote server 306. Based on the outputs from the sensors (102, 103, 105), and in particular to the image sensors 102, one or more images are processed to identify the target animal 3000 and, in response to that identification, an identification data packet is generated and sent via a communications module 303 to the remote server 306.
  • Server 306 may be a physical server or a cloud-based server in communication with one, many or all of the apparatus 1000 within system 3500.
  • the server 306 is in communications with a network of other apparatus 1000, which may be distributed at spatially separate locations across a geographic region being monitored. In this manner, a network of apparatus 1000 are able to communicate with each other via server 306.
  • the identification data packet is a small set of data capable of alerting server 306 of a potential detection of a target animal. The small data size allows bandwidth and power constraints of communications module 303 to be minimised.
  • the identification data packet may contain information such as:
  • Device specific data such as a unique identifier, location and current battery power level
  • apparatus 1000 is capable of detecting a target animal 3000 in the form of an individual animal, an animal species, sub species, cohort, or genus or group of animals or species, sub-species or genus.
  • target animal used herein are intended to cover these different options.
  • the apparatus 1000 includes a protective housing 100 where the one or more sensors 102, 103, 105 are mounted on or within the housing 100.
  • Each of the sensors 102, 103, 105 are operably connected to a memory 107 and processor 104 for processing of the data acquired from the sensors 102, 103, 105.
  • memory 107 may include random access memory (RAM), read-only memory (ROM) and/or electrically erasable programmable read-only memory (EEPROM).
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • other equivalent memory or storage systems may be used as should be readily apparent to those skilled in the art.
  • Each sensor 102, 103, 105 is adapted to detect various characteristics of the target animal 3000 such as physical appearance and/or behaviour, the size of the target animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the target animal 3000.
  • characteristics of the target animal 3000 such as physical appearance and/or behaviour, the size of the target animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the target animal 3000.
  • audio sensors are used, audio patterns, animal call sounds and signatures may be captured.
  • Each of the image sensors 102 and 103 are configured to capture one or more images (in either 2D or 3D) of a region of the environment 2000.
  • Each image sensor 102 and 103 may take a variety of forms ranging from a conventional CCD or CMOS visual spectrum image sensor, IR sensor, 3D camera or LIDAR sensor.
  • a near-infrared 2D image sensor 103 with a global shutter paired with a high intensity pulsed near-infrared illumination source may be used.
  • the illumination source is then "flashed" at the same time as the image sensor 102 global shutter in order to illuminate and capture images from a broad area of the local environment. This allows for a low power draw (around 10 mW or less) minimizing battery usage.
  • a passive system is used in which no illumination device is implemented to further reduce power consumption.
  • each image sensor (102, 103) is adapted to capture two or three dimensional images which, when processed by processor 104, can locally discriminate living creatures from non-living creatures.
  • the housing 100 contains a processor 104 which is operably connected to a memory 107 for storage of instructions and data.
  • the processor 104 is configured, inter alia, to process one or more images and inputs from the sensors (102, 103) to detect the presence of the target animal 3000 in real time.
  • Figure 2 exemplifies the use of a single apparatus 1000 in use with a target animal 3000. However, it will be understood that more than one apparatus 1000 would be typically used to aid in the detection of a target animal 3000. A system of such apparatus 1000 is described below.
  • Processor 104 may be implemented as any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • the processor 104 takes the form of an embedded system or system-on-chip (SoC) device.
  • SoC device provides the benefit of a single platform where the entire computing system can be integrated onto the SoC device.
  • the SoC device may further include one or more of the image sensors 102 and 103 and a clock for precisely determining a current time.
  • the clock can provide valuable information input when determining whether a detected animal has been correctly identified.
  • data from the clock can be accessed to register timestamps of detected animal events.
  • a separate clock device may be included in apparatus 1000.
  • Information from the local clock on apparatus 1000 may be used by processor 104 in the classification of animals.
  • processor 104 may be used by processor 104 as a measure of confidence or an input to a classifier algorithm to avoid false positive detections that may occur.
  • processor 104 may be divided into different modules such as a vision processor functional module for performing computer vision processes and a device controller functional module for performing device control.
  • a vision processor functional module for performing computer vision processes
  • a device controller functional module for performing device control.
  • the functions of the vision processor functional module and device controller functional module may be realized as separate hardware such as microprocessors in conjunction with custom or specialized circuitry.
  • processor 104 is collectively realised as a heterogenous computing system such as a big. LITTLE architecture in which a smaller, low power processor performs low level processing while a larger more power intensive processing performs higher level processing.
  • initial animal detection may be performed by the smaller processor, which may include a small image signal processor (ISP) engine and subsequent high level animal classification may be performed by the larger processor, which includes a larger ISP engine.
  • ISP image signal processor
  • the vision processor functional module of processor 104 is configured to process the captured images to perform target animal detection; for example to perform classification based on shape, movement (e.g. speed), colour, size, temperature, thermal signature and other characteristics of objects detected within the imaged environment.
  • the vision processor module may utilize one or more image processing algorithms such as object tracking, edge detection, shape recognition, contour detection and spectral analysis.
  • the device controller functional module of processor 104 is configured to control the various devices of apparatus 1000 such as sensors 102, 103 and 105.
  • the device controller may control a timing, resolution and/or frame rate of image sensors 102 and 103 and may also control the selective illumination of one or more light sources to illuminate the environment being imaged (if the apparatus is fitted with active lighting).
  • the processor 104 is capable of executing a machine learning algorithm such as a neural network classifier 4000 to detect the presence of the target animal.
  • the algorithm 4000 may take a virtually unlimited number of inputs such as data from the imaging device, acoustic inputs, scent data, time data and GPS location data among other possibilities.
  • Figure 5 illustrates exemplary inputs (e.g. 4001) in the form of image sensor input, acoustic sensor input, scent data from a particle or chemical sensor, time data from a clock and GPS location data. Other information such as a time of day may also be input to the algorithm 4000. It will be understood that the greater number of inputs will typically result in a higher probability of detecting the target animal species accurately.
  • the inputs are fed to nodes (e.g. 4003) of one or more hidden layers (a single hidden layer 4005 is shown in Figure 5).
  • the nodes represent weighted functions of the inputs wherein the weights of the functions can be varied based on training of the algorithm.
  • the outputs of the nodes are combined at an output 4007 to produce a determination of a presence and/or classification of an animal species (or a determination that no target animal is present).
  • Algorithm 4000 has preferably been trained on a dataset of images, sounds, videos and/or other characteristic data of the target animal or animals.
  • the algorithm 4000 may be static or dynamic to be able to be further trained to improve the classification.
  • the learning of algorithm 4000 may be supervised, semi-supervised or unsupervised.
  • the housing 100 may be fabricated from a number of materials suitable for use in an outdoor setting.
  • the housing 100 may be formed from a rigid polymer material that includes UV stabilized polymers to withstand the sun.
  • a metallic material may be used for the housing 100 where greater longevity is required as it would be resistant to UV degradation.
  • a metallic housing 100 made from a material resistant to corrosion such as stainless steel or aluminium would be preferred.
  • the use of a polymer housing is to be preferred if the housing is adapted for containing wireless communications.
  • the housing 100 may be formed of other rigid or semi-rigid materials such as wood.
  • the apparatus 1000 includes one or more sensors 102, 103, 105 contained within the housing 100 and positioned to monitor environment 2000.
  • the apparatus 1000 is powered by a battery 110.
  • the battery 110 may be of the rechargeable variety in which case a combination of a solar panel array or small wind turbine can be used to charge the battery. Alternatively, single use batteries may be used where they are periodically replaced when the apparatus is maintained.
  • apparatus 1000 may be connected to mains power and powered by the electricity grid.
  • the solar panel array is configured to convert light incident upon the solar panel array into electrical power, and to store the electrical power in the battery 110. For instance, the solar panel array converts sunshine during daytime into power in order to recharge the battery 110.
  • the solar panel array is located on an outer or top surface of the apparatus 1000. In other embodiments, the solar panel array is positioned remote from the apparatus 1000.
  • the battery 110 can be recharged by a generator or an external power source, can be a replaceable power source (e.g., a replaceable battery that is swapped out periodically), or can be itself located remotely from the apparatus 1000 (for instance, via power lines electrically coupling the battery 110 to the apparatus 1000).
  • Thermal image sensor 103 may utilize IR sensitive pixels to detect thermal characteristics of the target animal species 3000.
  • the thermal image sensors 103 are preferably calibrated to have a high sensitivity at temperatures corresponding to the target animal’s core temperature so as to accurately detect the temperature of the target animal 3000 and/or thermal characteristics of regions of the target animal (e.g. a heat map of the animal).
  • certain target animals 3000 may have well defined body temperatures and, as such, the detected temperature of the detected animal may be used as an input to processor 104 to classify whether the detected animal is a target animal species or not based at least in part on the temperature of the animal.
  • the detected temperature may be used to determine a health status of the target animal 3000. For instance, if a visual classification confirms the identity of the target animal 3000 as a given animal species but the detected temperature is outside the expected range for the animal species, this may be indicative that the target animal is unwell. Such health information can also be useful ecological data to obtain.
  • visual characteristics of an animal are used to make a determination as to whether the detected animal is the target animal. These visual characteristics include but are not limited to; a shape of the target animal 3000, colour and/or marking, size, one or more predefined movement characteristics such as gait of the target animal 3000 over more than one image, a core or average temperature of the target animal 3000 or a temperature distribution across the target animal 3000 as detected by the IR sensor 103, the brightness or reflectivity of the target animal and any distinct marking that the target animal 3000 may have.
  • the apparatus 1000 includes a GPS device 106 to allow for the determination of the location of the apparatus 1000.
  • the location of the apparatus 1000 may be used to provide important information about the location of the detected target species and the location of the apparatus to assist when it requires service or replacement among other things.
  • acoustic sensor 108 In addition to sensors that are sensitive to the visual and IR ranges, other embodiments include the use of an acoustic sensor 108.
  • the acoustic sensor 108 is calibrated to detect the acoustic characteristics of the environment and more specifically, any characteristic noises that the target animal species may make including mating calls and other characteristic sounds of the particular animal species of interest.
  • light level sensors may be used.
  • a humidity sensor may be used.
  • ambient temperature sensors may be used.
  • the apparatus includes a particle detector/chemical analyser 105 which is adapted to detect in its broadest sense, scent characteristics of the environment and more specifically, signature characteristics of the target animal 3000 such as pheromones indicative of a certain target animal. Other characteristics such as the animal's sex, health status or pregnancy status may be determined using the detection of scent.
  • the apparatus 1000 includes a wireless identifier configured to detect wireless signals and pre-stored identification data associated with one or more animals.
  • the wireless identifier may take the form of a Zigbee® based RF tag or Wireless Sensor Network (WSN) technology, which may provide long range low power wireless tracking or the target animals 3000.
  • WSN Wireless Sensor Network
  • the wireless identifier may take the form of a Radio Frequency Identification (RFID) device.
  • RFID Radio Frequency Identification
  • the RFID device may be used to detect RFID signals in the environment such as those that may be present in the vicinity of an animal with an embedded RFID chip.
  • the RFID device makes use of active RFID tags allowing a range of tracking in the vicinity of hundreds of meters.
  • Local memory 107 storage on the apparatus 1000 allows for images and other data related to a detection event which may be later retrieved either manually or via a network for analysis.
  • the memory 107 may take the form of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), and other equivalent memory or storage systems as should be readily apparent to those skilled in the art.
  • the software/firmware stored on each of the apparatus 1000 may be updated either manually by an operator in the field or more preferably, over the air (OTA) or generally via a network where multiple units can be updated at one time.
  • OTA over the air
  • the apparatus 1000 is configured to operate in a plurality of different power modes.
  • apparatus 1000 may be configured to operate in either a low power mode, allowing for long term monitoring in a low power state, or a higher power active mode, which may be triggered when greater computing power is required.
  • the low power mode operates when the device is in its quiescent state awaiting the detection of a change in the environment such as the presence of a target animal 3000. Once a change in the environment is detected, the device may then operate in the high power mode where greater computing power and additional functions are utilized to determine whether the change in environment is a detection of the target animal 3000 or not. Control of which power mode the apparatus 1000 operates in is performed by processor 104 as described below.
  • the one or more image sensors may operate at a lower frame rate and/or resolution, consuming less processing power and memory and in turn consuming less electrical power.
  • the deactivation of multiple image sensors may be performed in the low power mode further reducing the power consumption in this mode.
  • the transition from low power to high power mode may be instigated by the use of a trigger signal which may take a number of forms such as a visual trigger such as the detection of a shape or movement in the images or a mechanical or electromechanical trigger associated with the apparatus.
  • Other triggers may include the detection of a predefined brightness or temperature in the images.
  • trigger signal is an acoustic trigger, where for instance, the trigger may activate when a particular sound indicative of the target animal species 3000 is detected.
  • one or more illumination devices are included in association with the apparatus 1000. These illumination devices may include one or more LEDs operating in the visible and/or infrared wavelength range. The illumination devices may be controlled by processor 104 to selectively illuminate parts of the environment to modify an animal's behaviour among other things.
  • the apparatus 1000 includes a communications module 303, which is adapted to communicate with other apparatus 1000 positioned in the environment and alternatively to send and receive data to a remote server 306 at predefined time periods for the collection and retrieval of data among other things.
  • the communications module 303 is adapted to communicate information such as the visual characteristics and/or presence of the target animal to server 306, which can, in turn, communicate with communications modules of other apparatus 1000.
  • the communications module 303 is further adapted to communicate environmental characteristics and/or the presence of a target animal 3000 to other apparatus 1000 via server 306. In some embodiments, communications module 303 is able to communicate directly with other communications devices of other apparatus directly and bypass central server 306.
  • the communications module 303 can include a wireless receiver, transmitter, or transceiver hardware.
  • the communications module 303 may also be adapted to transmit other status data such as an energy level of the battery 110 (e.g. state of battery charge) or diagnostic codes in the event of a malfunction.
  • Firmware updates may be performed Over the Air (OTA) using the communications module 303.
  • OTA Over the Air
  • a variety of wireless protocols may be used, with low power wireless protocols such as Bluetooth, BLE, ZigBee, 2G or 3G being some examples.
  • communications module 303 includes hardware for communicating between devices over a wired network such as USB, Ethernet, twisted pair or coaxial cables.
  • the communications module 303 is adapted to communicate the detected presence of a target animal 3000 to any of the other apparatuses 1000 allowing for the tracking of the target animal and the collection of information as to the target animal's 3000 movements.
  • the detected presence of the target animal 3000 at one of the apparatus 1000 may be communicated to other apparatus within the vicinity causing them to alter from a lower power mode to a high power mode in anticipation of the target animal 3000 approaching other apparatus 1000 in the vicinity of the apparatus 1000 that initially detected the target animal 3000.
  • apparatus 1000 are stationary devices deployed at specific locations throughout environment 2000.
  • apparatus 1000 may be in the form of a drone or unmanned aerial vehicle (UAV), which can be controlled to move around environment 2000.
  • UAV unmanned aerial vehicle
  • apparatus 1000 can be used simply for detection of animals within the environment 2000, in some embodiments, apparatus 1000 is advantageously capable of performing various actions in response to detection of a target animal. These responsive actions include:
  • Dispensing a compound such as a poison or medicament from a dispenser on the apparatus 1000 • Dispensing a poison compound onto the target animal (for pest animals);Dispensing food or water from a food dispenser on the apparatus 1000.
  • the apparatus further includes an actuation device 402 which is responsive to a sensor signal generated from the processor 104 for initiating a response based on the detection of the target animal 3000.
  • the actuation device 402 further includes a dispenser 404 which is adapted to dispense a compound 406 onto the target animal 3000.
  • the compound is a pharmaceutical substance for medicating the target animal in response to a disease state.
  • the pharmaceutical is a toxin
  • the dosage of toxin supplied in the pharmaceutical is at least sufficient to incapacitate the target animal, and may cause death instantaneously or delayed relative to the ejection of the pharmaceutical, for instance via toxin-induced anoxia or other physiological effect.
  • the pharmaceutical is sodium fluoroacetate 1080 ("1080") which is a well-known poison in Australia, and an effective dosage of 1080 may be between 5 mg and 12 mg, which may be supplied as 0.4ml of 30g/L concentrate 1080.
  • 1080 has the advantage that it is present in a number of Australian native flora species and as such, Australian native animals tend to have a higher tolerance to it than introduced species such as feral cats. For instance, a predetermined dose of 1080 can euthanize a feral cat without harming other Australian native species.
  • the pharmaceutical is PAPP and the dosage of the pharmaceutical is between 100mg and 300mg.
  • the toxin supplied in the pharmaceutical is delivered within a volume of fluid between approximately 1 ml and 5ml.
  • the pharmaceutical may be supplied in a viscous form.
  • the pharmaceutical includes a gel formulation.
  • the gel formulation has a consistent viscosity at a range of temperatures and pressures, and can beneficially improve the reliability of the speed, direction, and precision of the ejection of the pharmaceutical.
  • the pharmaceutical includes a grease formulation, or is administered as a spray.
  • a syringe or similar vessel is adapted (e.g., as part of the dispenser 404) to provide separate measured doses of the pharmaceutical for each ejection of the pharmaceutical by the targeting system.
  • a larger vessel e.g., a canister or tank
  • a larger vessel provides a constant supply of the pharmaceutical to be applied or ejected in amounts corresponding to single doses per application/ejection.
  • different target animals may receive different doses of the pharmaceutical.
  • the dose of the pharmaceutical applied to a given target animal can be selected, for instance based on the type of the target animal 3000 detected by the apparatus 1000.
  • the pharmaceutical is enclosed within a frangible membrane designed to rupture upon contact with the target animal 3000.
  • the frangible membrane contains the pharmaceutical and each membrane, which may be in the form of a capsule, pellet, ball, or the like, contains a distinct unit dose of the pharmaceutical.
  • the dispenser 404 can shoot or eject the frangible membrane at the target animal 3000, which ruptures upon contact with the target animal 3000, causing the pharmaceutical enclosed within to contact and/or stick to the coat of the target animal 3000.
  • the frangible membrane is shot at the target animal 3000 at a speed fast enough to ensure the frangible membrane ruptures, but at a speed slow enough to prevent significant pain from being caused to the target animal 3000.
  • the actuation device 402 includes a visual or acoustic (audio lure) stimulus which may be actuated in response to the detection of the target animal 3000.
  • the visual stimulus may include a LED light arrangement which is adapted to generate a pattern of light that is likely to attract or lure the target animal or alternatively scare non-target animals away from the apparatus.
  • An acoustic stimulus may be produced by a sound generator and fed through a loudspeaker mimicking noises characteristic of or attractive to the target animal 3000 which may aid in luring the target animal to a vicinity of the apparatus 1000.
  • the acoustic stimulus is programmable enabling realistic sounds including cat prey and mating calls for cats and foxes to be broadcast at variable volumes and intervals to optimize the luring capacity of the apparatus 1000.
  • the acoustic stimulus can be configured to play at certain times of the day further improving the ability to lure target species.
  • FIG. 6 there is illustrated a flow chart depicting an exemplary method 6000 of operating apparatus 1000 for the detection of a target animal 3000.
  • the apparatus 1000 enters a low power or quiescent state where the environment 2000 is monitored at a basic level.
  • the low power state may be the default state that apparatus 1000 enters upon initialisation or after a predetermined period of no activity.
  • apparatus 1000 performs basic monitoring of the environment 2000 and only certain functions of apparatus 1000 will be activated.
  • some of the sensors may be deactivated, one or more of the image sensors may operate at a lower frame rate and/or resolution, illumination devices (if installed) may be deactivated or turned down and other functions deactivated.
  • processor 104 may be configured to perform only basic image processing algorithms such as low resolution object detection, brightness variations or edge detection that draw relatively low power.
  • sensors 102 and 103 may be deactivated and a low power sensor such as a motion sensor or acoustic sensor may be used to simply detect motion or sounds within the environment 2000.
  • a basic image classifier may be trained and executed on processor 104 to detect the normal background of environment 2000 being imaged by the image sensors. Minor changes across images such movement of branches or sunlight changes throughout the day may be taken into account in this classifier to reduce the instance of false triggers. This basic classifier can more accurately determine when an animal enters the environment scene being imaged, which would substantially change the normal background that is imaged.
  • processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the smaller processor and ISP engine may be executed to perform the low power mode processes of step 6001.
  • the low power mode remains active until, at step 6002, a trigger signal is detected by processor 104.
  • the trigger signal may include a trigger from a mechanical device such as a weight sensor, a detection of a shape or motion by one of the image sensors 102 and 103 or detection of a sound by acoustic sensor 105 or a separate motion sensor. Where a basic classifier is implemented by processor 104, the trigger may be upon detection of a change in the normal background that is imaged by image sensor 102 or 103.
  • the trigger signal may also be received from another nearby apparatus or server 306 via communications module 303. For example, if the nearby apparatus has detected a target animal in the environment, it may transmit a trigger signal to apparatus 1000 via server 306 and other nearby apparatus to wake them from their low power state.
  • processor 104 switches apparatus 1000 into a first higher power mode in which additional functionality is activated.
  • This mode termed “Stage 1” in Figure 6, is configured to allow apparatus 1000 to detect whether an animal is present.
  • the level of processing and power consumption is somewhat higher than that of the low power mode in order to perform the detection.
  • the image sensors may be configured to image the environment at a higher resolution and/or higher frame rate, deactivated sensors may be activated and illumination devices (if installed) may be activated.
  • processor 104 may be configured to implement more comprehensive image processing algorithms such as shape recognition, spectral analysis and a machine learned classifier in order to determine whether or not an animal is present.
  • Stage 1 may include capturing one or more still frame (non video) images and performing image processing on those images. In other embodiments, Stage 1 may include capturing a low frame rate video sequence and performing image processing on that sequence of images.
  • image sensors 102 or 103 may be activated.
  • the processor 104 registers the current time of day is daylight hours, visible image sensor 102 may be activated and IR sensor 103 deactivated.
  • processor 104 registers the current time of day as being night time, visible image sensor 102 may be deactivated and IR sensor 103 activated.
  • maintaining low power consumption is still of primary importance as many false triggers may switch apparatus 1000 into Stage 1 .
  • This may include processor 104 determining a confidence measure, and, if the confidence measure is above a threshold value, a designation that an animal has been detected is made.
  • a threshold confidence value might be 70%, 80%, 90% or 95%. This confidence value may be performed by the detection of known characteristics of animals in comparison to characteristics of general movement within the environment. In many cases, the detected movement will not be due to an animal but rather motion within the environment. If no animal has been detected, the system returns to step 6001 and apparatus 1000 re-enters the low power mode. If an animal is detected, then system operation proceeds to step 6005, which includes a classification operation to classify the detected animal.
  • Animal detection at step 6004 may occur via a number of techniques including matching recorded data with data stored in a database. This includes basic shape or pattern recognition and comparison with a database of stored animal shapes, acoustic recognition of a stored animal call, thermal signature detection from IR images detected by IR sensor 103, and movement or motion detection amongst others.
  • processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the larger processor and ISP engine may be executed to perform the higher processing of step 6004.
  • the period from commencement of Stage 1 (upon triggering) to the determination at step 6004 is preferably only a period of milliseconds, such as less than 100 milliseconds. In some embodiments, the period is less than 50 milliseconds or less than 10 milliseconds. This rapid detection is preferable so as to be able to quickly detect a fast moving animal within the field of view of image sensors 102 and 103.
  • apparatus 1000 does not know what type of animal has been detected; simply that an animal has been detected.
  • This form of low-end processing allows for apparatus 1000 to operate at low power and to return to the low power mode if some trigger other than an animal, such a movement of a tree, switches device into the Stage 1 mode.
  • apparatus 1000 may send a signal via communications module 303 to server 306 or directly to other nearby apparatus to alert them of the animal detection.
  • This alert may, for example, trigger those devices to switch from the low power mode into the Stage 1 or a higher power mode for detecting the animal.
  • apparatus 1000 is switched into a Stage 2 “classification” mode of operation in which a target animal classification process is performed by processor 104.
  • Stage 2 represents a more processor-intensive mode in which a higher level of processing occurs to classify the animal based on input received from the various sensors.
  • Additional devices such as sensors and illumination devices may be activated or switched into a different mode.
  • image sensors 102 and 103 may be activated into higher frame rate and/or higher resolution modes to better capture characteristics of the animal such as shape, physical appearance and/or behaviour, the size of the animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the animal.
  • Stage 2 represents a higher power mode than Stage 1 , which, in turn, draws a higher power than the low power mode of step 6001.
  • Stage 1 detection performs analysis on only a single image frame or small number of images while Stage 2 classification includes performing analysis on a sequence of video images. Analysis of video allows processor 104 to determine temporal characteristics of the animal such as movement gait and behaviour.
  • Stage 2 classification includes a more comprehensive analysis of the data by processor 104.
  • Stage 2 classification includes performing a classifier algorithm such as that described above and illustrated in Figure 5.
  • processor 104 determines whether or not a target animal has been detected in the classification of step 6005. Like with step 6004, this decision may be based on a confidence measure produced with the classification. If the confidence of the detected animal being a target animal is greater than a threshold confidence value, then a decision is made that a target animal has been detected.
  • a threshold confidence value might be 70%, 80%, 90% or 95%.
  • the detection of a target animal at step 6006 may not be limited to a single animal but may include a group of target animals (e.g. fox, feral cat, endangered pygmy possum) that are of interest to be monitored.
  • a plurality of target animals may be stored in memory 107 and accessed by processor 104 for classification.
  • the target animal classification at step 6005 may be simply to detect a species of the target animal or it may be to detect a subset (e.g. male or adult animals only), cohort or individual target animal.
  • step 6006 If, at step 6006, a target animal is not detected, then the process returns to step 6001 and apparatus 1000 again enters the low power mode. If, at step 6006, a target animal is detected, then the process proceeds to optional step 6007 in which further processing is performed to determine a specific individual target animal.
  • This step is designated as Stage 3 and is optional as Stage 2 may be sufficient to detect an individual target animal.
  • an individual animal of interest may have known physical markings that can be detected in the Stage 2 classification. However, in some embodiments, particularly where an individual animal is difficult to distinguish from other animals of a species, Stage 3 classification can provide a further classification to determine if the target animal is the individual animal of interest.
  • Stage 3 classification may include running a more advanced classifier algorithm such as a machine learnt algorithm trained based on images of the individual animal.
  • Stage 3 classification may be executed solely by processor 104 within apparatus 1000 (e.g. a larger processor and ISP engine of a two-stage or heterogenous processor system such as in the big. LITTLE architecture) or may be executed wholly or in part by server 306 or another cloud server with higher processing power.
  • Stage 3 classification may also include receiving inputs from other nearby apparatus which may have detected the individual animal to consider movement patterns of the animal.
  • the Stage 3 classification at step 6007 determines a confidence value regarding a confidence that the individual animal has been detected. If the confidence value is greater than a confidence threshold, then, at step 6008, processor 104 determines that the individual target animal has been detected.
  • a threshold confidence value might be 70%, 80%, 90% or 95%. If, at step 6008, the confidence value is lower than the threshold confidence value, then apparatus 1000 is returned to a lower power state such as the low power mode of step 6001.
  • one or more responsive actions are taken by apparatus 1000, depending on the animal detected.
  • the responsive action might be to dispense a poison from a dispenser in apparatus 1000 as described above and also in Australian patent 2016302398 entitled “Device for Proximal Targeting of Animals”.
  • Example responsive actions taken at step 6009 include:
  • Triggering a trap and/or Activating one or more additional sensors such as acoustic sensors or RFID tag sensors to capture more data about the individual animal.
  • apparatus may optionally perform a verification that the action was successful. This may include processing a short sequence of images after the action to observe an outcome.
  • image sensor 102 is controlled to capture a short sequence of images and processor 104 processes the images to visibly identify that the poison was administered to the animal (e.g. poison observed to land on the animal’s body).
  • processor 104 may issue a verification to server 306 and/or store the verification in memory 107.
  • communications module 303 is controlled to transmit the short sequence of images captured after the action to server 306 for analysis and verification by a human operator.
  • processor 104 may store relevant information in memory 107 and/or transmit information to server 306. This may include the detection of a particular animal or animal species for counting in a study, markings of the detected animal, movement patterns, direction of travel, gait, behaviour characteristics (e.g. injured), as well as biometric information such as age, size and gender. This information is valuable for the ongoing study of the ecosystem within environment 2000.
  • apparatus 1000 is able to operate in more than or fewer than four power modes.
  • system 3500 is able to collectively monitor a large area of environment 2000 that extends significantly beyond that of the field of view of a single sensor apparatus 1000.
  • apparatus 1000 may be deployed at spatially separated locations around environment 2000, particularly in locations where target animals are known or likely to be present.
  • at least a subset of the apparatus 1000 are fitted to drones so that they are mobile and can controllably move around environment 2000.
  • the apparatus 1000 are each preferably locally powered by on-board batteries and optionally supplemented with solar and/or wind turbine installations. However, in some embodiments, some or all of the apparatus 100 may be powered by mains power. Further, each apparatus 1000 preferably communicates wirelessly with remote server 306 via communications module 303 and communicates only small amounts of data over short periods of time and at predetermined time periods so as to minimise power consumption of the apparatus 1000. However, in some embodiments, communications module 303 includes a wired connection such as USB, Ethernet or twisted pair cable to connect each apparatus 1000 with remote server 306 and/or between different apparatus via wired connections. The transmitted data may be compressed and encrypted by various data encryption algorithms known in the art. In other embodiments, the various apparatus 1000 are able to communicate directly with each other without communicating with server 306, such as in a mesh network.
  • server 306 acts as a central hub for collating data from each apparatus 1000 and performs system-level decision making such as which apparatus to switch into higher or lower power modes, which apparatus are malfunctioning, whether any apparatus needs to be relocated, serviced, or replaced.
  • Server 306 may also intermittently issue software updates to the various apparatus to, for example, update the classification algorithms to more accurately classify the current target animals or change the target animals to be classified.
  • Server 306 may also monitor power levels of the respective batteries of each apparatus 1000 and issue alerts if a battery needs replacement or if an apparatus is offline.
  • server 306 may also perform a higher level classification to that performed by apparatus 1000, such as the Stage 3 classification described above.
  • each apparatus may employ one or more static classifier algorithms which perform classification of target animals and feed these classifications and associated images, acoustic data and other data to server 306 for further processing.
  • Server 306 may employ a dynamic machine learning algorithm which continues to learn based on an updated training dataset fed by the data received from each apparatus 1000. This may be particularly useful when the system 3500 is aiming to detect individual animals having distinct markings or the like. In this situation, the apparatus 1000 may be used to detect the particular species of animal and server 306 performs a higher level analysis of the data to determine if the particular animal has been detected or not.
  • server 306 to perform a higher level classification allows the classification software and hardware used in the apparatus 1000 to be kept relatively simple. This leads to reduced cost and power consumption by the apparatus 1000 and longer field operating lifetime.
  • server 306 may be powered by mains power due to a higher power consumption.
  • Server 306 may periodically issue software updates to the apparatus 1000 as the dynamic classification system used at the server is able to classify target animals more accurately and/or the target animals to be identified and monitored changes.
  • real-time or near real-time means a time frame that is in the order of milliseconds and preferably less than 100 milliseconds in order to be able to detect, image and classify a swiftly moving animal and perform an appropriate responsive action before the animal moves out of a responsive action zone proximal to the apparatus.
  • This rapid timing is important as animal management can be very time-sensitive.
  • efficient animal management requires identification of animals at multiple locations to be time-stamped and retrieved and centralized into a single database, where correlations between the detections can be made in order to understand the prevalence, movements and other statistics of the animal population.
  • System 3500 is able to achieve this using the centralised server 306 as the central management engine.
  • Server 306 may be cloud-based or include or communicate with a database that is cloud-based to alleviate the very real risks of data loss through transfer from local devices to more permanent data storage devices and during/following classification.
  • Server 306 may host or enable an interface or dashboard that is accessible by ecologists or other personnel to access and analyse the data gathered by system 3500.
  • the system may provide an efficient interface, able to serve the needs of several key user roles including:
  • server 306 may also host management software capable of accessing a database to collate detection event data from groups of apparatus in the same geographic area.
  • the management software may also include capability to upload and update sensor software and configurations using "over the air updates”.
  • the management software may provide the capability to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected).
  • the management software may also provide the capability for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
  • system 3500 When there are undefined numbers of animals over a large geographic area, and potentially multiple user roles in the management of the animals, system 3500 described above provides various benefit and advantages, including:
  • a plurality of sensors spatially dispersed in the environment in order to detect more animals in a wider variety of locations, habitats and times. • Sensors that are both sensitive and specific (or accurate) in order to avoid false triggering/classifications cluttering the system, creating unnecessary costs, consuming battery life/power and memory storage and triggering erroneous actions.
  • Sensors or remote management software that are equipped to recognize individual animals that have been previously identified (for example detecting a specific individual tiger in a population of tigers).
  • Management software that includes a database to collate detection event data from groups of sensors in the same geographic area.
  • Management software that includes the means to upload and update sensor software and configurations using “over the air updates”.
  • Management software that offers the means to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected).
  • Management software that offers the means for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
  • system level operation of system 3500 of apparatus 1000 provides for the identification, monitoring and management of animals within a potentially wide geographic area with minimal manual intervention by field personnel.
  • the processes of system 3500 are largely automated and various management processes can be instigated automatically based on predefined rules and processes. Data can be centrally managed and monitored with the capability of field personnel to modify the rules and processes implemented by system 3500 such as to modify the management processes and/or vary the target animals being monitored.
  • the collation of the spatial information from the system of apparatus allows for accurate determination of whether there is one or more than one problematic animals within the environment. This level of information is not possible from a single sensor apparatus in the field.
  • infrared refers to the general infrared area of the electromagnetic spectrum which includes near infrared, infrared and far infrared frequencies or light waves.
  • controller or “processor” may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory.
  • a “computer” or a “computing machine” or a “computing platform” may include one or more processors.
  • any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others.
  • the term comprising, when used in the claims should not be interpreted as being limitative to the means or elements or steps listed thereafter.
  • the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B.
  • Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
  • Coupled when used in the claims, should not be interpreted as being limited to direct connections only.
  • the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other.
  • the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
  • Coupled may mean that two or more elements are either in direct physical, electrical or optical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.

Abstract

Described herein is an apparatus (1000) configured to identify a target animal (3000) in an environment (2000) in real-time. The apparatus (1000) including: a housing (100); one or more image sensors (102, 103) mounted on or within the housing (100) and configured to capture images of a region of the environment (2000); and a processor (104) contained within the housing (100) and configured to process the images to identify a target animal (3000) in real-time. The apparatus (1000) is configured to operate in a low power mode and a higher power mode. In the low power mode, the one or more image sensors (102, 103) are controlled to capture images in a low power mode and the processor (104) is configured to detect a change in the environment (2000) from the images and, in response, trigger the higher power mode. In the higher power mode, the one or more image sensors (102, 103) are controlled to capture images in a higher power mode than in the low power mode and the processor (104) is configured to identify a target animal (3000) in the environment (2000).

Description

A SYSTEM AND APPARATUS FOR ANIMAL MANAGEMENT FIELD OF THE INVENTION
[0001 ] The present application relates to an apparatus adapted to detect and identify animals and/or animal species.
[0002] Embodiments of the present invention are particularly adapted for the detection, identification, monitoring and management of target animal populations and detection of individual animals within an environment being monitored. However, it will be appreciated that the invention is applicable in broader contexts and other applications.
BACKGROUND
[0003] In the field of ecology, it is important to accurately detect and identify animal species in their natural settings for the purposes of: i) Monitoring the distribution, abundance, and health of animal species and the success of conservation programs; ii) Detecting and identifying pests so that prompt and cost-effective control or management actions can be implemented; iii) Detecting particular individuals that may exhibit contagious disease, key reproductive stage or other attributes requiring immediate management, or enabling population and interaction assessments; iv) Monitoring behaviour of animals such as movement and breeding patterns, sick or injured animals and general populations of animals; v) Discriminating wildlife, pets and possibly certain individuals of particular species to safeguard them from, or target them for, automated control actions;
[0004] Some animals from each of the above categories are cryptic (or hidden) and hence only detectable occasionally and unpredictably, and typically with minimum interference and often at night without use of white lights or flash.
[0005] Presently, systems for detecting animal species include passive infrared (PIR) cameras, which detect heat signatures but are not well adapted to distinguish animals from changes in other-non animal sources of heat such as moving shadows. Other systems for detecting animals involve physical tags or physical mechanisms such as trip-wires or pressure plates and other traps. These systems suffer from a number of deficiencies including but not limited to; the need to capture and tag animals, the inability to precisely and quickly identify a target animal species and, in cases where the animal is identified, unwanted delays in identifying the target animal species, resulting in the target animal moving out of the area in which the detection occurs. This can be problematic if for instance, an action such as administering a compound onto the target animal is required. Other problems associated with present systems include a generally low level of accuracy in distinguishing a target animal species, potentially resulting in an action being taken on the wrong species of animal resulting in adverse results for the animal involved.
[0006] One example of a target animal species which requires control is the feral cat ( Felis catus) which can have a profound impact on indigenous wildlife and require constant monitoring and control. Historically, traps have been used to control these feral cats with varying degrees of success. Being intelligent creatures, feral cats are quite cunning, and most feral cats are very familiar with their environments and any change such as the installation of a trap or other apparatus. The result is the triggering of a neophobic instinct within the feral cat causing it to avoid the new object, rendering the trap ineffective in these cases.
[0007] Furthermore, research has shown traditional baiting methods to be more effective against scavenger animals and less effective against the super predators such as some feral cats, which are less likely to be persuaded by food in a trap, rendering these methods largely ineffective.
[0008] More recently, sensors such as passive infra-red sensors and optical sensors have been employed to monitor animal species have the advantage of remote sensing over wide areas. However, these systems suffer from a large number of false triggers due to moving objects like blowing vegetation and ambient temperatures approaching animal core temperatures. These false triggers, or lack of triggers when the ambient temperature is close to body temperature, give rise to incorrect tracking and administration of control actions, as well as increasing overall power consumption of the monitoring devices.
[0009] In remote applications, animal monitoring devices need to be self-powered. However, more advanced detection systems require higher power consumption and therefore have limited lifetimes before needing batteries replaced. [0010] PCT Patent Application Publication WO 2020/037377 entitled “A Detection System” describes an animal detection system that switches from an idle state to an active state upon detection of motion by a motion sensor. However, in a natural environment such as a forest, this system would suffer from regular false triggers from movements in the environment including trees moving in the wind. These false triggers would result in rapid depletion of the local battery and require regular manual intervention.
[0011 ] Therefore, the present inventors have identified a need for improvement to monitoring devices to be able to quickly and accurately identify target animals, potentially down to the individual animal, while being robust and energy efficient to be situated in the field for long periods of time.
[0012] Furthermore, current animal identification systems are limited to identification of animals and basic responsive actions such as triggering a trap. Current systems do not provide sufficient information to permit a broader level management of animal populations and behaviour in an environment. In particular, prior art systems require manual analysis of data by field personnel to extract information on animal behaviour such as roaming patterns of individual animals.
[0013] Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
SUMMARY OF THE INVENTION
[0014] In accordance with a first aspect of the present invention, there is provided an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are controlled to capture images in a low power mode and the processor is configured to detect a change in the environment from the images and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are controlled to capture images in a higher power mode than in the low power mode and the processor is configured to identify a target animal in the environment.
[0015] In some embodiments, in the low power mode, the one or more image sensors are configured to capture images at a lower resolution and/or a lower frame rate than in the high power mode.
[0016] In some embodiments, in the higher power mode, the processor is configured to classify a detected animal as a target animal.
[0017] In some embodiments, the apparatus is configured to operate in a second higher power mode in which the processor performs a classification to determine if a detected animal is a target animal. In the second higher power mode, the processor is preferably configured to classify the animal as an individual target animal.
[0018] In some embodiments, the apparatus includes a communications module configured to transmit one or more of the captured images to a remote server and wherein the remote server performs a classification to identify if a detected animal is a target animal.
[0019] In some embodiments, switching between the low power mode and higher power mode occurs within a period of less than 100 milliseconds.
[0020] In accordance with a second aspect of the present invention, there is provided an apparatus configured to perform real-time identification of an individual target animal in an environment, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture one or more 2D or 3D images of the environment; a processor contained within the housing and configured to process the one or more images to identify the individual target animal in real time based on one or more detected visual characteristics in the one or more images and, in response, generate an identification data packet; and a communications module configured to communicate the identification data packet to a remote server which is in communication with other apparatus.
[0021 ] In some embodiments, the apparatus includes a GPS device configured to determine a location of the apparatus.
[0022] In some embodiments, at least one of the one or more image sensors is a thermal image sensor capable of imaging in the infrared wavelength range to detect thermal characteristics. The thermal image sensor may be calibrated to detect a temperature of the target animal such as an average temperature. In some embodiments, the processor is configured to use the temperature of the target animal and background to determine a health status of the target animal.
[0023] In some embodiments, the apparatus includes an acoustic sensor to detect acoustic characteristics within the environment.
[0024] In some embodiments, the processor is configured to detect a presence of or identify the target animal at least in part by one or more acoustic characteristics that match or closely match an acoustic characteristic indicative of the target animal.
[0025] In some embodiments, the apparatus includes a particle detector or chemical analyser configured to detect scent characteristics within the environment. The processor is preferably configured to detect a presence of or identify the target animal at least in part by determining one or more scent signatures of the target animal based on the detected scent characteristics.
[0026] Preferably, the apparatus includes a battery and the one or more image sensors and the processor are powered locally by the battery.
[0027] In some embodiments, the processor is implemented on a system-on-chip (SoC) device. In some embodiments, the processor and/or other hardware are incorporated into an embedded hardware system. In some embodiments, the one or more image sensors are also implemented on an SoC device.
[0028] In some embodiments, the processor includes a clock to determine the current time at the location. [0029] In some embodiments, the processor is configured to execute a machine learned classifier to identify the target animal. In some embodiments, the processor is configured to execute a neural network classifier algorithm.
[0030] In some embodiments, the detected visual characteristics include a shape of the target animal. In some embodiments, the detected visual characteristics include one or more predefined movements or behavioural characteristics of the target animal over a plurality of images. In some embodiments, the detected visual characteristics include thermal characteristics of the target animal. In some embodiments, the detected visual characteristics include a brightness or reflectivity or absorbance characteristic of the target animal. In some embodiments, the detected visual characteristics include a distinct colour and/or marking of the target animal.
[0031] In some embodiments, the target animal includes a predefined animal species. In other embodiments, the target animal is an individual animal. The individual animal is a unique individual that can be distinguished from all other animals.
[0032] In some embodiments, the apparatus includes a wireless identifier configured to detect pre-stored data from a wireless tag associated with one or more animals.
[0033] In some embodiments, the apparatus is configured to operate in one of a low power mode or a high power mode.
[0034] In some embodiments, in the low power mode, the one or more image sensors are configured to capture images at a lower frame rate and/or a lower resolution. The high power mode is preferably activated by a trigger signal. The trigger signal may be based on a mechanical or electromechanical trigger associated with the apparatus. In some embodiments, the trigger signal is based on detection of one or more trigger visual characteristics in the one or more images. In some embodiments, the one or more trigger visual characteristics include movement characteristics in the images. In some embodiments, the one or more trigger visual characteristics include detection of a predefined shape or shapes in the images. In some embodiments, the one or more trigger visual characteristics include detection of a predefined temperature or brightness in the images. In some embodiments, the trigger signal is based on detection of one or more trigger acoustic characteristics detected by an acoustic sensor.
[0035] In some embodiments, in the low power mode, the one or more image sensors are deactivated. In some embodiments, in the low power mode, an acoustic sensor is configured to sense acoustic signals and the processor is configured to process the acoustic signals to identify one or more acoustic sounds indicative of an animal or a target animal.
[0036] In some embodiments, the apparatus includes an illumination device configured to selectively illuminate at least a part of the environment.
[0037] In some embodiments, the apparatus includes one or more of a light level sensor, humidity sensor and/or a temperature sensor.
[0038] In some embodiments, the apparatus includes an actuation device responsive to a sensor signal generated from the processor for initiating an action in response to identification of the target animal. In some embodiments, the actuation device includes a dispenser for dispensing a compound onto the target animal. In some embodiments, the actuation device includes a visual or acoustic stimulus actuated in response to the identification of the target animal.
[0039] In some embodiments, the apparatus includes a sound generator adapted to generate sounds to lure the target animal.
[0040] In some embodiments, the apparatus includes a visual lure to lure the target animal toward the apparatus.
[0041] In some embodiments, the communications module is further adapted to communicate the characteristics and/or the presence of a target animal to other nearby apparatus for detecting or identifying animals. In some embodiments, the communications device is adapted to send and receive data to a remote database at predetermined time periods.
[0042] In some embodiments, the apparatus is incorporated into a drone or UAV which is controllably moveable around the environment.
[0043] In accordance with a third aspect of the present invention, there is provided an actuation device including one or more actuators responsive to a sensor signal generated from an apparatus of the first or second aspect.
[0044] In accordance with a fourth aspect of the present invention, there is provided a system configured to detect a presence of a target animal in an environment, the system including a plurality of apparatuses of the second aspect, wherein the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server. In some embodiments, upon receipt of an identification of a target animal from a nearby apparatus, one or more other apparatus are controlled from a lower power mode into one or more high power modes.
[0045] In accordance with a fifth aspect of the present invention, there is provided an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more sensors mounted on or within the housing and configured to sense one or more characteristics of the environment; a processor contained within the housing and configured to process the one or more images and environment characteristics to detect a presence of a target animal in real time; and a communication device adapted to communicate the environment characteristics and/or the presence of a target animal to other nearby apparatus to switch the other nearby apparatus into a monitoring mode.
[0046] In some embodiments of the fifth aspect, the one or more sensors include one or more of image sensors, acoustic sensors, temperature sensors, particle sensors and/or chemical analysers.
[0047] In accordance with a sixth aspect of the present invention, there is provided an animal monitoring system for monitoring animals within an environment, the system including a plurality of animal monitoring apparatus positioned at spatially separated locations within the environment, each animal monitoring apparatus including: one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; and a communications module for communicating the identification of a target animal to one or more others of the animal monitoring apparatus. [0048] The system of the sixth aspect may include a server configured to communicate with the communications modules of the animal monitoring apparatus to receive the identification of a target animal from one of the animal monitoring apparatus.
[0049] In some embodiments of the sixth aspect, upon identification of a target animal from one of the animal monitoring apparatus, the server is configured to receive images from the animal monitoring apparatus and process those images to classify the identified target animal as a subgroup of the target animals or an individual target animal.
[0050] In some embodiments of the sixth aspect, upon identification of a target animal by one of the animal monitoring apparatus, the communications module of that apparatus sends a signal to one or more other animal monitoring apparatus to switch that apparatus from a low power mode into a higher power mode.
[0051] In some embodiments of the sixth aspect, one or more of the animal monitoring apparatus are incorporated onto a drone or UAV device.
[0052] In accordance with a seventh aspect of the present invention, there is provided an apparatus configured to detect a presence of a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; a processor contained within the housing and configured to process the images to detect a presence of an animal and/or classify an animal in real-time; and a communications device configured to transmit at least a subset of the captured images to a remote server upon detection of an animal by the processor, wherein the remote server is configured to process the images to classify the animal as a target animal.
[0053] In accordance with an eighth aspect of the present invention, there is provided an apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; an acoustic sensor configured to sense acoustic signals; and a processor contained within the housing and configured to process the acoustic signals to detect acoustic signals indicative of an animal and process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are deactivated and the processor is configured to detect acoustic signals indicative of an animal and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are activated and controlled to capture images and the processor is configured to identify a target animal in the environment.
BRIEF DESCRIPTION OF THE FIGURES
[0054] Example embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
Figure 1 shows a schematic view of an apparatus configured to detect a presence of a target animal in an environment;
Figure 2 shows the apparatus in use in the field;
Figure 3 shows a system of the apparatus of Figure 1 and Figure 2 in use;
Figure 4 shows a dispensing device in accordance with an embodiment of the invention;
Figure 5 exemplifies a neural network for detecting the target animal; and
Figure 6 shows a flow chart depicting the process of detecting a target animal in accordance with an embodiment of the invention. DESCRIPTION OF THE INVENTION Sensor Apparatus
[0055] Referring initially to Figures 1 and 2, there is illustrated schematically an apparatus 1000 configured to detect the presence of a target animal (shown in Figure 2 as 3000) in an environment 2000. The apparatus 1000 allows for local and/or remote processing of data inputs such as image and acoustic data with the use of sensors 102, 103, 105. In the embodiment shown, the primary sensors include visible image sensors 102 and infrared image sensors 103 to image a region of an environment 2000 proximal to apparatus 1000. Optionally, additional sensors 105 may be used to augment the primary sensors and these additional sensors include particle detectors/chemical analyser, acoustic sensors, optical or ultrasonic rangefinder sensors and temperature sensors. This allows for an autonomous system that is able to detect a particular individual target animal 3000, animal species, cohort, sub-species or group of target animals utilising a number of sensory inputs as are detected by the sensors (102, 103, 105) and may be processed locally.
[0056] Image sensors 102 and 103 may include conventional two dimensional image sensors such as CMOS or CCD arrays, or may include more sophisticated sensors such as phase detect pixel sensors, stereo imaging systems, LIDAR systems, hyperspectral imagers, time of flight cameras, structured light systems and other systems capable of imaging a scene in three dimensions. These more sophisticated imaging devices are capable of extracting depth information from an imaged scene. Depth information allows the determination of distance to an object which, in turn can more accurately allow determination of an animal’s size and speed of movement.
[0057] Throughout this specification, use of the terms “individual target animal” will be used. These terms are intended to refer to a single unique individual animal that can be distinguished from all other animals via distinct characteristics such as physical markings, tags, gait and/or behaviour.
[0058] The ability to process the data locally allows for real time or near real time processing and evaluations of the data which may otherwise not be possible if the processing was performed remotely. The local processing may also minimize the use of network bandwidth for communications which tends to be limited (and expensive) in remote areas in which the apparatus 1000 is likely to be used. However, communication with a cloud server or other remote device to perform remote processing may be performed in some instances where higher processing power is required.
[0059] Referring to Figure 3, there is illustrated a system 3500 of apparatus 1000 which communicate wirelessly via a central remote server 306. Based on the outputs from the sensors (102, 103, 105), and in particular to the image sensors 102, one or more images are processed to identify the target animal 3000 and, in response to that identification, an identification data packet is generated and sent via a communications module 303 to the remote server 306. Server 306 may be a physical server or a cloud-based server in communication with one, many or all of the apparatus 1000 within system 3500.
[0060] The server 306 is in communications with a network of other apparatus 1000, which may be distributed at spatially separate locations across a geographic region being monitored. In this manner, a network of apparatus 1000 are able to communicate with each other via server 306. Preferably, the identification data packet is a small set of data capable of alerting server 306 of a potential detection of a target animal. The small data size allows bandwidth and power constraints of communications module 303 to be minimised. The identification data packet may contain information such as:
> A simple alert that a target animal has been detected;
> Information pertaining to the detected characteristics detected by the sensors (102, 103, 105);
> Timestamps or timing information about the detection event;
> Device specific data such as a unique identifier, location and current battery power level;
> Image data for images corresponding to the detected target animal; and
> Other sensor data such as acoustic signals for times when the target animal was detected.
[0061] It will be appreciated from the description below that apparatus 1000 is capable of detecting a target animal 3000 in the form of an individual animal, an animal species, sub species, cohort, or genus or group of animals or species, sub-species or genus. The terms “target animal” used herein are intended to cover these different options.
[0062] The operation of system 3500 will be described in more detail below. [0063] Referring again to Figure 1 , the apparatus 1000 includes a protective housing 100 where the one or more sensors 102, 103, 105 are mounted on or within the housing 100. Each of the sensors 102, 103, 105 are operably connected to a memory 107 and processor 104 for processing of the data acquired from the sensors 102, 103, 105. By way of example, memory 107 may include random access memory (RAM), read-only memory (ROM) and/or electrically erasable programmable read-only memory (EEPROM). However, other equivalent memory or storage systems may be used as should be readily apparent to those skilled in the art.
[0064] It will be appreciated that the number and type of sensors would not be limited to the aforementioned, with a greater number of sensors resulting in more accurate detection of the target animal 3000 in its environment. Each sensor 102, 103, 105 is adapted to detect various characteristics of the target animal 3000 such as physical appearance and/or behaviour, the size of the target animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the target animal 3000. Where audio sensors are used, audio patterns, animal call sounds and signatures may be captured.
[0065] Each of the image sensors 102 and 103 are configured to capture one or more images (in either 2D or 3D) of a region of the environment 2000. Each image sensor 102 and 103 may take a variety of forms ranging from a conventional CCD or CMOS visual spectrum image sensor, IR sensor, 3D camera or LIDAR sensor.
[0066] In one embodiment, a near-infrared 2D image sensor 103 with a global shutter paired with a high intensity pulsed near-infrared illumination source may be used. The illumination source is then "flashed" at the same time as the image sensor 102 global shutter in order to illuminate and capture images from a broad area of the local environment. This allows for a low power draw (around 10 mW or less) minimizing battery usage. However, in other embodiments, a passive system is used in which no illumination device is implemented to further reduce power consumption.
[0067] In particular, each image sensor (102, 103) is adapted to capture two or three dimensional images which, when processed by processor 104, can locally discriminate living creatures from non-living creatures. As is shown in Figure 1, the housing 100 contains a processor 104 which is operably connected to a memory 107 for storage of instructions and data. The processor 104 is configured, inter alia, to process one or more images and inputs from the sensors (102, 103) to detect the presence of the target animal 3000 in real time. [0068] Figure 2 exemplifies the use of a single apparatus 1000 in use with a target animal 3000. However, it will be understood that more than one apparatus 1000 would be typically used to aid in the detection of a target animal 3000. A system of such apparatus 1000 is described below.
[0069] Processor 104 may be implemented as any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. In some embodiments, the processor 104 takes the form of an embedded system or system-on-chip (SoC) device. The SoC device provides the benefit of a single platform where the entire computing system can be integrated onto the SoC device. The SoC device may further include one or more of the image sensors 102 and 103 and a clock for precisely determining a current time. The clock can provide valuable information input when determining whether a detected animal has been correctly identified. Furthermore, data from the clock can be accessed to register timestamps of detected animal events. In some embodiments, a separate clock device may be included in apparatus 1000.
[0070] Information from the local clock on apparatus 1000 may be used by processor 104 in the classification of animals. By way of example, during daylight hours, nocturnal animals are less likely to be detected by the image sensors. This information can be used by processor 104 as a measure of confidence or an input to a classifier algorithm to avoid false positive detections that may occur.
[0071 ] The functional operation of processor 104 may be divided into different modules such as a vision processor functional module for performing computer vision processes and a device controller functional module for performing device control. However, it will be appreciated that, in alternative embodiments, the functions of the vision processor functional module and device controller functional module may be realized as separate hardware such as microprocessors in conjunction with custom or specialized circuitry.
[0072] In some embodiments, processor 104 is collectively realised as a heterogenous computing system such as a big. LITTLE architecture in which a smaller, low power processor performs low level processing while a larger more power intensive processing performs higher level processing. By way of example, initial animal detection may be performed by the smaller processor, which may include a small image signal processor (ISP) engine and subsequent high level animal classification may be performed by the larger processor, which includes a larger ISP engine.
[0073] The vision processor functional module of processor 104 is configured to process the captured images to perform target animal detection; for example to perform classification based on shape, movement (e.g. speed), colour, size, temperature, thermal signature and other characteristics of objects detected within the imaged environment. To achieve this, the vision processor module may utilize one or more image processing algorithms such as object tracking, edge detection, shape recognition, contour detection and spectral analysis.
[0074] The device controller functional module of processor 104 is configured to control the various devices of apparatus 1000 such as sensors 102, 103 and 105. By way of example, the device controller may control a timing, resolution and/or frame rate of image sensors 102 and 103 and may also control the selective illumination of one or more light sources to illuminate the environment being imaged (if the apparatus is fitted with active lighting).
[0075] With reference to Figure 5, in some embodiments the processor 104 is capable of executing a machine learning algorithm such as a neural network classifier 4000 to detect the presence of the target animal. The algorithm 4000 may take a virtually unlimited number of inputs such as data from the imaging device, acoustic inputs, scent data, time data and GPS location data among other possibilities. Figure 5 illustrates exemplary inputs (e.g. 4001) in the form of image sensor input, acoustic sensor input, scent data from a particle or chemical sensor, time data from a clock and GPS location data. Other information such as a time of day may also be input to the algorithm 4000. It will be understood that the greater number of inputs will typically result in a higher probability of detecting the target animal species accurately. Utilising the neural network algorithm 4000, the inputs are fed to nodes (e.g. 4003) of one or more hidden layers (a single hidden layer 4005 is shown in Figure 5). The nodes represent weighted functions of the inputs wherein the weights of the functions can be varied based on training of the algorithm. The outputs of the nodes are combined at an output 4007 to produce a determination of a presence and/or classification of an animal species (or a determination that no target animal is present).
[0076] Algorithm 4000 has preferably been trained on a dataset of images, sounds, videos and/or other characteristic data of the target animal or animals. The algorithm 4000 may be static or dynamic to be able to be further trained to improve the classification. The learning of algorithm 4000 may be supervised, semi-supervised or unsupervised. [0077] The housing 100 may be fabricated from a number of materials suitable for use in an outdoor setting. For instance, the housing 100 may be formed from a rigid polymer material that includes UV stabilized polymers to withstand the sun. Alternatively, a metallic material may be used for the housing 100 where greater longevity is required as it would be resistant to UV degradation. A metallic housing 100 made from a material resistant to corrosion such as stainless steel or aluminium would be preferred. However, the use of a polymer housing is to be preferred if the housing is adapted for containing wireless communications. In other embodiments, the housing 100 may be formed of other rigid or semi-rigid materials such as wood.
[0078] As is exemplified in Figure 1, the apparatus 1000 includes one or more sensors 102, 103, 105 contained within the housing 100 and positioned to monitor environment 2000. In the embodiment shown, the apparatus 1000 is powered by a battery 110. The battery 110 may be of the rechargeable variety in which case a combination of a solar panel array or small wind turbine can be used to charge the battery. Alternatively, single use batteries may be used where they are periodically replaced when the apparatus is maintained. In other embodiments, apparatus 1000 may be connected to mains power and powered by the electricity grid.
[0079] Where installed, the solar panel array is configured to convert light incident upon the solar panel array into electrical power, and to store the electrical power in the battery 110. For instance, the solar panel array converts sunshine during daytime into power in order to recharge the battery 110. In some embodiments, the solar panel array is located on an outer or top surface of the apparatus 1000. In other embodiments, the solar panel array is positioned remote from the apparatus 1000. It should be noted that in some embodiments, the battery 110 can be recharged by a generator or an external power source, can be a replaceable power source (e.g., a replaceable battery that is swapped out periodically), or can be itself located remotely from the apparatus 1000 (for instance, via power lines electrically coupling the battery 110 to the apparatus 1000).
[0080] Thermal image sensor 103 may utilize IR sensitive pixels to detect thermal characteristics of the target animal species 3000. The thermal image sensors 103 are preferably calibrated to have a high sensitivity at temperatures corresponding to the target animal’s core temperature so as to accurately detect the temperature of the target animal 3000 and/or thermal characteristics of regions of the target animal (e.g. a heat map of the animal). [0081] For instance, certain target animals 3000 may have well defined body temperatures and, as such, the detected temperature of the detected animal may be used as an input to processor 104 to classify whether the detected animal is a target animal species or not based at least in part on the temperature of the animal.
[0082] Furthermore, once a target animal 3000 has been wholly or partially classified using other parameters such as visual characteristics, the detected temperature may be used to determine a health status of the target animal 3000. For instance, if a visual classification confirms the identity of the target animal 3000 as a given animal species but the detected temperature is outside the expected range for the animal species, this may be indicative that the target animal is unwell. Such health information can also be useful ecological data to obtain.
[0083] In analysing the data from the image sensors 102, visual characteristics of an animal are used to make a determination as to whether the detected animal is the target animal. These visual characteristics include but are not limited to; a shape of the target animal 3000, colour and/or marking, size, one or more predefined movement characteristics such as gait of the target animal 3000 over more than one image, a core or average temperature of the target animal 3000 or a temperature distribution across the target animal 3000 as detected by the IR sensor 103, the brightness or reflectivity of the target animal and any distinct marking that the target animal 3000 may have.
[0084] In other embodiments, the apparatus 1000 includes a GPS device 106 to allow for the determination of the location of the apparatus 1000. The location of the apparatus 1000 may be used to provide important information about the location of the detected target species and the location of the apparatus to assist when it requires service or replacement among other things.
[0085] In addition to sensors that are sensitive to the visual and IR ranges, other embodiments include the use of an acoustic sensor 108. The acoustic sensor 108 is calibrated to detect the acoustic characteristics of the environment and more specifically, any characteristic noises that the target animal species may make including mating calls and other characteristic sounds of the particular animal species of interest.
[0086] In other embodiments, light level sensors, a humidity sensor or ambient temperature sensors (not shown) may be used.
[0087] In other embodiments, the apparatus includes a particle detector/chemical analyser 105 which is adapted to detect in its broadest sense, scent characteristics of the environment and more specifically, signature characteristics of the target animal 3000 such as pheromones indicative of a certain target animal. Other characteristics such as the animal's sex, health status or pregnancy status may be determined using the detection of scent.
[0088] In other embodiments, the apparatus 1000 includes a wireless identifier configured to detect wireless signals and pre-stored identification data associated with one or more animals. In some embodiments, the wireless identifier may take the form of a Zigbee® based RF tag or Wireless Sensor Network (WSN) technology, which may provide long range low power wireless tracking or the target animals 3000.
[0089] In one embodiment, the wireless identifier may take the form of a Radio Frequency Identification (RFID) device. The RFID device may be used to detect RFID signals in the environment such as those that may be present in the vicinity of an animal with an embedded RFID chip. Preferably, the RFID device makes use of active RFID tags allowing a range of tracking in the vicinity of hundreds of meters.
[0090] Local memory 107 storage on the apparatus 1000 allows for images and other data related to a detection event which may be later retrieved either manually or via a network for analysis. The memory 107 may take the form of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), and other equivalent memory or storage systems as should be readily apparent to those skilled in the art.
[0091] In other embodiments, the software/firmware stored on each of the apparatus 1000 may be updated either manually by an operator in the field or more preferably, over the air (OTA) or generally via a network where multiple units can be updated at one time.
[0092] In order to minimize power consumption, in some embodiments, the apparatus 1000 is configured to operate in a plurality of different power modes. By way of example, apparatus 1000 may be configured to operate in either a low power mode, allowing for long term monitoring in a low power state, or a higher power active mode, which may be triggered when greater computing power is required.
[0093] The low power mode operates when the device is in its quiescent state awaiting the detection of a change in the environment such as the presence of a target animal 3000. Once a change in the environment is detected, the device may then operate in the high power mode where greater computing power and additional functions are utilized to determine whether the change in environment is a detection of the target animal 3000 or not. Control of which power mode the apparatus 1000 operates in is performed by processor 104 as described below.
[0094] For instance, in the low power mode, less critical facilities may be shut down, for example, the one or more image sensors may operate at a lower frame rate and/or resolution, consuming less processing power and memory and in turn consuming less electrical power. Alternatively, the deactivation of multiple image sensors may be performed in the low power mode further reducing the power consumption in this mode. The transition from low power to high power mode may be instigated by the use of a trigger signal which may take a number of forms such as a visual trigger such as the detection of a shape or movement in the images or a mechanical or electromechanical trigger associated with the apparatus. Other triggers may include the detection of a predefined brightness or temperature in the images.
[0095] Another form of trigger signal is an acoustic trigger, where for instance, the trigger may activate when a particular sound indicative of the target animal species 3000 is detected.
[0096] As light conditions can affect the activities of animals, in some embodiments, one or more illumination devices are included in association with the apparatus 1000. These illumination devices may include one or more LEDs operating in the visible and/or infrared wavelength range. The illumination devices may be controlled by processor 104 to selectively illuminate parts of the environment to modify an animal's behaviour among other things.
[0097] As is exemplified in Figure 3, the apparatus 1000 includes a communications module 303, which is adapted to communicate with other apparatus 1000 positioned in the environment and alternatively to send and receive data to a remote server 306 at predefined time periods for the collection and retrieval of data among other things. The communications module 303 is adapted to communicate information such as the visual characteristics and/or presence of the target animal to server 306, which can, in turn, communicate with communications modules of other apparatus 1000. The communications module 303 is further adapted to communicate environmental characteristics and/or the presence of a target animal 3000 to other apparatus 1000 via server 306. In some embodiments, communications module 303 is able to communicate directly with other communications devices of other apparatus directly and bypass central server 306.
[0098] The communications module 303 can include a wireless receiver, transmitter, or transceiver hardware. The communications module 303 may also be adapted to transmit other status data such as an energy level of the battery 110 (e.g. state of battery charge) or diagnostic codes in the event of a malfunction. Firmware updates may be performed Over the Air (OTA) using the communications module 303. A variety of wireless protocols may be used, with low power wireless protocols such as Bluetooth, BLE, ZigBee, 2G or 3G being some examples. In other embodiments, communications module 303 includes hardware for communicating between devices over a wired network such as USB, Ethernet, twisted pair or coaxial cables.
[0099] In other embodiments, the communications module 303 is adapted to communicate the detected presence of a target animal 3000 to any of the other apparatuses 1000 allowing for the tracking of the target animal and the collection of information as to the target animal's 3000 movements. The detected presence of the target animal 3000 at one of the apparatus 1000 may be communicated to other apparatus within the vicinity causing them to alter from a lower power mode to a high power mode in anticipation of the target animal 3000 approaching other apparatus 1000 in the vicinity of the apparatus 1000 that initially detected the target animal 3000.
[00100] In some embodiments, apparatus 1000 are stationary devices deployed at specific locations throughout environment 2000. However, in other embodiments, apparatus 1000 may be in the form of a drone or unmanned aerial vehicle (UAV), which can be controlled to move around environment 2000.
Example responsive actions
[00101] Although apparatus 1000 can be used simply for detection of animals within the environment 2000, in some embodiments, apparatus 1000 is advantageously capable of performing various actions in response to detection of a target animal. These responsive actions include:
• Counting the presence, times, locations, health, behaviour or movement pattern of the detected target animal and storing the detection in a database as part of a larger study of the target animal.
• Communicate the information to central server 306 or to other apparatus 1000 within a system 3500 of apparatuses.
• Dispensing a compound such as a poison or medicament from a dispenser on the apparatus 1000. • Dispensing a poison compound onto the target animal (for pest animals);Dispensing food or water from a food dispenser on the apparatus 1000.
• Dispensing food and/or water;
• Capturing one or more high resolution images or high frame rate videos of the detected animal for subsequent analysis;
• Alerting, activating or changing the state of other nearby apparatus of the detected target animal;
• Alerting server 306 of the presence of the individual animal;
• Logging the presence, location and time (e.g. timestamp) of the individual animal in memory 107 or server 306;
• Calculating a number of target animals detected from one or more sensors in a given timeframe;
• Alerting a pet monitoring service or owner of a pet (if a pet is identified);
• Alerting monitoring personnel;
• Activating a live feed of images from apparatus 1000 to monitoring personnel;
• Issuing an audio signal to scare, alert or attract the individual animal;
• Releasing a chemical compound that provides an olfactory lure to the animal;
• Activating or opening a live trap or kill trap; and/or
• Activating one or more additional sensors such as acoustic sensors or RFID tag sensors to capture more data about the individual animal.
Dispensing of Compound
[00102] With reference to Figure 4, in other embodiments, the apparatus further includes an actuation device 402 which is responsive to a sensor signal generated from the processor 104 for initiating a response based on the detection of the target animal 3000. In one embodiment, the actuation device 402 further includes a dispenser 404 which is adapted to dispense a compound 406 onto the target animal 3000. [00103] In some embodiments, the compound is a pharmaceutical substance for medicating the target animal in response to a disease state. In other embodiments, the pharmaceutical is a toxin, the dosage of toxin supplied in the pharmaceutical is at least sufficient to incapacitate the target animal, and may cause death instantaneously or delayed relative to the ejection of the pharmaceutical, for instance via toxin-induced anoxia or other physiological effect.
[00104] In some embodiments, the pharmaceutical is sodium fluoroacetate 1080 ("1080") which is a well-known poison in Australia, and an effective dosage of 1080 may be between 5 mg and 12 mg, which may be supplied as 0.4ml of 30g/L concentrate 1080.
[00105] The use of 1080 has the advantage that it is present in a number of Australian native flora species and as such, Australian native animals tend to have a higher tolerance to it than introduced species such as feral cats. For instance, a predetermined dose of 1080 can euthanize a feral cat without harming other Australian native species.
[00106] In some embodiments, the pharmaceutical is PAPP and the dosage of the pharmaceutical is between 100mg and 300mg. In other embodiments, the toxin supplied in the pharmaceutical is delivered within a volume of fluid between approximately 1 ml and 5ml. To increase the chances of the pharmaceutical adhering to the coat of the target animal 3000, the pharmaceutical may be supplied in a viscous form.
[00107] In one embodiment, the pharmaceutical includes a gel formulation. In some embodiments, the gel formulation has a consistent viscosity at a range of temperatures and pressures, and can beneficially improve the reliability of the speed, direction, and precision of the ejection of the pharmaceutical. In other embodiments, the pharmaceutical includes a grease formulation, or is administered as a spray.
[00108] In some embodiments, a syringe or similar vessel is adapted (e.g., as part of the dispenser 404) to provide separate measured doses of the pharmaceutical for each ejection of the pharmaceutical by the targeting system.
[00109] In some other embodiments, a larger vessel (e.g., a canister or tank) provides a constant supply of the pharmaceutical to be applied or ejected in amounts corresponding to single doses per application/ejection. In some embodiments, different target animals may receive different doses of the pharmaceutical. The dose of the pharmaceutical applied to a given target animal can be selected, for instance based on the type of the target animal 3000 detected by the apparatus 1000. [00110] In some embodiments, the pharmaceutical is enclosed within a frangible membrane designed to rupture upon contact with the target animal 3000. The frangible membrane contains the pharmaceutical and each membrane, which may be in the form of a capsule, pellet, ball, or the like, contains a distinct unit dose of the pharmaceutical. In such embodiments, the dispenser 404 can shoot or eject the frangible membrane at the target animal 3000, which ruptures upon contact with the target animal 3000, causing the pharmaceutical enclosed within to contact and/or stick to the coat of the target animal 3000. In some embodiments, the frangible membrane is shot at the target animal 3000 at a speed fast enough to ensure the frangible membrane ruptures, but at a speed slow enough to prevent significant pain from being caused to the target animal 3000.
[00111] In some embodiments, the actuation device 402 includes a visual or acoustic (audio lure) stimulus which may be actuated in response to the detection of the target animal 3000. The visual stimulus may include a LED light arrangement which is adapted to generate a pattern of light that is likely to attract or lure the target animal or alternatively scare non-target animals away from the apparatus.
[00112] An acoustic stimulus may be produced by a sound generator and fed through a loudspeaker mimicking noises characteristic of or attractive to the target animal 3000 which may aid in luring the target animal to a vicinity of the apparatus 1000. The acoustic stimulus is programmable enabling realistic sounds including cat prey and mating calls for cats and foxes to be broadcast at variable volumes and intervals to optimize the luring capacity of the apparatus 1000. The acoustic stimulus can be configured to play at certain times of the day further improving the ability to lure target species.
Example operation
[00113] Referring to Figure 6, there is illustrated a flow chart depicting an exemplary method 6000 of operating apparatus 1000 for the detection of a target animal 3000. Initially, at step 6001, the apparatus 1000 enters a low power or quiescent state where the environment 2000 is monitored at a basic level. The low power state may be the default state that apparatus 1000 enters upon initialisation or after a predetermined period of no activity.
[00114] In this low power mode, apparatus 1000 performs basic monitoring of the environment 2000 and only certain functions of apparatus 1000 will be activated. By way of example, some of the sensors may be deactivated, one or more of the image sensors may operate at a lower frame rate and/or resolution, illumination devices (if installed) may be deactivated or turned down and other functions deactivated. Further, processor 104 may be configured to perform only basic image processing algorithms such as low resolution object detection, brightness variations or edge detection that draw relatively low power. Alternatively, sensors 102 and 103 may be deactivated and a low power sensor such as a motion sensor or acoustic sensor may be used to simply detect motion or sounds within the environment 2000.
[00115] In some embodiments, in the low power mode, only thermal image sensor 103 is activated and basic image processing is performed to detect thermal changes from the background environment. In some embodiments, in the low power mode a basic image classifier may be trained and executed on processor 104 to detect the normal background of environment 2000 being imaged by the image sensors. Minor changes across images such movement of branches or sunlight changes throughout the day may be taken into account in this classifier to reduce the instance of false triggers. This basic classifier can more accurately determine when an animal enters the environment scene being imaged, which would substantially change the normal background that is imaged.
[00116] Where processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the smaller processor and ISP engine may be executed to perform the low power mode processes of step 6001.
[00117] The low power mode remains active until, at step 6002, a trigger signal is detected by processor 104. The trigger signal may include a trigger from a mechanical device such as a weight sensor, a detection of a shape or motion by one of the image sensors 102 and 103 or detection of a sound by acoustic sensor 105 or a separate motion sensor. Where a basic classifier is implemented by processor 104, the trigger may be upon detection of a change in the normal background that is imaged by image sensor 102 or 103. The trigger signal may also be received from another nearby apparatus or server 306 via communications module 303. For example, if the nearby apparatus has detected a target animal in the environment, it may transmit a trigger signal to apparatus 1000 via server 306 and other nearby apparatus to wake them from their low power state.
[00118] At step 6003 once a trigger signal is received, processor 104 switches apparatus 1000 into a first higher power mode in which additional functionality is activated. This mode, termed “Stage 1” in Figure 6, is configured to allow apparatus 1000 to detect whether an animal is present. Thus, the level of processing and power consumption is somewhat higher than that of the low power mode in order to perform the detection. By way of example, in the higher power mode of Stage 1 , the image sensors may be configured to image the environment at a higher resolution and/or higher frame rate, deactivated sensors may be activated and illumination devices (if installed) may be activated. Further, processor 104 may be configured to implement more comprehensive image processing algorithms such as shape recognition, spectral analysis and a machine learned classifier in order to determine whether or not an animal is present.
[00119] In some embodiments, Stage 1 may include capturing one or more still frame (non video) images and performing image processing on those images. In other embodiments, Stage 1 may include capturing a low frame rate video sequence and performing image processing on that sequence of images. Depending on the time of day, detected by the local clock, only one of image sensors 102 or 103 may be activated. By way of example, if the processor 104 registers the current time of day is daylight hours, visible image sensor 102 may be activated and IR sensor 103 deactivated. Alternatively, if processor 104 registers the current time of day as being night time, visible image sensor 102 may be deactivated and IR sensor 103 activated. In Stage 1 , maintaining low power consumption is still of primary importance as many false triggers may switch apparatus 1000 into Stage 1 .
[00120] At step 6004, a determination is made of whether an animal has been detected or not from the processing of step 6003. This may include processor 104 determining a confidence measure, and, if the confidence measure is above a threshold value, a designation that an animal has been detected is made. By way of example, a threshold confidence value might be 70%, 80%, 90% or 95%. This confidence value may be performed by the detection of known characteristics of animals in comparison to characteristics of general movement within the environment. In many cases, the detected movement will not be due to an animal but rather motion within the environment. If no animal has been detected, the system returns to step 6001 and apparatus 1000 re-enters the low power mode. If an animal is detected, then system operation proceeds to step 6005, which includes a classification operation to classify the detected animal.
[00121] Animal detection at step 6004 may occur via a number of techniques including matching recorded data with data stored in a database. This includes basic shape or pattern recognition and comparison with a database of stored animal shapes, acoustic recognition of a stored animal call, thermal signature detection from IR images detected by IR sensor 103, and movement or motion detection amongst others.
[00122] Where processor 104 includes multi-stage processor hardware such as in a big. LITTLE architecture, the larger processor and ISP engine may be executed to perform the higher processing of step 6004.
[00123] The period from commencement of Stage 1 (upon triggering) to the determination at step 6004 is preferably only a period of milliseconds, such as less than 100 milliseconds. In some embodiments, the period is less than 50 milliseconds or less than 10 milliseconds. This rapid detection is preferable so as to be able to quickly detect a fast moving animal within the field of view of image sensors 102 and 103.
[00124] At the output of step 6004, apparatus 1000 does not know what type of animal has been detected; simply that an animal has been detected. This form of low-end processing allows for apparatus 1000 to operate at low power and to return to the low power mode if some trigger other than an animal, such a movement of a tree, switches device into the Stage 1 mode.
[00125] In some embodiments, upon detection of an animal at step 6004, apparatus 1000 may send a signal via communications module 303 to server 306 or directly to other nearby apparatus to alert them of the animal detection. This alert may, for example, trigger those devices to switch from the low power mode into the Stage 1 or a higher power mode for detecting the animal.
[00126] At step 6005, apparatus 1000 is switched into a Stage 2 “classification” mode of operation in which a target animal classification process is performed by processor 104. Stage 2 represents a more processor-intensive mode in which a higher level of processing occurs to classify the animal based on input received from the various sensors. Additional devices such as sensors and illumination devices may be activated or switched into a different mode. By way of example, image sensors 102 and 103 may be activated into higher frame rate and/or higher resolution modes to better capture characteristics of the animal such as shape, physical appearance and/or behaviour, the size of the animal, walking patterns, gait, movement speed and characteristics such as the texture, skin/fur patterns and colour of the animal. As such, Stage 2 represents a higher power mode than Stage 1 , which, in turn, draws a higher power than the low power mode of step 6001. [00127] In some embodiments, Stage 1 detection performs analysis on only a single image frame or small number of images while Stage 2 classification includes performing analysis on a sequence of video images. Analysis of video allows processor 104 to determine temporal characteristics of the animal such as movement gait and behaviour. In addition to additional or more comprehensive sensor input, Stage 2 classification includes a more comprehensive analysis of the data by processor 104. In some embodiments, Stage 2 classification includes performing a classifier algorithm such as that described above and illustrated in Figure 5.
[00128] At step 6006, processor 104 determines whether or not a target animal has been detected in the classification of step 6005. Like with step 6004, this decision may be based on a confidence measure produced with the classification. If the confidence of the detected animal being a target animal is greater than a threshold confidence value, then a decision is made that a target animal has been detected. By way of example, a threshold confidence value might be 70%, 80%, 90% or 95%.
[00129] The detection of a target animal at step 6006 may not be limited to a single animal but may include a group of target animals (e.g. fox, feral cat, endangered pygmy possum) that are of interest to be monitored. A plurality of target animals may be stored in memory 107 and accessed by processor 104 for classification. Furthermore, the target animal classification at step 6005 may be simply to detect a species of the target animal or it may be to detect a subset (e.g. male or adult animals only), cohort or individual target animal.
[00130] If, at step 6006, a target animal is not detected, then the process returns to step 6001 and apparatus 1000 again enters the low power mode. If, at step 6006, a target animal is detected, then the process proceeds to optional step 6007 in which further processing is performed to determine a specific individual target animal. This step is designated as Stage 3 and is optional as Stage 2 may be sufficient to detect an individual target animal. For example, an individual animal of interest may have known physical markings that can be detected in the Stage 2 classification. However, in some embodiments, particularly where an individual animal is difficult to distinguish from other animals of a species, Stage 3 classification can provide a further classification to determine if the target animal is the individual animal of interest. By way of example, a unique individual zebra or numbat may be detected by their specific arrangement of stripes. Similarly, spotted animals may be uniquely identified by their spot arrangements on their coats. [00131 ] Stage 3 classification may include running a more advanced classifier algorithm such as a machine learnt algorithm trained based on images of the individual animal. Stage 3 classification may be executed solely by processor 104 within apparatus 1000 (e.g. a larger processor and ISP engine of a two-stage or heterogenous processor system such as in the big. LITTLE architecture) or may be executed wholly or in part by server 306 or another cloud server with higher processing power. Stage 3 classification may also include receiving inputs from other nearby apparatus which may have detected the individual animal to consider movement patterns of the animal.
[00132] The Stage 3 classification at step 6007 determines a confidence value regarding a confidence that the individual animal has been detected. If the confidence value is greater than a confidence threshold, then, at step 6008, processor 104 determines that the individual target animal has been detected. By way of example, a threshold confidence value might be 70%, 80%, 90% or 95%. If, at step 6008, the confidence value is lower than the threshold confidence value, then apparatus 1000 is returned to a lower power state such as the low power mode of step 6001.
[00133] Upon detection of an individual target animal at step 6008, at step 6009, one or more responsive actions are taken by apparatus 1000, depending on the animal detected. By way of example, if the individual animal detected is a problematic feral cat, the responsive action might be to dispense a poison from a dispenser in apparatus 1000 as described above and also in Australian patent 2016302398 entitled “Device for Proximal Targeting of Animals”. Example responsive actions taken at step 6009 include:
> Dispensing a poison compound onto the target animal (for pest animals);
> Dispensing food and/or water;
> Capturing one or more high resolution images or high frame rate videos of the detected animal for subsequent analysis;
> Alerting and/or activating other nearby apparatus of the individual animal;
> Alerting server 306 of the presence of the individual animal;
> Logging the presence, location and time (e.g. timestamp) of the individual animal in memory 107 or server 306;
> Alerting a pet monitoring service or owner of a pet (if a pet is identified); > Alerting monitoring personnel;
> Activating a live feed of images from apparatus 1000 to monitoring personnel;
> Issuing an audio signal to scare, alert or attract the individual animal;
> Releasing a chemical compound that provides an olfactory lure to the animal;
> Deploy one or more drones or UAVs to an area where an animal was detected or is anticipated to be for monitoring by sensors on those drones/UAVs;
> Triggering a trap; and/or Activating one or more additional sensors such as acoustic sensors or RFID tag sensors to capture more data about the individual animal.
[00134] Finally, at step 6010, apparatus may optionally perform a verification that the action was successful. This may include processing a short sequence of images after the action to observe an outcome. By way of example, upon the dispensing of a poison, image sensor 102 is controlled to capture a short sequence of images and processor 104 processes the images to visibly identify that the poison was administered to the animal (e.g. poison observed to land on the animal’s body). In response, processor 104 may issue a verification to server 306 and/or store the verification in memory 107. In some embodiments, communications module 303 is controlled to transmit the short sequence of images captured after the action to server 306 for analysis and verification by a human operator.
[00135] At or after each step of the detection and classification in method 6000, processor 104 may store relevant information in memory 107 and/or transmit information to server 306. This may include the detection of a particular animal or animal species for counting in a study, markings of the detected animal, movement patterns, direction of travel, gait, behaviour characteristics (e.g. injured), as well as biometric information such as age, size and gender. This information is valuable for the ongoing study of the ecosystem within environment 2000.
[00136] Although four different power modes are described above, it will be appreciated that apparatus 1000 is able to operate in more than or fewer than four power modes.
System level operation
[00137] Referring again to Figure 3, in operation, system 3500 is able to collectively monitor a large area of environment 2000 that extends significantly beyond that of the field of view of a single sensor apparatus 1000. In system 3500, apparatus 1000 may be deployed at spatially separated locations around environment 2000, particularly in locations where target animals are known or likely to be present. In some embodiments, at least a subset of the apparatus 1000 are fitted to drones so that they are mobile and can controllably move around environment 2000.
[00138] The apparatus 1000 are each preferably locally powered by on-board batteries and optionally supplemented with solar and/or wind turbine installations. However, in some embodiments, some or all of the apparatus 100 may be powered by mains power. Further, each apparatus 1000 preferably communicates wirelessly with remote server 306 via communications module 303 and communicates only small amounts of data over short periods of time and at predetermined time periods so as to minimise power consumption of the apparatus 1000. However, in some embodiments, communications module 303 includes a wired connection such as USB, Ethernet or twisted pair cable to connect each apparatus 1000 with remote server 306 and/or between different apparatus via wired connections. The transmitted data may be compressed and encrypted by various data encryption algorithms known in the art. In other embodiments, the various apparatus 1000 are able to communicate directly with each other without communicating with server 306, such as in a mesh network.
[00139] In the illustrated embodiment of system 3500, server 306 acts as a central hub for collating data from each apparatus 1000 and performs system-level decision making such as which apparatus to switch into higher or lower power modes, which apparatus are malfunctioning, whether any apparatus needs to be relocated, serviced, or replaced. Server 306 may also intermittently issue software updates to the various apparatus to, for example, update the classification algorithms to more accurately classify the current target animals or change the target animals to be classified. Server 306 may also monitor power levels of the respective batteries of each apparatus 1000 and issue alerts if a battery needs replacement or if an apparatus is offline.
[00140] In some embodiments, server 306 may also perform a higher level classification to that performed by apparatus 1000, such as the Stage 3 classification described above. By way of example, each apparatus may employ one or more static classifier algorithms which perform classification of target animals and feed these classifications and associated images, acoustic data and other data to server 306 for further processing. Server 306 may employ a dynamic machine learning algorithm which continues to learn based on an updated training dataset fed by the data received from each apparatus 1000. This may be particularly useful when the system 3500 is aiming to detect individual animals having distinct markings or the like. In this situation, the apparatus 1000 may be used to detect the particular species of animal and server 306 performs a higher level analysis of the data to determine if the particular animal has been detected or not.
[00141 ] Employing server 306 to perform a higher level classification allows the classification software and hardware used in the apparatus 1000 to be kept relatively simple. This leads to reduced cost and power consumption by the apparatus 1000 and longer field operating lifetime. In some embodiments, server 306 may be powered by mains power due to a higher power consumption. Server 306 may periodically issue software updates to the apparatus 1000 as the dynamic classification system used at the server is able to classify target animals more accurately and/or the target animals to be identified and monitored changes.
[00142] The operation of system 3500 and apparatus 1000 preferably enables classification of a target animal in real-time or near real-time. In the context of the present invention, real-time or near real-time means a time frame that is in the order of milliseconds and preferably less than 100 milliseconds in order to be able to detect, image and classify a swiftly moving animal and perform an appropriate responsive action before the animal moves out of a responsive action zone proximal to the apparatus. This rapid timing is important as animal management can be very time-sensitive. In particular, efficient animal management requires identification of animals at multiple locations to be time-stamped and retrieved and centralized into a single database, where correlations between the detections can be made in order to understand the prevalence, movements and other statistics of the animal population. System 3500 is able to achieve this using the centralised server 306 as the central management engine. Server 306 may be cloud-based or include or communicate with a database that is cloud-based to alleviate the very real risks of data loss through transfer from local devices to more permanent data storage devices and during/following classification.
[00143] Server 306 may host or enable an interface or dashboard that is accessible by ecologists or other personnel to access and analyse the data gathered by system 3500. In environments where there are many thousands of animals, the system may provide an efficient interface, able to serve the needs of several key user roles including:
> the manager of the animal population;
> observers who wish to observe the manager and his/her actions; > members of the public who may also be indirectly involved in the animal management process, for example as owners of domestic pets or citizen scientists assisting with research;
> ecological researchers who use the data to attempt to understand animal species;
> specialists who may wish/need to review and reclassify identifications or confirm them for official records; and
> product researchers who use the data to improve the software solution in the sensors or the management software.
[00144] In addition to hosting the dashboard, server 306 may also host management software capable of accessing a database to collate detection event data from groups of apparatus in the same geographic area. The management software may also include capability to upload and update sensor software and configurations using "over the air updates". The management software may provide the capability to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected). The management software may also provide the capability for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
[00145] When there are undefined numbers of animals over a large geographic area, and potentially multiple user roles in the management of the animals, system 3500 described above provides various benefit and advantages, including:
• Sensors that contain local processing and software able to accurately determine the animal species or an individual target animal. This enables rapid-response autonomous system actions for targeted species or individuals that would not otherwise be possible if the identification was performed remotely. Additionally, the local processing and software minimizes use of network bandwidth for communications, which are usually highly limited (and expensive) in remote areas.
• A plurality of sensors spatially dispersed in the environment in order to detect more animals in a wider variety of locations, habitats and times. • Sensors that are both sensitive and specific (or accurate) in order to avoid false triggering/classifications cluttering the system, creating unnecessary costs, consuming battery life/power and memory storage and triggering erroneous actions.
• Being equipped with the means to photograph or film the animal in such a way as to allow for remote human observation and verification of the animal.
• The capability to avoid consumption of excess power by having one or more different power modes, each mode with a different power draw, with a standby or deep-sleep mode that enables long-term (or indefinite) operation in the environment without mains power.
• Capacity to harvest energy from the local environment via solar panels, a small scale wind turbine or the like in order to enable long-term (or indefinite) operation in the environment without mains power.
• Sensors that can wake-up quickly in order to film, or trap animals that are moving quickly (such as birds).
• Sensors that are equipped to locally discriminate living creatures from non-living objects primarily to avoid false triggering by landscape features including moving vegetation).
• Sensors that are equipped to classify the species of the animal quickly (within milliseconds), in order to trigger a trapping apparatus/management device nearby the sensor.
• Local storage capacity to record images and data related to a detection event for later retrieval, by network or manually by a worker.
• Capacity to connect to a network, for either upload, storage and classification of detection data, or download of new configurations and software to control the sensor, or both.
• The means to connect to a network to initiate management actions (playing lure, enabling trap etc) at adjacent apparatuses.
• Units equipped with audio, visual or olfactory lures or deterrents, able to be controlled by software, to attract or deter specific animal species to within operational range of the sensor. • The means to detect proximity (and/or direction) of a tag device, e.g. to allow registered domestic animals to be discriminated from untagged individuals of the same species.
• The means to detect proximity of a tag device, to allow previously captured and tagged wild animals to be distinguished from other individuals of the same species.
• Sensors or remote management software that are equipped to recognize individual animals that have been previously identified (for example detecting a specific individual tiger in a population of tigers).
• Management software that includes a database to collate detection event data from groups of sensors in the same geographic area.
• Management software that includes the means to upload and update sensor software and configurations using “over the air updates”.
• Management software that offers the means to remotely alert users of a particular category of animal detection event (for example a feral or domestic cat, or a particular species of wildlife, detected).
• Management software that offers the means for members of the public to observe or be notified about the identity of the animals being detected (for example to notify them that their cat or dog is detected).
[00146] Overall, the system level operation of system 3500 of apparatus 1000 provides for the identification, monitoring and management of animals within a potentially wide geographic area with minimal manual intervention by field personnel. The processes of system 3500 are largely automated and various management processes can be instigated automatically based on predefined rules and processes. Data can be centrally managed and monitored with the capability of field personnel to modify the rules and processes implemented by system 3500 such as to modify the management processes and/or vary the target animals being monitored. The collation of the spatial information from the system of apparatus allows for accurate determination of whether there is one or more than one problematic animals within the environment. This level of information is not possible from a single sensor apparatus in the field. Interpretation
[00147] The term “infrared” is used throughout the description and specification. Within the scope of this specification, infrared refers to the general infrared area of the electromagnetic spectrum which includes near infrared, infrared and far infrared frequencies or light waves.
[00148] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," “determining”, analyzing” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.
[00149] In a similar manner, the term “controller” or "processor" may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A “computer” or a “computing machine” or a "computing platform" may include one or more processors.
[00150] Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.
[00151] As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
[00152] In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.
[00153] It should be appreciated that in the above description of exemplary embodiments of the disclosure, various features of the disclosure are sometimes grouped together in a single embodiment, Fig., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this disclosure.
[00154] Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the disclosure, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.
[00155] In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the disclosure may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[00156] Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. "Coupled" may mean that two or more elements are either in direct physical, electrical or optical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
[00157] Embodiments described herein are intended to cover any adaptations or variations of the present invention. Although the present invention has been described and explained in terms of particular exemplary embodiments, one skilled in the art will realize that additional embodiments can be readily envisioned that are within the scope of the present invention.

Claims

What is claimed is:
1 . An apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are controlled to capture images in a low power mode and the processor is configured to detect a change in the environment from the images and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are controlled to capture images in a higher power mode than in the low power mode and the processor is configured to identify a target animal in the environment.
2. The apparatus of claim 1 wherein, in the low power mode, the one or more image sensors are configured to capture images at a lower resolution and/or a lower frame rate than in the high power mode.
3. The apparatus of claim 1 or claim 2 wherein, in the higher power mode, the processor is configured to classify a detected animal as a target animal.
4. The apparatus of any one of the preceding claims configured to operate in a second higher power mode in which the processor performs a classification to determine if a detected animal is a target animal.
5. The apparatus of claim 4 wherein, in the second higher power mode, the processor is configured to classify the animal as an individual target animal.
6. The apparatus of claim 4 or claim 5 including a communications module configured to transmit one or more of the captured images to a remote server and wherein the remote server performs a classification to identify if a detected animal is a target animal.
7. The apparatus of any one of the preceding claims wherein switching between the low power mode and higher power mode occurs within a period of less than 100 milliseconds.
8. An apparatus configured to perform real-time identification of an individual target animal in an environment, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture one or more 2D or 3D images of the environment; a processor contained within the housing and configured to process the one or more images to identify the individual target animal in real time based on one or more detected visual characteristics in the one or more images and, in response, generate an identification data packet; and a communications module configured to communicate the identification data packet to a remote server which is in communication with other apparatus.
9. The apparatus of any one of the preceding claims, including a GPS device configured to determine a location of the apparatus.
10. The apparatus of any one of the preceding claims, wherein at least one of the one or more image sensors is a thermal image sensor capable of imaging in the infrared wavelength range to detect thermal characteristics.
11. The apparatus of claim 10, wherein the thermal image sensor is calibrated to detect a temperature of the target animal.
12. The apparatus of claim 11 , wherein the processor is configured to use the temperature of the target animal and background to determine a health status of the target animal.
13. The apparatus of any one of the preceding claims, including an acoustic sensor to detect acoustic characteristics within the environment.
14. The apparatus of claim 13 wherein the processor is configured to detect a presence of or identify the target animal at least in part by one or more acoustic characteristics that match or closely match an acoustic characteristic indicative of the target animal.
15. The apparatus of any one of the preceding claims, including a particle detector or chemical analyser configured to detect scent characteristics within the environment.
16. The apparatus of claim 15, wherein the processor is configured to detect a presence of or identify the target animal at least in part by determining one or more scent signatures of the target animal based on the detected scent characteristics.
17. The apparatus of any one of the preceding claims including a battery and wherein the one or more image sensors and the processor are powered locally by the battery.
18. The apparatus of any one of the preceding claims, wherein the processor is implemented on a system-on-chip (SoC) device.
19. The apparatus of claim 18 wherein the one or more image sensors are also implemented on the SoC device.
20. The apparatus of any one of the preceding claims wherein the processor includes a clock to determine the current time at the location.
21 . The apparatus of any one of the preceding claims, wherein the processor is configured to execute a machine learned classifier to identify a target animal.
22. The apparatus of claim 21 wherein the processor is configured to execute a neural network classifier algorithm.
23. The apparatus of any one of the preceding claims wherein the detected visual characteristics include a shape of the target animal.
24. The apparatus of any one of the preceding claims wherein the detected visual characteristics include one or more predefined movements or behavioural characteristics of the target animal over a plurality of images.
25. The apparatus of any one of the preceding claims wherein the detected visual characteristics include thermal characteristics of the target animal.
26. The apparatus of any one of the preceding claims wherein the detected visual characteristics include a brightness or reflectivity or absorbance characteristic of the target animal.
27. The apparatus of any one of the preceding claims wherein the detected visual characteristics include a distinct colour and/or marking of the target animal.
28. The apparatus of any one of claims 1 to 7 wherein the target animal includes a predefined animal species.
29. The apparatus of any one of claims 1 to 7 wherein the target animal is an individual animal.
30. The apparatus of any one of the preceding claims including a wireless identifier configured to detect pre-stored data from a wireless tag associated with one or more animals.
31 . The apparatus of claim 8 configured to operate in one of a low power mode or a high power mode.
32. The apparatus of claim 31 wherein, in the low power mode, the one or more image sensors are configured to capture images at a lower frame rate and/or a lower resolution.
33. The apparatus of claim 31 or claim 32 wherein the high power mode is activated by a trigger signal.
34. The apparatus of claim 33 wherein the trigger signal is based on a mechanical or electromechanical trigger associated with the apparatus.
35. The apparatus of claim 33 wherein the trigger signal is based on detection of one or more trigger visual characteristics in the one or more images.
36. The apparatus of claim 35 wherein the one or more trigger visual characteristics include movement characteristics in the images.
37. The apparatus of claim 35 or claim 36 wherein the one or more trigger visual characteristics include detection of a predefined shape or shapes in the images.
38. The apparatus of any one of claims 35 to 37 wherein the one or more trigger visual characteristics include detection of a predefined temperature or brightness in the images.
39. The apparatus of claim 33 wherein the trigger signal is based on detection of one or more trigger acoustic characteristics detected by an acoustic sensor.
40. The apparatus of claim 31 wherein, in the low power mode, the one or more image sensors are deactivated.
41 . The apparatus of any one of the preceding claims including an illumination device configured to selectively illuminate at least a part of the environment.
42. The apparatus of any one of the preceding claims including one or more of a light level sensor, humidity sensor and/or a temperature sensor.
43. The apparatus of any one of the preceding claims including an actuation device responsive to a sensor signal generated from the processor for initiating an action in response to identification of the target animal.
44. The apparatus of claim 43 wherein the actuation device includes a dispenser for dispensing a compound onto the target animal.
45. The apparatus of claim 43 wherein the actuation device includes a visual or acoustic stimulus actuated in response to the identification of the target animal.
46. The apparatus of any one of the preceding claims, including a sound generator adapted to generate sounds to lure the target animal.
47. The apparatus of any one of the preceding claims, including a visual lure to lure the target animal toward the apparatus.
48. The apparatus of claim 8 wherein the communications module is further adapted to communicate the characteristics and/or the presence of a target animal to other nearby apparatus for detecting or identifying animals.
49. The apparatus of claim 48 wherein the communications device is adapted to send and receive data to a remote database at predetermined time periods.
50. The apparatus of any one of the preceding claims incorporated into a drone or UAV which is controllably moveable around the environment.
51 . An actuation device including one or more actuators responsive to a sensor signal generated from an apparatus of any one of the preceding claims.
52. A system configured to detect a presence of a target animal in an environment, the system including a plurality of apparatuses of claim 8, wherein the communications module is adapted to communicate the identification of a target animal to one or more others of the apparatuses either directly or via a remote server.
53. The system of claim 52 wherein, upon receipt of an identification of a target animal from a nearby apparatus, one or more other apparatus are controlled from a lower power mode into one or more high power modes.
54. An apparatus configured to detect a presence of a target animal in an environment in real time, the apparatus including: a housing; one or more sensors mounted on or within the housing and configured to sense one or more characteristics of the environment; a processor contained within the housing and configured to process the one or more images and environment characteristics to detect a presence of a target animal in real time; and a communication device adapted to communicate the environment characteristics and/or the presence of a target animal to other nearby apparatus to switch the other nearby apparatus into a monitoring mode.
55. The apparatus of claim 54 wherein the one or more sensors include one or more of image sensors, acoustic sensors, temperature sensors, particle sensors and/or chemical analysers.
56. An animal monitoring system for monitoring animals within an environment, the system including a plurality of animal monitoring apparatus positioned at spatially separated locations within the environment, each animal monitoring apparatus including: one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; and a processor contained within the housing and configured to process the images to identify a target animal in real-time; and a communications module for communicating the identification of a target animal to one or more others of the animal monitoring apparatus.
57. The system of claim 56 including a server configured to communicate with the communications modules of the animal monitoring apparatus to receive the detection of a target animal from one of the animal monitoring apparatus.
58. The system of claim 56 wherein, upon identification of a target animal from one of the animal monitoring apparatus, the server is configured to receive images from the animal monitoring apparatus and process those images to classify the identified target animal as a subgroup of the target animals or an individual target animal.
59. The system of any one of claims 56 to 58 wherein, upon identification of a target animal by one of the animal monitoring apparatus, the communications module of that apparatus sends a signal to one or more other animal monitoring apparatus to switch that apparatus from a low power mode into a higher power mode.
60. The system of any one of claims 56 to 59 wherein one or more of the animal monitoring apparatus are incorporated onto a drone or UAV device.
61 . An apparatus configured to detect a presence of a target animal in an environment in real time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; a processor contained within the housing and configured to process the images to detect a presence of an animal and/or classify an animal in real-time; and a communications device configured to transmit at least a subset of the captured images to a remote server upon detection of an animal by the processor, wherein the remote server is configured to process the images to classify the animal as a target animal.
62. An apparatus configured to identify a target animal in an environment in real-time, the apparatus including: a housing; one or more image sensors mounted on or within the housing and configured to capture images of a region of the environment; an acoustic sensor configured to sense acoustic signals; and a processor contained within the housing and configured to process the acoustic signals to detect acoustic signals indicative of an animal and process the images to identify a target animal in real-time; wherein the apparatus is configured to operate in a low power mode and a higher power mode; wherein, in the low power mode, the one or more image sensors are deactivated and the processor is configured to detect acoustic signals indicative of an animal and, in response, trigger the higher power mode; and wherein, in the higher power mode, the one or more image sensors are activated and controlled to capture images and the processor is configured to identify a target animal in the environment.
PCT/AU2022/050627 2021-06-21 2022-06-21 A system and apparatus for animal management WO2022266705A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2022297009A AU2022297009A1 (en) 2021-06-21 2022-06-21 A system and apparatus for animal management

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021901865 2021-06-21
AU2021901865A AU2021901865A0 (en) 2021-06-21 A System and Apparatus for Animal Management

Publications (1)

Publication Number Publication Date
WO2022266705A1 true WO2022266705A1 (en) 2022-12-29

Family

ID=84543784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/050627 WO2022266705A1 (en) 2021-06-21 2022-06-21 A system and apparatus for animal management

Country Status (2)

Country Link
AU (1) AU2022297009A1 (en)
WO (1) WO2022266705A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160026895A1 (en) * 2014-07-23 2016-01-28 Metecs, LLC Alerting system for automatically detecting, categorizing, and locating animals using computer aided image comparisons
US20160277688A1 (en) * 2015-03-18 2016-09-22 The Samuel Roberts Noble Foundation, Inc. Low-light trail camera
US20170079260A1 (en) * 2014-04-18 2017-03-23 Hogman-Outdoors, Llc Game Alert System
US20180000575A1 (en) * 2016-06-29 2018-01-04 International Business Machines Corporation Unmanned aerial vehicle-based system for livestock parasite amelioration
WO2020037377A1 (en) * 2018-08-24 2020-02-27 OutofBox Solutions Tech Pty Ltd A detection system
US20210176982A1 (en) * 2019-12-11 2021-06-17 Plano Molding Company, Llc Camera system and method for monitoring animal activity
US20210259235A1 (en) * 2020-02-24 2021-08-26 Sony Corporation Detection of animal intrusions and control of a repellent mechanism for detected animal intrusions
WO2022040366A1 (en) * 2020-08-18 2022-02-24 IntelliShot Holdings, Inc. Automated threat detection and deterrence apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170079260A1 (en) * 2014-04-18 2017-03-23 Hogman-Outdoors, Llc Game Alert System
US20160026895A1 (en) * 2014-07-23 2016-01-28 Metecs, LLC Alerting system for automatically detecting, categorizing, and locating animals using computer aided image comparisons
US20160277688A1 (en) * 2015-03-18 2016-09-22 The Samuel Roberts Noble Foundation, Inc. Low-light trail camera
US20180000575A1 (en) * 2016-06-29 2018-01-04 International Business Machines Corporation Unmanned aerial vehicle-based system for livestock parasite amelioration
WO2020037377A1 (en) * 2018-08-24 2020-02-27 OutofBox Solutions Tech Pty Ltd A detection system
US20210176982A1 (en) * 2019-12-11 2021-06-17 Plano Molding Company, Llc Camera system and method for monitoring animal activity
US20210259235A1 (en) * 2020-02-24 2021-08-26 Sony Corporation Detection of animal intrusions and control of a repellent mechanism for detected animal intrusions
WO2022040366A1 (en) * 2020-08-18 2022-02-24 IntelliShot Holdings, Inc. Automated threat detection and deterrence apparatus

Also Published As

Publication number Publication date
AU2022297009A1 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
EP3466256B1 (en) Machine for capturing, counting and monitoring of insects
US10512260B2 (en) Method and apparatus for automated animal trapping
US11659826B2 (en) Detection of arthropods
US11617353B2 (en) Animal sensing system
Hadjur et al. Toward an intelligent and efficient beehive: A survey of precision beekeeping systems and services
US20190166823A1 (en) Selective Action Animal Trap
US20160277688A1 (en) Low-light trail camera
AU2016302398B2 (en) Device for proximal targeting of animals
CN102282570A (en) System and method for stereo-view multiple animal behavior characterization
EP3481190A1 (en) Pest deterrent system
US20220361471A1 (en) Intelligent insect trap and monitoring system
US20210329891A1 (en) Dynamic laser system reconfiguration for parasite control
US20230102968A1 (en) Selective Predator Incapacitation Device & Methods (SPID)
Janani et al. Human-Animal Conflict Analysis and Management-A Critical Survey
Sundaram et al. Integrated animal monitoring system with animal detection and classification capabilities: a review on image modality, techniques, applications, and challenges
WO2022266705A1 (en) A system and apparatus for animal management
Chiwamba et al. An application of machine learning algorithms in automated identification and capturing of fall armyworm (FAW) moths in the field
US20210315186A1 (en) Intelligent dual sensory species-specific recognition trigger system
KR100688243B1 (en) System for capturing wild animal using by remote control
US11557142B1 (en) Home wildlife deterrence
Pillewan et al. Review on design of smart domestic farming based on Internet of Things (IoT)
Lathesparan et al. Real-time animal detection and prevention system for crop fields
Wang Intelligent UAVs for Pest Bird Control in Vineyards
Bello An overview of animal behavioral adaptive frightening system
Bhusal Unmanned Aerial System (UAS) for Bird Damage Control in Wine Grapes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22826880

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 806371

Country of ref document: NZ

Ref document number: 2022297009

Country of ref document: AU

Ref document number: AU2022297009

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2022297009

Country of ref document: AU

Date of ref document: 20220621

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE