WO2022217316A1 - Animal monitoring device - Google Patents

Animal monitoring device Download PDF

Info

Publication number
WO2022217316A1
WO2022217316A1 PCT/AU2022/050336 AU2022050336W WO2022217316A1 WO 2022217316 A1 WO2022217316 A1 WO 2022217316A1 AU 2022050336 W AU2022050336 W AU 2022050336W WO 2022217316 A1 WO2022217316 A1 WO 2022217316A1
Authority
WO
WIPO (PCT)
Prior art keywords
animal
movement
attribute
signal
sensor
Prior art date
Application number
PCT/AU2022/050336
Other languages
French (fr)
Inventor
Jeremy BOCKNEK
Tat-Ming TAM
Original Assignee
Alpha Vet Tech Holdings Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2021901107A external-priority patent/AU2021901107A0/en
Application filed by Alpha Vet Tech Holdings Pty Ltd filed Critical Alpha Vet Tech Holdings Pty Ltd
Publication of WO2022217316A1 publication Critical patent/WO2022217316A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K11/00Marking of animals
    • A01K11/006Automatic identification systems for animals, e.g. electronic devices, transponders for animals
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14552Details of sensors specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6822Neck
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6831Straps, bands or harnesses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K27/00Leads or collars, e.g. for dogs
    • A01K27/001Collars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/06Arrangements of multiple sensors of different types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/16Details of sensor housings or probes; Details of structural supports for sensors

Definitions

  • the present invention relates to a device and method for animal monitoring, and in particular for monitoring non-human animals.
  • an aspect of the present invention seeks to provide a system for monitoring an animal, including: at least one attribute sensor worn by the animal; at least one movement sensor worn by the animal; and, one or more electronic processing devices configured to: receiving at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor, receiving a movement signal from the at least one movement sensor; processing the at least one attribute signal at least partially in accordance with the movement signal; and, generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
  • the at least one attribute sensor is at least one of: worn on a neck region; worn on a tail of the animal; attached to a collar; and, attached to a tail piece.
  • the at least one movement sensor is at least one of: worn on a neck region; worn on a tail of the animal; attached to a collar; and, attached to a tail piece.
  • the tail piece includes: a body; and, an arm pivotally mounted to the body and movable between: an open position to allow the tail piece to be positioned on the tail; and, a closed position in which the tail is retained in the tail piece.
  • the tail piece includes a sensor in an underside of the body and a number of deformable fins on an underside of the arm configured to engage the tail and urge the tail into engagement with an underside of the body.
  • the at least one attribute signal includes a plurality of optical signals and wherein the one or more processing devices are configured to: process at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, generate the at least one indicator based on the selected at least one optical signal.
  • each of the plurality of optical signals are measured at a different wavelength, and wherein the one or more processing devices are configured to select the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
  • the one or more processing devices are configured to process the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: a predetermined depth of penetration; and, a predetermined tolerance to skin pigmentation.
  • the one or more processing devices are configured to: analyse the movement signals to determine at least one movement parameter indicative of at least one of: a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; and, a pose; and, process the at least one attribute signal based on the at least one movement parameter.
  • the one or more processing devices are configured to analyse the movement signal using at least one of: adaptive filtering; fuzzy logic; autocorrelation; machine learning; and, pattern recognition.
  • the one or more processing devices are configured to process the at least one attribute signal includes filtering the at least one attribute signal based on the movement parameter.
  • the one or more processing devices are configured to: analyse the at least one attribute signal based on the movement signal; and, generate at least one indicator based on the analysed signal.
  • the one or more processing devices are configured to process the at least one attribute signal by: determining one or more features derived from the animal; using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
  • the one or more processing devices are configured to: apply machine learning to reference features derived from one or more reference animals having known attributes; and, apply machine learning to features derived from the animal.
  • the one or more processing devices are configured to: develop a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modify a generic model to create a subject specific model by applying machine learning to features derived from the animal.
  • the at least one indicator includes: a heart rate of the animal; an oxygen level of the animal; and, a status of the animal.
  • the one or more processing devices are configured to generate a representation of the at least one indicator for display.
  • the representation including: a numerical representation; a trend line; a scale; and, a meter gauge.
  • the one or more processing devices are configured to: determine information indicative of a physical parameter of the animal; and, process the at least one attribute signal at least partially in accordance with the physical parameter.
  • the physical parameter includes at least one of: an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse-oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP.
  • the information indicative of the physical parameter includes information determined from an image of the animal.
  • the one or more processing devices are configured to: determine information indicative of an environmental parameter of the animal; and, process the at least one attribute signal at least partially in accordance with the environmental parameter.
  • the environmental parameter includes at least one of: environmental pressure; air quality indicator; pollen; humidity; and, altitude.
  • the attribute sensor is at least one of: a pulse-oximetry sensor; a temperature sensor; a heart rate sensor; and, a respiration sensor.
  • the biological attribute includes at least one of: an oxygen level; a temperature; a heart rate; and, a respiration rate.
  • the movement sensor includes at least one of: an accelerometer; and, a gyroscope.
  • the one or more processing devices are at least one of: integrated into an external electronic device and wirelessly connected to the at least one attribute sensor and/or the at least one movement sensor; and, integrated into the collar and/or the tail piece.
  • the external electronic device is at least one of: a smart phone; a tablet; a desk-top computer; and, a handheld device.
  • an aspect of the present invention seeks to provide a method for monitoring an animal, the method including, in one or more electronic processing devices: receiving at least one attribute signal indicative of a biological attribute of the animal from at least one attribute sensor worn by the animal, receiving a movement signal from at least one movement sensor worn by the animal; processing the at least one attribute signal at least partially in accordance with the movement signal; and, generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
  • the at least one attribute signal includes a plurality of optical signals and wherein the method includes, in the one or more electronic processing devices: processing at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, generating the at least one indicator based on the selected at least one optical signal.
  • each of the plurality of optical signals are measured at a different wavelength
  • the method includes, in the one or more electronic processing devices, selecting the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
  • the method includes, in the one or more electronic processing devices, processing the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: a predetermined depth of penetration; and, a predetermined tolerance to skin pigmentation.
  • the method further includes, in the one or more electronic processing devices: analysing the movement signals to determine at least one movement parameter indicative of at least one of: a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; and, a pose; and, processing the at least one attribute signal based on the at least one movement parameter.
  • the method further includes, in the one or more electronic processing devices, analysing the movement signals using at least one of: adaptive filtering; fuzzy logic; autocorrelation; machine learning; and, pattern recognition.
  • the method includes, in the one or more electronic processing devices, processing the at least one attribute signal by filtering the at least one attribute signal based on the movement parameter.
  • the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: analysing the at least one attribute signal based on the movement signal; and, generating the at least one indicator based on the analysed signal.
  • the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: determining one or more features derived from the animal; using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
  • the method further includes, in the one or more electronic processing devices: applying machine learning to reference features derived from one or more reference animals having known attributes; and, applying machine learning to features derived from the animal.
  • the method further includes, in the one or more electronic processing devices: developing a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modifying a generic model to create a subject specific model by applying machine learning to features derived from the animal.
  • the at least one indicator includes: a heart rate of the animal; an oxygen level of the animal; and, a status of the animal.
  • the method further includes, in the one or more electronic processing devices, generating a representation of the at least one indicator for display.
  • the representation including: a numerical representation; a trend line; a scale; and, a meter gauge.
  • the method further includes, in the one or more electronic processing devices: determining information indicative of a physical parameter of the animal; and, processing the at least one attribute signal at least partially in accordance with the physical parameter.
  • the physical parameter includes at least one of: an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse-oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP.
  • the information indicative of the physical parameter includes information determined from an image of the animal.
  • the method further includes, in the one or more electronic processing devices: determining information indicative of an environmental parameter; and, processing the at least one attribute signal at least partially in accordance with the environmental parameter.
  • the environmental parameter includes at least one of: environmental pressure; air quality indicator; pollen; humidity; and, altitude.
  • the attribute sensor is at least one of: a pulse-oximetry sensor; a temperature sensor; a heart rate sensor; and, a respiration sensor.
  • the biological attribute includes at least one of: an oxygen level; a temperature; a heart rate; and, a respiration rate.
  • the movement sensor includes at least one of: an accelerometer; and, a gyroscope.
  • Figure 1 is a schematic diagram of a first example of a system for use in monitoring an animal
  • Figure 2 is a flowchart of a first example of a method for use in monitoring an animal using the system of Figure 1;
  • Figure 3 is a schematic diagram of an example of a network architecture
  • Figure 4 is a schematic diagram of an example of a sensor for use in monitoring an animal
  • Figure 5 is a schematic diagram of a second example of a system for use in monitoring an animal
  • Figures 6A and 6B is a flowchart of a second example of a method for use in monitoring an animal using the system of Figure 5;
  • Figure 7 a flowchart of an example of a method for use in processing signals using the system of Figure 5;
  • Figures 8A and 8B are prospective views of a collar piece of a system for use in monitoring an animal;
  • Figures 9A and 9B are prospective views of a tail piece of a system for use in monitoring an animal;
  • Figure 10A is a schematic diagram of a sensor module in a tail piece
  • Figure 10B is a schematic diagram of an optical sensor of the sensor module
  • Figure 11 shows other exemplary arrangements of the optical sensors of the sensor module
  • Figure 12 is a schematic diagram of the representations of the animal monitoring system
  • Figure 13 is a schematic diagram of the representations of the animal monitoring system
  • Figure 14 is a schematic diagram of the representations of the animal monitoring system
  • Figure 15 is a schematic diagram of the representations of the animal monitoring system.
  • Figure 16 is a schematic diagram of the representations of the animal monitoring system.
  • the system 100 includes at least one attribute sensor 111 worn by the animal, and one or more movement sensor 112 worn on the animal.
  • the attribute sensor could include a heart rate sensor, a temperature sensor and a pulse-oximeter
  • the movement sensor could include an accelerometer, gyroscope, inertial measurement unit, or the like.
  • the at least one attribute sensor 111 and/or the at least one movement sensor 112 may be worn on a neck region of the animal and/or worn on a tail of the animal, and may be attached to a collar, and/or attached to a tail piece.
  • the system 100 further includes one or more electronic processing devices 120, which may form part of one or more processing systems.
  • the one or more electronic processing devices 120 can be any suitable processing device that is capable of processing attribute and/or movement signals, and could include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
  • FPGA Field Programmable Gate Array
  • a single processing device could be used, alternatively multiple processing devices could be used, with processing being distributed between the processing devices. Accordingly, for ease of illustration the remaining description will refer to a processing device, but it will be appreciated that multiple processing devices could be used, with processing distributed between the devices as needed, and that reference to the singular encompasses the plural arrangement and vice versa.
  • the processing device 120 receives the at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor 111.
  • the electronic processing device 120 receives a movement signal from the at least one movement sensor 112.
  • the signals are typically received wirelessly, and may be received directly, or via an intermediate device, although this is not necessarily essential and alternatively the signals may be received via wired connections.
  • the signals may also undergo pre-processing, such as digitisation or the like, depending on the preferred implementation.
  • the processing device 120 processes the at least one attribute signal at least partially in accordance with the movement signal.
  • the nature of the processing will vary depending on the preferred implementation and could include selecting different attribute signals, or filtering or analysing signals.
  • the processing device 120 generates at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
  • the indicator could be of indicative of a wide range of different biological attributes, such as heart rate, oxygen level and/or temperature, and could be of any appropriate form, such a numerical and/or graphical representation of the biological attribute.
  • the above described arrangement allows the biological attributes of the animal to be accurately monitored by processing the attribute signals while taking the movement of the animal into consideration.
  • this allows the animal to be monitored while the animal is active or conducting daily activities system, whilst maintaining the accuracy of resulting indicator(s) that are derived, so that veterinarians and/or nurses are not required to manually examine or take measurements of the animal’s biological conditions.
  • the system as shown in Figure 1 may include a tail piece or similar arrangements capable of securing the sensors on and optionally in engagement with a tail of the animal.
  • the tail piece may include a body and an arm pivotally mounted to the body. The arm is movable between an open position and a closed position. The open position allows the tail piece to be positioned on the tail, and the closed position in which the tail is retained in the tail piece. In this arrangement, the tail piece can be easily secured to or removed from the tail.
  • the tail piece includes a sensor in an underside of the body, and a number of deformable fins on an underside of the arm configured to engage the tail and urge the tail into engagement with an underside of the body.
  • This allows the sensor on the tail piece to be in proximity to the skin of the tail, which optimises sensing, while the deformable fins allows the tail piece to be gripped on the tail firmly without harming or irritating the animal, whilst reducing movement of the sensor relative to the skin, which in turn helps ensures reliable signal measurement.
  • the at least one attribute signal may include a plurality of optical signals, for example used to perform pulse oximetry sensing.
  • the processing device processes at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal.
  • the electronic processing device then generates the indicator based on the selected optical signal.
  • This arrangement increases the reliability of the attribute signals by providing more than one optical signals and allowing the processing device to select the appropriate signal(s) for further processing.
  • each of the plurality of optical signals may have different wavelength, so that the processing device may select the optical signal(s) with a wavelength that is least affected by movements.
  • the optical signals may be red, green or orange lights, while the movement of the tail may be a fast wagging or a slow sweeping.
  • the attribute signals received from green lights may be selected when the tail is wagging, as the wavelength of a green light is less affected or less sensitive to movements.
  • this eliminates or reduces movement artefacts on the optical signal and/or allows artefacts to be removed through filtering, and thereby allows the system to provide more accurate measurements of the biological attributes of the animal without requiring the animal to be still.
  • this also allows the optical signals to be processed depending on the different types of movements that may be observed by the sensors worn on different part of the animal.
  • the processing device may select the optical signal with a wavelength that has a predetermined depth of penetration and/or a predetermined tolerance to skin pigmentation. This allows the system to provide more accurate measurements depending on the physical characteristics of the animal, such as breed, colour, and hair length.
  • the processing device may further analyse the movement signals to determine movement parameter(s) and process the attribute signal based on the movement parameter(s).
  • the movement parameter(s) may be indicative of a type of movement, a degree of movement, a direction of movement, a movement frequency, a movement pattern, and/or a pose. Accordingly, the movement signals indicative of movement of the animal are analysed, so that the attribute signal may be processed in a more accurate manner based on the analysed information.
  • a high heart rate of the animal obtained from the attribute sensor can be processed differently when the animal movement indicates lower or higher degree of movement. For example, a high heart rate whilst the animal is stationary may be used to trigger an alert indicative of an issue with the animal, whilst the same heart rate when the animal is moving may be considered normal and not require any action.
  • the processing device may analyse the movement signal using adaptive filtering, fuzzy logic, and/or autocorrelation. With this, the movement signals may be analysed based on a history of the same animal or a library of all data collected from multiple animals, so that the movement parameter(s) can be more accurately determined.
  • the processing device processes the attribute signal by filtering the attribute signal based on the movement parameter, to assist the further processing or analysing of the attribute signal by filtering out noise. For example, this could be used to filter out optical signals at a wavelength corresponding to a frequency of movement of the animal.
  • the processing device may further analyse the attribute signal based on the movement signal, and generate at least one indicator based on the analysed signal. This allows the attribute signal and the indicator to be generated to be more precise, and thereby, eliminate or reduce false positives, where the indicator indicating an abnormal state but the animal is in a normal state, or false negatives, where the indicator indicating a normal state but the animal is in an abnormal state.
  • the processing device may process the attribute signal by determining features derived from the attribute signal, using the features and a computational model to determine the indicator.
  • the computational model is at least partially indicative of a relationship between different features and different attributes. This allows the relationship between the features and attributes to be established for the specific animal, or the specific type of animal and the indicator may be generated accordingly.
  • the processing device may apply machine learning to reference features derived from reference animals having known attributes, and apply machine learning to features derived from the animal. Additionally, the processing device may develop a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modify a generic model to create a subject specific model by applying machine learning to features derived from the animal. Using machine learning can improve the accuracy of analysis and also easily and efficiently expand the complexity of the analysis.
  • the indicator can be indicative of one or more different biological attributes, including but not limited to a heart rate of the animal, an oxygen level of the animal, and a status of the animal.
  • the processing device may further generate a representation of the at least one indicator for display.
  • the representation may be any one or the combinations of a numerical representation, a trend line, a scale, and/or a meter gauge. This arrangement allows the indicator to be easily and clearly presented or communicated to the owners, veterinarians and/or nurses.
  • the processing device may further determine information indicative of a physical parameter of the animal and process the at least one attribute signal at least partially in accordance with the physical parameter.
  • the physical parameter herein may include at least one of an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse -oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP.
  • the physical parameters may also be retrieved from a treatment history of the animal, or be input by the owner, veterinarians or nurses.
  • Knowledge of the physical parameters can in turn inform how the attribute signals are processed and/or analysed.
  • skin pigmentation can effect transmission of optical signals, so information regarding skin pigmentation can be used to select an optical signal with the deepest penetration in order to ensure blood oxygen levels are more accurately measured.
  • knowledge of an anaesthesia state can be used to control alerting, so for example a heart rate that might be acceptable when the animal is awake might be indicative of a problem when the animal is under anaesthetic.
  • the above-mentioned information indicative of the physical parameter may include information determined from an image of the animal.
  • Such physical parameter may be a skin colour, a shaved status, fur length or texture. In this way, the physical parameters may be easily provided or updated to the system for processing.
  • the processing device may determine information indicative of an environmental parameter of the animal, and process the attribute signal at least partially in accordance with the environmental parameter.
  • the environmental parameter may be an environmental pressure, air quality indicator, pollen, humidity, and/or altitude.
  • the attribute signals may be different under different environmental conditions, such as in different pressure or humidity. This arrangement allows the attribute signals to be process, and thereby, provide more accurate indicators or representations.
  • the attribute sensor may be at least one of a pulse-oximetry sensor; a temperature sensor, a heart rate sensor, and a respiration sensor.
  • the biological attribute may be at least one of an oxygen level, a temperature, a heart rate, and a respiration rate.
  • the movement sensor may be an accelerometer and/or a gyroscope.
  • the processing device may be integrated into an external electronic device and wirelessly connected to the attribute sensor and/or the movement sensor. Additionally or alternatively, the processing device may be integrated into the collar and/or the tail piece.
  • the external electronic device may be a smart phone, a tablet, a desk-top computer, and/or a handheld device with computing capabilities.
  • one or more servers 310 are provided coupled to one or more client devices 330, via one or more communications networks 340, such as the Internet, and/or a number of local area networks (LANs).
  • a number of sensors 320 are provided, with these optionally communicating directly with the servers 310 via the communications networks 340, or more typically, with these being communicating with the client devices 330.
  • the client device 330 may be a smart phone, a tablet, a desk-top computer, and/or any handheld device with computing capabilities.
  • any number of servers 310, sensors 320 and client devices 330 could be provided, and the current representation is for the purpose of illustration only.
  • the configuration of the networks 340 is also for the purpose of example only, and in practice the servers 310, sensors 320 and client devices 330 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point- to-point connections, such as Bluetooth, or the like.
  • the servers 310 are shown as single entities, it will be appreciated they could include a number of servers distributed over a number of geographically separate locations, for example as part of a cloud based environment.
  • the sensor 320 includes at least one microprocessor 321, a memory 322, an optional input/output device 323, such as input buttons and/or a display, and an external interface 324.
  • the interfaces 324 may be of any form and can include a Universal Serial Bus (USB) port or Ethernet port, but more typically include a wireless transmitter, and in particular a short range wireless transmitter, such as Bluetooth, or the like.
  • the external interface 324 can be utilised for connecting the sensor 320 to processing systems, such as the servers 310 or client devices 330, or to a communications network 340, or the like.
  • a single external interface 324 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
  • the sensor modules are configured for attribute and movement sensing.
  • the processor 321 receives the one or more attribute and movement signals from the sensor modules 311, 312, optionally storing these in the memory 322.
  • the processor 321 then processes the signals in accordance with instructions stored in the memory 322, for example in the form of software instructions and/or in accordance with input commands provided by a user via the I/O device 323, thereby generating the indicator.
  • the indicator can then be provided as an output, for example via the I/O device 323, or via the interface 324 to a remote processing device.
  • the attribute and/or movement signals may be preprocessed by the processor 321 and transmitted to the server 310 and/or the client device 330 for further processing and analysis.
  • this allows the client device 330 and/or server 310 to receive, process and analyse signals received from the sensors 320, and generating an indicator and allowing the indicator to be displayed via the client devices 330.
  • the processing could be performed on board the sensor 320, or could be performed remotely by a processing system such as the client device 330 and/or server 310, or could be distributed between the sensor and processing system, depending on the preferred implementation.
  • the system 500 including a collar sensor 520a and a tail sensor 520b worn by the animal and wirelessly coupled to each other via Bluetooth.
  • the collar sensor 520a wirelessly connects to a client device 530 via a communications network 540, and/or directly, for example using a short range wireless communications protocol, allowing data from both the collar and tail sensors 520a, 520b to be uploaded for processing.
  • the collar sensor 520a measures an environmental temperature.
  • the tail sensor 520b in this example, includes a pulse oximeter and a temperature sensor.
  • each of sensors 520a and 520b includes an accelerometer and/or a gyroscope, so that the measurements together can be processes to indicate a body position, body movement and/or acceleration of the animal.
  • the sensor 520b measures a heart rate, an oxygen level and a body temperature of the animal with the pulse oximeter and the temperature sensor, and transmits the measurement signals to the collar sensor 520a.
  • the collar sensor 520a receives the measurement signals of the tail sensor 520b and transmits the measurement signals together with the environmental temperature measurement to the client device 530 for processing.
  • the transmission may be via the network 540 or directly to the client device 530, as described in the system of Figure 3.
  • the client device 530 in this example, is a tablet computer which receives the signals transmitted from the collar sensor 520a and processes the signals to generate an indication of the animal. The indication is further processed to generate representations for display with the client device 330.
  • attribute signals such as signals from the pulse-oximeter and the temperature sensor of the tail sensor 520b
  • the movement signals from both sensors 520a, 520b are also received by the sensor 520a at step 605.
  • the signals are pre-processed by the microprocessor 312 of the sensors.
  • the attribute signals is filtered based on the movement signal.
  • the signals may also be pre- processed by sampling, amplifying or converting, so that the signals or data derived therefrom may be subsequently transmitted to a processor at a client device 530 or a server at step 615.
  • the processor determines the physical parameters of the animal.
  • the physical parameter may be manually input by a user and then retrieved from the server or a memory of the client device 530 as needed. Additionally and/or alternatively, the physical parameter may be stored in the memory 322 of the sensor(s) and transmitted to the processor together with the signals.
  • the physical parameter(s) may also be determined from an image provided.
  • the client device 530 includes a camera for capturing an image of the animal. The hair colour, skin colour, shaved status, fur length or texture may be determined by processing the image.
  • the environmental parameter(s), such as the environmental temperature measured by the sensor 520a, may also be transmitted to the processor, at step 625. Additionally and/or alternatively, the processor may determine the environmental parameters, such as humidity and altitude, by retrieving information from a server or other data source, such as a meteorological service, via the network 540.
  • the movement signals are analysed, which may include adaptive filtering, fuzzy logic, autocorrelation, machine learning and/or any other suitable pattern recognition algorithms.
  • the movement parameters are determined based on the analysed results, at the next step 635.
  • the movement parameter typically includes a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; or the like. The movement could also be indicative of and/or used to derive a pose of the animal.
  • the attribute signals are processed at step 640.
  • the processing may include using techniques such as fuzzy logic, autocorrelation, machine learning and/or any other suitable pattern recognition. The process is explained in more detail with reference to Figure 7.
  • an indicator is generated at step 645.
  • the indicator may be indicating the heart rate, an oxygen level and/or a status of the animal.
  • the indicator is then converted to a representation at step 650, and subsequently displayed at step 655.
  • the display may be a display of the client device 530. Example representations will be described in more detail with reference to Figures 10 and 11.
  • step 700 machine learning algorithm is applied to reference features derived from attribute signals measured for animals having known attributes. For example, optical signals from an animal having a known elevated heart rate can be used to train the model to identify when a subject animal has an elevated heart rate.
  • the reference animal may include the same subject animal, and/or animals of the same or a similar breed or species, optionally having similar conditions, and/or physical parameters, as this makes it more likely that the animal will respond in a manner similar to the subject.
  • a generic model is developed based on the reference animal and the known attributes.
  • the generic model may be developed to indicate an association between, for example, measured optical signals and a heart rate for medium-sized dogs.
  • the features derived from measured attribute signals of a subject animal are determined, for example by analysing attribute signals as described above.
  • the derived features for the subject animal are applied to the model and used to derive an indicator at step 740.
  • the subject specific model may indicate a heart rate of a poodle in relation with the heart rates of medium-sized dogs, or a heart rate of a post-surgery poodle in relation with the heart rates of post-surgery animal.
  • this can be used in training the generic model at step 745, for example to improve the generic model and/or modify the generic model to make a subject specific model, which can then be used in subsequent analysis.
  • the senor 520a worn on the tail of the animal is in a form of a collar piece, as shown in Figures 8A and 8B.
  • the collar piece 800 shown in Figures 8A and 8B includes a sensor housing 810 and a strap 820.
  • the sensor housing 810 is configured to be placed about a neck region of the animal by surrounding the strap 820 on the neck.
  • the sensor housing typically incorporates the electronics shown in Figure 4, including the processing device and one or more sensors.
  • the strap 820 is threaded through a rear side of the sensor housing 810, and the strap 820 may be elastic and/or adjustable in length for fitting animal of different sizes. It should be appreciated that this arrangement may also be fitted to any other suitable part of the animal, such as waist or leg, in addition to or instead of the neck region.
  • the sensor 520b worn on the tail of the animal is in a form of a tail piece, as shown in Figures 9A and 9B.
  • Figures 9A and 9B show a tail piece 900 for retaining the sensors to the tail of the animal.
  • the tail piece 900 includes a body 910 and a pair of arms 920 pivotally mounted to the body 910.
  • the arms 920 is movable between an open position and a closed position.
  • Figure 9A illustrates the arms 920 being in the open position
  • Figure 9B illustrates the arms being in the closed position.
  • the closed position allows the tail piece 900 to be secured to the tail, and the open position in which the tail is released from the tail piece 900.
  • the tail piece 900 includes a sensor module 911 in an underside of the body 910 for sensing the oxygen level and temperature of the animal.
  • the sensor module 911 will be described in more detail with reference to Figures 10A and 10B.
  • the pair of arms 920 include a number of deformable fins 921 on undersides of the arms 920.
  • the fins 921 engage the tail, when the tail piece 900 is in the closed position, and urge the tail into engagement with an underside of the body 910, so that the sensor 911 is in close contact to the skin of the tail.
  • FIG 10A shows a schematic diagram of a sensor module in the tail piece 900.
  • the sensor module 1000 includes a plurality of optical sensors 1100, a photo detector 1200, a temperature sensor 1310 and an embedded temperature sensor 1320.
  • the temperature sensor 1310 may be a pyrometer that measures the temperature by thermal radiation. In one example, the temperature sensor 1310 measures the temperature of the skin of the animal.
  • the embedded temperature sensor 1320 may be one or more thermometers embedded in the array of optical sensors 1100 and measure the temperature by conduction. In one example, the embedded temperature sensor 1320 measures an internal temperature inside the device. In one example, an ambient temperature may also be measured.
  • the optical sensor 1100 emits light of a plurality of wavelengths, and the optical sensors 1100 are arranged in a pattern that surrounds the photo detector 1200.
  • Figure 10B shows a schematic diagram of the optical sensor 1100 in the sensor module 1000.
  • the optical sensor 1100 includes three LEDs 1110, 1120, 1130.
  • the LED 1110 is a green LED
  • the LED 1120 is a red LED
  • the LED 1130 is an infrared LED.
  • the optical sensor may include one or more LEDs of the same or different wavelengths suitable.
  • the photo detector 1200 reads the reflected light from the plurality of optical sensors 1100 and generates signals indicative of the biological attributes of the animal.
  • the reflected light from the green LEDs is the main contributors to indicate the heart rate of the animal, whereas the reflected light from the red LEDs are the main contributor to indicate an oxygen level or SpCh of the animal.
  • the optical sensors 1100 in the sensor module 1000 are arranged as shown in Figure 10A.
  • the arrangement is selected based on at least one of the following factors:
  • Figure 11 shows other arrangements of the optical sensors, the photo detector and the temperature sensors.
  • the arrangement may be selected based on at least one of the above factors.
  • Figure 12 shows a schematic diagram of the representations of the animal monitoring system.
  • the representations on a display include the report information, subject information and the attribute representations.
  • the report information typically includes date, time and duration of what the data represents, and may also include the medication and dosage in relation of the subject.
  • the subject information includes an image, name, owner, species, breed, age, gender and weight of the subject animal.
  • the attribute representations include an average hear rate, average SPO2, and average body temperature, and they are represented numerically and graphically.
  • the graphs represents the attribute timeline with key event notations, such as surgery start time, surgery end time.
  • the attribute representations may further include any alarms or alerts, which may be raised based on the processing of the attribute signals.
  • Figures 13-16 shows further examples of the representations of the animal monitoring system.
  • the representations may also include trend lines and gauges as shown in Figure 13.
  • the gauge may include a numerical value and a coloured scales which indicates a normal state, a warning state and a danger state with green, orange and red, respectively.
  • the representations in Figure 14 include the report information, subject information and the attribute representations.
  • the report information includes date, time and duration of what the data represents, and also includes the vet/nurse names, surgery duration and the surgery type performed.
  • the subject information includes an image of the subject, name, owner, age and weight of the subject animal.
  • the attribute representations include the high, the average and the low readings of the heart rate, Sp0 2 , and body temperature, and they are represented numerically and graphically against time.
  • Figure 15 shows the representations in a table including the medication information before, during and after the surgery.
  • Figure 16 shows another example of representations including report information, subject information and the attribute information.
  • the above described arrangements allow the biological attributes of the animal to be accurately monitored by processing the attribute signals with respect to the movement of the animal.
  • this further allows more information to be derived.
  • movement of the animal may be analysed with machine learning or the like, and the behaviour or status of the animal may be better understood.
  • the animal can be monitored while the animal is active or conducting daily activities system, and the accuracy of results as derived is maintained or improved.
  • This is also beneficial to the owner, veterinarians and/or nurses as less manually examinations or measurements are required.
  • the display also represents more accurate results and more useful information based on in-depth analysis of the animal’s biological conditions.

Abstract

The invention relates to a system for monitoring an animal, including: at least one attribute sensor worn by the animal; at least one movement sensor worn by the animal; and, one or more electronic processing devices configured to: receiving at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor, receiving a movement signal from the at least one movement sensor; processing the at least one attribute signal at least partially in accordance with the movement signal; and, generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.

Description

ANIMAL MONITORING DEVICE
Background of the Invention
[0001] The present invention relates to a device and method for animal monitoring, and in particular for monitoring non-human animals.
Description of the Prior Art
[0002] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[0003] It is known to monitor animals, including non-humans in order to detect changes in vital signs indicative of any reactions. Typically, the monitoring involves the use of a number of sensors worn by the animals. However, the sensors may misrepresent the actual condition of the animals due to inaccurate sensor readings.
[0004] In this regard, in order to accurately monitor the animals, veterinarians and/or nurses are required to manually access the true condition of the animals, which is time consuming and costly.
Summary of the Present Invention
[0005] In one broad form an aspect of the present invention seeks to provide a system for monitoring an animal, including: at least one attribute sensor worn by the animal; at least one movement sensor worn by the animal; and, one or more electronic processing devices configured to: receiving at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor, receiving a movement signal from the at least one movement sensor; processing the at least one attribute signal at least partially in accordance with the movement signal; and, generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal. [0006] In one embodiment, the at least one attribute sensor is at least one of: worn on a neck region; worn on a tail of the animal; attached to a collar; and, attached to a tail piece.
[0007] In one embodiment, the at least one movement sensor is at least one of: worn on a neck region; worn on a tail of the animal; attached to a collar; and, attached to a tail piece.
[0008] In one embodiment, the tail piece includes: a body; and, an arm pivotally mounted to the body and movable between: an open position to allow the tail piece to be positioned on the tail; and, a closed position in which the tail is retained in the tail piece.
[0009] In one embodiment, the tail piece includes a sensor in an underside of the body and a number of deformable fins on an underside of the arm configured to engage the tail and urge the tail into engagement with an underside of the body.
[0010] In one embodiment, the at least one attribute signal includes a plurality of optical signals and wherein the one or more processing devices are configured to: process at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, generate the at least one indicator based on the selected at least one optical signal.
[0011] In one embodiment, each of the plurality of optical signals are measured at a different wavelength, and wherein the one or more processing devices are configured to select the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
[0012] In one embodiment, the one or more processing devices are configured to process the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: a predetermined depth of penetration; and, a predetermined tolerance to skin pigmentation.
[0013] In one embodiment, the one or more processing devices are configured to: analyse the movement signals to determine at least one movement parameter indicative of at least one of: a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; and, a pose; and, process the at least one attribute signal based on the at least one movement parameter. [0014] In one embodiment, the one or more processing devices are configured to analyse the movement signal using at least one of: adaptive filtering; fuzzy logic; autocorrelation; machine learning; and, pattern recognition.
[0015] In one embodiment, the one or more processing devices are configured to process the at least one attribute signal includes filtering the at least one attribute signal based on the movement parameter.
[0016] In one embodiment, the one or more processing devices are configured to: analyse the at least one attribute signal based on the movement signal; and, generate at least one indicator based on the analysed signal.
[0017] In one embodiment, the one or more processing devices are configured to process the at least one attribute signal by: determining one or more features derived from the animal; using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
[0018] In one embodiment, the one or more processing devices are configured to: apply machine learning to reference features derived from one or more reference animals having known attributes; and, apply machine learning to features derived from the animal.
[0019] In one embodiment, the one or more processing devices are configured to: develop a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modify a generic model to create a subject specific model by applying machine learning to features derived from the animal.
[0020] In one embodiment, the at least one indicator includes: a heart rate of the animal; an oxygen level of the animal; and, a status of the animal.
[0021] In one embodiment, the one or more processing devices are configured to generate a representation of the at least one indicator for display.
[0022] In one embodiment, the representation including: a numerical representation; a trend line; a scale; and, a meter gauge. [0023] In one embodiment, the one or more processing devices are configured to: determine information indicative of a physical parameter of the animal; and, process the at least one attribute signal at least partially in accordance with the physical parameter.
[0024] In one embodiment, the physical parameter includes at least one of: an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse-oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP.
[0025] In one embodiment, the information indicative of the physical parameter includes information determined from an image of the animal.
[0026] In one embodiment, the one or more processing devices are configured to: determine information indicative of an environmental parameter of the animal; and, process the at least one attribute signal at least partially in accordance with the environmental parameter.
[0027] In one embodiment, the environmental parameter includes at least one of: environmental pressure; air quality indicator; pollen; humidity; and, altitude.
[0028] In one embodiment, the attribute sensor is at least one of: a pulse-oximetry sensor; a temperature sensor; a heart rate sensor; and, a respiration sensor.
[0029] In one embodiment, the biological attribute includes at least one of: an oxygen level; a temperature; a heart rate; and, a respiration rate.
[0030] In one embodiment, the movement sensor includes at least one of: an accelerometer; and, a gyroscope.
[0031] In one embodiment, the one or more processing devices are at least one of: integrated into an external electronic device and wirelessly connected to the at least one attribute sensor and/or the at least one movement sensor; and, integrated into the collar and/or the tail piece. [0032] In one embodiment, the external electronic device is at least one of: a smart phone; a tablet; a desk-top computer; and, a handheld device.
[0033] In one broad form an aspect of the present invention seeks to provide a method for monitoring an animal, the method including, in one or more electronic processing devices: receiving at least one attribute signal indicative of a biological attribute of the animal from at least one attribute sensor worn by the animal, receiving a movement signal from at least one movement sensor worn by the animal; processing the at least one attribute signal at least partially in accordance with the movement signal; and, generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
[0034] In one embodiment, the at least one attribute signal includes a plurality of optical signals and wherein the method includes, in the one or more electronic processing devices: processing at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, generating the at least one indicator based on the selected at least one optical signal.
[0035] In one embodiment, each of the plurality of optical signals are measured at a different wavelength, and wherein the method includes, in the one or more electronic processing devices, selecting the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
[0036] In one embodiment, the method includes, in the one or more electronic processing devices, processing the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: a predetermined depth of penetration; and, a predetermined tolerance to skin pigmentation.
[0037] In one embodiment, the method further includes, in the one or more electronic processing devices: analysing the movement signals to determine at least one movement parameter indicative of at least one of: a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; and, a pose; and, processing the at least one attribute signal based on the at least one movement parameter. [0038] In one embodiment, the method further includes, in the one or more electronic processing devices, analysing the movement signals using at least one of: adaptive filtering; fuzzy logic; autocorrelation; machine learning; and, pattern recognition.
[0039] In one embodiment, the method includes, in the one or more electronic processing devices, processing the at least one attribute signal by filtering the at least one attribute signal based on the movement parameter.
[0040] In one embodiment, the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: analysing the at least one attribute signal based on the movement signal; and, generating the at least one indicator based on the analysed signal.
[0041] In one embodiment, the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: determining one or more features derived from the animal; using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
[0042] In one embodiment, the method further includes, in the one or more electronic processing devices: applying machine learning to reference features derived from one or more reference animals having known attributes; and, applying machine learning to features derived from the animal.
[0043] In one embodiment, the method further includes, in the one or more electronic processing devices: developing a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modifying a generic model to create a subject specific model by applying machine learning to features derived from the animal.
[0044] In one embodiment, the at least one indicator includes: a heart rate of the animal; an oxygen level of the animal; and, a status of the animal. [0045] In one embodiment, the method further includes, in the one or more electronic processing devices, generating a representation of the at least one indicator for display.
[0046] In one embodiment, the representation including: a numerical representation; a trend line; a scale; and, a meter gauge.
[0047] In one embodiment, the method further includes, in the one or more electronic processing devices: determining information indicative of a physical parameter of the animal; and, processing the at least one attribute signal at least partially in accordance with the physical parameter.
[0048] In one embodiment, the physical parameter includes at least one of: an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse-oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP.
[0049] In one embodiment, the information indicative of the physical parameter includes information determined from an image of the animal.
[0050] In one embodiment, the method further includes, in the one or more electronic processing devices: determining information indicative of an environmental parameter; and, processing the at least one attribute signal at least partially in accordance with the environmental parameter.
[0051] In one embodiment, the environmental parameter includes at least one of: environmental pressure; air quality indicator; pollen; humidity; and, altitude.
[0052] In one embodiment, the attribute sensor is at least one of: a pulse-oximetry sensor; a temperature sensor; a heart rate sensor; and, a respiration sensor.
[0053] In one embodiment, the biological attribute includes at least one of: an oxygen level; a temperature; a heart rate; and, a respiration rate. [0054] In one embodiment, the movement sensor includes at least one of: an accelerometer; and, a gyroscope.
[0055] It will be appreciated that the broad forms of the invention and their respective features can be used in conjunction and/or independently, and reference to separate broad forms is not intended to be limiting. Furthermore, it will be appreciated that features of the method can be performed using the system or apparatus and that features of the system or apparatus can be implemented using the method.
Brief Description of the Drawings
[0056] Various examples and embodiments of the present invention will now be described with reference to the accompanying drawings, in which: -
[0057] Figure 1 is a schematic diagram of a first example of a system for use in monitoring an animal;
[0058] Figure 2 is a flowchart of a first example of a method for use in monitoring an animal using the system of Figure 1;
[0059] Figure 3 is a schematic diagram of an example of a network architecture;
[0060] Figure 4 is a schematic diagram of an example of a sensor for use in monitoring an animal;
[0061] Figure 5 is a schematic diagram of a second example of a system for use in monitoring an animal;
[0062] Figures 6A and 6B is a flowchart of a second example of a method for use in monitoring an animal using the system of Figure 5;
[0063] Figure 7 a flowchart of an example of a method for use in processing signals using the system of Figure 5;
[0064] Figures 8A and 8B are prospective views of a collar piece of a system for use in monitoring an animal; [0065] Figures 9A and 9B are prospective views of a tail piece of a system for use in monitoring an animal;
[0066] Figure 10A is a schematic diagram of a sensor module in a tail piece;
[0067] Figure 10B is a schematic diagram of an optical sensor of the sensor module;
[0068] Figure 11 shows other exemplary arrangements of the optical sensors of the sensor module;
[0069] Figure 12 is a schematic diagram of the representations of the animal monitoring system;
[0070] Figure 13 is a schematic diagram of the representations of the animal monitoring system;
[0071] Figure 14 is a schematic diagram of the representations of the animal monitoring system;
[0072] Figure 15 is a schematic diagram of the representations of the animal monitoring system; and,
[0073] Figure 16 is a schematic diagram of the representations of the animal monitoring system.
Detailed Description of the Preferred Embodiments
[0074] An example of a system for monitoring an animal will now be described with reference to Figure 1.
[0075] In this example, the system 100 includes at least one attribute sensor 111 worn by the animal, and one or more movement sensor 112 worn on the animal. The nature of the sensors will vary depending on the preferred implementation. For example, the attribute sensor could include a heart rate sensor, a temperature sensor and a pulse-oximeter, whilst the movement sensor could include an accelerometer, gyroscope, inertial measurement unit, or the like. The at least one attribute sensor 111 and/or the at least one movement sensor 112 may be worn on a neck region of the animal and/or worn on a tail of the animal, and may be attached to a collar, and/or attached to a tail piece.
[0076] The system 100 further includes one or more electronic processing devices 120, which may form part of one or more processing systems. The one or more electronic processing devices 120, can be any suitable processing device that is capable of processing attribute and/or movement signals, and could include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement. Furthermore, whilst a single processing device could be used, alternatively multiple processing devices could be used, with processing being distributed between the processing devices. Accordingly, for ease of illustration the remaining description will refer to a processing device, but it will be appreciated that multiple processing devices could be used, with processing distributed between the devices as needed, and that reference to the singular encompasses the plural arrangement and vice versa.
[0077] An example of operation of the apparatus of Figure 1 will now be described with reference to Figure 2.
[0078] At step 200, the processing device 120 receives the at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor 111. At step 210, the electronic processing device 120 receives a movement signal from the at least one movement sensor 112. The signals are typically received wirelessly, and may be received directly, or via an intermediate device, although this is not necessarily essential and alternatively the signals may be received via wired connections. The signals may also undergo pre-processing, such as digitisation or the like, depending on the preferred implementation.
[0079] At step 220, the processing device 120 processes the at least one attribute signal at least partially in accordance with the movement signal. The nature of the processing will vary depending on the preferred implementation and could include selecting different attribute signals, or filtering or analysing signals. Subsequently, at step 230, the processing device 120 generates at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal. The indicator could be of indicative of a wide range of different biological attributes, such as heart rate, oxygen level and/or temperature, and could be of any appropriate form, such a numerical and/or graphical representation of the biological attribute.
[0080] Accordingly, the above described arrangement allows the biological attributes of the animal to be accurately monitored by processing the attribute signals while taking the movement of the animal into consideration. Advantageously, this allows the animal to be monitored while the animal is active or conducting daily activities system, whilst maintaining the accuracy of resulting indicator(s) that are derived, so that veterinarians and/or nurses are not required to manually examine or take measurements of the animal’s biological conditions.
[0081] A number of further features will now be described.
[0082] The system as shown in Figure 1 may include a tail piece or similar arrangements capable of securing the sensors on and optionally in engagement with a tail of the animal. In one example, the tail piece may include a body and an arm pivotally mounted to the body. The arm is movable between an open position and a closed position. The open position allows the tail piece to be positioned on the tail, and the closed position in which the tail is retained in the tail piece. In this arrangement, the tail piece can be easily secured to or removed from the tail.
[0083] Additionally, the tail piece includes a sensor in an underside of the body, and a number of deformable fins on an underside of the arm configured to engage the tail and urge the tail into engagement with an underside of the body. This allows the sensor on the tail piece to be in proximity to the skin of the tail, which optimises sensing, while the deformable fins allows the tail piece to be gripped on the tail firmly without harming or irritating the animal, whilst reducing movement of the sensor relative to the skin, which in turn helps ensures reliable signal measurement.
[0084] In one example the at least one attribute signal may include a plurality of optical signals, for example used to perform pulse oximetry sensing. In this example, the processing device processes at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal. The electronic processing device then generates the indicator based on the selected optical signal. This arrangement increases the reliability of the attribute signals by providing more than one optical signals and allowing the processing device to select the appropriate signal(s) for further processing. [0085] Furthermore, each of the plurality of optical signals may have different wavelength, so that the processing device may select the optical signal(s) with a wavelength that is least affected by movements. In one example, the optical signals may be red, green or orange lights, while the movement of the tail may be a fast wagging or a slow sweeping. In this example, the attribute signals received from green lights may be selected when the tail is wagging, as the wavelength of a green light is less affected or less sensitive to movements. Advantageously, this eliminates or reduces movement artefacts on the optical signal and/or allows artefacts to be removed through filtering, and thereby allows the system to provide more accurate measurements of the biological attributes of the animal without requiring the animal to be still. Furthermore, this also allows the optical signals to be processed depending on the different types of movements that may be observed by the sensors worn on different part of the animal.
[0086] Additionally, the processing device may select the optical signal with a wavelength that has a predetermined depth of penetration and/or a predetermined tolerance to skin pigmentation. This allows the system to provide more accurate measurements depending on the physical characteristics of the animal, such as breed, colour, and hair length.
[0087] The processing device may further analyse the movement signals to determine movement parameter(s) and process the attribute signal based on the movement parameter(s). The movement parameter(s) may be indicative of a type of movement, a degree of movement, a direction of movement, a movement frequency, a movement pattern, and/or a pose. Accordingly, the movement signals indicative of movement of the animal are analysed, so that the attribute signal may be processed in a more accurate manner based on the analysed information. In one example, a high heart rate of the animal obtained from the attribute sensor can be processed differently when the animal movement indicates lower or higher degree of movement. For example, a high heart rate whilst the animal is stationary may be used to trigger an alert indicative of an issue with the animal, whilst the same heart rate when the animal is moving may be considered normal and not require any action.
[0088] Additionally and/or alternatively, the processing device may analyse the movement signal using adaptive filtering, fuzzy logic, and/or autocorrelation. With this, the movement signals may be analysed based on a history of the same animal or a library of all data collected from multiple animals, so that the movement parameter(s) can be more accurately determined. [0089] In one example, the processing device processes the attribute signal by filtering the attribute signal based on the movement parameter, to assist the further processing or analysing of the attribute signal by filtering out noise. For example, this could be used to filter out optical signals at a wavelength corresponding to a frequency of movement of the animal.
[0090] The processing device may further analyse the attribute signal based on the movement signal, and generate at least one indicator based on the analysed signal. This allows the attribute signal and the indicator to be generated to be more precise, and thereby, eliminate or reduce false positives, where the indicator indicating an abnormal state but the animal is in a normal state, or false negatives, where the indicator indicating a normal state but the animal is in an abnormal state.
[0091] In order to perform better analysis of the attribute signal, the processing device may process the attribute signal by determining features derived from the attribute signal, using the features and a computational model to determine the indicator. In this case, the computational model is at least partially indicative of a relationship between different features and different attributes. This allows the relationship between the features and attributes to be established for the specific animal, or the specific type of animal and the indicator may be generated accordingly.
[0092] Furthermore, the processing device may apply machine learning to reference features derived from reference animals having known attributes, and apply machine learning to features derived from the animal. Additionally, the processing device may develop a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, modify a generic model to create a subject specific model by applying machine learning to features derived from the animal. Using machine learning can improve the accuracy of analysis and also easily and efficiently expand the complexity of the analysis.
[0093] The indicator can be indicative of one or more different biological attributes, including but not limited to a heart rate of the animal, an oxygen level of the animal, and a status of the animal. The processing device may further generate a representation of the at least one indicator for display. In one example, the representation may be any one or the combinations of a numerical representation, a trend line, a scale, and/or a meter gauge. This arrangement allows the indicator to be easily and clearly presented or communicated to the owners, veterinarians and/or nurses.
[0094] The processing device may further determine information indicative of a physical parameter of the animal and process the at least one attribute signal at least partially in accordance with the physical parameter. The physical parameter herein may include at least one of an anaesthesia status; a medication record; a tail length; at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; a skin colour of where a pulse -oximetry sensor is located; a shaved status; a fur length or texture; a gender; and, a blood pressure including at least one of: SBP; DBP; and, MAP. The physical parameters, such as an anaesthesia status and a medication record, may also be retrieved from a treatment history of the animal, or be input by the owner, veterinarians or nurses. Knowledge of the physical parameters can in turn inform how the attribute signals are processed and/or analysed. For example, skin pigmentation can effect transmission of optical signals, so information regarding skin pigmentation can be used to select an optical signal with the deepest penetration in order to ensure blood oxygen levels are more accurately measured. Similarly, knowledge of an anaesthesia state can be used to control alerting, so for example a heart rate that might be acceptable when the animal is awake might be indicative of a problem when the animal is under anaesthetic.
[0095] The above-mentioned information indicative of the physical parameter may include information determined from an image of the animal. Such physical parameter may be a skin colour, a shaved status, fur length or texture. In this way, the physical parameters may be easily provided or updated to the system for processing.
[0096] In addition, the processing device may determine information indicative of an environmental parameter of the animal, and process the attribute signal at least partially in accordance with the environmental parameter. The environmental parameter may be an environmental pressure, air quality indicator, pollen, humidity, and/or altitude. As the attribute signals may be different under different environmental conditions, such as in different pressure or humidity. This arrangement allows the attribute signals to be process, and thereby, provide more accurate indicators or representations. [0097] To implement the above-mentioned features, the attribute sensor may be at least one of a pulse-oximetry sensor; a temperature sensor, a heart rate sensor, and a respiration sensor. The biological attribute may be at least one of an oxygen level, a temperature, a heart rate, and a respiration rate. The movement sensor may be an accelerometer and/or a gyroscope.
[0098] In one example, the processing device may be integrated into an external electronic device and wirelessly connected to the attribute sensor and/or the movement sensor. Additionally or alternatively, the processing device may be integrated into the collar and/or the tail piece. The external electronic device may be a smart phone, a tablet, a desk-top computer, and/or a handheld device with computing capabilities.
[0099] An example of a system will now be described in more detail with reference to Figure 3.
[00100] In this example, one or more servers 310 are provided coupled to one or more client devices 330, via one or more communications networks 340, such as the Internet, and/or a number of local area networks (LANs). A number of sensors 320, as described above, are provided, with these optionally communicating directly with the servers 310 via the communications networks 340, or more typically, with these being communicating with the client devices 330. The client device 330 may be a smart phone, a tablet, a desk-top computer, and/or any handheld device with computing capabilities.
[0101] Any number of servers 310, sensors 320 and client devices 330 could be provided, and the current representation is for the purpose of illustration only. The configuration of the networks 340 is also for the purpose of example only, and in practice the servers 310, sensors 320 and client devices 330 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point- to-point connections, such as Bluetooth, or the like. Whilst the servers 310 are shown as single entities, it will be appreciated they could include a number of servers distributed over a number of geographically separate locations, for example as part of a cloud based environment. Thus, the above described arrangements are not essential and other suitable configurations could be used. [0102] An example of a sensor 320 is shown in Figure 4. In this example, the sensor 320 includes at least one microprocessor 321, a memory 322, an optional input/output device 323, such as input buttons and/or a display, and an external interface 324. The interfaces 324 may be of any form and can include a Universal Serial Bus (USB) port or Ethernet port, but more typically include a wireless transmitter, and in particular a short range wireless transmitter, such as Bluetooth, or the like. In this example, the external interface 324 can be utilised for connecting the sensor 320 to processing systems, such as the servers 310 or client devices 330, or to a communications network 340, or the like. Although a single external interface 324 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
[0103] In this example, the sensor modules are configured for attribute and movement sensing. In use, the processor 321 receives the one or more attribute and movement signals from the sensor modules 311, 312, optionally storing these in the memory 322. The processor 321 then processes the signals in accordance with instructions stored in the memory 322, for example in the form of software instructions and/or in accordance with input commands provided by a user via the I/O device 323, thereby generating the indicator. The indicator can then be provided as an output, for example via the I/O device 323, or via the interface 324 to a remote processing device. Additionally and/or alternatively, the attribute and/or movement signals may be preprocessed by the processor 321 and transmitted to the server 310 and/or the client device 330 for further processing and analysis. For example, this allows the client device 330 and/or server 310 to receive, process and analyse signals received from the sensors 320, and generating an indicator and allowing the indicator to be displayed via the client devices 330. It will be appreciated from this that the processing could be performed on board the sensor 320, or could be performed remotely by a processing system such as the client device 330 and/or server 310, or could be distributed between the sensor and processing system, depending on the preferred implementation.
[0104] An example of a system will now be described in more detail with reference to Figure 5.
[0105] In this example, the system 500 including a collar sensor 520a and a tail sensor 520b worn by the animal and wirelessly coupled to each other via Bluetooth. The collar sensor 520a wirelessly connects to a client device 530 via a communications network 540, and/or directly, for example using a short range wireless communications protocol, allowing data from both the collar and tail sensors 520a, 520b to be uploaded for processing.
[0106] The collar sensor 520a measures an environmental temperature. The tail sensor 520b, in this example, includes a pulse oximeter and a temperature sensor. Furthermore, each of sensors 520a and 520b includes an accelerometer and/or a gyroscope, so that the measurements together can be processes to indicate a body position, body movement and/or acceleration of the animal.
[0107] In use, the sensor 520b measures a heart rate, an oxygen level and a body temperature of the animal with the pulse oximeter and the temperature sensor, and transmits the measurement signals to the collar sensor 520a. The collar sensor 520a receives the measurement signals of the tail sensor 520b and transmits the measurement signals together with the environmental temperature measurement to the client device 530 for processing. The transmission may be via the network 540 or directly to the client device 530, as described in the system of Figure 3. The client device 530, in this example, is a tablet computer which receives the signals transmitted from the collar sensor 520a and processes the signals to generate an indication of the animal. The indication is further processed to generate representations for display with the client device 330.
[0108] The above system may also be used to perform the method as shown in Figures 6A and 6B.
[0109] At step 600, attribute signals, such as signals from the pulse-oximeter and the temperature sensor of the tail sensor 520b, are received by the sensor 520a. The movement signals from both sensors 520a, 520b are also received by the sensor 520a at step 605. At step 610, the signals are pre-processed by the microprocessor 312 of the sensors. In this example, the attribute signals is filtered based on the movement signal. The signals may also be pre- processed by sampling, amplifying or converting, so that the signals or data derived therefrom may be subsequently transmitted to a processor at a client device 530 or a server at step 615.
[0110] At step 620, the processor determines the physical parameters of the animal. The physical parameter may be manually input by a user and then retrieved from the server or a memory of the client device 530 as needed. Additionally and/or alternatively, the physical parameter may be stored in the memory 322 of the sensor(s) and transmitted to the processor together with the signals. The physical parameter(s) may also be determined from an image provided. In one example, the client device 530 includes a camera for capturing an image of the animal. The hair colour, skin colour, shaved status, fur length or texture may be determined by processing the image.
[0111] The environmental parameter(s), such as the environmental temperature measured by the sensor 520a, may also be transmitted to the processor, at step 625. Additionally and/or alternatively, the processor may determine the environmental parameters, such as humidity and altitude, by retrieving information from a server or other data source, such as a meteorological service, via the network 540.
[0112] At step 630, the movement signals are analysed, which may include adaptive filtering, fuzzy logic, autocorrelation, machine learning and/or any other suitable pattern recognition algorithms. Upon analysing the movement signals, the movement parameters are determined based on the analysed results, at the next step 635. The movement parameter typically includes a type of movement; a direction of movement; a degree of movement; a movement frequency; a movement pattern; or the like. The movement could also be indicative of and/or used to derive a pose of the animal.
[0113] Based on the movement parameter, the attribute signals are processed at step 640. The processing may include using techniques such as fuzzy logic, autocorrelation, machine learning and/or any other suitable pattern recognition. The process is explained in more detail with reference to Figure 7.
[0114] According to the processed attribute signal, an indicator is generated at step 645. The indicator may be indicating the heart rate, an oxygen level and/or a status of the animal. The indicator is then converted to a representation at step 650, and subsequently displayed at step 655. The display may be a display of the client device 530. Example representations will be described in more detail with reference to Figures 10 and 11.
[0115] The above process at step 640 is now described with reference to Figure 7. To process the attribute signal with machine learning, at step 700, machine learning algorithm is applied to reference features derived from attribute signals measured for animals having known attributes. For example, optical signals from an animal having a known elevated heart rate can be used to train the model to identify when a subject animal has an elevated heart rate. The reference animal may include the same subject animal, and/or animals of the same or a similar breed or species, optionally having similar conditions, and/or physical parameters, as this makes it more likely that the animal will respond in a manner similar to the subject. At step 710, a generic model is developed based on the reference animal and the known attributes. In one example, the generic model may be developed to indicate an association between, for example, measured optical signals and a heart rate for medium-sized dogs. At step 720, the features derived from measured attribute signals of a subject animal are determined, for example by analysing attribute signals as described above. Subsequently, at step 730, the derived features for the subject animal are applied to the model and used to derive an indicator at step 740. In one example, based on the analysis, the subject specific model may indicate a heart rate of a poodle in relation with the heart rates of medium-sized dogs, or a heart rate of a post-surgery poodle in relation with the heart rates of post-surgery animal. Furthermore, once the analysis is performed, this can be used in training the generic model at step 745, for example to improve the generic model and/or modify the generic model to make a subject specific model, which can then be used in subsequent analysis.
[0116] Specific examples of the physical construction of a collar and tail piece will now be described in further detail.
[0117] In this example, the sensor 520a worn on the tail of the animal is in a form of a collar piece, as shown in Figures 8A and 8B.
[0118] The collar piece 800 shown in Figures 8A and 8B includes a sensor housing 810 and a strap 820. The sensor housing 810 is configured to be placed about a neck region of the animal by surrounding the strap 820 on the neck. The sensor housing typically incorporates the electronics shown in Figure 4, including the processing device and one or more sensors. In this example, the strap 820 is threaded through a rear side of the sensor housing 810, and the strap 820 may be elastic and/or adjustable in length for fitting animal of different sizes. It should be appreciated that this arrangement may also be fitted to any other suitable part of the animal, such as waist or leg, in addition to or instead of the neck region. [0119] In this example, the sensor 520b worn on the tail of the animal is in a form of a tail piece, as shown in Figures 9A and 9B.
[0120] Figures 9A and 9B show a tail piece 900 for retaining the sensors to the tail of the animal. The tail piece 900 includes a body 910 and a pair of arms 920 pivotally mounted to the body 910. The arms 920 is movable between an open position and a closed position. Figure 9A illustrates the arms 920 being in the open position, whereas Figure 9B illustrates the arms being in the closed position. The closed position allows the tail piece 900 to be secured to the tail, and the open position in which the tail is released from the tail piece 900.
[0121] The tail piece 900 includes a sensor module 911 in an underside of the body 910 for sensing the oxygen level and temperature of the animal. The sensor module 911 will be described in more detail with reference to Figures 10A and 10B.
[0122] The pair of arms 920 include a number of deformable fins 921 on undersides of the arms 920. The fins 921 engage the tail, when the tail piece 900 is in the closed position, and urge the tail into engagement with an underside of the body 910, so that the sensor 911 is in close contact to the skin of the tail.
[0123] Figure 10A shows a schematic diagram of a sensor module in the tail piece 900. The sensor module 1000 includes a plurality of optical sensors 1100, a photo detector 1200, a temperature sensor 1310 and an embedded temperature sensor 1320. The temperature sensor 1310 may be a pyrometer that measures the temperature by thermal radiation. In one example, the temperature sensor 1310 measures the temperature of the skin of the animal. The embedded temperature sensor 1320 may be one or more thermometers embedded in the array of optical sensors 1100 and measure the temperature by conduction. In one example, the embedded temperature sensor 1320 measures an internal temperature inside the device. In one example, an ambient temperature may also be measured. In this example, the optical sensor 1100 emits light of a plurality of wavelengths, and the optical sensors 1100 are arranged in a pattern that surrounds the photo detector 1200. Figure 10B shows a schematic diagram of the optical sensor 1100 in the sensor module 1000. The optical sensor 1100 includes three LEDs 1110, 1120, 1130. In this example, the LED 1110 is a green LED, the LED 1120 is a red LED and the LED 1130 is an infrared LED. It should be appreciated that the optical sensor may include one or more LEDs of the same or different wavelengths suitable.
[0124] In this example, the photo detector 1200 reads the reflected light from the plurality of optical sensors 1100 and generates signals indicative of the biological attributes of the animal. The reflected light from the green LEDs is the main contributors to indicate the heart rate of the animal, whereas the reflected light from the red LEDs are the main contributor to indicate an oxygen level or SpCh of the animal.
[0125] As above, the optical sensors 1100 in the sensor module 1000 are arranged as shown in Figure 10A. The arrangement is selected based on at least one of the following factors:
• the skin colour of the animal
• the fur type of the animal
• potential movements of the animal during measurement
• wavelengths of the LEDs
• modulation frequency of the sensor signals
• power levels of the LEDs
• distance between the LEDs and the photo detector
[0126] Figure 11 shows other arrangements of the optical sensors, the photo detector and the temperature sensors. The arrangement may be selected based on at least one of the above factors.
[0127] Referring to Figure 12, Figure 12 shows a schematic diagram of the representations of the animal monitoring system.
[0128] As described above, the representations on a display, in this example, include the report information, subject information and the attribute representations. The report information typically includes date, time and duration of what the data represents, and may also include the medication and dosage in relation of the subject. The subject information includes an image, name, owner, species, breed, age, gender and weight of the subject animal. The attribute representations include an average hear rate, average SPO2, and average body temperature, and they are represented numerically and graphically. In this example, the graphs represents the attribute timeline with key event notations, such as surgery start time, surgery end time. The attribute representations may further include any alarms or alerts, which may be raised based on the processing of the attribute signals.
[0129] Figures 13-16 shows further examples of the representations of the animal monitoring system.
[0130] In a similar manner, the representations may also include trend lines and gauges as shown in Figure 13. The gauge may include a numerical value and a coloured scales which indicates a normal state, a warning state and a danger state with green, orange and red, respectively.
[0131] Similarly to the report shown in Figure 12, the representations in Figure 14 include the report information, subject information and the attribute representations. The report information includes date, time and duration of what the data represents, and also includes the vet/nurse names, surgery duration and the surgery type performed. The subject information includes an image of the subject, name, owner, age and weight of the subject animal. The attribute representations include the high, the average and the low readings of the heart rate, Sp02, and body temperature, and they are represented numerically and graphically against time. Figure 15 shows the representations in a table including the medication information before, during and after the surgery. Figure 16 shows another example of representations including report information, subject information and the attribute information.
[0132] Accordingly, the above described arrangements allow the biological attributes of the animal to be accurately monitored by processing the attribute signals with respect to the movement of the animal. Advantageously, this further allows more information to be derived. For example, movement of the animal may be analysed with machine learning or the like, and the behaviour or status of the animal may be better understood. As such, the animal can be monitored while the animal is active or conducting daily activities system, and the accuracy of results as derived is maintained or improved. This is also beneficial to the owner, veterinarians and/or nurses as less manually examinations or measurements are required. The display also represents more accurate results and more useful information based on in-depth analysis of the animal’s biological conditions.
[0133] Throughout this specification and claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated integer or group of integers or steps but not the exclusion of any other integer or group of integers. As used herein and unless otherwise stated, the term "approximately" means ±20%.
[0134] It must be noted that, as used in the specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a support” includes a plurality of supports. In this specification and in the claims that follow, reference will be made to a number of terms that shall be defined to have the following meanings unless a contrary intention is apparent.
[0135] It will of course be realised that whilst the above has been given by way of an illustrative example of this invention, all such and other modifications and variations hereto, as would be apparent to persons skilled in the art, are deemed to fall within the broad scope and ambit of this invention as is herein set forth.

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1) A system for monitoring an animal, including: a) at least one attribute sensor worn by the animal; b) at least one movement sensor worn by the animal; and, c) one or more electronic processing devices configured to: i) receiving at least one attribute signal indicative of a biological attribute of the animal from the at least one attribute sensor, ii) receiving a movement signal from the at least one movement sensor; iii) processing the at least one attribute signal at least partially in accordance with the movement signal; and, iv) generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
2) A system according to claim 1, wherein the at least one attribute sensor is at least one of: a) worn on a neck region; b) worn on a tail of the animal; c) attached to a collar; and, d) attached to a tail piece.
3) A system according to claim 1 or 2, wherein the at least one movement sensor is at least one of: a) worn on a neck region; b) worn on a tail of the animal; c) attached to a collar; and, d) attached to a tail piece.
4) A system according to claim 2 or 3, wherein the tail piece includes: a) a body; and, b) an arm pivotally mounted to the body and movable between: i) an open position to allow the tail piece to be positioned on the tail; and, ii) a closed position in which the tail is retained in the tail piece.
5) A system according to any one of claims 2-4, wherein the tail piece includes a sensor in an underside of the body and a number of deformable fins on an underside of the arm configured to engage the tail and urge the tail into engagement with an underside of the body.
6) A system according to any one of claims 1-5, wherein the at least one attribute signal includes a plurality of optical signals and wherein the one or more processing devices are configured to: a) process at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, b) generate the at least one indicator based on the selected at least one optical signal.
7) A system according to claim 6, wherein each of the plurality of optical signals are measured at a different wavelength, and wherein the one or more processing devices are configured to select the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
8) A system according to claim 6 , wherein the one or more processing devices are configured to process the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: i) a predetermined depth of penetration; and, ii) a predetermined tolerance to skin pigmentation.
9) A system according to any one of claims 1-8, wherein the one or more processing devices are configured to: a) analyse the movement signals to determine at least one movement parameter indicative of at least one of: i) a type of movement; ii) a direction of movement; iii) a degree of movement; iv) a movement frequency; v) a movement pattern; and, vi) a pose; and, b) process the at least one attribute signal based on the at least one movement parameter.
10) A system according to claim 9, wherein the one or more processing devices are configured to analyse the movement signal using at least one of: a) adaptive filtering; b) fuzzy logic; c) autocorrelation; d) machine learning; and, e) pattern recognition.
11) A system according to any one of claims 1-10, wherein the one or more processing devices are configured to process the at least one attribute signal includes filtering the at least one attribute signal based on the movement parameter.
12) A system according to any one of claims 1-11, wherein the one or more processing devices are configured to: a) analyse the at least one attribute signal based on the movement signal; and, b) generate at least one indicator based on the analysed signal.
13) A system according to any one of claims 1-12, wherein the one or more processing devices are configured to process the at least one attribute signal by: a) determining one or more features derived from the animal; b) using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
14) A system according to claim 13, wherein the one or more processing devices are configured to: a) apply machine learning to reference features derived from one or more reference animals having known attributes; and, b) apply machine learning to features derived from the animal.
15) A system according to claim 13 or 14, wherein the one or more processing devices are configured to: a) develop a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, b) modify a generic model to create a subject specific model by applying machine learning to features derived from the animal.
16) A system according to any one of claims 1-15, wherein the at least one indicator includes: a) a heart rate of the animal; b) an oxygen level of the animal; and, c) a status of the animal.
17) A system according to any one of claims 1-16, wherein the one or more processing devices are configured to generate a representation of the at least one indicator for display.
18) A system according to claim 17, wherein the representation including: a) a numerical representation; b) a trend line; c) a scale; and, d) a meter gauge.
19) A system according to any one of claims 1-18, wherein the one or more processing devices are configured to: a) determine information indicative of a physical parameter of the animal; and, b) process the at least one attribute signal at least partially in accordance with the physical parameter.
20) A system according to claim 19, wherein the physical parameter includes at least one of: a) an anaesthesia status; b) a medication record; c) a tail length; d) at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; e) a skin colour of where a pulse-oximetry sensor is located; f) a shaved status; g) a fur length or texture; h) a gender; and, i) a blood pressure including at least one of: i) SBP; ii) DBP; and, iii) MAP.
21) A system according to claim 19 or 20, wherein the information indicative of the physical parameter includes information determined from an image of the animal.
22) A system according to any one of claims 1-21, wherein the one or more processing devices are configured to: a) determine information indicative of an environmental parameter of the animal; and, b) process the at least one attribute signal at least partially in accordance with the environmental parameter.
23) A system according to claim 22, wherein the environmental parameter includes at least one of: a) environmental pressure; b) air quality indicator; c) pollen; d) humidity; and, e) altitude.
24) A system according to any one of claims 1-23, wherein the attribute sensor is at least one of: a) a pulse-oximetry sensor; b) a temperature sensor; c) a heart rate sensor; and, d) a respiration sensor.
25) A system according to any one of claims 1-24, wherein the biological attribute includes at least one of: a) an oxygen level; b) a temperature; c) a heart rate; and, d) a respiration rate.
26) A system according to any one of claims 1-25, wherein the movement sensor includes at least one of: a) an accelerometer; and, b) a gyroscope.
27) A system according to any one of claims 1-26, wherein the one or more processing devices are at least one of: a) integrated into an external electronic device and wirelessly connected to the at least one attribute sensor and/or the at least one movement sensor; and, b) integrated into the collar and/or the tail piece. 28) A system according to claim 27, wherein the external electronic device is at least one of: a) a smart phone; b) a tablet; c) a desk-top computer; and, d) a handheld device.
29) A method for monitoring an animal, the method including, in one or more electronic processing devices: a) receiving at least one attribute signal indicative of a biological attribute of the animal from at least one attribute sensor worn by the animal, b) receiving a movement signal from at least one movement sensor worn by the animal; c) processing the at least one attribute signal at least partially in accordance with the movement signal; and, d) generating at least one indicator at least partially indicative of the biological attribute based on the processed attribute signal.
30) A method according to claim 29, wherein the at least one attribute signal includes a plurality of optical signals and wherein the method includes, in the one or more electronic processing devices: a) processing at least one optical signal of the plurality of optical signals by selecting an optical signal based on the movement signal; and, b) generating the at least one indicator based on the selected at least one optical signal.
31) A method according to claim 30, wherein each of the plurality of optical signals are measured at a different wavelength, and wherein the method includes, in the one or more electronic processing devices, selecting the at least one optical signal by selecting the optical signal with a wavelength that is least affected by movements.
32) A method according to claim 30, wherein the method includes, in the one or more electronic processing devices, processing the at least one optical signal by selecting an optical signal with a wavelength that has at least one of: a) a predetermined depth of penetration; and, b) a predetermined tolerance to skin pigmentation.
33)A method according to any one of claims 29-32, wherein the method further includes, in the one or more electronic processing devices: a) analysing the movement signals to determine at least one movement parameter indicative of at least one of: i) a type of movement; ii) a direction of movement; iii) a degree of movement; iv) a movement frequency; v) a movement pattern; and, vi) a pose; and, b) processing the at least one attribute signal based on the at least one movement parameter.
34) A method according to claim 33, wherein the method further includes, in the one or more electronic processing devices, analysing the movement signals using at least one of: a) adaptive filtering; b) fuzzy logic; c) autocorrelation; d) machine learning; and, e) pattern recognition.
35)A method according to any one of claims 29-34, wherein the method includes, in the one or more electronic processing devices, processing the at least one attribute signal by filtering the at least one attribute signal based on the movement parameter.
36)A method according to any one of claims 29-35, wherein the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: a) analysing the at least one attribute signal based on the movement signal; and, b) generating the at least one indicator based on the analysed signal.
37) A method according to any one of claims 29-36, wherein the method further includes, in the one or more electronic processing devices, processing the at least one attribute signal by: a) determining one or more features derived from the animal; b) using the features and at least one computational model to determine the indicator, the at least one computational model being at least partially indicative of a relationship between different features and different attributes.
38) A method according to claim 37, wherein the method further includes, in the one or more electronic processing devices: a) applying machine learning to reference features derived from one or more reference animals having known attributes; and, b) applying machine learning to features derived from the animal.
39) A method according to claim 37 or 38, wherein the method further includes, in the one or more electronic processing devices: a) developing a generic model by applying machine learning to reference features derived from one or more reference animals having known attributes; and, b) modifying a generic model to create a subject specific model by applying machine learning to features derived from the animal.
40) A method according to any one of claims 29-39, wherein the at least one indicator includes: a) a heart rate of the animal; b) an oxygen level of the animal; and, c) a status of the animal.
41) A method according to any one of claims 29-40, wherein the method further includes, in the one or more electronic processing devices, generating a representation of the at least one indicator for display.
42) A method according to claim 41, wherein the representation including: a) a numerical representation; b) a trend line; c) a scale; and, d) a meter gauge.
43) A method according to any one of claims 29-42, wherein the method further includes, in the one or more electronic processing devices: a) determining information indicative of a physical parameter of the animal; and, b) processing the at least one attribute signal at least partially in accordance with the physical parameter. 44) A method according to claim 43, wherein the physical parameter includes at least one of: a) an anaesthesia status; b) a medication record; c) a tail length; d) at least one tail circumference measurements, including a first measurement at a base of the tail and a second measurement approximately 5cm from the base of the tail; e) a skin colour of where a pulse-oximetry sensor is located; f) a shaved status; g) a fur length or texture; h) a gender; and, i) a blood pressure including at least one of: i) SBP; ii) DBP; and, iii) MAP.
45)A method according to claim 43 or 44, wherein the information indicative of the physical parameter includes information determined from an image of the animal.
46) A method according to any one of claims 29-45, wherein the method further includes, in the one or more electronic processing devices: a) determining information indicative of an environmental parameter; and, b) processing the at least one attribute signal at least partially in accordance with the environmental parameter.
47) A method according to claim 46, wherein the environmental parameter includes at least one of: a) environmental pressure; b) air quality indicator; c) pollen; d) humidity; and, e) altitude.
48) A method according to any one of claims 29-47, wherein the attribute sensor is at least one of: a) a pulse-oximetry sensor; b) a temperature sensor; c) a heart rate sensor; and, d) a respiration sensor.
49) A method according to any one of claims 29-48, wherein the biological attribute includes at least one of: a) an oxygen level; b) a temperature; c) a heart rate; and, d) a respiration rate.
50) A method according to any one of claims 29-49, wherein the movement sensor includes at least one of: a) an accelerometer; and, b) a gyroscope.
PCT/AU2022/050336 2021-04-15 2022-04-14 Animal monitoring device WO2022217316A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2021901107 2021-04-15
AU2021901107A AU2021901107A0 (en) 2021-04-15 Animal monitoring device

Publications (1)

Publication Number Publication Date
WO2022217316A1 true WO2022217316A1 (en) 2022-10-20

Family

ID=83639408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2022/050336 WO2022217316A1 (en) 2021-04-15 2022-04-14 Animal monitoring device

Country Status (1)

Country Link
WO (1) WO2022217316A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009076325A2 (en) * 2007-04-11 2009-06-18 Starr Life Sciences Corp. Noninvasive photoplethysmographic sensor platform for mobile animals
US20100298660A1 (en) * 2009-05-20 2010-11-25 Triage Wireless, Inc. Body-worn device and associated system for alarms/alerts based on vital signs and motion; also describes specific monitors that include barcode scanner and different user interfaces for nurse, patient, etc.
US9538729B2 (en) * 2014-04-08 2017-01-10 Medisim, Ltd. Cattle monitoring for illness
US20180333244A1 (en) * 2017-05-19 2018-11-22 Maxim Integrated Products, Inc. Physiological condition determination system
WO2019195267A1 (en) * 2018-04-02 2019-10-10 Barati Zeinab Methods and systems for near infrared spectroscopy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009076325A2 (en) * 2007-04-11 2009-06-18 Starr Life Sciences Corp. Noninvasive photoplethysmographic sensor platform for mobile animals
US20100298660A1 (en) * 2009-05-20 2010-11-25 Triage Wireless, Inc. Body-worn device and associated system for alarms/alerts based on vital signs and motion; also describes specific monitors that include barcode scanner and different user interfaces for nurse, patient, etc.
US9538729B2 (en) * 2014-04-08 2017-01-10 Medisim, Ltd. Cattle monitoring for illness
US20180333244A1 (en) * 2017-05-19 2018-11-22 Maxim Integrated Products, Inc. Physiological condition determination system
WO2019195267A1 (en) * 2018-04-02 2019-10-10 Barati Zeinab Methods and systems for near infrared spectroscopy

Similar Documents

Publication Publication Date Title
US10898136B2 (en) Monitoring device for animals
EP2713853B1 (en) Fever detection apparatus
CN104305972B (en) Multi-parameter monitoring based on intelligent watch with it is health management system arranged
JP6101878B1 (en) Diagnostic equipment
CN107092806A (en) It is a kind of towards the intelligentized information fusion of old man's household and method for early warning
US10813593B2 (en) Using visual context to timely trigger measuring physiological parameters
CN107847161A (en) Generate the designator of the situation of patient
CN112244765B (en) Method, device and system for detecting brain temporary abnormal state
CN115695734A (en) Infrared thermal imaging protection monitoring method, device, equipment, system and medium
WO2022217316A1 (en) Animal monitoring device
CN112790762A (en) Wearable glucometer and method for detecting blood glucose concentration
CN214965491U (en) Intelligent security inspection all-in-one machine before post
KR20170017989A (en) Biological information measurement Necklace
CN105595973A (en) Sleeping abnormity warning device
CN208808459U (en) A kind of skin physiology instrument for measuring index
JP2023527080A (en) Means and methods for accurately predicting, warning, and thereby avoiding sports injuries
KR20160109098A (en) Biological information measurement Necklace
Pravin et al. Machine learning and IoT-based automatic health monitoring system
Saravanan et al. Design and implementation of patient monitoring system based on IoT using oxygen saturation
US20220096008A1 (en) System and method of smart health monitoring
US20230034055A1 (en) Monitoring Of Physiological Data In Animals
TWI772689B (en) Non-contact physiological signal measuring device
Malar et al. Implementation of an Intelligent Neonatal Monitoring System Using Raspberry Pi
CA3094908A1 (en) Smart wearable health monitor system
CN217471931U (en) Data acquisition device, medical acquisition equipment and medical data processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787165

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22787165

Country of ref document: EP

Kind code of ref document: A1